Senior Data Lead Opportunity

New

Skills

Apache Kafka AWS Azure Big Data Data Modeling Databricks Gcp Python Sql

Join Ollion as a Senior Data Lead and drive transformative data engineering solutions for ambitious organizations worldwide. We value innovation, independence, and diversity, fostering a remote-first culture that supports professional growth and meaningful impact. As a technical leader, you will architect scalable data systems, mentor talented engineers, and collaborate with stakeholders to deliver game-changing results.

Job Overview
  • Lead and mentor data engineering teams in a collaborative, fully remote environment.
  • Architect, design, and implement scalable data pipelines using modern cloud technologies.
  • Champion data engineering best practices, governance, and technical excellence.
  • Collaborate with cross-functional stakeholders to deliver actionable data solutions.
  • Drive innovation while maintaining a customer-focused approach and business impact.
Key Responsibilities
  • Lead and mentor a team of data engineers, fostering growth and technical excellence.
  • Architect and implement scalable data pipelines and processing systems (AWS, GCP, Azure).
  • Establish and enforce data engineering standards and governance frameworks.
  • Collaborate with stakeholders to gather requirements and align data architecture.
  • Drive technical vision for data infrastructure, optimizing workflows for performance and cost.
  • Conduct code reviews and ensure maintainable, scalable solutions.
  • Automate and orchestrate complex data pipelines, reducing manual effort.
  • Support pre-sales engagements and develop reusable frameworks to accelerate delivery.
  • Maintain cloud certifications and contribute to thought leadership initiatives.
Required Skills & Qualifications
  • Minimum 10 years in data engineering or related roles, with 2+ years in team leadership.
  • Expertise in Python and SQL for scalable data solutions.
  • Hands-on experience with cloud platforms (AWS, GCP, Azure), Databricks, and Delta Lake.
  • Strong knowledge of data lake, data warehouse concepts, and real-time streaming technologies.
  • Proven skills in data modeling, ETL/ELT design, and pipeline orchestration.
  • Excellent communication, collaboration, and stakeholder engagement abilities.
  • Experience with Agile methodologies and leading cross-functional teams.
  • Relevant cloud certifications (e.g., AWS Data Analytics, Google Data Engineer).
  • Understanding of data governance, privacy, and compliance regulations.
  • Commitment to diversity, inclusion, and fostering a collaborative culture.

Job Type: Remote

Salary: Not Disclosed

Experience: Entry

Duration: 12 Months

Share this job:

Similar Jobs

Developer Advocate, DataHub

Posted 12 days ago

Empower and educate DataHub users

Foster active engagement in developer communities

Apache Kafka Communication Community engagement Data Engineering

Senior Data Engineer Role

Posted 58 days ago

Design and maintain robust data warehouses and datasets

Integrate data best practices into the software development lifecycle

Apache Kafka BigQuery Data Warehousing Etl Pipeline Development

Senior Backend Engineer Role

Posted 89 days ago

Design and maintain scalable backend systems

Optimize and manage relational databases

Apache Kafka Devops Docker Java

Full-Stack Engineer Portugal Remote

Posted 94 days ago

Develop scalable full-stack software solutions

Leverage AI tools to enhance products

Ai Tools Apache Kafka Docker Git

Senior NodeJS Backend Engineer

Posted 98 days ago

Develop scalable backend microservices

Enhance data security and performance

Apache Kafka AWS Docker Git

Senior Kafka Architect Role

Posted 103 days ago

Architect and implement event-driven systems

Manage and optimize Kafka infrastructure

Agile Methodology Apache Kafka

Developer Advocate, DataHub

Posted 104 days ago

Empower and support developer community

Create and deliver technical content

Apache Kafka Communication Community engagement Content Creation

SDE III - Backend

Posted 134 days ago

Design and implement scalable backend services

Optimize database performance and availability

Apache Kafka Architecture AWS Devops

Remote Scala Jobs

Posted 163 days ago

Facilitate remote job opportunities for Scala developers.

Connect talented Scala professionals with companies in need of their skills.

Apache Kafka GraphQL Scala

AI Customer Experience Platform

Posted 165 days ago

Architect and develop highly scalable backend services

Optimize and maintain relational databases for high performance

Apache Kafka Architecture AWS Devops

Allstate Product Engineer Project

Posted 166 days ago

Implement applications following 12-factor principles for product development

Collaborate within the team to design and build systems and apps

Apache Kafka CI/CD Github Java

DBA at Wikimedia Foundation

Posted 181 days ago

The Wikimedia Foundation is seeking a Senior DBA. Our objective is to make the sum of all human knowledge available to everyone, and we persist most of this knowledge in MariaDB.

Implementation, maintenance and troubleshooting of relational database systems in production and staging environments.

SQL Database Optimization LAMP Administration Linux
overtime