Staff Data Engineer Lead

New

Skills

Apache Kafka AWS Devops Docker Java Kubernetes Python Snowflake Software Engineering Sql Terraform

Wiser Solutions is seeking a Staff Data Engineer to lead and architect our global Commerce Execution Suite. As part of our remote-first team, you'll drive the development of scalable data pipelines and analytics platforms that empower brands and retailers to optimize pricing, marketing, and operational strategies both online and in-store. Work with vast datasets, cutting-edge technologies, and cross-functional teams to deliver robust data solutions that create seamless shopping experiences.

Job Overview
  • Lead the design and development of data architectures for large-scale data collection and analytics.
  • Collaborate with product and engineering leaders to define solutions supporting customer needs.
  • Mentor team members and promote best practices across engineering teams.
  • Champion high standards for data quality and reliability.
  • Innovate and bring new ideas for optimizing and scaling data systems.
Key Responsibilities
  • Architect and extend data pipeline services using modern data processing tools.
  • Transform raw, unstructured data into actionable insights for customers.
  • Diagnose and resolve data abnormalities and ensure system reliability.
  • Shepherd projects from conception through production deployment.
  • Lead agile ceremonies and provide technical guidance to team members.
Required Skills & Qualifications
  • Bachelor’s or Master’s degree in Computer Science or related field.
  • 10+ years of professional software engineering experience.
  • Strong proficiency in Python, SQL, and data processing technologies (Spark, Airflow).
  • Experience with AWS, Docker, Kubernetes, and infrastructure as code (Terraform).
  • Expertise in database solutions (Snowflake, MongoDB, Postgres, etc.).
  • Solid understanding of streaming platforms (Kafka, Kinesis).
  • Proven track record building data warehouses and lakes.
  • Experience with business intelligence tools (Tableau).
  • Familiarity with ML/Agentic AI pipelines and microservice architecture.
  • Excellent mentorship and leadership skills in an agile environment.

Job Type: Remote

Salary: Not Disclosed

Experience: Entry

Duration: 12 Months

Share this job:

Similar Jobs

Lead Data Engineering Role

Posted 18 days ago

Design and optimize scalable data architectures

Lead and mentor engineering teams

Apache Kafka AWS Docker Engineer

Developer Advocate, DataHub

Posted 22 days ago

Empower and support the DataHub developer community

Create and deliver educational technical content

Apache Kafka Communication Community engagement Data Engineering

Senior Data Lead Opportunity

Posted 42 days ago

Lead and mentor data engineering teams

Architect and deliver scalable cloud data solutions

Apache Kafka AWS Azure Big Data

Developer Advocate, DataHub

Posted 54 days ago

Empower and educate DataHub users

Foster active engagement in developer communities

Apache Kafka Communication Community engagement Data Engineering

Senior Data Engineer Role

Posted 100 days ago

Design and maintain robust data warehouses and datasets

Integrate data best practices into the software development lifecycle

Apache Kafka BigQuery Data Warehousing Etl Pipeline Development

Senior Backend Engineer Role

Posted 131 days ago

Design and maintain scalable backend systems

Optimize and manage relational databases

Apache Kafka Devops Docker Java

Full-Stack Engineer Portugal Remote

Posted 136 days ago

Develop scalable full-stack software solutions

Leverage AI tools to enhance products

Ai Tools Apache Kafka Docker Git

Senior NodeJS Backend Engineer

Posted 140 days ago

Develop scalable backend microservices

Enhance data security and performance

Apache Kafka AWS Docker Git

Senior Kafka Architect Role

Posted 145 days ago

Architect and implement event-driven systems

Manage and optimize Kafka infrastructure

Agile Methodology Apache Kafka

Developer Advocate, DataHub

Posted 146 days ago

Empower and support developer community

Create and deliver technical content

Apache Kafka Communication Community engagement Content Creation

SDE III - Backend

Posted 176 days ago

Design and implement scalable backend services

Optimize database performance and availability

Apache Kafka Architecture AWS Devops

Remote Scala Jobs

Posted 205 days ago

Facilitate remote job opportunities for Scala developers.

Connect talented Scala professionals with companies in need of their skills.

Apache Kafka GraphQL Scala

AI Customer Experience Platform

Posted 207 days ago

Architect and develop highly scalable backend services

Optimize and maintain relational databases for high performance

Apache Kafka Architecture AWS Devops

Allstate Product Engineer Project

Posted 208 days ago

Implement applications following 12-factor principles for product development

Collaborate within the team to design and build systems and apps

Apache Kafka CI/CD Github Java

DBA at Wikimedia Foundation

Posted 223 days ago

The Wikimedia Foundation is seeking a Senior DBA. Our objective is to make the sum of all human knowledge available to everyone, and we persist most of this knowledge in MariaDB.

Implementation, maintenance and troubleshooting of relational database systems in production and staging environments.

SQL Database Optimization LAMP Administration Linux
overtime