New
All roles at JumpCloud are Remote unless otherwise specified in the Job Description. About JumpCloud JumpCloud® delivers a unified open directory platform that makes it easy to securely manage identities, devices, and access across your organization. With JumpCloud®, IT teams and MSPs enable users to work securely from anywhere and manage their Windows, Apple, Linux, and Android devices from a single platform. JumpCloud® is IT Simplified. About the role We’re looking for a Senior Data Engineer to join JumpCloud’s Data Enablement team. Data Enablements Vision is for data to drive JumpCloud and our Customers. The current mission the team is on is to put in place foundational technology and process to uplevel the data capabilities of our Product and our Data Warehouse/Lakehouse. We are introducing an Event Based Architecture, developing and refining a data model that supports JumpCloud’s growth strategy and modernizing our Data Warehouse. A successful data engineer will exhibit an entrepreneurial spirit and enjoy tackling data engineering problems that most other people cannot solve, as well as shaping the future capabilities of JumpCloud’s data engineering, performance reporting and data governance. Come be a part of an exciting new team where you will be able to work on challenging projects, rich data sets, and develop valuable skills. This role involves frequent engagement with analytics partners, data/platform engineering and product engineering to mature our data model, pipelines and data practices. The role will report to the Senior Manager of Data. This is a senior level position. What you'll be doing: As part of the Data Enablement team, and as part of the engineering organization as a whole here at JumpCloud, you will be responsible for providing critical data infrastructure and systems for multiple areas of the business, including Business Analysis, Product Development, Engineering, Finance, Sales and Executive Strategy On a day-to-day basis, as a senior level data engineer, you may be asked to: Interface with stakeholders to define needs and develop strategies for providing data Integrate Technologies such as Airflow, Python, and Kafka Plan, build, and maintain data pipelines from internal and external data sources Implement data observability and monitoring in the pipeline and in the warehouse Work with appropriate teams to ensure data security and data compliance Guide data analysts to ensure clean delivery of data You will work with other senior level engineers and architects with the goal to achieve top level proficiency in core data engineering skills and business functions You have: Extensive hands-on experience with building scalable data solutions with complex fast moving data sets Can lead the technology on small to large sized projects from start to finish Strong experience with Cloud Data Warehouses and Data Lake architectures and implementations Proven proficiency in data modeling and database design, with an emphasis towards designing optimized self service data solutions. Experience with both batch and streaming data pipelines and ELT processes Ability to work and communicate effectively with other engineers, and both technical and non-technical business stakeholders Ability to quickly integrate new technologies and industry best practices into your skill sets Expert level SQL skills Proficiency with the Python programming tools and ecosystem, while incorporating strong software engineering techniques Experience working with some of these: message brokers, data sync/mirroring tools, stream and batch processors, data orchestrators and workflow engines Nice to haves: Python3 and golang Software Development, following general software engineering principles SQL for data transformation and analysis, with optimization and tuning in mind Snowflake data warehouses (or equivalent) Dremio data lakehouses Apache Airflow Apache Kafka (and it’s supporting tools) Experience of building stream and batch processing big data systems Experience building observable data systems Experience with basic data modeling and data architecture Experience with cloud data storage techniques Familiarity with data storage formats, such as JSON/Avro/Protobuf/Parquet/Iceberg Can work effectively both independently and as part of the data engineering team as a whole Experience with Data Governance, including data contracts and schema management Experience with Data Security standards including RBAC and sensitive data handling #LI-MS1 Where you’ll be working/Location: JumpCloud is committed to being Remote First, meaning that you are able to work remotely within the country noted in the Job Description. This role is remote in the country of India. You must be located in and authorized to work in India to be considered for this role. Language: JumpCloud® has teams in 15+ countries around the world and conducts our internal business in English. The interview and any additional screening process will take place primarily in English. To be considered for a role at JumpCloud®, you will be required to speak and write in English fluently. Any additional language requirements will be included in the details of the job description. Why JumpCloud? If you thrive working in a fast, SaaS-based environment and you are passionate about solving challenging technical problems, we look forward to hearing from you! JumpCloud® is an incredible place to share and grow your expertise! You’ll work with amazing talent across each department who are passionate about our mission. We’re out of the box thinkers, so your unique ideas and approaches for conceiving a product and/or feature will be welcome. You’ll have a voice in the organization as you work with a seasoned executive team, a supportive board and in a proven market that our customers are excited about. One of JumpCloud®'s three core values is to “Build Connections.” To us that means creating ' human connection with each other regardless of our backgrounds, orientations, geographies, religions, languages, gender, race, etc. We care deeply about the people that we work with and want to see everyone succeed.' - Rajat Bhargava, CEO Please submit your résumé and brief explanation about yourself and why you would be a good fit for JumpCloud®. Please note JumpCloud® is not accepting third party resumes at this time. JumpCloud® is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. Scam Notice: Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of JumpCloud. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that JumpCloud will never ask for any personal account information, such as credit card details or bank account numbers, during the recruitment process. Additionally, JumpCloud will never send you a check for any equipment prior to employment. All communication related to interviews and offers from our recruiters and hiring managers will come from official company email addresses (@jumpcloud.com) and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent JumpCloud and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at recruiting@jumpcloud.com with the subject line 'Scam Notice' #LI-Remote #BI-Remote
New
Engaging with technical communities and fostering positive interactions.
Creating technical content to educate users about DataHub's features and best practices.
Posted 4 days ago
Deliver insights and analytics to drive strategic opportunities
Turn messy data into reliable, analysis-ready datasets
Posted 11 days ago
Lead technical implementation and optimization of data platform
Serve as primary technical contact for key accounts
Posted 13 days ago
Implement data ingestion routines
Develop a workflow orchestration platform
Posted 13 days ago
Hiring Python and Kubernetes Specialist Engineers focused on Data, Workflows, AI/ML, and Analytics Solutions
Developing end-to-end data analytics and MLOps solutions with open-source tools
Posted 31 days ago
Support developers in the DataHub Slack community.
Create compelling technical content to educate users about DataHub.
Posted 31 days ago
Empower developers through technical communication and content creation.
Engage actively in the DataHub community to support users.
Posted 35 days ago
Architect scalable, secure data platforms.
Implement modern software engineering practices.
Posted 46 days ago
Build open source AI/ML and analytics solutions
Develop and maintain scalable data platforms
Posted 48 days ago
Architect scalable data pipelines and infrastructure.
Enable real-time, reliable, and high-quality data access.
Posted 54 days ago
Design and implement scalable data architectures
Lead and mentor engineering teams
Posted 65 days ago
Empower and support DataHub developers
Create educational technical content
Posted 67 days ago
Design and implement scalable data solutions
Lead and mentor engineering teams
Posted 68 days ago
Architect and deliver scalable data solutions
Lead and mentor engineering teams
Posted 69 days ago
Architect scalable and reliable data infrastructure
Empower data-driven decision making and analytics
Posted 69 days ago
Architect scalable data platforms and pipelines
Drive innovation in cloud-native product development
Posted 71 days ago
Lead data governance strategy and execution
Build and manage governance-aware data pipelines
Posted 79 days ago
Design scalable AI data platforms
Optimize ML pipeline efficiency and resource allocation
Posted 79 days ago
Design and scale generative AI infrastructure
Develop and fine-tune generative video and visual models
Posted 79 days ago
Design scalable data infrastructure
Build and maintain high-quality data pipelines
Posted 81 days ago
Enable data-informed decision-making organization-wide
Design and implement scalable cloud-based ETL/ELT solutions
Posted 82 days ago
Hire a remote data engineer
Build automated communication systems
Posted 88 days ago
Lead global data governance strategy and execution
Build and maintain governance-aware data pipelines
Posted 89 days ago
Design and maintain scalable data pipelines
Ensure data quality and reliability
Posted 91 days ago
Lead end-to-end ML ad targeting product development
Drive technical research and strategic roadmap
Posted 98 days ago
Develop and automate compliance dashboards and reports
Support regulatory reporting and audit readiness
Posted 99 days ago
Deliver actionable business insights
Develop and optimize data pipelines
Posted 100 days ago
Design and implement scalable data solutions
Lead and mentor engineering teams
Posted 111 days ago
Design and build scalable data architectures
Lead customer-facing technical engagements
Posted 112 days ago
Architect scalable and reliable data pipelines
Develop and launch self-serve analytics products
Posted 115 days ago
Design and maintain scalable data pipelines
Ensure data quality and governance
Posted 116 days ago
Develop and maintain scalable data pipelines
Ensure data quality and robust monitoring
Posted 116 days ago
Design and build scalable data pipelines
Ensure data quality and system reliability
Posted 116 days ago
Develop robust, scalable data pipelines
Ensure high data quality and governance
Posted 116 days ago
Build scalable and robust data pipelines
Enhance data quality and governance
Posted 116 days ago
Develop scalable and robust data pipelines
Ensure high data quality and system reliability
Posted 121 days ago
Design and build scalable data pipelines
Ensure data quality and robust monitoring
Posted 121 days ago
Develop scalable and robust data pipelines
Enhance data quality through monitoring and alerting
Posted 122 days ago
Lead and architect scalable data platforms and pipelines
Apply modern software engineering and cloud-native principles
Posted 122 days ago
Build and maintain scalable data pipelines
Ensure data quality and governance across systems
Posted 127 days ago
Lead development of scalable ML systems
Advance Apollo's AI-native product features
Posted 131 days ago
Mine and analyze large-scale business data
Develop and maintain dashboards and reports
Posted 158 days ago
Lead and scale ML-driven product features
Develop and optimize AI-first user experiences
Posted 168 days ago
Lead and mentor a data science team
Integrate analytics into business strategy
Posted 170 days ago
Lead and manage modern cloud data platforms for clients.
Provide architectural guidance and operational support.
Posted 173 days ago
Lead operation and management of cloud data platforms
Provide architectural guidance and technical leadership
Posted 175 days ago
Design and build scalable data pipelines
Ensure data quality and robust monitoring
Posted 175 days ago
Design and build scalable data pipelines
Ensure and monitor data quality
Posted 175 days ago
Develop and maintain scalable data pipelines
Ensure data quality, governance, and lineage
Posted 175 days ago
Design and develop scalable data pipelines
Enhance data quality, governance, and lineage