Signup or Login to proceed

Please Signup or Login to continue and unlock all the features of AiDOOS.


login with

Join AiDOOS: Empower Your Micro-Company Today!


Landing Image
  • Form micro-companies around niche technologies
  • Pick and deliver work with complete autonomy
  • cale up your team to take on bigger projects
Big Data Expert for ETL, Python, AWS, and More!
Budget: $TBD

Technologies: Airflow, Aws Emr, Aws glue, Extract Transform Load (etl), Mysql, Oracle, Postgresql, Pyspark, Python, Sql

This job opportunity is for a Big Data Expert who has a deep understanding and experience in ETL, Python, AWS Glue, AWS EMR/PySpark, Talend, Airflow for orchestration, SQL, and programming languages. The candidate will be responsible for designing and implementing data modeling, processing and ingesting data from various datasets, building orchestration workflows using Airflow, developing and executing ad-hoc data ingestion, interacting with vendors, and evaluating tools that support business requirements. The candidate must have previous experience working with big data processing techniques like Sqoop, Spark, and Hive, be proficient in writing technical specifications for data extraction, SQL, and good quality code, have experience with analytic tools such as Tableau/Qlik, and be familiar with Agile methodology. The position is for a duration of 26 weeks, and the work hours are 40 hours/week, following the IST time zone, and it requires candidates located in India. The compensation is set at $600 - 700 (USD) per week.