Shortcuts:

IMAGE: Return to Main IMAGE: RSS Feed IMAGE: Show All Jobs

Position Details: Lead Data Engineer (ML Ops) (JOB CODE - 221)

Location: Chennai, Tamil Nadu
Openings: 1
Salary Range:

Description:

Location: Chennai
Work Hours: 12:00 PM to 9.00 PM IST (US Business Hours)
Availability: Immediate to 8 weeks (Preference for those currently serving notice)
Experience Level: 8–12 Years


Key Responsibilities

  • Lead end-to-end delivery of Data Warehousing and BI solutions, including ETL pipelines, reporting (Power BI/Tableau), and cloud data migrations.

  • Serve as the single point of contact (SPOC) for project engagements, coordinating across teams, business units, and international client stakeholders.

  • Convert business requirements into data pipelines and applications to deliver insights and reports in a timely manner.

  • Plan and execute projects with a focus on high-quality deliverables aligned to business goals.

  • Collaborate in Agile development environments, tracking progress, resolving issues, and adapting to changing priorities.

  • Design and implement batch, near real-time, and real-time data integration solutions.

  • Build scalable data platforms and pipelines using AWS services such as Lambda, Glue, Athena, and S3.

  • Develop and optimize stored procedures, complex SQL scripts, and automate tasks using Python and shell scripting.

  • Conduct root cause analysis and performance tuning in hybrid cloud environments.

  • Perform data profiling, reverse engineering, and normalization/denormalization for both relational and NoSQL databases.

  • Translate complex technical data concepts into actionable business insights and effectively communicate them to technical and non-technical stakeholders.

  • Lead legacy data migration efforts to modern cloud platforms.

  • Collaborate with cross-functional teams to design and deliver data models and business intelligence solutions.

  • Monitor project progress and proactively manage risks, roadblocks, and deliverables.


Must-Have Skills

  • Strong programming skills in Python for ETL, multithreading, API integrations, and AWS Lambda scripting.

  • Expertise in advanced SQL scripting, query optimization, and performance tuning in cloud/hybrid environments.

  • Proficient in Unix/Linux shell scripting for automation.

  • Hands-on experience with AWS services: Lambda, S3, Athena, Glue, CloudWatch.

  • Solid understanding of data pipeline frameworks, data mining, ELT/ETL architectures, and performance tuning.

  • Experience in data migration projects to AWS and hybrid/multi-cloud environments.

  • Familiarity with version control and deployment tools such as GitLab, Bitbucket, or CircleCI.

  • Experience in Digital Campaign Management or marketing data environments is a strong plus.


Tools and Technologies

  • AWS ETL Tools: AWS Lambda, S3, Athena, Glue, CloudWatch, Crawlers, Spark

  • Databases: Microsoft SQL Database, AWS-managed DBs

  • Programming: Python, Unix Shell Scripting, MD-SQL

  • Reporting Tools: Power BI, Tableau, Excel

  • (Optional): Azure ETL tools, Informatica


Preferred Skills

  • Knowledge of Azure/GCP cloud platforms and ETL tools like Informatica.

  • Experience with CI/CD pipelines for data applications.

  • Exposure to real-time streaming frameworks and data lake architectures.

  • Strong communication skills with client-facing responsibilities in global/multi-team environments.

  • Proficiency in visualization and reporting tools: Power BI, Tableau, Excel.


Qualifications

  • Bachelor’s degree in Computer Science, MCA, or a related technical field.

  • 8–12 years of experience in data engineering, ETL, data warehousing, and business intelligence.

  • Minimum 5 years of experience leading projects and handling teams.

  • Technical certifications in AWS and/or Data Warehousing are preferred.

  • · Availability within 4–8 weeks or currently serving notice period.

  • Willingness to work from 3:00 PM to 1:00 AM IST (US business hours).

     

    ·

 

Perform an action:

IMAGE: Apply to Position














Our Portfolio companies:









Powered by: CATS - Applicant Tracking System