Senior Data Engineer

Hybrid - GA

Hyderabad, Telangana

Overview

Job Title: Senior Data Engineer
Location: Hyderabad (Onsite/Hybrid)
Experience: 6+ Years
Employment Type: Full-Time

About the Role:

We are hiring for our client – a forward-thinking technology company focused on building secure, scalable, and intelligent data platforms. As a Senior Data Engineer, you will play a critical role in designing, developing, and maintaining robust data pipelines and infrastructure to support analytics, machine learning, and business operations. The ideal candidate will be hands-on, highly collaborative, and experienced in AWS cloud technologies, data engineering tools, and modern DevOps practices.

Key Responsibilities:

  • Design, build, and optimize scalable, secure, and high-performance data pipelines and platform components.

  • Collaborate with data scientists, analysts, and software engineers to ensure data availability, quality, and accessibility.

  • Develop infrastructure as code using Terraform to automate provisioning and environment management.

  • Manage and deploy containerized applications using Docker in both dev and production environments.

  • Set up and optimize CI/CD pipelines using Jenkins for seamless deployment workflows.

  • Write clean, modular, and efficient code in Python and/or Scala to support ETL, data transformation, and processing workflows.

  • Leverage AWS services (Redshift, S3, Glue, Lambda, etc.) to build and scale data infrastructure.

  • Contribute to data architecture planning, participate in design reviews, and support operational reliability and performance tuning.

Required Skills:

  • 6+ years of hands-on experience in data engineering or platform engineering roles.

  • Strong coding skills in Python, Scala, and SQL.

  • Expertise in AWS data ecosystem: EC2, S3, Glue, Redshift, Lambda, etc.

  • Proficient with Terraform for Infrastructure-as-Code (IaC).

  • Experience managing and deploying Docker containers.

  • Solid understanding of ETL pipelines, distributed systems, and data architecture principles.

  • Hands-on experience with CI/CD pipelines using Jenkins.

  • Strong problem-solving skills and ability to mentor junior engineers.

Good to Have:

  • Experience in regulated domains like healthcare or finance.

  • Exposure to tools like Apache Airflow, Apache Spark, or Databricks.

  • Familiarity with data quality frameworks and observability tools.

Why Join Us?

  • Work on cutting-edge data infrastructure that powers real-world analytics and ML initiatives.

  • Collaborate in a fast-paced, innovation-driven environment.

  • Get exposure to modern DevOps, IaC, and data engineering practices at scale.

  • Flexible working model with strong career growth and learning opportunities.