Big Data Engineer – Cloud Migration

Gurugram, Haryana, India
May 14, 2025
May 14, 2026
Hybrid
Full-Time
2 Years
Job Description

We're seeking a passionate, forward-thinking Big Data Engineer to join our journey of migrating complex big data workloads from legacy on-prem systems to modern, cloud-native infrastructure on Google Cloud Platform (GCP). This role is an opportunity to work on high-impact initiatives, design scalable data solutions, and help drive a once-in-a-generation technology transformation across our data landscape.

What You'll Do

  • Be a key player in our large-scale cloud migration initiative, moving use cases from on-prem Big Data clusters to GCP-based data warehousing solutions.
  • Design, build, and optimize high-performance, scalable data platforms using technologies such as Spark, Hive, BigQuery, and other GCP-native services.
  • Collaborate with cross-functional teams including platform engineers, data scientists, product owners, and business users to translate complex requirements into robust data solutions.
  • Lead the development and deployment of automation tools to simplify data migration and accelerate cloud adoption across business units.
  • Support and guide use case owners through the transition process, offering consultative insights on best-fit tools and architecture patterns.
  • Drive excellence in engineering practices by following DevOps principles, CI/CD pipelines, and Agile methodologies.

What You Bring

  • A Bachelor’s degree in Computer Science, Engineering, or related discipline (Master’s preferred).
  • 3+ years of professional experience in software/data engineering with hands-on expertise in Java, Python, or Scala.
  • 2+ years of experience with GCP services including BigQuery, Dataflow, Dataproc, Cloud SQL, and Pub/Sub.
  • Strong SQL skills, with the ability to write complex queries across RDBMS (MySQL, PostgreSQL) and NoSQL (Hive, HBase).
  • Experience working with big data technologies like Apache Spark and Hadoop.
  • Familiarity with low-code/no-code data transformation tools and frameworks.
  • Solid understanding of data warehousing concepts, ETL pipelines, and dimensional modeling.
  • Comfort with Git, code reviews, and maintaining clean, reusable code.
  • Experience in building CI/CD pipelines using tools like Jenkins, XLR, and working with Agile development frameworks (SAFe a plus).
  • Exceptional communication, analytical thinking, and problem-solving skills.
  • GCP Professional Certification (Data Engineer or Cloud Architect) is highly desirable.

What You’ll Get in Return

  • Competitive base salary and performance-based bonuses
  • Comprehensive health benefits, medical, dental, vision, life insurance, and disability coverage
  • Retirement planning and financial wellness programs
  • Flexible work arrangements (hybrid, remote, or onsite depending on business need)
  • Generous paid parental leave and family support policies
  • Global wellness centers and access to mental health support
  • Ongoing learning and development, mentorship, and leadership training
  • Inclusive and supportive workplace culture where diversity is celebrated

At American Express, we believe diversity makes us stronger. We’re proud to be an equal opportunity employer, and we do not make employment decisions based on race, color, religion, gender identity, sexual orientation, national origin, disability, veteran status, or age.