As a GCP Data Architect, you will play a pivotal role in leading data strategy initiatives, overseeing cloud data infrastructure, and collaborating with cross-functional teams to turn business requirements into actionable, data-driven solutions. This role is best suited for individuals with a strong technical foundation, a passion for cloud innovation, and a commitment to delivering high-quality solutions in a fast-paced, dynamic environment.
Key Responsibilities
- Architect and Design Scalable Solutions. Lead the design and implementation of end-to-end data architectures that leverage GCP’s full suite of services including BigQuery, Cloud Storage, Pub/Sub, Dataflow, and more.
- Build Robust Data Pipelines. Develop and maintain efficient ETL/ELT pipelines using tools such as Dataflow, Dataproc, Cloud Composer, ensuring reliability, performance, and scalability.
- Data Warehouse and Lakehouse Strategy. Design and optimize modern data lake, data warehouse, and data mart architectures to support enterprise-scale analytics and reporting needs.
- Cloud Migration Leadership. Drive and manage data migration strategies from legacy on-premises systems or other cloud platforms to GCP with minimal disruption and high data fidelity.
- Cross-functional Collaboration. Partner with data engineers, analysts, business users, and product teams to understand data requirements and deliver solutions that support business goals and KPIs.
- Governance, Security & Compliance. Implement and maintain data governance frameworks, manage data privacy and access controls, and ensure compliance with industry standards such as GDPR, HIPAA, etc.
- Performance Tuning & Optimization. Recommend and apply best practices for query performance, storage optimization, cost management, and scalability.
- Monitoring and Troubleshooting. Proactively monitor data pipelines and systems, diagnose bottlenecks, resolve failures, and ensure high availability and business continuity.
- Stay Ahead of the Curve. Keep up with the latest trends and product offerings in the GCP ecosystem and identify opportunities to incorporate new tools or methodologies into existing environments.
Required Qualifications
- Educational Background. Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related technical field.
- Industry Experience. A minimum of 7+ years of professional experience in data architecture and engineering roles, with at least 3 years of hands-on experience working on GCP.
- GCP Proficiency. In-depth knowledge of core GCP services including BigQuery, Pub/Sub, Dataflow, Dataproc, Cloud SQL, Cloud Storage, Looker, Cloud Composer.
Technical Skills
- Strong data modeling and database design expertise.
- Proficiency in SQL, Python, and Terraform.
- Familiarity with CI/CD pipelines and infrastructure-as-code for deploying data solutions.
- Solid understanding of distributed systems, real-time streaming, and batch processing.
Security & Compliance
- Experience implementing security best practices including IAM roles, VPC Service Controls, encryption, and knowledge of data privacy regulations.
Soft Skills
- Strong analytical and problem-solving abilities.
- Excellent written and verbal communication skills.
- Ability to lead technical discussions and articulate architectural decisions to both technical and non-technical stakeholders.
- Proactive, self-motivated, and collaborative mindset.
What We Offer
- The opportunity to work with a global leader in IT services and consulting.
- A highly collaborative and innovative work culture that supports career development.
- Exposure to cutting-edge cloud technologies and real-world business challenges.
- Flexibility to work in a hybrid model with teams across India and globally.
Ready to architect the future of data with GCP?
Join us at Tech Mahindra and be part of a mission to drive digital transformation and next-gen innovation for leading enterprises across the globe.