As a Senior AWS Cloud Architect, you’ll be at the forefront of our cloud transformation efforts, responsible for architecting and implementing large-scale, real-time data solutions. Your contributions will help streamline operations across Siemens' diverse ecosystem.
Key Responsibilities
Architect & Design Solutions
- Design and implement robust, scalable, and efficient data architectures using a range of AWS services like AWS Glue, Amazon Redshift, S3, Kinesis (Apache Kafka), DynamoDB, Lambda, EMR, and Lake Formation.
- Develop architectures that support real-time and batch data processing pipelines.
Data Integration & Streaming
- Integrate real-time data streams from various Siemens business units into a unified AWS-based Data Lake, ensuring smooth, secure, and high-throughput data ingestion.
- Work with AWS Glue (Streaming ETL) and Kinesis/Kafka for stream processing.
Data Lake Management
-
Architect and manage an enterprise-grade Data Lake leveraging Amazon S3, AWS Glue, and Lake Formation, ensuring high availability, scalability, and performance.
Snowflake Integration & Performance Optimization
- Design and manage data pipelines into Snowflake, making use of Iceberg tables for maximum flexibility and efficiency.
- Continuously monitor, analyze, and optimize performance, costs, and reliability of pipelines and systems.
Data Transformation & Quality
- Implement complex transformations to prepare data for downstream consumption by BI tools, analytics teams, and applications.
- Ensure data accuracy, integrity, and consistency across all platforms.
Security, Compliance & Governance
Cross-Functional Collaboration
- Partner with Data Engineers, Scientists, Architects, and business stakeholders to deliver high-impact, end-to-end data solutions.
- Be a strong advocate for cloud best practices, automation, and infrastructure-as-code.
Monitoring & Troubleshooting
- Build monitoring solutions for all pipelines and services to ensure uptime, reliability, and performance.
- Troubleshoot and resolve any issues within the cloud ecosystem swiftly and effectively.
What You Bring – Skills & Experience
- 8+ years of experience in Cloud Architecture, Data Engineering, or similar roles with a strong focus on AWS.
- Deep hands-on expertise in AWS services such as AWS Glue, Amazon Redshift, S3, Lake Formation, Kinesis / Apache Kafka, Lambda, EMR, DynamoDB, AWS APIs and CLI tools
- Proven experience with real-time streaming and batch processing architectures.
- Strong skills in big data tools like PySpark, Hive, and Spark SQL.
- Advanced programming knowledge in at least one of Python, Java, or Scala.
- Familiarity with Snowflake and Iceberg tables is a big plus.
- Ability to design for performance, scalability, fault-tolerance, and cost-efficiency.
- Experience building secure, compliant data systems.
- Excellent problem-solving and analytical capabilities.
- Strong communication skills – able to explain technical ideas to non-technical stakeholders with clarity.
- AWS certifications (e.g., Solutions Architect, Big Data Specialty) are highly desirable.
What You’ll Get in Return
- Opportunity to work with a global team on innovative projects impacting real-world systems.
- Exposure to cutting-edge technologies and the latest cloud-based data engineering patterns.
- Flexible and inclusive work culture focused on well-being, innovation, and continuous learning.
- A chance to work with 312,000+ brilliant minds at Siemens, united by a shared vision, Building the future, one day at a time.
At Siemens, we celebrate diversity, authenticity, and creative thinking. We believe innovation thrives when everyone brings their unique identity, background, and voice to the table.