The Enterprise Essentials team is seeking an Engineer II with a strong background in data engineering on Google Cloud Platform (GCP) or other major cloud platforms. This role focuses on developing capabilities in critical areas like Global Tax, Finance, and Conversational AI platforms. You will have the opportunity to build robust data pipelines that empower data-driven decisions across the organization.
Key Responsibilities
- Design and Implementation. Create scalable data pipelines and maintain automated ETL processes to manage large volumes of structured and unstructured data.
- Data Modeling and Architecture. Design efficient data models and architectures, ensuring data integrity and adherence to governance best practices.
- Cloud-Based Engineering. Utilize GCP tools such as BigQuery, Dataflow, and Cloud Storage to deliver high-performance data solutions.
- Workflow Optimization. Monitor and improve data processing workflows for reliability and speed, enhancing overall data operations.
- Cross-Functional Collaboration. Work closely with data scientists, analysts, and software engineers to ensure seamless integration of data engineering solutions.
- Security and Compliance. Implement data security measures in line with regulatory requirements to protect sensitive information.
- Automation and Innovation. Identify and implement opportunities for automation to improve operational efficiency and adopt new tools.
Leadership Outcomes
- Enterprise Thinking. Align your agenda with enterprise priorities, balancing customer, partner, colleague, and shareholder needs.
- Innovation. Challenge the status quo and drive continuous improvement in existing processes.
- Agility. Make informed decisions quickly, maintaining the highest level of integrity.
- Digital Mindset. Deliver exceptional customer experiences every day through innovative solutions.
Minimum Qualifications
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 2-3+ years of experience in data engineering on GCP or another cloud platform (AWS, Azure).
- Proficiency in SQL and programming languages such as Python, Java, or Scala.
- Experience designing and developing ETL pipelines for large datasets.
- Hands-on experience with GCP tools like BigQuery, Cloud Storage, and Dataflow.
- Knowledge of data modeling and data warehousing principles.
Preferred Qualifications
- Experience with distributed processing frameworks (e.g., Apache Spark).
- Familiarity with infrastructure-as-code (IaC) tools like Terraform or CloudFormation.
- Understanding of Kubernetes or containerization tools for data pipeline deployment.
- Experience with CI/CD pipelines for automating data workflows.
- Knowledge of cloud security best practices.
Benefits
- Competitive base salaries and bonus incentives.
- Financial wellness and retirement support.
- Comprehensive health insurance (medical, dental, vision, life, disability).
- Flexible working arrangements (hybrid, onsite, or virtual).
- Generous paid parental leave policies.
- Access to global wellness centers and confidential counseling services.
- Career development and training opportunities.
Commitment to Diversity
American Express is an equal opportunity employer. We make employment decisions based on merit and without discrimination based on race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law.
Application Process
Candidates are subject to a background verification check in accordance with applicable laws and regulations. Join us at American Express, where your impact matters, and let's lead the way together!