We’re currently looking for a Senior Data Engineer to join our Finance Data Engineering team (Finance-DE). This is an exciting opportunity for someone who thrives on solving complex data challenges, enjoys building scalable solutions, and is passionate about turning data into insights that drive business decisions.
Responsibilities
- Architecting and managing a multi-petabyte scale data lake for financial data and analytics.
- Building and maintaining high-performance ETL/ELT pipelines using modern data tools and frameworks.
- Developing microservices and REST APIs to enable access to data and services across the organization.
- Designing and implementing data models and storage practices that ensure data integrity, security, and performance.
- Driving initiatives around data quality, anomaly detection, and observability.
- Creating self-service tools that empower analysts, engineers, and business users to access and use data more efficiently.
- Championing agile development practices, including TDD, CI/CD, and peer reviews.
- Collaborating with peers across multiple teams to support strategic data needs and ensure consistency in data practices.
Basic Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field or equivalent practical experience.
- Minimum of 5 years’ experience as a Senior Data Engineer or Senior Software Engineer.
- Proficiency in Python, Java, or Scala for backend and data-related programming.
- Deep expertise in SQL, data modeling, and data warehousing concepts.
- Hands-on experience with Apache Spark, Hive, Airflow, and streaming technologies.
- Proven experience building scalable data pipelines, platforms, and APIs.
- Strong understanding of data quality best practices, and experience implementing automated anomaly detection.
- Familiarity with modern cloud technologies and experience working on AWS (S3, EMR, Kinesis, RDS, SQS, etc.).
- Agile mindset with experience in TDD, version control (Git), and CI/CD pipelines.
- Excellent communication skills and a collaborative approach to working with cross-functional teams.
Preferred Qualifications
- Experience with Databricks, including building pipelines and utilizing their APIs.
- Exposure to Kappa architecture and designing real-time data processing solutions.
- Track record of contributing to open-source projects, especially around workflow orchestration tools like Apache Airflow.
- Experience building internal self-service platforms or tools for data access and management.
Why Join Atlassian?
Atlassian isn’t just a place to work, it’s a place where your work matters. You’ll join a team that’s passionate, inclusive, and driven to build tools that unleash the power of every team, everywhere.
- Competitive compensation and equity
- Flexible work arrangements and remote-first setup
- Generous PTO and paid volunteer days
- Robust health and wellness programs
- Access to learning and development resources
- Inclusive policies and practices that embrace diversity, equity, and belonging
Commitment to Diversity & Inclusion
Atlassian is committed to creating a diverse and inclusive environment. We believe that the best teams are built from a variety of backgrounds, experiences, and perspectives. We are proud to be an equal-opportunity employer and ensure all qualified applicants receive consideration without regard to race, religion, gender identity, sexual orientation, disability, or veteran status.
If you require any accommodations or adjustments during the recruitment process, just let us know, we're here to help.