We are looking for an experienced Databricks Senior Data Engineer with significant expertise in data engineering, particularly with Azure Databricks, Python, and PySpark. The ideal candidate should have a strong understanding of complex data systems and the ability to contribute to various stages of the data platform lifecycle.
Skills Required
- Azure Databricks
- PySpark
- Python
- SQL
- Azure data/delta lake
- Azure DevOps
- CI/CD
- Git
Experience Required
- 7+ years of relevant experience
- 3+ years of hands-on experience with Azure Databricks
- 5+ years of proficiency in Python and PySpark
- 5+ years of advanced knowledge in SQL
Job Responsibilities
- Collaborate with business SMEs to implement data validation rules
- Understand business requirements and provide data perspective inputs
- Analyze and understand complex data and data flow
- Understand the overall platform architecture and load data effectively
- Adapt to changes in processes
- Contribute to preparing design documents, unit test plans, and code review reports
- Understand data platform lifecycle
- Work in an Agile environment (Scrum, Agile) if required
Mandatory Skills
- Proficient hands-on experience with Azure Databricks (3+ years)
- Proficiency in Python and PySpark (5+ years)
- Advanced knowledge in SQL (5+ years)
- Knowledge of Azure data/delta lake, Azure DevOps, CI/CD, and Git
- Clear understanding of data platform lifecycle
- Experience in preparing design documents, unit test plans, and code review reports
- Knowledge of data Lake house and business intelligence architecture
- Interpersonal and communication skills with clarity and precision
- Bachelor's degree in IT/Computer Science or a related field
Nice-to-Have Skills. Knowledge of the financial markets, portfolio theory, and risk management
Additional Information
- The position is fully remote
- Immediate to 30-day notice period only
If you have the relevant experience and skills, this role provides an excellent opportunity to work with a leading IT services and consulting company while contributing significantly to complex data engineering projects.