Are you passionate about crafting robust data pipelines that drive advanced data analytics? At ValueLabs, we are seeking a highly skilled Senior Data Engineer with expertise in Snowflake, Airflow, and the complete data warehousing ecosystem. If you have a proven track record in designing scalable data solutions and enjoy leveraging cutting-edge technologies, join our dynamic team!
Key Responsibilities
- Data Warehouse Architect. Design and develop efficient data warehouses using Snowflake to optimize data storage and retrieval for analytics.
- ETL Pipeline Expert. Build and manage data pipelines using Airflow or Luigi, ensuring seamless extraction, transformation, and loading of data for reliable analysis.
- Cloud Integration Specialist. Utilize AWS services like S3, EC2, RDS, and EMR for effective data storage, processing, and analytics in the cloud environment.
- Python Proficiency. Develop efficient data manipulation scripts using Python, contributing to the enhancement of data pipeline functionalities.
- Automation Champion. Schedule and automate deployment of data pipelines using Airflow or Luigi, streamlining data processing workflows for timely insights.
- Data Transformation Guru. Transform raw data into structured formats (e.g., Star, Snowflake, Galaxy schemas) to facilitate comprehensive data analysis.
- DBT Knowledge (Bonus). Apply expertise in DBT to streamline data transformation processes and elevate overall data quality.
- Collaboration. Work closely with data analysts and scientists to understand data requirements and translate them into effective data warehousing solutions.
Qualifications
- Extensive experience in data engineering and data warehousing technologies.
- Proficiency in Snowflake and ETL processes, with hands-on experience in building and optimizing data pipelines.
- Familiarity with data pipeline management tools such as Airflow or Luigi.
- Strong command of AWS cloud services (S3, EC2, RDS, EMR) and cloud-based data management practices.
- Advanced skills in Python for data manipulation and scripting.
- Proven experience in scheduling, automating, and deploying production data pipelines.
- Knowledge of DBT is a plus.
- Solid understanding of data lakes, data warehouse (EDW) concepts, and data modeling techniques (Star, Snowflake, Galaxy schemas).
- Excellent problem-solving abilities and analytical skills.
- Strong communication and collaboration skills, enabling effective teamwork across diverse teams.
Join Us
If you are enthusiastic about leveraging your expertise to build scalable data solutions and drive impactful insights through advanced analytics, we invite you to apply for this exciting opportunity at ValueLabs.
Note. Please include any additional qualifications specific to the job that you would like applicants to have.