Job Overview
We are looking for a highly qualified Senior Data Engineer to join our data-focused team. The ideal candidate should be an expert in data engineering with a wealth of experience in using tools and languages such as AWS S3, Pentaho, AWS API, Snowflake, Data warehousing, ETL/ELT processes, AWS Code Commit, AWS Code Deploy, AWS Code Build, Looker, Shell script, and Python. This role involves the creation, maintenance, and optimization of data architectures, pipelines, and data sets to cater to our diverse business needs.
Responsibilities
Data Architecture and Design:
- Design, build, set up, and maintain scalable and robust data architectures.
- Work with Snowflake and AWS for efficient data warehousing solutions.
Data Integration:
- Develop, test, and maintain ETL/ELT processes using Pentaho.
- Build sturdy data pipelines to support analytics and data science initiatives.
Data Deployment and Automation:
- Use AWS Code Commit, AWS Code Deploy, and AWS Code Build for smooth deployment and automation.
- Implement data quality checks, monitoring, and validation.
Collaboration and Support:
- Collaborate with data analysts, data scientists, and other stakeholders.
- Support business users with Looker for data visualization and insights.
- Provide mentorship and guidance to junior data engineers.
Scripting and Development:
- Utilize Python and Shell scripting to create and automate data processing tasks.
Qualifications:
- Bachelor's or master's degree in computer science, Engineering, or a related field.
- Over 5 years of hands-on experience in data engineering.
- Expertise in the following technologies:
- AWS S3, Pentaho, AWS API, Snowflake, AWS Code Commit, AWS Code Deploy, AWS Code Build.
- Significant experience with Data warehousing, ETL/ELT processes.
- Proficiency in Python, Shell script, and Looker.
- Strong problem-solving skills, attention to detail, and the ability to work in a fast-paced environment.
Requirements
Main Focus on 3 things:
- Snowflake
- Pentaho
- S3