DataOps - La redoute Porto
- Other
- Other places
- 06/15/2024
- -
Main Responsibilities
• Understand problems from a user perspective and communicate clearly to understand the issue.
• Reproduce bugs or issues that users are facing.
• Apply root cause analysis to quickly and efficiently find the root cause of the problem, patch it, test it, and communicate with the end user.
• Write postmortems summarizing every step of resolution and helping the team track all issues.
• Monitor existing flows and infrastructure and perform the same tasks when discovering bugs/issues through monitoring and alerting.
• Monitor flows and infrastructure to identify potential issues.
• Adapt configurations to keep flows and infrastructures working as expected, keeping operations running smoothly without incidents.
• Track costs and processing time through dedicated dashboards.
• Alert people who query tables inefficiently, causing high costs.
• Identify and resolve inefficient jobs, views, and tables, ensuring optimal costs and execution speed.
• Optimize jobs, queries, and tables to enhance both costs and execution speed.
• Manage infrastructure using Terraform.
• Share and propose best practices.
• Decommission unnecessary infrastructures such as services, tables, or virtual machines.
• Track future deployments with a Data Architect and participate in Deployment Reviews.
• Share and propose best practices for deployment.
• Support Data Engineers during the entire deployment process and subsequent active monitoring phases.
• Ensure diligent application of deployment processes, logging, and monitoring strategies.
• Take over newly deployed flows in the run process.
• Google Cloud Platform: General knowledge of the platform and various services, with at least one year of experience with GCP.
• Apache Airflow: At least two years of experience with the Airflow orchestrator; experience with Google Composer is a plus.
• Google BigQuery: Extensive experience (at least 4 years) with GBQ, including optimizing tables and queries and designing database architecture.
• Terraform: At least two years of experience with Terraform and knowledge of GitOps best practices.
• Apache Spark: This is an optional expertise we value; some of our pipelines use pySpark.
• Pub/Sub
• Kafka
• Azure Analysis Services
• Google Cloud Storage Optimization
Apply now to join Alter Solutions as a DataOps specialist in La Redoute Porto. Optimize and manage our cutting-edge infrastructure!