Senior Data Engineer

Job expired!
Company Description Founded in 2014, Shippeo is a global leader in real-time multimodal transportation visibility, aiding major shippers and logistics service providers to operate supply chains that are more collaborative, automated, sustainable, profitable, and customer-focused. Hundreds of customers, encompassing global brands such as Coca-Cola HBC, Carrefour, Renault Group, Schneider Electric, Total, Faurecia, Saint-Gobain and Eckes Granini, rely on Shippeo to monitor over 28 million shipments annually across 92 countries. Having already raised €110 million in funding, Shippeo expands on average by 80% year on year. Their team of Shippians comprises 28 different nationalities, speaking a total of 24 languages. Job Description We are looking for a Data Engineer to join our Data Intelligence Tribe. The Data Intelligence Tribe is charged with deploying Shippeo’s data from our vast shipper and carrier base, to construct data products that aid our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to: - Get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions. - Extract the data they need, gain direct access to it or analyze it directly on the platform to obtain actionable insights that can help them elevate their operational performance and the quality and compliance of their tracking. - Provide best-in-class data quality by implementing advanced cleansing & enhancement rules. As a Data Engineer at Shippeo, your goal is to ensure that data is available and usable by our Data Scientists and Analysts on our different data platforms. You will contribute to the building and maintenance of Shippeo’s modern data stack that consists of various technology blocks: Data Acquisition (Kafka, KafkaConnect, RabbitMQ), Batch data transformation (Airflow, DBT), Cloud Data Warehousing (Snowflake, BigQuery), Stream/event data processing (Python, docker, Kubernetes) and the underlying infrastructure that supports these use cases. Qualifications Qualifications Required: - You have a degree (MSc or equivalent) in Computer Science. - 3+ years of experience as a Data Engineer. - Experience constructing, maintaining, testing, and optimizing data pipelines and architectures. - Python programming skills and experience with asynchronous event processing (asyncio). - Advanced working understanding of SQL, experience working with relational databases and familiarity with a diverse set of databases. - Working knowledge of message queuing and stream processing. - Familiarity with Docker and Kubernetes. - Understanding of a cloud platform (preferably GCP). - Experience working with workflow management systems such as Airflow. Desired: - Experience with cloud-based data warehouse solutions (BigQuery, Snowflake). - Experience with Kafka and KafkaConnect (Debezium). - Experience with Infrastructure as code (Terraform/Terragrunt). - Experience building and evolving CI/CD pipelines with Github Actions. - Monitoring and alerting on Grafana / Prometheus. - Experience working on Apache Nifi. Additional Information We are searching for talents who resonate with our values: - Ambition - Care - Deliver - Collaboration If our values align with your own and you enjoy working in a swift-paced and international setting, Shippeo could be your ideal workplace! Would you like to learn more? Click here: - Website - LinkedIn - Instagram - Shippeo Tech blog (Medium) - Shippeo blog - Twitter - Facebook - Youtube