Company Description
We Dream. We Do. We Deliver.
As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe. Originally from Switzerland, Merkle DACH was created out of a merger between Namics and Isobar - two leading full-service digital agencies.
Our 1200+ digital enthusiasts are innovating the way brands are built, providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, which shares with us a network of over 66,000 passionate individuals in 146 countries.
Job Description
- Use CI/CD tools to facilitate deployment of code to stage and production environments.
- Participate in the architecture of end-to-end solutions for our customers on AWS, Azure and other cloud platforms.
- Maintain GIT repositories using the Gitflow framework.
- Collaborate on feature deliverables to meet milestones and quality expectations.
- Communicate with stakeholders, vendors and technology subject matter experts.
- Document implemented logic in a structured manner using Confluence; plan your activities using Agile methodology in Jira.
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs, such as optimizing existing data delivery, redesigning infrastructure for greater scalability, etc.
Qualifications
- Experience in building and productionizing public cloud infrastructure.
- Experience using GitHub, Bitbucket or other code repository solutions.
- Experience in setting up and using CI/CD automation tools like Github Actions, Azure DevOps or AWS CodePipeline.
- Experience with infrastructure as a code frameworks like Terraform, AWS CloudFormation, ARM templates.
- Understanding of containerization concepts and container orchestration services like Docker, Fargate, Kubernetes.
- Experience with scripting languages like Python, Bash, PowerShell etc.
- Strong analytical skills related to working with structured and unstructured datasets.
- An individual who is precise, well organized, has good communication skills, can adapt to changing circumstances, and is not afraid of taking responsibility for their work, will excel in this role.
Preferred Skills
- Understanding of big data concepts and patterns, data lake, lambda architecture, stream processing, DWH, BI & reporting.
- Experience with data pipeline/workflow management tools like dbt, AWS Step Functions, AWS Glue, Azure Data Factory, Airflow.
- Knowledge of SQL.
Additional Information
With us, you will become part of:
- An international, amazing team where you can gain new/relevant experience.
- A dynamic and supportive environment where you will never happen to fall into a routine.
- The possibility to grow, in accordance with your skills and interests connected with future development.
- A start-up agile atmosphere.
- A friendly international team of creative minds.
We, obviously, offer even more:
⛺ 5 weeks of vacation + 3 wellness days.
❤️ 2 Volunteering days to share the kindness of your heart with others.
⏰ Flexible working hours and home office.
🎯 Fully covered certifications in Salesforce, Adobe, Microsoft etc. (delete for non-tech roles).
🎓 Full access to Dentsu Academy, LinkedIn Learning, on-site learning sessions.
🐶 Pet friendly offices.
💌 Edenred meal and cafeteria points.
🍹 Team events: company parties, monthly breakfasts, and pub quizzes.
🥪 Snacks, and drinks at the office.
💸 Referral bonus programme.
💻 Laptop + equipment.
📞 Corporate mobile phone subscription.