Junior Backend (Big Data) Developer
Talan is an international advisory group specializing in innovation and transformation through technology. With a robust team of 5,000 employees and an impressive €600M turnover, we offer comprehensive services to assist organizations in their key stages of transformation.
Our Services
We provide a continuum of services in four main areas:
- Consulting in Management and Innovation: Supporting business, managerial, cultural, and technological transformations.
- Data & Technology: Implementing major transformation projects.
- Cloud & Application Services: Building or integrating software solutions.
- Service Centers of Excellence: Supporting our services through technology, innovation, agility, sustainability of skills, and cost optimization.
At Talan, we accelerate our clients' transformation through innovation and technology. By understanding their challenges, we enable them to be more efficient and resilient through our support, innovation, technology, and data. We firmly believe that a human-oriented approach to technology will make the new digital age an era of progress for all. Together, let’s commit!
Job Description
We are focused on developing tools for financial regulatory reporting across different geographical locations. Our aim is to create efficient, user-friendly, and scalable solutions utilized by various teams within our company. Strong, robust data products are essential for mitigating risks and improving our capabilities within the Group.
We run our projects using SCRUM/Agile methodology because we believe continuous delivery and quick feedback loops are key to our success and client satisfaction. To achieve our goals, we rely on technologies like Scala, Python, Java, SQL/HQL, Airflow, ControlM, Jenkins, Github, Hive, Databricks, Azure, S3, and Maven. Our commitment to quality is unwavering, employing Unit Tests, Automation, and Pull Requests review.
What You Will Be Doing
As a Backend Spark Developer, your mission will be to develop, test, and deploy technical and functional specifications provided by Solution Designers, Business Architects, and Business Analysts. Your responsibilities include ensuring operability and compliance with internal quality standards.
- Develop end-to-end ETL processes using Spark/Scala, including data transfer from/to the data lake, technical validations, business logic, etc.
- Work within a high-performance team using Scrum methodology.
- Document your solutions in JIRA, Confluence, and ALM.
- Certify your delivery and its integration with other components by designing and performing relevant tests.
- Collaborate with technical specialists to improve the architecture and technical implementation of the solution in place.
- Work with cross-functional and cross-country teams to integrate data from various sources.
- Develop and maintain data hub and data pipelines within a microservices architecture.
- Collaborate with other developers to maintain and improve CI/CD pipelines.
Qualifications
Required Qualifications:
- Bachelor's degree in Computer Science or a related field.
- 1+ years of experience in data/software engineering, preferably in banking.
- Experience with integration solutions (using API and Microservices).
- Programming backend applications with BigData technologies.
- Agile approach to software development.
- Knowledge of Continuous Integration tools (Git, GitHub, Jenkins).
- Strong understanding of SQL databases.
- Proficiency in English (at least B2+ level).
Preferred Qualifications:
- Experience with Jenkins orchestration and GitHub actions.
- Proficiency in Scala or Python, particularly with Apache Spark.
- Skills in Bash scripting, ControlM, and AirFlow.
- Understanding of the software development lifecycle (e.g., HP ALM).
- Knowledge of basics in cybersecurity and quality tools (e.g., Sonar).
- Familiarity with Cloud computing (Docker, Kubernetes, S3, Azure, AWS, EMR, Databricks).
- Understanding of SOA architecture and strong analytical skills.
Additional Information
We offer a permanent, full-time contract