Data Engineer - Remote Position

  • Full Time
Job expired!

Company Description

We are a global team of professionals who believe that the right knowledge and approach can make technology the solution to the challenges businesses face today. Since 2016, we have been bringing this knowledge and approach to our clients, helping them turn technology into their path to success.

With our roots in Switzerland and our own development team in Lima and across the region, we offer the best of both cultural worlds: the talent and passion of Latin American professionals combined with Swiss organizational skills and mindset.

Job Description

Bertoni Solutions is seeking a Data Engineer to work 100% remotely for a US client.

In this role, you will be responsible for:

  • Developing data solutions for the Azure cloud. You will work with various teams across the organization to help deliver on data strategy, architecture, and governance. You will also drive improvements in data quality, performance, and availability within the data engineering team.
  • Creating data solutions using cloud-based storage. You will also assist in the deployment of data-driven applications using Azure services.
  • Collaborating with the product teams to build and maintain scalable data solutions. You will also work with the data management teams to ensure data is organized and accessible for the company's applications.
  • Designing, managing, and maintaining the data architecture and pipelines for the organization's data services. You will work with architects and other data professionals to make sure that data services cater to the organization's needs, are scalable, reliable, and consume minimal resources.
  • Designing, building, and maintaining data solutions using Microsoft Azure. You will work with engineers and stakeholders to develop and assess data solutions and provide expert guidance on best practices for data management in Microsoft Azure. You will also collaborate with the product management, engineering, and operations teams to ensure that data solutions meet business goals.

Qualifications

This position is only available to applicants dwelling or located in Latin America.

  • 2 to 3 years of experience or more.
  • Practical experience using ADF to connect to various sources (RDBMS, Salesforce, flat files) and to import data into ADLS.
  • Practical experience handling incremental load and full load approaches.
  • Practical experience handling semi-structured data like JSON.
  • Practical experience using Databricks with Delta Lake concepts.
  • Strong SQL skills (Slow Change Types [SCD], Joins, Common Table Expressions [CTEs], Hierarchies, etc.)
  • Practical experience with .
  • Experience with Scala and Python
  • Practical experience using Databricks Notebooks and optimization techniques.
  • Experience with Azure Cloud Services for Data Engineering.
  • Understanding of Data Layers: Raw, Refined, and Curated Models.
  • Experience with Databricks: programming and automating notebooks using ADF.
  • Proficiency in optimizing the performance of Spark or PySpark code, including partitioning and data caching techniques.
  • Practical experience in creating a user-defined function (UDF) in Databricks and applying it to a data frame.
  • Ability to manage and operate DataFrames for Data Engineering tasks.
  • Advanced proficiency in English

Additional Information

This is a remote opportunity to work with one of our esteemed clients in the United States. Initially, the contract for this position is for 6 months, with the chance of extension. To meet the client's requirements, we require you to be available full-time, working 8 hours per day from Monday to Friday in Eastern Standard Time (EST).