Azure Data Engineer

Job expired!

Azure Data Engineer Job Opportunity at BDO USA

Job Summary:

Join our team as an Azure Data Engineer, where you'll work with cutting-edge technology to deliver high-quality data analytics solutions across various industries. Collaborate with teams on diverse engagements and benefit from continuous career development opportunities. Your role involves hands-on delivery of data analytics projects, including solution development and unit testing.

Job Duties:

  • Design and implement top-tier data ingestion strategies, data warehouses, data mart structures, semantic layers, and models.
  • Create visualizations, streaming processes, API integrations, and automation (RPA) solutions for end-to-end data analytics.
  • Align solutions with client needs and business requirements, ensuring timely delivery.
  • Develop written functional and technical designs.
  • Participate in project status meetings and assist with project and program management.
  • Ensure SLA compliance and perform performance tuning and optimization of solutions.
  • Write code using multiple languages and frameworks, applying architectural patterns and software development principles.
  • Deliver scalable, secure solutions with high performance and broad impact.
  • Support the implementation of data governance programs and best practices.
  • Clean and transform data from source systems into analytics models.
  • Implement models to support data visualizations and integrations.
  • Assist with the implementation of DevOps, DataOps, and MLOps methodologies on projects.
  • Write custom integration logic in applicable programming languages.
  • Establish secure data analytics platforms and infrastructure in collaboration with clients and team members.
  • Contribute to successful solution deployments and integration of DevOps tools.
  • Maintain a broad understanding of data analytics and business intelligence strategies, cloud platforms, methodologies, and tools.
  • Build strong client relationships during project execution.
  • Participate in support activities for existing software solutions.
  • Perform other duties as assigned.

Supervisory Responsibilities:

N/A

Qualifications, Knowledge, Skills, and Abilities:

Education:

  • High School diploma or GED equivalent, required
  • Bachelor’s degree preferred, with a focus in Information Systems, Data Science, or Computer Science

Experience:

  • Three (3) or more years of experience in Data Analytics, Business Intelligence, AI, or Application Development, required

Software:

  • Strong SQL skills, including Data Definition Language (DDL), Data Manipulation Language (DML), views, functions, stored procedures, or performance tuning, required
  • Experience with Data Warehousing, Data Modeling, Semantic Model Definition, or Star Schema Construction, preferred
  • Hands-on experience in end-to-end cloud data analytics solutions on Azure or AWS, preferred
  • Experience with one or more programming languages: C#, Python, Java, Scala, preferred
  • Experience with tabular modeling in Microsoft Fabric, Power BI, or Azure Analysis Services, preferred
  • Experience with Git and DevOps deployment technologies, preferred
  • Experience with Linux, preferred
  • Experience with Data Lake Medallion Architecture, batch/streaming data ingestion into a data lake, AI Algorithms/Machine Learning, and automation tools such as UiPath or Alteryx, preferred

Other Knowledge, Skills, & Abilities:

  • High degree of professionalism and autonomy
  • Excellent verbal and written communication skills
  • Solid organizational skills and attention to detail
  • Ability to multi-task and work independently or within a group environment
  • Effective interaction with people at all organizational levels
  • Strong relationship-building skills with internal and client personnel
  • Encourages a team environment on engagements

Keywords: Data Analytics, Business Intelligence, BI, Synapse, IoT, Machine Learning, Data Lake, Stream, Cube, Microsoft, SQL Server, Tableau, .Net, C#, Qlik, Power BI, Azure Data Factory,