Senior Associate 1 GTS - DES

Job expired!

Senior Associate 1 GTS - DES - KPMG India

Join Our Audit Data Engineering Team

Role Overview

We are looking for a talented and dedicated individual to join our Audit Data Engineering team at KPMG India. This is a techno-functional role where you will develop expertise in KPMG's proprietary tools and underlying business rules. Your responsibilities will include extracting, validating, analyzing, and visualizing data from our client’s ERP systems, whether hosted on-premises or in the cloud. Through this role, you will provide standard reports, audit work papers, and insights to our Audit engagement teams across multiple business processes. Additionally, you will assist in the development of solutions for various Audit Data & Analytics (D&A) services.

Key Responsibilities

Development

  • Build and configure ETL tools for successful data extraction and transformation from multiple sources, whether on-premises or in the cloud.
  • Be proficient in Azure Cloud technologies to perform data extraction, transformation, and loading (ETL). Provide technical guidance on debugging errors/issues in the process.
  • Design, code, verify, test, document, amend, and refactor moderately complex programs/scripts, adhering to agreed standards and tools to achieve a well-engineered result.
  • Develop and implement data ingestion, transformation, and validation processes using Azure cloud applications, ensuring data quality, consistency, and reliability.
  • Apply data analysis, design, modeling, and quality assurance techniques based on a detailed understanding of business processes.
  • Participate in the design, development, and implementation of fixes and enhancements to new or existing modules.
  • Assist or lead in the development of operational and/or engagement team-related routines.
  • Lead team deliverables and train professionals to acquire technology working knowledge and certifications in Azure data engineering.
  • Experience with visualization tools for building dashboards and reports or Power Apps is advantageous.

Knowledge of leveraging AI/ML algorithms (k-NN, Naïve Bayes, SVM, Decision Forests) and modeling frameworks (PyTorch, TensorFlow, Keras) using Python is a plus.

Execution

  • Support clients on data extraction remotely, dealing with medium to high complexity and medium to large data sizes.
  • Assist Audit engagement teams by coordinating with the Client’s IT teams and other technical leads during the data extraction process.
  • Work with engagement teams to interpret results and provide meaningful audit insights from reports.
  • Develop transformations using Azure Databricks, Azure Data Factory, or Python, and handle any data mapping changes and customizations within Databricks using PySpark code.
  • Debug, optimize, performance tune, and resolve issues with limited guidance when processing large datasets, proposing possible solutions.
  • Reconcile data across multiple data layers to maintain data integrity and completeness.
  • Maintain accurate and up-to-date project status for self and any assigned team member.
  • Prepare and review required documents supporting the engagement with utmost attention to detail.
  • Handle and analyze enormous volumes of data using big data technologies like Azure Databricks and Apache Spark. Create data processing workflows and pipelines to support data analytics, machine learning, and other data-driven applications.
  • Coach Associates on data processing best practices and enable them to handle low complexity work.

Job Requirements

Technical Skills

Primary Skills:

  • Azure Data Factory
  • Azure Data Lake Storage
  • Azure Databricks
  • Azure Synapse Analytics
  • Python or PySpark
  • SQL/PLSQL

Experience:

  • 5+ years in IT, focusing on ETL and Microsoft Azure.
  • Experience in building ETL/ELT processes, Data Ingestion, and Data Migration.
  • Building Python or PySpark notebooks to support data transformation by integrating with Azure Data Lake Storage seamlessly.
  • Monitoring, troubleshooting, and optimizing the performance of Databricks notebooks, Azure Data Factory, and Synapse workloads.
  • Hands-on experience with Azure Cloud Services and using Databricks for