Data Engineering SDET (Remote in the US)

  • Full Time
Job expired!
PRI Talent is recruiting a Data Engineering SDET for our customer. This full-time, 1099 contract staff augmentation opportunity is with a company that excels in cutting down electronic waste and finding value in gently used electronics. Our client has experienced impressive growth and a significant positive impact on environmental protection, alongside promoting a unique work culture. Our client is in search of a competent and highly motivated Data Engineering SDET (Software Development Engineer in Test) to join their data engineering team. This role is centered on certifying the quality and reliability of our client's data pipelines, data warehousing, and analysis solutions powered by automated testing. The ideal candidate should have experience in data engineering, & ELT (Extract, Transform, Load) orchestration tools, SQL, and Tableau.

Key Responsibilities

  • Create, implement, and carry out comprehensive testing plans and cases for ETL processes (to verify data quality, transformation logic, and performance), machine learning models & APIs. Identify, report defects, and collaborate closely with data engineers to resolve issues.
  • Cooperate with data engineers to validate and optimize data warehousing solutions. Ensure data consistency, accuracy, and efficient storage.
  • Employ ETL orchestration tools like FiveTran or similar platforms to automate and schedule data workflows. Generate tests to validate the functionality and reliability of these workflows.
  • Create and maintain test suites for Tableau dashboards and reports. Confirm data accuracy and dashboard functionally to guarantee data visualizations deliver significant insights.
  • Compose and perform SQL queries to verify data transformations, data loading, and data retrieval processes. Make sure of data consistency and accuracy at each phase of the pipeline.
  • Becoming acquainted with real-time data streaming technologies, specifically Amazon Kinesis. Test data streaming processes for correctness, data integrity, and performance.
  • Establish and preserve regression test suites to ensure that changes or updates to data pipelines do not introduce new issues or regressions.
  • Create and keep automated test scripts and frameworks for data engineering processes to improve testing efficiency and coverage.
  • Work closely with data engineers, analysts, and other stakeholders to comprehend requirements and guarantee data quality and reliability.
  • Document test cases, test plans, and test results. Produce and keep documentation on data pipelines, ETL processes, and data structures.
  • Stay updated with industry best practices, emerging technologies, and data engineering and testing trends—recognize opportunities for process improvement and automation.
  • Having knowledge of real-time data streaming technologies like Kinesis is an asset.

Education and Experience

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Demonstrated experience in software testing, specifically in data engineering, ETL processes, and data warehousing.
  • Strong SQL skills and experience working with databases such as SQLServer or similar and warehouses like Snowflake or Amazon Redshift.
  • Proficient in ETL orchestration tools like Snaplogic or FiveTran or similar.
  • Experience with data visualization tools, particularly Tableau.
  • Familiarity with real-time data streaming technologies like Amazon Kinesis or Apache Kafka is a plus.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Ability to work independently and in a team.
  • Knowledge of scripting languages (e.g., Python) for test automation is a plus.
  • Certifications in data engineering or software testing are advantageous.


Please note that we will not consider applications without a cover letter and work samples.