Senior Big Data Engineer - Homes.com

Job expired!

Senior Big Data Engineer - Homes.com

Job Description

Overview

CoStar Group (NASDAQ: CSGP) is a leading global provider of commercial and residential real estate information, analytics, and online marketplaces. Included in the S&P 500 Index and the NASDAQ 100, CoStar Group is on a mission to digitize the world’s real estate, empowering all people to discover properties, insights, and connections that improve their businesses and lives.

With over 35 years of industry experience, we have continually refined and perfected our approach to real estate information and online marketplaces. This commitment to innovation helps us deliver exceptional value to our customers, employees, and investors. By equipping the brightest minds with the best resources available, we provide an invaluable edge in real estate.

Homes.com Overview

Homes.com is one of the fastest-growing real estate portals in the industry, driven to become #1. Following its acquisition in 2014, Apartments.com quickly became the most popular place to find a place. We’re now replicating this success with the new Homes.com.

As part of the CoStar Group, Homes.com leverages over 20 years of expertise in digital marketplaces to continually improve and innovate. We are building a brand poised to define the industry and are seeking big thinkers, brave leaders, and creative talent ready to influence the new age of homebuying.

Learn more about .

Position Summary

Homes.com is expanding rapidly and is looking for a Senior Data Engineer to help accelerate our growth. The role involves building out sitewide tracking architecture and feeding data into user-friendly KPI dashboards. These dashboards track site performance, product usage, and provide daily insights on consumer behavior. You will play a key role in designing and maintaining automated data infrastructure, creating BI solutions, and facilitating data access and analysis for key company teams.

Location: Washington, DC
Schedule: Hybrid (3 days onsite, 2 days remote)

Responsibilities

  • Design, build, test, deploy, and maintain real-time and batch processed data pipelines.
  • Develop and maintain data storage and retrieval systems.
  • Ensure data accuracy, consistency, and security.
  • Integrate data from multiple source systems.
  • Develop and maintain data models and ETL/ELT processes.
  • Collaborate with analytics teams, data scientists, and business analysts.
  • Work with cross-functional engineering teams.
  • Advocate for technical and non-technical evolution and improvement within our teams, including new technologies, tools, and best practices.

Basic Qualifications

  • Bachelor’s Degree from an accredited university, preferably in Computer Science, Data Science, or a related field. MSc or PhD is a plus.
  • A track record of commitment to prior employers.
  • Proven experience building and launching successful products using terabytes of data.
  • 5+ years of data pipeline engineering experience and deep database engineering experience.
  • Ability to analyze technical requirements and design new data architectures, models, and ETL/ELT strategies.
  • Hands-on experience with cloud-based relational and non-relational databases and proficiency in SQL.
  • Deliver work products that meet specifications, are defect-free, and performant.
  • Define architecture and development best practices.

Preferred Skills

  • Proficiency in programming languages such as Python, R, and SQL.
  • Performance tuning of database queries (e.g., Snowflake, Databricks).
  • Proficiency in data modeling techniques.
  • Experience with No-SQL databases (e.g., DynamoDB).
  • Experience with data pipeline tools (e.g., Glue, Apache Airflow, Lambda).
  • Experience using Confluent Kafka.
  • Working knowledge of cloud computing platforms such as AWS, Azure, and Google Cloud.
  • Knowledge and experience with data frameworks (e.g., Apache Spark).
  • Monitoring and dashboard metric management (e.g., CloudWatch, Kibana).
  • Knowledge of data security