Data Engineer I

Job expired!

Company Description

We help the world see new possibilities and inspire change for better tomorrows. Our analytic solutions bridge content, data, and analytics to help businesses, people, and society become stronger, more resilient, and sustainable.

Job Description: Data Engineer I at Verisk

About the Role

The Data Engineer will provide data pipeline expertise to guide the development and implementation of data products and reporting solutions within the Verisk Claims Solutions division. The successful candidate has experience developing data pipelines using cloud services and designing data structures that support data products and reporting systems. This hands-on role involves transitioning a large SQL Server data warehouse implementation to Snowflake and collaborating with other technical leads to tackle technical problems and expand the Data Engineering footprint and expertise in our offshore location. Additionally, the candidate will develop new dimensional models and create innovative solutions to satisfy the growing needs of internal and external customers.

About the Day-to-Day Responsibilities of the Role

  • Pipeline Development: Design and develop data pipelines using AWS services such as Glue, Step Functions, Lambda, SNS, SQS, and S3 in conjunction with computing platforms like Snowflake and Databricks.
  • Data Table Design: Design, review, and build Kimball style data models for data products and business intelligence reporting systems like Power BI and ThoughtSpot.
  • Collaboration: Work closely with product owners and other technical leads to understand and refine new feature requests. Collaborate with the data engineering team to design, review, and develop new data pipelines.
  • Communication: Communicate technical ideas in terms product owners and non-technical users can understand.
  • Best Practice: Ensure standard practices are followed for logging, monitoring, alerting, and data security.
  • Data Security: Understand and implement Verisk's data security policies and controls.

Qualifications

  • BS or MS degree in Computer Science or Engineering OR equivalent years of work experience.
  • At least 3 years of experience in engineering.
  • Mandatory proven coding and debugging skills in TypeScript.
  • Experience with AWS services like S3, DMS, CFT.
  • Proficiency with version control systems such as GIT and Azure DevOps.
  • Proficient in writing SQL for aggregating and transforming data as part of a data pipeline.
  • At least 2 years of Data Warehousing experience to build Kimball style data models for data products.
  • Experience with Snowflake or equivalent preferred.
  • Ability to work and communicate effectively across disciplines and teams.
  • Experience building/working with CI/CD pipelines is a plus.

Why Verisk?

For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the comprehensive support you need to succeed. Recognized repeatedly as a Great Place to Work®, Verisk values learning, caring, results, and inclusivity and diversity. Join us to help translate big data into big ideas and create an exceptional experience for yourself and a better tomorrow for future generations.

Verisk Analytics is an equal opportunity employer. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age, or disability.