Seasoned Data Engineer - GCP/AWS, Python, SQL

Job expired!

Join Wrike: The Leading Work Management Platform

Wrike stands out as the ultimate work management platform designed for teams and organizations that aim to collaborate, create, and excel daily. With Wrike, you can centralize all work processes to eliminate complexity, boost productivity, and allow your team to focus on their most vital tasks.

Our Vision

We envision a world where everyone is free to concentrate on their most significant work, together.

Become Part of the Wrike Family

At Wrike, we combine challenging and enjoyable work experiences. As a rapidly growing company, we offer abundant opportunities for professional growth. Our talented and energetic team, characterized by intelligence, passion, friendliness, and professionalism, makes Wrike a fun place to work. If you possess these qualities, we are looking forward to having you on board.

Job Opportunity: Senior Data Engineer

Title: Seasoned Data Engineer - GCP/AWS, Python, SQL

Location: Remote/Hybrid

Job Responsibilities

  • Own data assets and pipelines providing actionable insights into customer, product, GTM, and key business functions.
  • Design, develop, and maintain scalable data pipelines and transformations from various engineering and business systems (e.g., Salesforce/CPQ, NetSuite, Marketo).
  • Collaborate with Analysts to improve data models feeding business intelligence tools, increasing data accessibility, and driving BI tool adoption.
  • Deploy ML models with Data Science teams, adhering to best practices of the ML lifecycle.
  • Implement data quality management processes and systems, ensuring production data accuracy and SLAs maintenance.

Experience Requirements

  • Experience in building and maintaining data pipelines in data-heavy environments (Data Engineering, Backend, Data Science).
  • Strong proficiency in Python.
  • Advanced working knowledge of SQL and relational databases.
  • 7+ years of experience with Data Warehousing Solutions (BigQuery, Redshift, Snowflake, Vertica, etc.).
  • 7+ years of experience with Data Pipeline Orchestration (Airflow, Dagster, Prefect, etc.).
  • Proficiency in using Git, CI/CD, and containerization technologies.
  • Experience with Google Cloud Platform, AWS, or Azure.
  • Knowledge in database architecture.

Desired Skills

  • Experience with significant B2B vendor integrations (Salesforce/CPQ, NetSuite, Marketo, etc.).
  • Strong understanding of Data Modelling (Kimball, Inmon, SCDs, Data Vault, Anchor).
  • Proficiency with Python Data Libraries (Pandas, SciPy, NumPy, Sci-Kit Learn, TF, PyTorch).
  • Experience with Data Quality Tools, Monitoring, and Alerting.
  • Experience with Enterprise Data Governance, Master Data Management, Data Privacy, and Security (GDPR, CCPA).
  • Familiarity with Data Streaming and CDC technologies (Google Pub/Sub, Google DataFlow, Apache Kafka, Kafka Streams, Apache Flink, Spark Streaming, etc.).
  • Background in building analytic solutions within a B2B SaaS environment.
  • Experience partnering with go-to-market, sales, customer success, and marketing teams.

Interpersonal Skills

  • Excellent communication and collaboration skills.
  • Ability to work effectively in distributed, multi-functional, multinational teams.
  • Promote and maintain a fun and productive team environment.

Your Recruitment Buddy

Your recruitment journey will be guided by Pavel Kucera, Senior Tech Recruiter. #LI-PK1

Our Culture and Values

We are a team of innovators and creators dedicated to solving present and future work challenges. Our hybrid work model encourages office collaboration 2-3 times a week for those near our office hubs, fostering a culture of fast problem-solving and synergistic success.

Our