Senior Data Architect

  • Full Time
Job expired!

Join SADA as a Senior Data Architect!

Your Mission

As a Senior Data Architect, your mission goes beyond advising to hands-on execution as you collaborate with a team of diverse engineers to design, build, and enhance robust full-stack architectures and data pipelines that optimize data operations, improve visibility, and ensure reliability. Backed by a Scrum Master and a team of platform, data, and security engineers, you will act as a crucial link between our customers' business needs and technical requirements, creating and implementing custom data strategies that align with their objectives.Your leadership will expand the engineering team’s delivery capabilities through mentoring, code reviews, and knowledge sharing.

Your expertise will help guide customers towards optimal solutions, and your dedication to keeping pace with industry advancements will ensure our strategies remain at the forefront. Your role will be to promote a culture of open discussion and debate, encouraging the exploration of diverse viewpoints and ideas to drive innovative solutions.

You will strive for transparency and communication in all interactions, setting clear expectations and providing regular updates to ensure we're effectively meeting customer needs. In collaboration with our customers and internal teams, you'll design and implement cloud-based solutions that are secure, scalable, and reliable, aiming for operational excellence.

By forming strong relationships with key customer stakeholders and internal teams, you'll ensure our solutions align with strategic goals and customer needs. Your focus on delivery excellence will foster effective teamwork, holding members accountable for their work's quality and promoting a culture of continuous improvement. Through your leadership and practical work, you will be critical in optimizing data operations, improving visibility, and ensuring reliable data architectures, ultimately aiding our customers succeed on their digital transformation journey.

Pathway to Success

#MakeThemRave is at the core of all our engineering work. We're motivated to provide customers with an exceptional experience when migrating, developing, modernizing, and operationalizing their systems on Google Cloud Platform.

Your success begins by positively influencing the direction of a rapidly growing practice with vision and passion. You will be assessed bi-yearly based on the breadth, magnitude, and quality of your contributions, your aptitude for accurate estimating, customer feedback at project completion, teamwork with your peers, and the advisory polish you bring to customer interactions.

As you continue to execute successfully, we will develop a personalized advancement plan together that guides you through the engineering or management growth tracks.

Expectations

Required Travel - 10% of travel to customer sites, conferences, and other related events.

Customer Facing - You will interact with customers regularly, sometimes daily, other times weekly/bi-weekly. Common touchpoints occur when qualifying potential opportunities, at project kickoff, throughout the engagement as progress is communicated, and at project close. You can expect to interact with a range of customer stakeholders, including engineers, technical project managers, and executives. You will serve as a Tier 3 on-call escalation point for our largest customers’ business critical workloads.

Onboarding/Training - The first few weeks of onboarding are dedicated to learning, including learning materials/assignments and compliance training, as well as meeting with relevant individuals. More details on the timeline will be provided closer to the start date.

Job Requirements

Required Credentials:

  • Professional Data Engineer Certified or able to complete within the first 90 days of employment.

Required Qualifications:

  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • At least 5 years of experience in data architecture, data engineering, or a similar role.
  • Demonstrated experience in designing and implementing data solutions using Google Cloud Platform (GCP) services like BigQuery, Cloud Composer, Cloud SQL, Dataproc, and Dataflow.
  • Expertise in designing, building, and maintaining ETL pipelines using various data integration tools and frameworks.
  • Strong understanding of ETL processes including data cleansing, data conversion, data import/export, data validation and testing.
  • Expertise in extending and customizing monitoring solutions for resource metrics, input/output metrics, and transformation stage metrics.
  • Proficiency in BigQuery table archival and restore processes to optimize cost and on-demand data warehousing capabilities.
  • Practical experience in building data quality monitors and extending data quality reporting across various stages of data pipelines.
  • Experience in pipeline execution visualization and orchestration using Cloud Composer or alternative lightweight microservices.
  • Strong knowledge of creating reusable data transformation templates and libraries for efficient data pipeline deployment.
  • Familiarity with handling data mutations and creating streaming ingestion templates for specific customer needs.
  • Proficiency in log analytics and observability using Looker and BigQuery, focusing on leveraging cloud logging data.
  • Experience conducting feasibility studies and working with API-based data warehouse exposure, preferably using GraphQL.
  • Solid understanding of MLOps principles, including model deployment, monitoring, versioning, and continuous integration/continuous deployment (CI/CD) of machine learning pipelines.
  • Excellent problem-solving skills, attention to detail, and the ability to work both independently and in a team.
  • Strong communication skills, with the ability to explain technical concepts effectively to both technical and non-technical stakeholders.
  • Strong estimation skills.

Beneficial Qualifications:

Candidates with these qualifications will have a stronger standing, but these are not strictly necessary.

  • Understanding of PCI, SOC2, GDPR, FEDRAMP, and HIPAA compliance standards.

About SADA

Values: SADA stands for inclusion, fairness, and doing the right thing. From our very beginning, we’ve championed a diverse workplace where we support and learn from each other, amplifying the impact we make with our customers. We’re proud that our teams are made up of team members who represent a wide array of backgrounds, experiences, abilities, and perspectives. We are an equal opportunity employer. Our five core values are the foundation of all we do:

  1. Make Them Rave
  2. Be Data Driven
  3. Think One Step Ahead
  4. Drive Purposeful Impact
  5. Do The Right Thing

Work with the Best: SADA has been the largest Google Cloud partner in North America since 2016 and has been named a Google Global Partner of the Year for the sixth year running. This year, SADA was also named the Google Cloud Global Partner of the year 2023. SADA has consecutively been awarded Best Place to Work by the Business Intelligence Group and Inc. Magazine, and was recognized as a Niche Player in the 2023 Gartner® Magic Quadrant™ for Public Cloud IT Transformation Services.

Benefits: Unlimited PTO, paid parental leave, competitive compensation, performance-based bonuses, paid holidays, generous medical, dental, and vision plans, life insurance, short and long-term disability insurance, 401K/RRSP with company match, Google-certified training programs and a professional development stipend.

Business Performance: SADA has been named to the INC 5000 Fastest-Growing Private Companies list for over 10 consecutive years, earning Honoree status. CRN has also placed SADA on the Top 500 Global Solutions Providers list for the past 5 years. The overall culture continues to evolve with engineering at its heart: 3200+ projects completed, 4000+ customers served, 10K+ workloads, and 30M+ users migrated to the cloud.