Cloud Data Architect – Denver, Colorado or Austin, Texas

Job expired!

Cloud Data Architect – Denver, Colorado or Austin, Texas

Are you passionate about transforming financial organizations by uncovering hidden relationships and patterns in billions of data connections? Do you advocate for Graph Database and Analytics tools to solve critical business problems? Do you have experience as an Application Architect or Database Architect? Western Union is hiring an experienced Cloud Data Architect. Join us for what’s next!

Applicants must be currently authorized to work in the United States on a full-time basis. Western Union will not sponsor applicants for work visas for this position.

Western Union powers your pursuit.

In this pivotal role, you will lead the strategic utilization of graph databases (Neo4j) across the entire organization to solve complex business problems.

Responsibilities

  • Absorb business problems, understand use cases, and define a common vision for application and architectural goals. Promote this vision across the organization.
  • Enhance the current Neo4j data model to better handle existing complex, linked real-time data.
  • Map out architecture roadmaps for building a production-ready, graph-based enterprise data application created for the fraud analysts.
  • Provide guidance and leadership on graph database technologies, acting as a senior resource for the team.
  • Lead technical design review sessions, collaborating with engineering teams in geographically distributed development centers.
  • Prototype, research, and mitigate novel approaches to overcome impediments and achieve application goals.
  • Review and assess architectural and technological solutions for ongoing projects, ensuring optimal solution selection for new initiatives and providing strategy with a focus on reusable components.
  • Design, develop, and manage the entire technology stack, including cloud hosting, graph database, data modeling/engineering, and APIs.
  • Integrate Neo4j with Snowflake and AWS to ensure seamless data flow and compatibility.
  • Optimize Neo4j databases for performance, scalability, and maintainability.
  • Automate processes and identify opportunities for cost savings. Ensure the security, privacy, and integrity of data in Neo4j database systems.
  • Work closely with engineers, data teams, business partners, and other stakeholders across the entire organization.
  • Set up and manage monitoring tools to ensure the health and performance of the Neo4j database.

Role Requirements

  • BS in Computer Science or related fields with 10+ years of professional experience, or MS with 8+ years of experience.
  • Strong background in designing multiple, complex technical projects and architecture for graph database applications (Neo4j preferable).
  • Experience in leading or guiding teams in designing the overall architecture using Architecture Centric Design Methodology (ACDM) with experimentation and quality-based design focused on cost-efficiency, usability, modifiability, scalability, and performance.
  • Experience in building highly scalable services running in Kubernetes on cloud environments (AWS preferable, Azure, GCP).
  • Good experience with analytic databases like Snowflake or Databricks.
  • Ability to work in a fast-paced, iterative development environment and adapt to changing business priorities, thriving under pressure.
  • Experience with Angular, PySpark, Hadoop, Spark, Hive, Kafka, HBase, Elasticsearch, OpenSearch, Apache, and programming languages such as Java, and Python.
  • Experience using DevOps tools and agile development methodologies with Test Driven Development (TDD) and Continuous Integration/Continuous Delivery (CI/CD).

Nice to have:

  • Neo4j Certified Professional and/or Neo4j 4.x Certified; SnowPro Advanced Architect and SnowPro Advanced Administrator Certifications and/or DataBricks Certifications.
  • Experience with Databricks and Snowflake for complex pipelines, AI/ML and Gen-AI use cases.
  • Strong knowledge in AWS services such as EMR, IAM, KMS, Lambda, CloudWatch and Data pipeline.
  • Experience in writing spark streaming jobs (producers/consumers) using AWS MSK- Kafka or AWS Kinesis.
  • Exposure to other big data frameworks such as MapReduce, HDFS, Hive/HBase, Cassandra.

We make financial services accessible to humans everywhere. Join us for what’s next.

Western Union is positioned to become the world’s most accessible financial services