About FWD Group
FWD Group is a pan-Asian life insurance business with more than 11 million customers across 10 markets, including some of the world's fastest-growing insurance markets. FWD celebrated its 10-year anniversary in 2023. The company aims to simplify, accelerate and enhance the insurance experience, offering creative propositions and straightforward products, underpinned by digital technology. With this customer-centric approach, FWD is committed to changing how people perceive insurance. Visit https://www.fwd.com.hk/
PURPOSE
Contribute to the evolution of the Project Delivery department, concentrating on Azure MLOps projects development and support.
KEY ACCOUNTABILITIES
- Provide support for FWD Group Data platform on MLOps projects, major tasks include: data injection, data transformation, data processing, AI model deployment, AI model maintenance.
- Responsible for coding implementation of internal MLOps data delivery projects, and engagement in different phases of the delivery project.
- Involved in reviewing vendor delivery project code to ensure top-quality MLOps data project delivery.
- Accountable for receiving the transferred project from the vendor, and supporting the improvement and maintenance of the production system.
- Participate in resolving technical challenges faced in the R&D of big data development platforms.
- Provide regular reports to FWD group data management team.
- Communicate with all vendor delivery teams according to assigned project responsibilities.
QUALIFICATIONS / EXPERIENCE
- Graduated from a computer-related major, with strong programming skills (Java / Python / Scala). Experience with cloud platform development is preferred, Azure is a MUST.
- Over 5 years of data delivery project experience, with 3 – 5 years of proficiency in spark / spark streaming / Flink / Kafka ecological environment development. Proficient in basic knowledge and data development technology related to data processing, real-time data processing experience and familiarity with fluent is preferred.
- Good competence in software engineering, solid knowledge of ML/Deep Learning models and workflows.
- Good understanding of DevOps/MLOps principles.
- Experience in ML CI/CD/CT operationalization, deployment automation, and container/orchestration platforms under Microsoft Azure technologies like Azure Pipelines, Azure Kubernetes Service (AKS), Azure Container Service (ACS), Azure Container Registry, Azure ML CLI, and Azure Git.
- Hands-on experience with Azure Data Management Solution including Azure Data Factory, Azure Databricks, Azure Blobstorage (Gen2), Azure Synapse, and Azure Machine Learning Studio.
- Experience with PowerBI is preferred.
KNOWLEDGE & SKILLS
- Familiar with the construction and design of enterprise data warehouse, experience in management and design of large-scale data warehouses and data modeling methodology. Familiar with the big data ecosystem and its technology ecosystem.
- Comprehensive understanding of Azure Data Management Solution including Azure Data Factory, Azure Databricks, Azure Blobstorage (Gen2), Azure Synapse, and Azure Machine Learning Studio.
- Strong sense of responsibility, attention to detail, and practical work ethic, with strong hands-on capability and ability to learn new technology.
- Experience in AI model development and deployment.
- Good communication skills, proficiency in English listening, speaking, reading, and writing.
- Working experience in the insurance industry is preferred.