DFI Company Brief
DFI Retail Group (the 'Group') is a leading pan-Asian retailer. As of December 31st, 2021, the Group along with its affiliates and joint ventures operated over 10,200 outlets and employed around 230,000 individuals. In 2021, the Group's total annual sales exceeded US$27 billion.
The Group strives to serve Asian consumers with quality and value by offering top brands, an appealing retail experience, and excellent service; all of which are provided through a robust store network backed by effective supply chains.
The Group (including affiliates and joint ventures) runs numerous well-known brands across food, health and beauty, home furnishings, restaurants, and other retail sectors.
The Group’s parent company is Dairy Farm International Holdings Limited, incorporated in Bermuda, primarily listed on the London Stock Exchange, with secondary listings in Bermuda and Singapore. The Group's businesses are managed from Hong Kong by Dairy Farm Management Services Limited through its regional offices.
DFI Retail Group is a member of the Jardine Matheson Group.
Responsibilities:
- Manage the end-to-end execution of data analytics projects efficiently, with complete and validated data, meeting the project deadline.
- Construct ETL data pipeline for data warehouse transformation.
- Cooperate with business units, IT teams, and internal team members to ensure data capture, understanding, and translation for data analytics.
- Offer maintenance support and troubleshoot issues on report visualization.
- Uphold the data quality procedure to ascertain clean data within the SLA.
Requirements:
- A Bachelor’s degree in technology, Engineering, or a related field.
- At least 2+ years of software engineering experience is preferred, however, fresh graduates with strong technical skills are also welcomed.
- Profound SQL and Python development skills.
- Experience in ETL, Data Lake, data warehouse pipelines, and data analytics projects.
- Experience with Google Cloud Platform (GCP) would be a huge advantage, but experience developing in other cloud platforms is also welcomed (e.g., Azure, AWS).
- Experience with big data frameworks (e.g., Airflow, Hadoop, Kafka, Spark) would be a significant benefit.
- Knowledge in CI/CD would be a great asset.
- Ability to work independently in a fast-paced environment with minimal supervision.
- Good communication skills including strong written and spoken English.