Job Description
Get to Know Our GX Bank Team:
GX Bank Berhad - the Grab-led Digital Bank - is the FIRST digital bank in Malaysia, approved by BNM to commence operations. We aim to leverage technology and innovation to serve the financial needs of the unserved and underserved individuals, and micro and small medium enterprises.
We are driven by our shared purpose and passion to bring positive transformation to the banking industry, starting with solutions that addresses the financial struggles of Malaysians and businesses.
Get to Know the Role
The Data Engineer plays a key role within the data warehouse team in developing the data warehouse system, working closely with stakeholders from multiple lines of business (LOBs).
The Day-to-Day Activities
- Participate in all aspects of developing a data warehouse system. Design, develop, test, deploy and support scalable and flexible data warehouse system that can integrate with multiple LOBs.
- Liaison with Product, Business, Design and Engineering stakeholders to define and review technical specifications.
- Design and implement scripts, ETL jobs, data models, etc.
- Provide horizontal, organisational-wide visibility through metric measurements and insights regionally across different functions and teams.
- Develop and uphold best practices with respect to change management, documentation and data protocols.
- Identify system and application metrics, develop dashboards and set up alerts for metrics and thresholds.
- Participate in technical and product discussions, code reviews, and on-call support activities.
The Must-Haves
- Bachelor degree in Analytics, Data Science, Mathematics, Computer Science, Information Systems, Computer Engineering, or a related technical field.
- At least 3-4 years of experience developing Data warehouse and Business Intelligence solutions.
- Sound knowledge of data warehousing concepts, data modelling/architecture and SQL.
- Ability to work in a fast-paced agile development environment.
- Knowledge of programming languages such as Java, Scala, Python, etc.
- Understanding of performance, scalability and reliability concepts.
- Experience with relational as well as NoSQL data stores.
- Experience with Big Data frameworks such as Hadoop, Spark, etc.
- Experience with Stream processing technologies such as Kafka, Spark streams, etc.
- Experience with performance tuning & query optimization of data warehouse systems.
- Experience with Cloud technologies such as Azure, AWS, etc. would be nice to have.
- Ability to drive initiatives and work independently.
- Team player who can liaison with various stakeholders across the organisation.
- Excellent written and verbal communication skills.