Company Description
IQEQ is a leading service provider for the alternative asset industry. IQEQ collaborates with managers in various roles, including hedge fund, private equity fund, and mutual fund initiations; private equity fund administration; advisory firm setup, regulatory registration and infrastructure design; continuous regulatory compliance (SEC, CFTC, and 40 Act); financial controls and operational support services; compliance and operations-related projects and reviews; and outsourced CFO/controller and administration services for private equity fund investments - portfolio companies, real estate assets, and energy assets. Our client base is expanding, and our existing clients are utilizing the firm's services extensively.
Job Description
Please note, this position is based in Belfast.
We have an incredible opportunity for an experienced Data Engineer to join our global team. This role is crucial in the implementation of our Group Data Strategy and Data Transformation Journey, through the delivery, enhancement, and maintenance of our IQEQ Data Platform which will dictate how our data is managed and used to deliver outcomes in various key areas to optimize business value and growth, benefiting both internal and external stakeholders and clients.
The person in this role will be responsible for expanding and optimizing our data and data pipeline architecture. The ideal candidate is an experienced data pipeline builder and data wrangler who will assist our software developers, database architects, data analysts, and data scientists on data initiatives to navigate and utilize our vast data assets to construct optimal production models. You must be independent and comfortable in catering to the data needs of multiple teams, systems, and products.
Duties (daily responsibilities of the role)
- Develop and maintain ideal data pipelines based on the in-house stack.
- Compile complex datasets that meet functional/non-functional business requirements.
- Develop/Maintain data models on the current financial data.
- Assist stakeholders, including the Executive, Product, Data, and Design teams, with data-related technical difficulties and fulfil their data infrastructure necessities.
- Implement data flows to link operational systems, data for analytics, and business intelligence (BI) systems.
- Document source-to-target mappings.
- Reengineer manual data flows to allow for scalability and repetitive use.
- Write (extract, transform, load) scripts and code to ensure the ETL process performs optimally.
- Develop/Maintain DAG workflows for BAU processes, data pipelines, and transformations.
Expected behaviors
In addition to demonstrating our Group Values (Authentic, Bold, and Collaborative), the individual must exhibit the following:
- Effective Communication – Adjusting communication style to suit the audience and message. Providing timely information to aid others across the organization. Encouraging the open expression of diverse ideas and opinions.
- Action-Oriented – Promptly take action on challenges without unnecessary planning and identify new opportunities, taking ownership of them.
- Interpersonal Savvy – Relates comfortably with people across all levels, functions, cultures, and geographies. Builds rapport in an open, friendly, and accepting manner.
- An analytical mind, excellent problem-solving and diagnostic skills, attention to detail.
Qualifications
Education/professional qualifications
- Bachelor's degree in computer science or a related field.
- 3+ years of experience in software engineering.
- Experience in the financial industry is preferred.
Background & technical experience
- Proficiency in Linux basics and Bash scripting skills.
- Strong programming expertise in Python and exposure to more languages, primarily: Go, JS.
- Expertise in Python libraries - Pandas, Numpy.
- Good knowledge of algorithms and data structures.
- Deep understanding of database systems, e.g., PgSQL/MySQL and Microsoft SQL server.
- Experience with at least one cloud platform e.g., AWS, Azure, GCP, preferably Azure.
- Familiarity with one or more data lakes/data warehouses - Snowflake / DataBricks / Redshift etc.
- Know how in stream processing - Kafka, Kinesis, etc.
- Basic experience with Node.js and JavaScript.
- Experienced in the implementation of data warehousing solutions.
- Experienced in the implementation of API solutions and tooling.
Other company, product, and market knowledge
- Experience working in a complex, multi-country professional services, financial services, or BPO organization with complex processing requirements.
- Multi-country experience and demonstrates an ability to work in a multi-cultural, talented, and demanding team environment.
- Possesses the skills and character to operate effectively in a very high-paced, complex global business with deep knowledge of program management.
- Excellent written and verbal communication skills, with staff members, customers, suppliers, and the management team with the ability to make decisions, act and achieve results.
- Passion, energy, and drive.
- Personal presence, integrity, and credibility.
- Ability to solve problems either independently or by using other team members where necessary.
- Strong analytical and troubleshooting skills.
- Ability to investigate and analyze information, and to draw conclusions, i.e., RCA.
- Experience/Exposure to ISO 27001 Infosec compliances.
Languages
- Fluent spoken and written English, additional European languages will be a plus.