In this role, you will use your data engineering and programming skills to build systems that gather, manage, and convert raw data into valuable information for data scientists and data analysts. This includes transforming voluminous data into formats that can be processed and analysed. You will execute methods to enhance data reliability and quality, and devise analytical tools and programs that aid data extraction and transformation for predictive or prescriptive modelling.
Key skills and qualifications:
- Expertise in building and optimizing data sets, 'big data' pipelines and architectures
- Capability to carry out root cause analysis on external and internal processes and data to pinpoint improvements and provide answers
- Knowledge of programming languages, particularly Python
- Practical experience in SQL database design
- Technical proficiency with data models, data mining, and segmentation techniques
- Familiarity with SQL and AWS technologies
Requirements:
Mandatory qualifications for this position include:
- Bachelor’s degree in a relevant field (Computer Science, Computer Engineering, Electrical Engineering, Business Information Systems) or equivalent experience.
- Over 5 years of experience designing and developing databases and other applications on classified IC networks.
- Demonstrated extensive knowledge of custom scripts, procedures, queries, and user interfaces development for SQL databases on classified IC networks.
- Experience in designing logical data architectures and conducting data extraction, parsing, and manipulation.
- Proven experience with large-scale data flow and conditioning technologies (e.g., Hadoop, Elastic Map-Reduce, Snowflake, Apache NiFi) for analytics purposes.
- Experience developing and communicating technical specifications of databases to both technical and non-technical audiences.
- Extensive knowledge of IC cloud systems and services on classified networks, preferably those services available on the Commercial Cloud Enterprise (C2E) program.
- Experience developing and maintaining interfaces between data resources (databases, data lakes, etc.) and analytics software applications (e.g., Tableau, PowerBI, Kibana, Grafana).
- Experience with Tableau software suite, including Server, Prep, and Conductor.
- Ability to communicate clearly, concisely, and accurately.
Desired qualifications:
- Experience in developing and communicating metrics in the IC, particularly in relation to alignment of IC collection and analytic products to intelligence needs and questions, IC’s responsiveness to changes in intelligence priorities, and IC’s reliance on the variety of intelligence collection disciplines.
- Over 4 years of experience with data analysis and methodological research supporting analytic issues.
- Demonstrated experience in designing, testing, and managing software tools to support assessment and metrics production on large, aggregated datasets.
- Knowledge of tools in at least one of the following: statistics, mathematics, econometrics, operations research, survey design and analysis, or IT.
- Experience in designing, testing, and managing IT tools to support assessments.
- Experience in data extraction, parsing, manipulation, and storage.
- Experience in developing and deploying APIs for secure application access to data stores.
- Experience in working successfully in an analytic team environment.
Employment conditions:
- Top Secret/SCI clearance with polygraph required
- U.S. Citizenship required
- Federal Employment Suitability
- E-Verify Eligibility required
ASG is an equal opportunity employer (EEO)
ASG participates in the USCIS Electronic Employment Eligibility Verification Program (E-Verify). E-Verify helps employers determine employment eligibility of new hires and the validity of their Social Security numbers.
Benefits:
- Health insurance
- Paid time off
- Dental insurance
- 401(k)
- Vision insurance
- Tuition reimbursement
- Life insurance
- 401(k) matching
- Disability insurance
- Retirement plan
- Referral program
- Health savings account
- Flexible spending account