Company Description
Experian harnesses the power of data to create opportunities for consumers, businesses, and society. We collect, analyze, and process data in ways that others can't. We help individuals gain financial control and access financial services, businesses to make smarter decisions and thrive, lenders lend more responsibly, and organizations prevent identity fraud and crime. For over 125 years, we’ve helped consumers and clients prosper, and economies and communities thrive - and we’re not finished. Our 17,800 staff across 45 countries believe the possibilities for you, and our world, are increasing. We’re investing in new technologies, talented individuals, and innovation to contribute to a better tomorrow.
Job Description
Experian India is seeking a Data Analyst to join the Credit Bureau development team. The ideal candidate should possess a strong analytical mind, be a technical and innovative thinker, have a keen problem-solving ability, and intolerance for inefficiencies in software development and deliveries. The candidate should also have excellent written, oral, and interpersonal communication skills, along with a strong desire to continuously learn about different products and technologies.
You will be working on a Big Data application and part of a team collaborating and working together towards common goals. The team is responsible for the design, development, and support of the application.
What you’ll be doing
- Apply your software engineering skills including Java, Spark, Python, Scala to analyze disparate, complex systems and collaboratively design new products and services
- Integrate new data sources and tools
- Implement scalable and reliable distributed data replication strategies
- Possess the ability to mentor and provide architectural and design direction to onsite/offshore developers
- Collaborate with other teams to design, develop and deploy data tools that support both operational and product use cases.
- Analyze large data sets using components from the Hadoop ecosystem
- Manage product features from development, testing through to production deployment
- Evaluate big data technologies and prototype solutions to enhance our data processing architecture
- Automate everything possible
Qualifications
What you’ll need to bring to the party
- BS degree in computer science or computer engineering or equivalent
- 5 – 6 years of experience in delivering enterprise software solutions
- Proficiency in Spark, Scala, Python, AWS Cloud technologies
- 3+ years of experience in multiple Hadoop/Spark technologies such as Hadoop, MapReduce, HDFS, HBase, Hive, Flume, Sqoop, Kafka, Scala
- A knack for data, schema, data model, and bringing efficiency to big data-related life cycle
- An ability to quickly understand technical and business requirements and translate them into technical implementations
- Experience with Agile Development methodologies
- Experience with data ingestion and transformation
- A solid understanding of secure application development methodologies
- Experience in developing microservices using the Spring framework is a plus
- Experience with Airflow and Python is preferred
- Understanding of automated QA needs related to Big data
- Strong object-oriented design and analysis skills
- Excellent written and verbal communication skills
Additional Information
Why us
- We’re a high-performance team, but we always make sure to celebrate success
- We offer solid career and international opportunities for high performers
- We invest heavily in our products and our people
- We offer training and support from experienced subject matter experts and managers along with dedicated learning & development time