Overview
The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code.
Responsibilities- Designing the environment, defining implementation steps, and driving the transition from Hadoop to a new open-source solution.
- Bachelor's degree or equivalent experience in Computer Science or related field
- Open-source development with modern data platforms
- Programming in Python, Scala, and Java
- Data processing with Spark, Trino, and Flink
- Workflow orchestration & streaming: Apache Airflow, Apache Kafka
- Driving modernization initiatives: transitioning from Hadoop to next-gen open-source solutions
- Mid-Senior level
- Contract
- Design and Information Technology
- IT Services and IT Consulting, Financial Services, and Banking