Job type - Contractual (6 months to 12 months tenure, with a potential to extend)
Client - Global client with nearly USD10B in revenue
Basic requirements -
- 4+ years in Big Data technologies
- Must have - Big data stack – Hadoop, Spark, NoSQL. Strong fundamental knowledge of internals!
- Must have - Strong programming skills in Java, and a good understanding of Python and SQL
- Performance tuning for Hadoop, Spark
- Experience building big data stack on-prem and in the cloud including HA/DR/data synch strategies using Kafka/NiFi
- Experience building data governance and security using core hortonworks tools like Knox, Ranger, Kerberos
- Familiarity with Hortonworks HDP stack even if there is not a working experience on HDP as long as the engineer has equivalent open stack experience
- Understanding of containerization framework – Docker, Kubernetes, Mesos
- Basic Understanding of Data Science concepts
- Work with Infrastructure engineer for infrastructure automation of big data stack
- Mentoring junior members of the team and answering their technical questions
This is your progress in the required skills for this job. Sign in and improve your score by completing these topics and then apply for the job with a better profile.
Sorry, this job has expired.