Registrations Closing Soon for DevOps Certification Training by CloudxLab | Batch Starts on 18th April

  Enroll Now

Job Details

Hadoop Developer at Imagine Tech & Services

Imagine Tech & Services

Imagine Tech & Services

Job Location




Work Experience

Fresher - 3 Years

Required Qualifications

Bachelor of Engineering / Bachelor of Technology /

Job Description

Imagine Technology & Services Pvt. Ltd. is the technology subsidiary of Landmark Health headquartered in California. Landmark Health provides Complexivist Care for all chronical diseases which is not covered by other insurances to its members, thus handling the Insurance and Medical Care to the minimal cost with utmost & most personalized care.## Heading ## The Offshore product development center is in Bangalore, India which will provide technology support in terms of applications Development, Data Migration, Data warehouse etc. Imagine Technology & Services provides an interface that handles large database of Members & healthcare physicians & professionals


  • Responsible for the documentation, design, development, and architecture of Hadoop applications
  • Handle the installation, configuration, and supporting of Hadoop
  • Write MapReduce coding for Hadoop clusters; help to build new Hadoop clusters
  • Converting hard and complex techniques as well as functional requirements into the detailed designs
  • To design web applications for querying data and swift data tracking at higher speeds
  • To propose best practices and standards; handover to the operations
  • Perform the testing of software prototypes and transfer to the operational team
  • Pre-processing of data using Hive
  • To maintain data security and privacy
  • Management and deployment of HBase
  • Performing the analysis of large data stores and derive insights
  • Work closely with internal customers analyzing problems and ad-hoc SQL scripts.


  • Expert skills in writing SQL queries
  • Experience in HDFS to use in Hadoop application and Hadoop Common (contains java libraries and utilities that support Hadoop modules.
  • Good knowledge of the concepts of multi-threading and concurrency
  • Analytical and problem-solving skills; the implementation of these skills in Big Data domain
  • Understanding of data loading tools such as Flume, Sqoop etc
  • Good knowledge of database principles, practices, structures, and theories
  • Familiarity with the schedulers
  • Ability to write reliable, manageable, and high-performance code
  • Expertise knowledge of Hadoop, Hive and HBase
  • Working experience in HQL
  • Experience of writing MapReduce jobs
  • Hands-on experience in backend programming, particularly Java, JavaScript, OOAD, and Node.js

Required Skills

This is your progress in the required skills for this job. Sign in and improve your score by completing these topics and then apply for the job with a better profile.