Big Data with Hadoop & Spark by CloudxLab for $59 | Expires in

 Enroll Now

Job Details

Data Engineer at PLAYSIMPLE

PLAYSIMPLE
Company

PLAYSIMPLE

Job Location

Bangalore

Requirement

Immediate

Work Experience

2 Years - 4 Years

Required Qualifications

Undergraduate /

Job Description

As a member of Data Engineering at PlaySimple Games you will be at the epicenter of an amazing company experiencing extreme growth and solving the challenges that come with scaling rapidly. Accessibility of insight requires accessibility of data; quality of insight requires quality of data and everyone deserves accessibility to quality data. As PlaySimple Games is facing exponential growth, our data has followed the same growth curve.<br>

As a member of the Data Engineering team, you will be building and scaling a data platform that enables reliable, trustworthy, and approachable data which can be used to deliver meaningful insights about our players. Your software will deliver, model, curate, data that powers core data platforms at PlaySimple Games.

What’s required of you

  • Build a world class data platform which can handle terabytes of data daily.
  • Deliver real-time machine learning to millions of users at sub-second latency.
  • Empower dozens of data-hungry applications to internal intelligence tools.
  • 2-4 years experience in Apache Spark and ecosystem tools such as Kafka, Airflow etc.
  • Strong programming experience in Scala or PySpark.
  • In-depth understanding of Data Lakes and Big Data warehouses such as Redshift or Bigquery and query optimization on both.
  • Hands on experience in query optimizations in Apache Spark.
  • Hands-on experience in Extract Transform Load (ETL) techniques with large datasets.
  • Understanding of nosql databases with prior experience of working on one or more nosql servers.

Required Skills

This is your progress in the required skills for this job. Sign in and improve your score by completing these topics and then apply for the job with a better profile.

0%
0%