FAQ

Questions and Answers

How do I write and run a MapReduce job in CloudxLab?

Generally, users write MapReduce code in their local machine using their preferred IDE like Eclipse, IntelliJ etc, unit test it, build the JAR, upload it to CloudxLab and execute it there

Please go though this screen cast on how to run a MapReduce job. Please find the sample code used in the screen cast here

After building the JAR in your local machine, you can upload the JAR to CloudxLab using WinSCP or SCP and execute it there using hadoop jar command.

Please go through this screen cast if you are planning to run MapReduce in Python.