In this chapter, we learn the basics of Big Data which include various concepts, use-cases and understanding of the eco-system.
This chapter doesn't require any knowledge of programming or technology. We believe it is very useful for every to learn the basics of Big Data. So, jump in!
Happy Learning!
As everyone knows, Big Data is a term of fascination in the present-day era of computing. It is in high demand in today’s IT industry and is believed to revolutionize technical solutions like never before.
Upon learning the big data concepts, we will get a vivid picture of the need for clusters of machines (distributed systems), and appreciate the use of this architecture in solving critical problems associated with storing and processing humungous data. In addition, we will get an idea of system design concepts, which aid us in designing scalable and resilient systems - the most desirable kind of …
Welcome to a course in Scala Foundations.
As part of this course, you will learn how to write programs using Scala.
Scala is a programming language like Java or Python. The syntax is much like Python while under the hood it compiles to Java. It also comes with both an interactive interpreter and a compiler.
Further, Scala is designed for scalable computing where the code could be sent to data and codes is also run in parallel.
Scala is used in the enterprise world and has gain a lot of traction.
This chapter covers different NumPy constructs and functions along with Overview of Pandas, Matplotlib and Linear Algebra which is normally used in Machine Learning projects
Welcome to this project on Churning the Emails Inbox with Python. In this project, you will use Python to access the data from files and process it to achieve certain tasks. You will explore the MBox email dataset, and use Python to count lines, headers, subject lines by emails and domains. Know your way on how to work with data in Python.
Skills you will develop:
Spark Streaming is an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant processing of live data streams. Learn Spark Streaming from the industry experts.
MapReduce is Framework as well as a paradigm of computing. By the way of map-reduce, we are able to break-down complex computation into distributed computing.
As part of this chapter, we are going to learning how to build MapReduce programmes using Java.
Please make sure you work along with the course instead of just sitting back and watching.
Happy Learning!
Learn to load and save data using Spark, compression, and how to handle various file formats using Spark from the industry experts.
Whenever you make a request to a web server for a page, it records it in a file which is called logs.
The logs of a webserver are the gold mines for gaining insights in the user behaviour. Every data scientists usually look at the logs first to understand the behaviour of the users. But since the logs are humongous in size, it takes a distributed framework like Hadoop or Spark to process it.
As part of this project, you will learn to parse the text data stored in logs of a web server using the Apache Spark.