Enrollments closing soon for Post Graduate Certificate Program in Applied Data Science & AI By IIT Roorkee | 3 Seats Left
Apply NowLogin using Social Account
     Continue with GoogleLogin using your credentials
Flume - Introduction
Flume is a simple, robust and extensible tool for data ingestion from various data sources into Hadoop. It is used for collecting, aggregating and transporting a large amount of streaming data such as events and logs from various sources to a centralized data store such as HDFS.
[Flume - Use Case]
Let's assume an e-commerce company wants to analyze customer behavior. To do so, they will need to move customer logs and events data from various sources such as web servers and databases to HDFS or HBase to perform the analysis. Here Flume can be useful to move the customer's data to HDFS.
Taking you to the next exercise in seconds...
Want to create exercises like this yourself? Click here.
Loading comments...