UPTO 50% OFF on All Courses | Offer Ends at Midnight | Use Coupon Code - FLAT50 During Checkout |Enroll Now
Flume - Introduction
Flume is a simple, robust and extensible tool for data ingestion from various data sources into Hadoop. It is used for collecting, aggregating and transporting a large amount of streaming data such as events and logs from various sources to a centralized data store such as HDFS.
[Flume - Use Case]
Let's assume an e-commerce company wants to analyze customer behavior. To do so, they will need to move customer logs and events data from various sources such as web servers and databases to HDFS or HBase to perform the analysis. Here Flume can be useful to move the customer's data to HDFS.
Taking you to the next exercise in seconds...