Flume

You are currently auditing this course.
1 / 6

Flume - Introduction




Not able to play video? Try with youtube

Flume - Introduction

Flume is a simple, robust and extensible tool for data ingestion from various data sources into Hadoop. It is used for collecting, aggregating and transporting a large amount of streaming data such as events and logs from various sources to a centralized data store such as HDFS.

[Flume - Use Case]

Let's assume an e-commerce company wants to analyze customer behavior. To do so, they will need to move customer logs and events data from various sources such as web servers and databases to HDFS or HBase to perform the analysis. Here Flume can be useful to move the customer's data to HDFS.


Loading comments...