Registrations Closing Soon for DevOps Certification Training by CloudxLab | Registrations Closing inEnroll Now
Flume - Introduction
Flume is a simple, robust and extensible tool for data ingestion from various data sources into Hadoop. It is used for collecting, aggregating and transporting a large amount of streaming data such as events and logs from various sources to a centralized data store such as HDFS.
[Flume - Use Case]
Let's assume an e-commerce company wants to analyze customer behavior. To do so, they will need to move customer logs and events data from various sources such as web servers and databases to HDFS or HBase to perform the analysis. Here Flume can be useful to move the customer's data to HDFS.
No hints are availble for this assesment
Answer is not availble for this assesment