Only Few Seats Left for Advanced Certification Courses on Data Science, ML & AI by E&ICT Academy IIT Roorkee
How much memory is consumed for loading an RDD?
Spark will consume whatever memory is needed by RDD partition. If memory is not available, it will fail.
Spark will dedicate 60% of the memory to RDD. If RDD is not fitting, older RDD partitions will be dropped and loaded again when needed.
Spark will only consume memory for a single record that is being processed while keeping rest of records on disk
Note - Having trouble with the assessment engine? Follow the steps listed here
No hints are availble for this assesment
Answer is not availble for this assesment
Loading comments...