Login using Social Account
Login using your credentials
RDD caching is used for avoiding re-computation of an RDD:
Between different spark applications
In the same spark application if an RDD is being used multiple times
Taking you to the next exercise in seconds...
Stay here Next Exercise
Want to create exercises like this yourself? Click here.
Note - Having trouble with the assessment engine? Follow the steps listed here
No hints are availble for this assesment
Go Back to the Course
1 Chapter Overview - Advance Spark Programming
2 Adv Spark Programming - Understanding Persistence
3 Spark Programming - RDD caching is used for avoiding re-computation of an RDD:...
4 Spark Programming - With persistence we can replicate the RDD so that spark...
5 Spark Programming - RDD cachng is basically persisting an RDD into the RAM?...
6 Adv Spark Programming - Persistence StorageLevel
7 Spark Programming - The persist() method accepts an argument which is an object...
8 Spark Programming - StorageLevel does not specify the following configuration?...
9 Spark Programming - Which of the following is not true for MEMORY_AND_DISK_2 storage...
10 Spark Programming - Say, you are creating an RDD rdd1 after a lot...
11 Adv Spark Programming - Data Partitioning
12 Adv Spark Programming - Partitioning HandsOn
13 Adv Spark Programming - Data Partitioning Example
14 Spark Programming - Apache Spark's Data partitioning in RDD is useful if...
15 Spark Programming - Operations that can not benefit from Partitioning?...
16 Spark Programming - Which of the following is not true about Data partitioning?...
17 Spark Programming - We had a key-value RDD having 1 to 10 as...
18 Spark Programming - We had a key-value RDD having 1 to 10 as...
19 Spark Programming - Which of the following is not an example of default...
20 Adv Spark Programming - Custom Partitioner
21 Spark Programming - You can create own partitioner?...
22 Adv Spark Programming - Shared Variables
23 Spark Programming - When we pass a function say f to map or...
24 Spark Programming - If we have some data, which of the following is...
25 Spark Programming - If we have some data of few kilobyte size, which...
26 Adv Spark Programming - Accumulators
27 Spark Programming - Accumulators are used for...
28 Spark Programming - Which of the use-case is not right for accumulator:...
29 Spark Programming - Accumulators will give incorrect results in the cases where:...
30 Adv Spark Programming - Custom Accumulators
31 Adv Spark Programming - Broadcast Variables
32 Spark Programming - Broadcast variables are used because...
33 Spark Programming - To share few bytes data with workers we use:...
34 Adv Spark Programming - Broadcast Variables Example
35 Adv Spark Programming - Key Performance Considerations - Parallelism
36 Spark Programming - If you have an RDD with 1000 records. The function...
37 Adv Spark Programming - Key Performance Considerations - Partitions
38 Spark Programming - If we have a text file of 8GB in HDFS...
39 Spark Programming - If we have 1GB data as an array and we...
40 Spark Programming - Which of the following is not right way of controlling...
41 Adv Spark Programming - Serialization Format
42 Spark Programming - Serialization happens when...
43 Spark Programming - Performance of serializer is important because in Spark the serialization...
44 Spark Programming - Which serializer is slowest?...
45 Adv Spark Programming - Memory Management
46 Spark Programming - How much memory is consumed for loading an RDD?...
47 Spark Programming - Spark reserves 20% of memory to shuffle and agregation buffers....
48 Spark Programming - Spark reserves last 20% of memory to code. What does...
49 Adv Spark Programming - Hardware Provisioning
50 Spark Programming - Which of the following hardware parameters do not matter for...
51 Spark Programming - Spark is able to achieve linear scaling. What does it...
52 Spark Programming - Why the memory beyond 64GB is discouraged?...
53 Adv Spark Programming - Slides
Please login to comment
Be the first one to comment!