Book Appointment Now
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud Quiz Answers
Table of Contents
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud Week 1 Quiz Answers
Quiz 1: Orientation Quiz
Q1. This course lasts for ______ weeks.
- 6
- 5
- 8
- 4
Q2. I am required to read a textbook for this course.
- True
- False
Q3. Which of the following activities are required to qualify for the Course Certificate? Check all that apply.
- Reading Assignments
- Forum discussion
- Programming Assignments
- Quizzes
Q4. The following tools will help me use the discussion forums:
“Up-voting” posts that are thoughtful, interesting, or helpful.
- Reporting inappropriate posts
- Following a thread
- All of the other options are correct.
Q5. If I have a problem in the course I should:
- Report it to the Learner Help Center (if the problem is technical) or to the Content Issues forum (if the problem is an error in the course materials).
- Email the instructor
- Call the instructor
- Drop the class
Quiz 2: Module 1 Quiz
Q1. Which technology is the best suited for the following use case?
“Finding the set of words utilized in the Wikipedia website”
- YARN
- Spark
- HADOOP
- HDFS
Q2. Which technology is the best suited for the following use case?
“Training a machine learning model on a large dataset with several iterations”
- HDFS
- HADOOP
- YARN
- Spark
Q3. Which technology is the best suited for the following use case?
“Storing a large set of images on thousands of computers”
- Spark
- HDFS
- HADOOP
- YARN
Q4. Which technology is the best suited for the following use case?
“Assigning resources to a highly parallel application”
- HDFS
- YARN
- HADOOP
- Spark
Q5. Which technology is the best suited for the following use case?
“Exploring a new large dataset”
- Spark
- HADOOP
- YARN
- HDFS
Q6. Which feature of Spark Scheduler avoids extra shuffles?
- Dryad-like DAG
- Pipelining functions within a stage
- Partitioning-aware
- Cache-aware work reuse and locality
Q7. Which technology is the best suited for the following use case?
“Interactive data analytics”
- Hive
- Apache Zeppelin
- Storm
- HADOOP MapReduce
Q8. Which technology is the best suited for the following use case?
“SQL-like datastore”
- Hive
- Apache Zeppelin
- Storm
- HADOOP MapReduce
Q9. Which technology is the best suited for the following use case?
“Real-time data processing”
- Hive
- Storm
- HADOOP MapReduce
- Apache Zeppelin
Q10. Which technology is the best suited for the following use case?
“Batch data processing”
- Apache Zeppelin
- Storm
- HADOOP MapReduce
- Hive
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud Week 2 Quiz Answers
Quiz 1: Module 2 Quiz
Q1. Which guarantee can be relaxed for the following use case?
“Data can be served on a single server.”
- Availability
- Consistency
- Partition tolerance
Q2. Which guarantee can be relaxed for the following use case?
“Data is too big to be served on a single server, and is not modified after creation.”
- Consistency
- Partition tolerance
- Availability
Q3. Which guarantee can be relaxed for the following use case?
“Data is too big to be served on a single server and should be queried within a certain time.”
- Consistency
- Partition tolerance
- Availability
Q4. Which mechanism is the fittest for the following use case?
“Distributed coordination service with a reliable leader node”
- HBase
- HDFS
- ZooKeeper
- Paxos
Q5. Which mechanism is the fittest for the following use case?
“Distributed coordination service without a reliable leader node”
- ZooKeeper
- HDFS
- HBase
- Paxos
Q6. You want to build a word count program. Which of the following pseudo-code is the proper Map function for this program? Note that the indenting is not accurate.
“Word Count Program: You have a huge text file that consists of many lines. The goal is to count the number of times each distinct word appears in the file.”
- Map(key = line, value = contents):
- result = 0;
- for each word in value:
- result += value;
- emit(key, result)
- Map(key, values):
- for each value in intermediate values:
- value += 1;
- emit intermediate(key, values)
- Map(key = line, values = uniq_counts):
- Sum all 1’s in values list
- Emit result (word, sum)
- Map(key = line, value = contents):
- for each word in value:
- emit intermediate (word, 1)
Q7. You want to build a word count program. Which of the following pseudo-code is the proper Reduce function for this program? Note that the indenting is not accurate.
“Word Count Program: You have a huge text file that consists of many lines. The goal is to count the number of times each distinct word appears in the file.”
- Reduce(key = line, values = uniq_counts):
- Sum all 1’s in values list
- Emit result (word, sum)
- Reduce(key = line, value = contents):
- for each word in value:
- emit intermediate (word, 1)
- Reduce(key, values):
- for each value in intermediate values:
- value += 1;
- emit intermediate(key, values)
- Reduce(key = line, value = contents):
- result = 0;
- for each word in value:
- result += value;
- emit(key, result)
Q8. You want to build an image smoother program. Which of the following is the proper Map function for this program? Note that the indenting is not accurate.
“Image Smoother Program: To smooth an image, use a sliding mask and replace the value of each pixel.”
- Map(key = x,y value = list of R,G,B)
- compute average of R,G,B
- emit intermediate(key, average R,G,B)
- Map(key = x,y, value = R,G,B)
- emit intermediate(key, value)
Q9. You want to build an image smoother program. Which of the following is the proper Reduce function for this program? Note that the indenting is not accurate.
“Image Smoother Program: To smooth an image, use a sliding mask and replace the value of each pixel.”
- Reduce(key = x,y value = list of R,G,B)
- compute average of R,G,B
- emit (key, average R,G,B)
- Reduce(key = x,y, value = R,G,B)
- emit (key, value)
Q10. Which data model is the best for the following use case?
“A system that needs to reflect a change immediately”
- ACID
- BASE
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud Week 3 Quiz Answers
Quiz 1: Module 3 Quiz
Q1. What is load shedding?
- The process of eliminating events to keep up with the rate of events
- Distributing applications across many servers
- Enabling a system to continue operating properly in the case of the failure of some of its components
- Distributing the data across different parallel computing nodes
Q2. Which of the following is correct?
- A topology is a network of tuples and streams.
- A bolt processes input streams and produces new streams.
- A plant jar can receive output from many streams.
- A stream connects a bolt to a spout.
Q3. In a Storm program that produces a sorted list of the top K most frequent words encountered across all the documents streamed into it, four kinds of processing elements (bolts in Storm) might be created: QuoteSplitterBolt, WordCountBolt, MergeBolt, SortBolt.
What is the order in which words flow through the program?
- QuoteSplitterBolt, SortBolt, WordCountBolt, MergeBolt
- QuoteSplitterBolt, WordCountBolt, MergeBolt, SortBolt
- WordCountBolt, QuoteSplitterBolt, SortBolt, MergeBolt
- WordCountBolt, QuoteSplitterBolt, MergeBolt, SortBolt
Q4. What does Trident do?
- Provides a persistent state for the bolts, with a predefined set of characteristics
- Provides a persistent state for the topology, with a predefined set of characteristics
- Provides a persistent state for the bolts, but the exact implementation is up to the user
- Provides a persistent state for the spout, but the exact implementation is up to the user
Q5. What are Streams in Apache Storm?
- Unbounded sequences of tuples
- Processors of input
- A network of spouts and bolts
- Aggregators
Q6. What are Spouts in Apache Storm?
- Unbounded sequences of tuples
- Source of Streams
- Processors of input
- Network of spouts and bolts
Q7. What are Bolts in Apache Storm?
- Unbounded sequences of tuples
- Source of Streams
- Networks of spouts and bolts
- Processors of input
Q8. What are Topologies in Apache Storm?
- Unbounded sequences of tuples
- Networks of spouts and bolts
- Source of Streams
- Processors input
Q9. In the “At Least One” message process, what happens if there is a failure?
- Storm’s natural load-balancing takes over.
- Storm’s natural fault-tolerance takes over.
- Events are double processed.
- You must create and implement your load-balance algorithm.
Q10. How does Thrift contribute to Storm?
- Enables the usage of streams
- Allows Storm to be used from any language
- Provides scalability
- Provide load-balancing functionality
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud Week 4 Quiz Answers
Quiz 1: Module 3 Quiz
Q1. What is load shedding?
- The process of eliminating events to keep up with the rate of events
- Distributing applications across many servers
- Enabling a system to continue operating properly in the case of the failure of some of its components
- Distributing the data across different parallel computing nodes
Q2. Which of the following is correct?
- A topology is a network of tuples and streams.
- A bolt processes input streams and produces new streams.
- A plant jar can receive output from many streams.
- A stream connects a bolt to a spout.
Q3. In a Storm program that produces a sorted list of the top K most frequent words encountered across all the documents streamed into it, four kinds of processing elements (bolts in Storm) might be created: QuoteSplitterBolt, WordCountBolt, MergeBolt, SortBolt.
What is the order in which words flow through the program?
- QuoteSplitterBolt, SortBolt, WordCountBolt, MergeBolt
- QuoteSplitterBolt, WordCountBolt, MergeBolt, SortBolt
- WordCountBolt, QuoteSplitterBolt, SortBolt, MergeBolt
- WordCountBolt, QuoteSplitterBolt, MergeBolt, SortBolt
Q4. What does Trident do?
- Provides a persistent state for the bolts, with a predefined set of characteristics
- Provides a persistent state for the topology, with a predefined set of characteristics
- Provides a persistent state for the bolts, but the exact implementation is up to the user
- Provides a persistent state for the spout, but the exact implementation is up to the user
Q5. What are Streams in Apache Storm?
- Unbounded sequences of tuples
- Processors of input
- A network of spouts and bolts
- Aggregators
Q6. What are Spouts in Apache Storm?
- Unbounded sequences of tuples
- Source of Streams
- Processors of input
- Network of spouts and bolts
Q7. What are Bolts in Apache Storm?
- Unbounded sequences of tuples
- Source of Streams
- Networks of spouts and bolts
- Processors of input
Q8. What are Topologies in Apache Storm?
- Unbounded sequences of tuples
- Networks of spouts and bolts
- Source of Streams
- Processors input
Q9. In the “At Least One” message process, what happens if there is a failure?
- Storm’s natural load-balancing takes over.
- Storm’s natural fault-tolerance takes over.
- Events are double processed.
- You must create and implement your load-balance algorithm.
Q10. How does Thrift contribute to Storm?
- Enables the usage of streams
- Allows Storm to be used from any language
- Provides scalability
- Provide load-balancing functionality
Get all Course Quiz Answers of Cloud Computing Specialization
Cloud Computing Concepts, Part 1 Coursera Quiz Answers
Cloud Computing Concepts, Part 2 Coursera Quiz Answers
Cloud Computing Applications, Part 1: Cloud Systems and Infrastructure Quiz Answers
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud Quiz Answers
Cloud Networking Coursera Quiz Answers