Quiz Part I
Multiple Choice Questions
The following quiz will test your understanding of Advanced NLP Applications and the two hands-on projects: Machine Translation with MarianMT and Text Summarization with T5. The questions cover key concepts, tools, and practical implementations you have explored in this section.
1. What is the key advantage of transformer-based models like MarianMT for machine translation?
a) They use rule-based translations for better accuracy.
b) They efficiently capture long-range dependencies in text.
c) They require no tokenization of the input text.
d) They rely on statistical models for word alignment.
2. Which of the following prefixes is used when performing text summarization with T5?
a) translate:
b) summarize:
c) paraphrase:
d) compress:
3. What is the primary difference between extractive and abstractive summarization?
a) Extractive summarization generates entirely new text, while abstractive summarization selects sentences directly from the source text.
b) Extractive summarization selects key sentences from the source text, while abstractive summarization generates new sentences.
c) Extractive summarization only works with long text, while abstractive summarization works with short text.
d) Abstractive summarization is faster but less accurate than extractive summarization.
4. What does the num_beams
parameter control when generating text summaries using T5?
a) The maximum number of words in the summary.
b) The trade-off between fluency and relevance.
c) The number of beams explored for beam search during generation.
d) The penalty applied to longer outputs.
5. Which Hugging Face model would you load to perform machine translation from English to German?
a) Helsinki-NLP/opus-mt-en-fr
b) Helsinki-NLP/opus-mt-de-en
c) Helsinki-NLP/opus-mt-en-de
d) bert-base-multilingual-cased
Multiple Choice Questions
The following quiz will test your understanding of Advanced NLP Applications and the two hands-on projects: Machine Translation with MarianMT and Text Summarization with T5. The questions cover key concepts, tools, and practical implementations you have explored in this section.
1. What is the key advantage of transformer-based models like MarianMT for machine translation?
a) They use rule-based translations for better accuracy.
b) They efficiently capture long-range dependencies in text.
c) They require no tokenization of the input text.
d) They rely on statistical models for word alignment.
2. Which of the following prefixes is used when performing text summarization with T5?
a) translate:
b) summarize:
c) paraphrase:
d) compress:
3. What is the primary difference between extractive and abstractive summarization?
a) Extractive summarization generates entirely new text, while abstractive summarization selects sentences directly from the source text.
b) Extractive summarization selects key sentences from the source text, while abstractive summarization generates new sentences.
c) Extractive summarization only works with long text, while abstractive summarization works with short text.
d) Abstractive summarization is faster but less accurate than extractive summarization.
4. What does the num_beams
parameter control when generating text summaries using T5?
a) The maximum number of words in the summary.
b) The trade-off between fluency and relevance.
c) The number of beams explored for beam search during generation.
d) The penalty applied to longer outputs.
5. Which Hugging Face model would you load to perform machine translation from English to German?
a) Helsinki-NLP/opus-mt-en-fr
b) Helsinki-NLP/opus-mt-de-en
c) Helsinki-NLP/opus-mt-en-de
d) bert-base-multilingual-cased
Multiple Choice Questions
The following quiz will test your understanding of Advanced NLP Applications and the two hands-on projects: Machine Translation with MarianMT and Text Summarization with T5. The questions cover key concepts, tools, and practical implementations you have explored in this section.
1. What is the key advantage of transformer-based models like MarianMT for machine translation?
a) They use rule-based translations for better accuracy.
b) They efficiently capture long-range dependencies in text.
c) They require no tokenization of the input text.
d) They rely on statistical models for word alignment.
2. Which of the following prefixes is used when performing text summarization with T5?
a) translate:
b) summarize:
c) paraphrase:
d) compress:
3. What is the primary difference between extractive and abstractive summarization?
a) Extractive summarization generates entirely new text, while abstractive summarization selects sentences directly from the source text.
b) Extractive summarization selects key sentences from the source text, while abstractive summarization generates new sentences.
c) Extractive summarization only works with long text, while abstractive summarization works with short text.
d) Abstractive summarization is faster but less accurate than extractive summarization.
4. What does the num_beams
parameter control when generating text summaries using T5?
a) The maximum number of words in the summary.
b) The trade-off between fluency and relevance.
c) The number of beams explored for beam search during generation.
d) The penalty applied to longer outputs.
5. Which Hugging Face model would you load to perform machine translation from English to German?
a) Helsinki-NLP/opus-mt-en-fr
b) Helsinki-NLP/opus-mt-de-en
c) Helsinki-NLP/opus-mt-en-de
d) bert-base-multilingual-cased
Multiple Choice Questions
The following quiz will test your understanding of Advanced NLP Applications and the two hands-on projects: Machine Translation with MarianMT and Text Summarization with T5. The questions cover key concepts, tools, and practical implementations you have explored in this section.
1. What is the key advantage of transformer-based models like MarianMT for machine translation?
a) They use rule-based translations for better accuracy.
b) They efficiently capture long-range dependencies in text.
c) They require no tokenization of the input text.
d) They rely on statistical models for word alignment.
2. Which of the following prefixes is used when performing text summarization with T5?
a) translate:
b) summarize:
c) paraphrase:
d) compress:
3. What is the primary difference between extractive and abstractive summarization?
a) Extractive summarization generates entirely new text, while abstractive summarization selects sentences directly from the source text.
b) Extractive summarization selects key sentences from the source text, while abstractive summarization generates new sentences.
c) Extractive summarization only works with long text, while abstractive summarization works with short text.
d) Abstractive summarization is faster but less accurate than extractive summarization.
4. What does the num_beams
parameter control when generating text summaries using T5?
a) The maximum number of words in the summary.
b) The trade-off between fluency and relevance.
c) The number of beams explored for beam search during generation.
d) The penalty applied to longer outputs.
5. Which Hugging Face model would you load to perform machine translation from English to German?
a) Helsinki-NLP/opus-mt-en-fr
b) Helsinki-NLP/opus-mt-de-en
c) Helsinki-NLP/opus-mt-en-de
d) bert-base-multilingual-cased