2.6 - Limitations of LLMs
Enroll to start learning
You’ve not yet enrolled in this course. Please enroll for free to listen to audio lessons, classroom podcasts and take practice test.
Practice Questions
Test your understanding with targeted questions
What is hallucination in LLMs?
💡 Hint: Think of it as the model creating stories that sound true.
Name one limitation related to context length in LLMs.
💡 Hint: Consider how much information you share in one question.
4 more questions available
Interactive Quizzes
Quick quizzes to reinforce your learning
What does the term 'hallucination' refer to in LLMs?
💡 Hint: Think about how a story can sound true but be made up.
True or False: LLMs have a robust real-time memory feature.
💡 Hint: Consider whether a model can recall what you discussed in previous sessions.
2 more questions available
Challenge Problems
Push your limits with advanced challenges
Create a prompt that could lead to hallucination, then identify why that might happen.
💡 Hint: Consider the relationship between factual accuracy and the models' training data.
Design an experiment to test sensitivity to prompt wording. Write two prompts about the same topic and compare the responses.
💡 Hint: Focus on how altering just a few words can change the output's focus.
Get performance evaluation
Reference links
Supplementary resources to enhance your learning experience.