Personal Practice activities drive content selection at the end of a unit of learning to prepare learners for a summative assessment or a capstone activity. These activities are adaptive by learning objectives and the learner’s performance on the formative practice activities completed earlier in the unit of learning determines the questions and content they are presented in this adaptive activity.
How does this work? What is being measured?
- A learner’s performance on the formative practice activities within a unit of learning produces a unique Learning Estimate for each learner on each learning objective.
- The Learning Estimate is a model that predicts how well a learner will do on a given learning objective in a summative assessment.
- The model gauges the difficulty level of the content being assessed, and then assigns the learner a Low, Medium, or High estimate based on their performance to date.
Based on these Learning Estimates, learners are presented with a set of Personal Practice questions and additional support or remediation content adapted to their level on each learning objective. Personal Practice are formative activities and therefore not graded, but this can be changed by the customer.
User Experience
- Student’s learning estimate (driven by formative assessment) determines the questions and scaffolding the student receives
- The questions offered are based on student performance throughout the module. Students who struggle with a specific objective are given more questions that scaffold to the core questions, which all students see.
- You can set Personal Practice activities for a completion grade or a score. The score is based only on core questions.
- If students enter the Personal Practice without doing any work in the module, they will not have any learning estimates on that content, and they will be given all the available questions.
Requirements for Adaptivity
Prerequisite for Personal Practice—Formative activities require a minimum of 10 questions/data points per learning objective. See Question Data Points
Personal Practice activities require a minimum of one question (and a maximum of four questions) for each learning estimate level (low-medium-high) per learning objective.
Questions must contain supporting correct/incorrect feedback, and you must supply additional supporting remediation content to scaffold the Personal Practice questions.
Authoring Requirements
For the Personal Practice to function correctly, the content in the module must have enough data points for the learning estimate to work properly. Personal Practice activities require a minimum of one question (and a maximum of four questions) for each learning estimate level (low-medium-high) per learning objective. Questions must contain supporting correct/incorrect feedback, and you must supply additional supporting remediation content to scaffold the Personal Practice questions.
Basic Requirements
- A minimum of one low, one medium, and one high question per learning objective
- Scaffolding for low and/or medium questions
- ALL questions must include correct and incorrect answer feedback
Scaffolding
Successive levels of support that help students reach higher levels of comprehension. Scaffolded questions and content move students towards a stronger understanding of what they are learning (mastery)
Targeted Feedback
All questions will have targeted feedback for both correct and incorrect answer options, the feedback provides additional learning opportunities and guides the student as they continue to work the problem.
Authoring Questions
When authoring assessments, consider the following:
- High stakes level questions are considered ”core” – all students will get them in a Personal Practice
- What does a high stakes question for each Learning Objective look like?
- What information do students need to understand in order to answer the core questions? Develop low and medium questions that support/build student knowledge and confidence up to the high
The following table provides guidance on authoring questions:
Good questions | Bad questions |
|
|
Authoring Easy Questions
The following are suggestions for writing easy questions:
- Definitional/vocabulary questions tend to be easier than questions requiring application or critical judgments.
- Stick with the phrasing used by the text. Don’t require students to paraphrase.
- Keep vocabulary simple.
- Keep the questions short.
- Any question can be made easier my making the wrong choices easier to eliminate. But don’t make them so obviously wrong that they become silly.
Authoring Hard Questions
The following are suggestions for writing easy questions:
- Test deeper skills. Go beyond memorization.
- Require the students to paraphrase and recognize the same content in a different context.
- Make the wrong answer choices more subtle (yet still wrong). Bring in common misconceptions to tempt the unskilled.
Authoring Feedback
Be cautious when drafting feedback. Work to ensure adequate direction is provided to the student while not giving away the answer. Remember that feedback shows up right away to students.
Feedback can be useful in scaffolding an assessment. Ask yourself: What kind of guidance or hints work best? How will you phrase these questions and hints?
Creating Scenarios
While Personal Practice activities do not have to be scenario-based, Acrobatiq recommends scenarios to make them a richer experience. Ask yourself: How will students use this knowledge up to this time in the course? Is there a scenario in which students will need to draw on this knowledge?