
Welcome to the last Criterion in the MYP Design Cycle, Criterion D – Evaluating. If you’ve been following my posts about the MYP Design criteria, I have been focusing on my teaching and learning experiences with Year 1 (grade 6) students in the IB’s MYP Design Program.
What is Criterion D in MYP Design?
After completing MYP Design Criterion C – Creating the Solution, students evaluate the success of the solution in Criterion D. For Year 1 students, by the end of the school year, they should be able to outline the following according to the MYP Design Guide:
- simple, relevant testing methods, which generate data, to measure the success of the solution
- the success of the solution against the design specification
- how the solution could be improved
- the impact of the solution on the client/target audience
The specific word the MYP Design guide uses is “outline”. Students need to be able to outline or summarize the above four bullet points to show optimal understanding in MYP Design Criterion D – Evaluating.
What is Criterion D in the Design Cycle Really?
For my Year 1 students, I used the MYP Design Cycle to organize my approach to completing Criterion D. I would go through each strand as deliberately as possible, given that this class was my students’ first in MYP Design.
Criterion D – Evaluating starts at about four to four and a half weeks after beginning the unit with Criterion A – Inquiring and Analyzing. So, Criterion D for my classes lasted about one and a half to two weeks (i.e., four to six classes, one hour long each).
The reflective strands of Criterion D (D.3 and D.4) were not at the top of the list in terms of my students’ favorites. Overall, we did a lot of writing in each criterion throughout the unit. Had I offered other options instead of writing (e.g., Patlet, Seesaw) maybe the reflective writing in these final strands would have been met with greater enthusiasm.
When we conducted an evaluation of a physical product (e.g., performance testing), Criterion D was very popular! It can be exciting and fun to see how designs hold up to testing, how they fail, and to enjoy a little friendly competition.
MYP Design Statement of Inquiry and the GRASPS Scenario
Before getting too deep into this criterion, it’s important to spend some time reviewing and discussing the “why” of the lesson. The last time the reason for solving the design problem was reviewed and discussed was probably at the start of Criterion A.
Go over the GRASPS again (example). This brief task scenario outlines the goal of the unit, the student’s role, and who will benefit from the solution. A review of the statement of inquiry (example) will reconnect students to the big-picture purpose of their learning as well.
MYP Design Criterion D – Getting Started
For each criterion in an MYP unit, I had students complete one Google Doc over multiple class periods. MYP Design can be taught in other ways, but this is how I framed it. Here’s an example of the Criterion D document for my first unit, the paper water tank.
I found it helpful for students to warm up to Criterion D with a “zero strand”. That is, before students started Strand D.1, they completed D.0, which was essentially a success-activity warm-up piece with fill-in-the-blank questions about information from previous criteria.
For example, in our third unit, we upcycled plastic into a gift for a specific audience. For this unit, Strand D.0 – Design Reference and Identification was created to “provide elements of the design to fully communicate the solution.” Students restate the gift recipient, the gift name, and the final sketch of the gift from Criterion B – Developing Ideas.
As I mentioned before, the overall amount of writing can be a turn-off in Criterion D. With this being the last criteria, fatigue can begin to set in as well. Even though Strand D.0 is a tiny bit of extra work, these first few easy steps in D.0 can help make Criterion D less disagreeable and more inviting.
Strand D.1 – Design Testing Methods
Specifically, at the end of Year 1, the MYP Design Guide states that for the highest achievement level (a 7-8 score), the student “outlines simple, relevant testing methods, which generate data, to measure the success of the solution.” At the 5-6 achievement level, the student “defines relevant testing methods, which generate data, to measure the success of the solution.” How does “outlining” show more of what a student knows and understands versus “defining”? This difference has not been clear to me.
Set up the testing methods to acquire accurate data. For engineering-based units, setting up testing to get good data is straightforward. For designs that are meant to influence an audience’s behavior (e.g., cyber safety public service announcement), devising a testing method to yield good data may not be as straightforward.
How do you Design Testing Methods?
The hard-to-find MYP Design Teacher Support Material (IB login required) resource which was developed to accompany the 2014 MYP Design Guide classifies testing methods into five areas:
- Expert Appraisal
- Field Trial
- Performance Testing
- User Observation
- User Trials
This resource also states that at the end of Year 1, students, with guidance, should be able to “design simple tests to evaluate the solution against the requirements of the design specification.”
I did try to address Strand D.1 – Design Testing Methods to honor its role as part of the MYP Design Cycle. However, I never felt like I had this strand figured out sufficiently, and I was too inconsistent across my MYP Design units. My correlation between the appropriate task and this strand could have used some improvement!
For example, for my first unit in the school year, the paper water tank, students noted their tank’s mass, created a hypothesis, tested their tanks, shared a video of their test, recorded test results, and curated observers’ notes. We conducted the test during this strand (okay) but had written the test plan with the tank building plan in Strand C.1 – Construct a Logical Plan (pdf; maybe less than okay).
For the second unit of the school year, students created an animated cybersafety PSA for the morning announcements. In this strand, students showed their final animated PSA to two classmates and asked each to identify the PSA’s strongest characteristic among these three: relevant, memorable, or persuasive. Students then needed to explain why they agreed or disagreed (or both) with their peer observations.
An engineering unit like the paper water tank would use performance testing to gather quantitative data. The cybersafety PSA might employ user trail focus groups (i.e., peer review) and/or a field trial for testing to gather qualitative data.
Simple, Relevant Testing Methods?
So, for these MYP units, I didn’t exactly have students outline “simple, relevant testing methods, which generate data, to measure the success of the solution.” This strand on its own is difficult to address. Regardless, the testing methods should reflect the design specifications developed in Criterion B. One could argue that some sort of testing method would be needed for each design specification if time allowed. The MYP Design Teacher Support Material (IB login required) does essentially state this as “the student has tested against every aspect of the design specification.”
One way to improve D.1 would be to start with a review of the five types of testing used in MYP Design. A discussion about which of the five testing methods best fits the current unit would deepen understanding of this strand. If possible, have students generate hypotheses before testing for a bit more analysis and to cultivate buy-in. Having a conversation around hypotheses can lead to connections to the appropriate testing methods to get at the best data to evaluate the solution. For Year 1 students especially, address the purpose of the strand as accurately as possible.
Strand D.2 – Evaluate the Success of the Solution
The MYP Design Guide calls out for Year 1 students to “outline the success of the solution against the design specification.” I interpreted this to mean that students evaluate their finished product against each of the design specifications established in Criterion B. How deep you go into examining the final design against each of the design specifications can depend on how much energy you want students to dedicate to the remaining strands. Regardless, the goal/problem, introduced in the GRASPS, should be among the design specifications studied to determine the success of the solution.
Canned descriptors can provide sufficient depth of evaluation. For the paper water tank unit in Criterion D (pdf), the evaluation of the design for each specification was in terms of these five descriptors:
- Exact – Your team’s tank exactly met the design specification.
- Close – Your team’s tank mostly met the design specification.
- Middle – Your team’s tank met some of the design specification.
- Far – Your team’s tank met none or very little of the design specification.
- NA – You are not sure how your team’s tank met the design specification (try to avoid using this).
Another way to evaluate the success of the solution is to rank how well each specification was met with a short explanation. This is more work. If you’re under a time constraint, ask students to identify which design specification was most fully met and which was most insufficiently met. Students can also explain and justify their choices when looking at these extremes as well.
Benefits of Data Collection in MYP Design
At this point in Criterion D, data have been gathered from evaluating the solution against the design specifications. These data should be collected and archived to establish authentic reference points for future classes.
For example, current classes can use the data generated from previous classes’ Criterion D work as credible and authoritative research material for Strand A.2 – Identify and Prioritize the Research.
Strand D.3 – Explain How the Solution Could Be Improved
If you are working with Year 1 students and want to distinctly address the design problem, note that the responses about improvement (Strand D.3) can naturally cross over into impact (Strand D.4). Also, be aware that some student energy can be lost at this point, and answers might be less detailed than a teacher would prefer.
The MYP Design Guide states Year 1 students “outline how the solution could be improved.” Year 5 students “explain how the solution could be improved”. Regardless, I have found that Year 1 students naturally go beyond “outline” and do explain how to improve the solution in D.3.
This strand is straightforward and one paragraph of writing should be enough for a student to show what they know. Flipgrid and Seesaw offer verbal, video-based choices for strands D.3 and D.4, but may require more time for the teacher to assess.
It’s best to be precise about the next steps to improve the solution. Therefore, students should explain how the solution can be improved in terms of the design specifications critical to the success of the solution. If there are many specifications to choose from, specify an essential few for students to address.
Strand D.4 – Explain the Impact of the Solution
MYP Design Criterion D – Evaluating ends the MYP Design Cycle for the unit of study. Review the GRASPS scenario for a final refocus on the why of the problem and the who–the client/target audience. A short, final discussion informed by the Statement of Inquiry Questions can reinvigorate purpose and meaning, and cultivate empathy as well.
Year 1 students’ responses might be speculative due to these scenarios tending to be more contrived than those for Year 5 students. Regardless, students should focus their explanation of the impact on the target audience–the benefit the client gains (or would gain) from the solution to the problem.
Repeat the MYP Design Cycle?
Should you repeat the cycle? After all, a design cycle or design process is meant to improve the solution each time the designer goes through the steps. The repetition is good! However, to repeat a full six-week lesson would not be logistically possible. Plus, students would probably and legitimately lose interest.
At the end of the semester or year, students could engage in an accelerated and abbreviated version of a previously completed entire unit to apply their knowledge, honor the design process, and hopefully have some fun. Engineering-based units are the easiest to repeat in this manner.
MYP Design Criterion D Evaluating Summary
MYP Design Criterion D depends heavily on evaluating the success of the solution against the design specifications. To do this effectively, students must establish testing methods to generate accurate data. A helpful and natural progression is to share testing data with future classes as research material for Criterion A.
Criterion D is about improving prototypes through an iterative design process. Students use data from testing methods they set up. They must think about how effective their solution is for the target audience.
In Criterion D, there are different testing methods students can use. These include expert appraisal, field trials, performance testing, user observation, and user trials. Each method gives different data types to help make the prototype better. For example, performance testing gives objective data to compare designs and choose the best one for the task.
Overall, MYP Design Criterion D is a crucial step in the design process, although students may see it differently! It helps students evaluate their solution’s effectiveness and improve using the best real-world data available. By the end of Criterion D, students will have tested their design systematically and evaluated its impact on the target audience.