Wednesday, March 20, 2013

Preparing Diagnostic Assessments


Recently Melissa was on our members' site and asked me to respond to the following question, “Can you explain the best method of preparing a math diagnostic assessment?”

When one considers the best method of preparing a math diagnostic assessment, it is important to ask a few questions:

                  1. What is the purpose?
                  2. Who is the audience?
                  3. What kind of evidence (e.g., data or information) is needed?
                  4. How can you ensure for reliability and validity?
                  5. How can the findings be reported?

From a classroom assessment perspective, we think about diagnostic assessment as being about the gathering of “baseline data” through engaging learners in the tasks they are going to be learning more about. The purpose of diagnostic assessment from the classroom perspective is to understand what students know and what they need to know so instructional plans can be made with specific student needs in mind. Two examples:

Task: Students engage in representing their learning in relation to specific learning expectations/intentions. These tasks might be done by individuals, by small groups, or the entire class of students. There are many possibilities for performance tasks  based on grade level curriculum. Tasks could be anything from building patterns using manipulatives, problem solving and representing mathematical thinking in a variety of ways (words, symbols, graphs, equations and so on), or any other tasks that involve the application of mathematical concepts. Powerful performance tasks result in not only a product but also an opportunity to observe students and ask them to articulate their understandings. This collection of evidence from multiple sources collected over time (baseline tasks repeated more than once) provides for reliability and validity.

Test: Other times teachers take an end of unit test or quiz and ask students to do as much as they can and note which questions are easy, which are not too bad, and which are really difficult. It is helpful to ask students to use a common set of visual symbols (e.g., target) or colours (e.g., easy is green, moderate is yellow, difficult is red) to code the test items. Teachers explain to students that they don’t expect them to know everything because this is an END of learning test. Students are being asked to do the test so teachers will know more about what needs to be taught.

Other times the purpose of the diagnostic assessment is to identify trends and patterns across a large group of students so programs can be designed or to identify learning difficulties. These standardized diagnostic tests and tasks have their own quality standards. If these are the kind of diagnostic assessments you are interested in, you might want to read a column by Jim Popham titled, Diagnosing the Diagnostic Test.

Whatever you decide to do, think carefully about your purpose and ask yourself, “Do my planned next steps in terms of the diagnostic assessment support student learning?” If you can respond with a “Yes!” then proceed. If not, revise your plans. After all, diagnostic assessments are about supporting student learning first. Fulfilling the information needs of adults is a distant second purpose.

As teachers plan their classroom assessment in support of student learning, they find it helpful to build an assessment plan. You might want to use the end-of-chapter activities in Making Classroom Assessment Work to build your own assessment plan. This third edition will help you figure out which tasks could be a source of important baseline data for you and your students. I recommend you pay particular attention to the end-of-chapter activities for Chapters 3, 4, 5, and 9.

All my best,
Anne

PS Consider attending one of our summer Institutes in Canmore, AB or Fredericton, NB to find out more about diagnostic assessments and building an assessment plan.


Wednesday, March 13, 2013

Report Card Planning

Sarah is working on a project related to reporting. (A while ago I tweeted about an interesting blog by Andrew Campbell that you may also want to read.)

As I reflected on the questions she posed, I invited her to get in touch so we could have a longer conversation. And, I posted the following quick comment...

I'm happy to talk some more about reporting... especially when we conceive of reporting as a process rather than an event and when we think about how children can be involved in communicating evidence of their own learning. Technology is beginning to make it possible for students to take control of communicating the evidence of learning and for teachers to communicate their professional judgement in relation to grade level expectations... two thoughts come to mind...

1. The person working the hardest is learning the most...why shouldn't students be working harder (and smarter) when it comes to reporting?

2. Teachers professional judgement is more reliable and valid than external tests when they have been engaged in co-constructing criteria, looking at samples of student work, scoring that work, checking for inter-rater reliability, and so on....

What happens when we help students understand quality, learn the language of assessment, and self-monitor their way to success??? Even young children can do this! We have documentation. We have research evidence. Why not have students deeply engaged in collecting and sharing evidence of their learning?


Why don't you post your thoughts also? Here is the link again.

Cheers,
Anne

PS This is a topic we focus on during our summer Institutes and is often what people ask us to focus on as part of our sessions with schools and districts. Get in touch with me via Twitter (on this blog page) or through Kathy Burns at our office 1.800.603.9888/250.703.2920 if you want to find out more.

Monday, March 4, 2013

QUESTIONS TO EXPLORE THINKING


On Friday, I was working alongside a group of about 350 school leaders who were thinking and talking about ways to move from evidence and data to classroom practice.  The data that they had been called to look at came from a variety of sources:  classroom-based evidence, large-scale assessment results at grades three and six, and data sets that included contextual, attitudinal, and demographic information.  Indeed, they were ‘data rich.’ 

As we explored the nature of evidence and the many ways that it can be a ‘call to action’, I proposed several questions that these teams might want to use back at their schools in order to uncover what the data might be telling them:

  • What do you see in the data…patterns, trends, and anomalies?
  • What might be some of the reasons that these patterns, trends, and anomalies exist?
  • In what areas does the data indicate that the students are doing well?  What are some of your hunches as to the reasons for these strengths?
  • In what areas does the data indicate that there is room to grow?  What are some of your hunches as to the reasons for this?
  • What does the classroom-based data for this group of students tell you?  Strengths?  Areas for growth?
  • What questions does this data raise for you that you might want to pursue?
  • What are you learning?  What new perspectives might you be gaining?
  • What do you want to think more about?

These questions are not earth shattering; they are questions that have been asked before.  However, they are questions that are mediative in nature.  That is, they are invitational and intentional in nature.  Bob Garmston and Art Costa, founders of Cognitive Coaching, assert that mediative questions engage and transform thinking when they are invitational. 

Invitational questions are posed: 
  • in the plural form (e.g., some of your hunches…, some of the reasons…),
  • using tentative language (e.g.,what might...?, what are some...?),
  • with positive presuppositions (e.g.,What are you learning?),
  • and as open ended, rather than as ‘yes’ or ‘no’ questions (e.g.,What do you want to think more about?)

As the group considered the eight questions from above, we used a carousel strategy to think more deeply about them.  One of the questions that the participants responded to was ‘What other questions might we ask about these data sets?’  I have included a picture of one of the chart papers that was posted.






Participants created questions that intentionally opened thinking up, rather than questions that ‘shut down’ thinking.  It is in this way that we move to create deeper meaning and understanding.  As Descouvertes de la Salle said, “It’s not the answers that enlighten us, but the questions.”