Friday, 10 November 2017

The no-calculator part of the Alberta PAT

Under pressure from people who want the mathematics curriculum to revert to an earlier age, the Government of Alberta introduced a fifteen minute no-calculator section as part of the grade 6 Provincial Achievement Test. The results for 2016 were not spectacular, unless you think mediocrity is worthy of applause.

Because of my association with the SNAP Mathematics Foundation, which promotes the use of math based puzzles in the K-9 classroom, I am very interested in K-9 math education.  And I am dejected when I read that our students are doing poorly.

I have recently taken a look at the grade 6 Math PAT. It did not leave me feeling warm and fuzzy.  The test by itself cannot account for the less than stellar results, but I do have some quibbles.


Part A questions

The Alberta Grade 6 math PAT has two parts. The questions for Part A, the no-calculator part,
are described as being of low cognitive difficulty. For example: 


It is difficult to believe that a typical grade-sixer would stumble over questions like these. But apparently a lot did: almost 35 % failed to meet the acceptable standard.

Wow.

Are you still unwilling to jump onto the back-to-basics bandwagon? 

When the results of Part A were released, David Eggen, our education minister, reacted like this:
And there it was. Boom. Big place for room for improvement for basic skills
BUT . . . He also said that he expected to see poor results on Part A of the test.

I like what David Eggen is doing with education, but I didn't expect that last comment.

Permit me to digress a bit.  A few years back, my gas company sent consultants to do a free inspection of my heating system.  You can guess what happened: my system was found to need rehabilitation, and they tried to convince me to buy an expensive high-efficiency furnace.

In case you somehow miss my point, I reacted to Part A as if Alberta Education was trying to sell me a costly high-efficiency basic math curriculum.

What is the test like?

The best way to understand what a test is like is to try it yourself. That means you should actually work out the answers, not just look at them and judge whether they are too hard or too easy for a grade six student. Here's a sample test you can download and try:

Ted's Sample Part A test

It should take less than fifteen minutes. Be sure to read and follow the instructions to the letter.
The time you take to check your answers should be part of the 15 minutes.

(My sample test is a very slight modification of the one provided by Alberta Education.  Their sample test may be found in the link at the end of this post.)

The full PAT has two parts. The questions for Part A, the no-calculator part, have no context. They are arithmetic computations. The questions for Part B are placed in "real-life contexts," which is to say that they are arithmetic computations dressed in real world clothing. (To call them real-life is a stretch. You're not fooling the kids, ABED)

Sad to say, the grade 6 math PAT sample questions quite accurately portray the math world I lived in when I was in grades 6 and 7.  Sixty-five.  Years.  Ago. 

I have met many K-6 teachers through our SNAP workshops, and the math environment that they create in their classrooms is not at all like the one evoked by the Grade 6 math PAT.  (This is a good thing! The effect of my elementary math education was that, afterwards, throughout all of high-school, I habitually brushed off math as a possible source of interesting material. I certainly did not anticipate that math would become a major part of my life. )

To be candid, I'm not an "evaluation" expert. But my intuition suggests that any gulf between the math world of the classroom and the math world of the test will negatively affect the PAT outcomes.

The format of Part A

The heavy promotion of Part A as a no-calculator test suggests that it is a pencil and paper exam. Well it is, sort of. But, like Part B, it is essentially a fill-in-the-bubbles test. It is what you would get if you transferred a computer-based test to paper, with the students shading in the bubbles by pencil instead of mouse clicking the appropriate radio buttons.

Apart from the possibility of grading rough work (and I don't know if that happens), the main difference between Part A and Part B is that Part B is multiple choice. In Part A, students get to enter a numerical answer. 

Students are expected to answer each question in Part A using 
two different formats. 

First, by filling in boxes with written numerals,  and then by shading in 
the appropriate bubbles. The answer template for each question is as shown 
to the right. (In my sample test, I provided only the boxes, because the students were able to complete the bubble portion after the fifteen minutes.)

The numerical answer has to be written in the set of four boxes as follows:
One digit per box, beginning in the left-hand box.
A decimal point, if needed, goes in its own box.
Unused boxes must be left blank.
So the "correct" way to fill in the the boxes for the question

        "What is 7 ÷ 2 ?" 

is as follows:



Hmm. What about the smart-ass student who decides to write the answer in one of the following ways:



(By the way, what is an unused box? If it is unused is it not already blank??)


The bubble portion is to enable machine grading. 

The instructions for completing the bubble portion are:
You may fill in the bubbles below each of your answers as you do the test; however, you may also fill in the bubbles after you have completed both Part A and Part B of the Grade 6 Provincial Achievement Test and your teacher has collected your test booklets.
So the completely finished answer should look like this:




I'm not sure if any of the students wasted part of the 15 minutes filling in the bubbles. The box and bubble answer templates for all fifteen questions are on an answer sheet separate from the examination booklet. 


About learning the basics

Children do need to learn the basics –– but you and I will probably disagree about what the basics are. Students should be fluent with the basics, but you and I might disagree on what fluency means.

The Part A questions were easy, and the grades should be higher. But I'm not sure that fifteen isolated numerical responses to Part A reveal the extent of a student's ability.  And, as I intimated above, I have the uncomfortable suspicion that Part A was intended to prove incompetence rather than competence. 




Link:


Alberta Grade 6 Math PAT Bulletin




Saturday, 21 January 2017

Changing percent of students at each PISA math proficiency level


This is a graphical summary of how Canadian students have trended in the triennial math tests offered by the PISA/OECD consortium.

It is not so much about their international standing, but about how their achievement levels have varied over the five rounds from 2003 through 2015.

For each participating region or country, PISA publishes how the students’ grades are distributed across seven different proficiency levels. The levels are the same for all jurisdictions, and they have remained the same for the five rounds since 2003.


The grades defining the boundaries of the proficiency levels are shown in boldface. (The ones in parentheses are what you get when the PISA-assigned grades are converted in a reasonable way to grades out of 100.)  The yellow and blue levels identify low performing and and high performing students. The interval widths for each of Level 1 through Level 5 are the same. PISA does not define the widths for the lowest (less than Level 1) and highest (Level 6) proficiency levels.

The table below summarizes the distributions for Canadian students for each of the five PISA rounds.



In effect, each row of the table is an estimated probability distribution. For example, if you were to randomly choose a Canadian student in 2003, the probability that he or she would have achieved Level 5 is 14.8 percent.

Actually, the reason I wrote this post was to try to figure out how to use slides in blogger -- I think slides give a more vivid visual depiction of the changes than a table does.

The following slideshow displays the table as a sequence of histograms. There is some concern that Canadian students are trending in the wrong direction. Perhaps this graphical representation may help you decide if the trend is cause for alarm.