Friday, 10 November 2017

The no-calculator part of the Alberta PAT

Under pressure from people who want the mathematics curriculum to revert to an earlier age, the Government of Alberta introduced a fifteen minute no-calculator section as part of the grade 6 Provincial Achievement Test. The results for 2016 were not spectacular, unless you think mediocrity is worthy of applause.

Because of my association with the SNAP Mathematics Foundation, which promotes the use of math based puzzles in the K-9 classroom, I am very interested in K-9 math education.  And I am dejected when I read that our students are doing poorly.

I have recently taken a look at the grade 6 Math PAT. It did not leave me feeling warm and fuzzy.  The test by itself cannot account for the less than stellar results, but I do have some quibbles.


Part A questions

The Alberta Grade 6 math PAT has two parts. The questions for Part A, the no-calculator part,
are described as being of low cognitive difficulty. For example: 


It is difficult to believe that a typical grade-sixer would stumble over questions like these. But apparently a lot did: almost 35 % failed to meet the acceptable standard.

Wow.

Are you still unwilling to jump onto the back-to-basics bandwagon? 

When the results of Part A were released, David Eggen, our education minister, reacted like this:
And there it was. Boom. Big place for room for improvement for basic skills
BUT . . . He also said that he expected to see poor results on Part A of the test.

I like what David Eggen is doing with education, but I didn't expect that last comment.

Permit me to digress a bit.  A few years back, my gas company sent consultants to do a free inspection of my heating system.  You can guess what happened: my system was found to need rehabilitation, and they tried to convince me to buy an expensive high-efficiency furnace.

In case you somehow miss my point, I reacted to Part A as if Alberta Education was trying to sell me a costly high-efficiency basic math curriculum.

What is the test like?

The best way to understand what a test is like is to try it yourself. That means you should actually work out the answers, not just look at them and judge whether they are too hard or too easy for a grade six student. Here's a sample test you can download and try:

Ted's Sample Part A test

It should take less than fifteen minutes. Be sure to read and follow the instructions to the letter.
The time you take to check your answers should be part of the 15 minutes.

(My sample test is a very slight modification of the one provided by Alberta Education.  Their sample test may be found in the link at the end of this post.)

The full PAT has two parts. The questions for Part A, the no-calculator part, have no context. They are arithmetic computations. The questions for Part B are placed in "real-life contexts," which is to say that they are arithmetic computations dressed in real world clothing. (To call them real-life is a stretch. You're not fooling the kids, ABED)

Sad to say, the grade 6 math PAT sample questions quite accurately portray the math world I lived in when I was in grades 6 and 7.  Sixty-five.  Years.  Ago. 

I have met many K-6 teachers through our SNAP workshops, and the math environment that they create in their classrooms is not at all like the one evoked by the Grade 6 math PAT.  (This is a good thing! The effect of my elementary math education was that, afterwards, throughout all of high-school, I habitually brushed off math as a possible source of interesting material. I certainly did not anticipate that math would become a major part of my life. )

To be candid, I'm not an "evaluation" expert. But my intuition suggests that any gulf between the math world of the classroom and the math world of the test will negatively affect the PAT outcomes.

The format of Part A

The heavy promotion of Part A as a no-calculator test suggests that it is a pencil and paper exam. Well it is, sort of. But, like Part B, it is essentially a fill-in-the-bubbles test. It is what you would get if you transferred a computer-based test to paper, with the students shading in the bubbles by pencil instead of mouse clicking the appropriate radio buttons.

Apart from the possibility of grading rough work (and I don't know if that happens), the main difference between Part A and Part B is that Part B is multiple choice. In Part A, students get to enter a numerical answer. 

Students are expected to answer each question in Part A using 
two different formats. 

First, by filling in boxes with written numerals,  and then by shading in 
the appropriate bubbles. The answer template for each question is as shown 
to the right. (In my sample test, I provided only the boxes, because the students were able to complete the bubble portion after the fifteen minutes.)

The numerical answer has to be written in the set of four boxes as follows:
One digit per box, beginning in the left-hand box.
A decimal point, if needed, goes in its own box.
Unused boxes must be left blank.
So the "correct" way to fill in the the boxes for the question

        "What is 7 ÷ 2 ?" 

is as follows:



Hmm. What about the smart-ass student who decides to write the answer in one of the following ways:



(By the way, what is an unused box? If it is unused is it not already blank??)


The bubble portion is to enable machine grading. 

The instructions for completing the bubble portion are:
You may fill in the bubbles below each of your answers as you do the test; however, you may also fill in the bubbles after you have completed both Part A and Part B of the Grade 6 Provincial Achievement Test and your teacher has collected your test booklets.
So the completely finished answer should look like this:




I'm not sure if any of the students wasted part of the 15 minutes filling in the bubbles. The box and bubble answer templates for all fifteen questions are on an answer sheet separate from the examination booklet. 


About learning the basics

Children do need to learn the basics –– but you and I will probably disagree about what the basics are. Students should be fluent with the basics, but you and I might disagree on what fluency means.

The Part A questions were easy, and the grades should be higher. But I'm not sure that fifteen isolated numerical responses to Part A reveal the extent of a student's ability.  And, as I intimated above, I have the uncomfortable suspicion that Part A was intended to prove incompetence rather than competence. 




Link:


Alberta Grade 6 Math PAT Bulletin




Saturday, 21 January 2017

Changing percent of students at each PISA math proficiency level


This is a graphical summary of how Canadian students have trended in the triennial math tests offered by the PISA/OECD consortium.

It is not so much about their international standing, but about how their achievement levels have varied over the five rounds from 2003 through 2015.

For each participating region or country, PISA publishes how the students’ grades are distributed across seven different proficiency levels. The levels are the same for all jurisdictions, and they have remained the same for the five rounds since 2003.


The grades defining the boundaries of the proficiency levels are shown in boldface. (The ones in parentheses are what you get when the PISA-assigned grades are converted in a reasonable way to grades out of 100.)  The yellow and blue levels identify low performing and and high performing students. The interval widths for each of Level 1 through Level 5 are the same. PISA does not define the widths for the lowest (less than Level 1) and highest (Level 6) proficiency levels.

The table below summarizes the distributions for Canadian students for each of the five PISA rounds.



In effect, each row of the table is an estimated probability distribution. For example, if you were to randomly choose a Canadian student in 2003, the probability that he or she would have achieved Level 5 is 14.8 percent.

Actually, the reason I wrote this post was to try to figure out how to use slides in blogger -- I think slides give a more vivid visual depiction of the changes than a table does.

The following slideshow displays the table as a sequence of histograms. There is some concern that Canadian students are trending in the wrong direction. Perhaps this graphical representation may help you decide if the trend is cause for alarm.


     






Monday, 12 December 2016

The Meaning of the 2015 PISA Canadian Math Results


For the past few years various news media have repeatedly conveyed the belief that:
  1. The mathematical abilities of Albertan and Canadian students have declined significantly.
  2. This decline was caused by the introduction of a new curriculum.
  3. Evidence is provided by the results of the PISA tests.

That sentiment is what this post is about, but let me start with this question.
At a certain junior high school the averages of the math grades for the past three years were 73%, 75%, and 73%. How would you rate this school?
  1. Definitely a top notch school.
  2. That’s a pretty good school.
  3. That’s what an average school should be like.
  4. Mediocre — room for improvement.
  5. Poor — I would think about sending my child to a different school.

Myself, I would choose (2).  I would expect grades of 65% - 70% to be about average so I feel the school is a pretty good one. Not absolutely fantastic, but pretty good.

The real story:


The averages cited above are not for a particular school, but for an entire country. The figures are the average math scores for top notch Singapore in the 2009, 2012, and 2015 rounds of the PISA tests. (If you are familiar with the PISA tests, you know that the scores are not recorded as percentages. The numbers 73%, 75%, and 73% are what you get when you convert the published PISA scores in a reasonable way.1 )

For comparison, here are the Alberta and Canadian results for the years from 2003 through 2015, again converted to percentages. (Singapore did not compete prior to 2009).


The variation across the PISA rounds for all three regions seems pretty unexceptional. I would expect changes of ±3% at the very least — the year-to-year variability in my university math classes were at least as large as that.
[As an aside, I predicted in a previous post that Canada would get a grade in the range from 65% to 67% on the math part of the 2015 PISA test. I promised a Trumpian gloat if the prediction was correct, and a Clintonian sob if it was wrong. No guru here, I’m afraid — it was pure luck — but "Gloat. Gloat" anyway.]
In spite of the stable relationship between the results for Alberta and Canada, there is alarm that Alberta's performance in the 2015 round has pushed them below the rest of Canada. That may be numerically accurate when you look at the raw scores published by PISA, but the results are so close that when converted to percentages the difference vanishes. And, more importantly, the PISA consortium itself reports that the difference is so minuscule that it is not statistically significant. Which means there is no reliable evidence that there is any difference whatsoever.  

I don’t think that our ranking among the PISA participants indicates that our math curriculum is either superb or deficient. But using the PISA rankings to critique the curriculum is what people do. To confirm this, all you have to do is read the opening foray in the petition that stoked the current math wars in Alberta. Here is what it says:
"The Organization for Economic Co-operation and Development (OECD) recently released a report confirming that math scores, measured in 15yr olds, have declined significantly in the ten years since the new math curriculum was introduced across Canada."

Additionally, there is this report in the Globe and Mail in January 2014:
Robert Craigen, a University of Manitoba mathematics professor who advocates drilling basic math skills and algorithms, said Canada’s downward progression in the international rankings – slipping from sixth to 13th among participating countries since 2000 – coincides with the adoption of discovery learning.


So, what about that new curriculum?


The petition was presented to the Alberta legislature on Tuesday, Jan 28, 2014. (A later article gives a date of March 2014.)

So apparently the new curriculum was introduced a decade earlier, in 2004. However, that's not what happened — no students were taught the new curriculum prior to 2008 - 2009. To see just how far off the 2004 date is, let’s look at how the curriculum was actually phased in. Here is the schedule for the years that each grade was first exposed to the curriculum. (I trashed around a long time trying to find this. Thanks David Martin and John Scammell for providing me with the information. Since Albert Ed does not have this information on its website, I cannot be sure that Alberta actually followed the implementation schedule, but I'm assuming that was the case.)


The PISA exams take place every three years, and Canadian students take the tests in Grade 10. Just like David and John did in their 2014 posts2, I’ll save you some time and do the math so you can see when the students were first exposed to the new curriculum. 
  • Those who took the 2003 test - never exposed
  • Those who took the 2006 test - never exposed
  • Those who took the 2009 test - never exposed
  • Those who took the 2012 test - first exposed in grade 7
  • Those who took the 2015 test - first exposed in grade 4
The curriculum changes could not be reflected in any way by the PISA results prior to 2012. And those who wrote the PISA test in 2012 were not exposed to the new curriculum in the elementary grades (which is where those who blame the new curriculum for our decline are pointing their fingers).

The biggest drop in the PISA scores seems to have occurred in 2006, and the PISA grades since then have remained quite steady. What’s going on here?  Did the introduction of the new curriculum in 2008 cause a drop in the PISA results in 2006? Is this is a brand new type of quantum entanglement? As you all know, with ordinary everyday quantum entanglement, changes in physical properties can be transmitted across great distances instantaneously.3  Yes, instantaneously, but never backwards through time. As someone might say, that would be really spooky! 

Sorry for the sarcasm.

If you accept the PISA results as evidence of a decline of our students' math abilities, then those same results show that the decline occurred well before the new curriculum could have had any influence. And, this also means that the PISA results provide no evidence that returning to the previous curriculum will repair things.


Do the PISA scores matter?


By the way, if you are concerned about our sinking rank in the math PISA league tables, note that for the 2015 round, Canada moved up to 10th from the 13th place.

David Staples (a columnist for the Edmonton Journal) has said that the people who want to return to pre-2008 curriculum and teaching methods are not really concerned with our PISA rank (despite the comments quoted above). Their concern is that the proportion of lower performing students is increasing while the proportion of higher achieving students is decreasing.

Although that's true, and it might be serious, the decline started prior to 2015, when none of the students who took the PISA test were exposed to the new curriculum in elementary school. So, again, the PISA results are not evidence that the new curriculum is at fault. We'll have to go elsewhere to get evidence that it was caused by the new curriculum.

And if we can't find any, please let us not blame the front line workers and have this discussion degenerate into a round of teacher bashing.





REFERENCES

1 The conversion from the published PISA scores to percentages was done as follows:

Percent_grade = 60 + (Pisa_Score − 482) × (10 / 62.3). 

More details can be found here .


2 David Martin's post about the 2012 PISA round is here, and John Scammell's is here.

3 Quantum entanglement — I'll let others explain it. Here are two you tube videos that do a pretty good job.

Quantum Entanglement & Spooky Action at a Distance

Yro5 - Entanglement

** post was edited Dec 14 to correct two typos





Thursday, 24 November 2016

Use it or Lose it !


This is not a column about sexual health.


I’m talking about the basic math facts — times tables and all that stuff that should be stored in our long term memory:

Will we really forget those facts if we don’t use them? 

Our brains are flexible. Neuroscientists use the word "plastic". They tell us that the brain's circuits can be repurposed, new ones can grow, and that frequent use strengthens neural connections. My guess is that YES, our memory of these basic facts will fade if we don't use them.


But I have a follow-up question.

Do you know that  1 / 1.04 = .96154? 

I do know that! And it's a fact that I don't use.


Here's the story.

Fifty years ago I worked for a large Canadian insurance company. I was a clerk in the mathematics section of the Policy Change department. My job was to compute how much to charge or refund when clients changed their insurance policies.

Among the many options available was the possibility of paying premiums in advance. The company liked it when clients did this, and to encourage them the company offered a discount.

We used an interest rate of 4%.  So, if the premium was, say, $150.00, and the payment was being made a year in advance, the client would pay $144.23, which is what you get from the following computation:

$150.00 / 1.04.

To the nearest five decimal places, 1/1.04 = 0.96154, so we could also calculate the discounted premium this way:

$150.00  ×  .96154.

Actually, 1/1.04 is slightly smaller than 0.96154, but for all practical purposes the two methods are identical. Yet, despite their equivalence, our boss made us use the second one. Instead of dividing by 1.04 we had to multiply by 0.96154. It was not because our boss had some weird mathematical fetish: an official company policy memo explicitly stated that when computing prepayments of premiums, we had to use the factors from the supplied interest tables. Policy trumps mathematics.

After a short time on the job, the "fact" that 1/1.04 equalled .96154 had established permanent residence in my mind and it still remains there even though I haven't used it for over half a century.

* * * * *

This sort of thing is not unusual.

  • My parents-in-law, veterans of WWII, could recall their military ID numbers for their entire lives, even though they seldom used them.
  • My wife and her mother used to play a game when she was a child. They recited the alphabet backwards as rapidly as they could. It's a skill that my wife has not used for over 60 years. Yet yesterday she showed me that she can still reel off the alphabet backwards without hesitation. 

So my intuition that facts fade from memory when they are not used may be wrong. Some of us retain basic facts and practiced skills even if we don't use them for an extremely long time.

* * * * *

On the other hand - 

Sometimes when people say "Use it or lose it" they really mean

Use it and you won't lose it!

Well, we all know that statement is false.

How else to account for my spotty recall of simple multiplication facts? I've used those basic multiplication facts quite a lot since leaving that insurance company, and I'm still not "fluent" with them. I can fire off the squares of the natural numbers up to 12 × 12, but ask me what 12 × 7  or 12 × 11 is and I have to perform a mental computation. And 13 × 13 is 169, but what is 13 × 9?

Some people can recall basic math facts quickly and without conscious effort — they automatically know them without thinking. Like the way I know that 1/1.04 is .96154.

Others will never completely achieve this type of fluency. When someone says "Twelve times seven" it may not provoke the involuntary "Eight-four". Instead, the reaction may be "seventy plus fourteen" having broken 12 × 7 into 10×7 plus 2×7.

To be truthful, I am fluent with most of the basic multiplication facts. But, I really do still have trouble with a few parts of the 6, 7, 8 and 12 times tables. And, as I prepared this post, I actually did have to retrieve 12×7 by that quick mental computation. However, once done, I could instantaneously recall the answer for the rest of the post, and I know that the automaticity will remain for several days, maybe even a couple of weeks. But it’s as impermanent as those facts acquired by cramming for an exam: very soon it will fade from my memory.



My advice 

If my children were young, this is what I would tell them today:

Practice your times tables. Memorize as much as you can. It is worth the effort. Maybe you will never be able to keep all of them in your head. But if you can keep a few there, and if you learn how to be flexible with numbers, you will find ways to derive the more difficult ones very quickly — almost as fast as if you had actually memorized them — and it won’t stop you from learning more mathematics.

As for getting every single one of those basic facts into genuine long term memory in a way that they can be instantly and automatically recalled: forget it — for me, and maybe for quite a few others:

It's • Not • Gonna • Happen


Addendum

Just as I was finishing this post, I came across an article (linked below) by Maria Konnikova in The New Yorker magazine. It's about the tenuous relationship between practice and achievement across a variety of fields (including mathematics). It's really worthwhile reading. 




Friday, 11 November 2016

Understanding the PISA Tables


The PISA math results show Canada trending downwards internationally. Here are the rankings of the top 20 regions (of approximately 65) that participated in the 2003, 2006, 2009, and 2012 rounds:


Note Shanghai and Singapore have been greyed out for 2003 and 2006 because neither took part in those two rounds. Since it is hard to assess a trend when the population base changes, I think that either Shanghai and Singapore should be included for all rounds, or both should be excluded for all rounds. I have chosen to include them, and, based on their results in 2009 and 2012, I have assumed that they would have finished atop the PISA table had they competed in the earlier rounds.

With Shanghai and Singapore included, the perspective on the Canadian trend changes somewhat. However, some slippage is definitely happening, and for many people the trend is alarming.

But I wonder if we are concentrating too closely on the rankings. Here is the same table when grades are included:


Note: OK. This doesn't look like the PISA scores that you have seen. The results shown above are percentage grades, and the PISA consortium does not report the scores this way. Instead, they use a numerical scale that ranges from about 300 to over 600.

I have a fondness for percentage grades. They provide me with a more serviceable scale than grades out of 600. For example, to me, a class average of 67% indicates an ordinary but acceptable performance, while I am not sure what an average of 527 on the PISA scale conveys.

The conversion from the PISA scale to percentage grades in the table above was done linearly:

Percent_grade = 60 + (Pisa_Score − 482) × (10 / 62.3).

As is evident, Canada has remained remarkably stable: 68%, 67%, 67%, 66%. 


Proficiency Levels


The PISA results are intended to be a measure of mathematical proficiency, and PISA describes six different levels. The grade ranges for each level are as follows:



The descriptions above are extremely abbreviated. Detailed descriptions can be found in the 2012 PISA technical report (link [5] below). Also, PISA does not use verbal descriptions like Highly proficient or Basic proficiency.

Some features of the PISA proficiency levels

  • For each round, the PISA consortium adjusts the grades so that the proficiency levels retain the same meaning over the years. A percentage score of 65% in 2003 has exactly the same meaning as a percentage score of 65% in 2012. In other words, PISA takes care to avoid grade inflation.
  • On the percentage scale, each Level band is 10 points wide. On the PISA scale, the bands are 62.3 points wide.
  • The PISA scales are designed so that the OECD mean is 500 (or very close to 500). On the percentage scale, the OECD mean is 63%.
  • Over the four rounds from 2003 to 2012, Level 6 has never been achieved by any of the participating regions. Level 5 has been achieved by one participant. Level 4 has been achieved by five participants.  In the four rounds from 2003 to 2012, Canada has resided in the Level 3 band.


Is the slippage significant?


The difference between Canada’s score in 2003 and 2012 is statistically significant. The words mean that, with high probability, the difference in the scores is real and is not a result of the chance that arises by testing only a sample of the population. Statistical significance tells us that the statistics are reliable, but it doesn't tell us anything about whether or not the statistics are important.

(The word "significant" has considerable emotional impact. I remember a Statistics professor telling us that the word should be outlawed when we are discussing statistical results.)

Do the differences in the PISA tables show that Canada must do something to improve how our students learn mathematics? Some people believe that the statistically significant difference between 2003 and 2012 is also of great practical significance, and they have strong opinions about what should be done.

As well, the PISA group itself has investigated what will and won't work to improve students mathematical proficiency, and has published its findings in a 96 page document [6]. Some of the conclusions appear to run counter to what the back-to-basics people advocate — one conclusion in the document is that dependence on intensive memorization and rote practice of the basic algorithms is a poor strategy for learning anything but the most trivial mathematics.

That particular PISA document may leave some of my friends in a slightly awkward place: Those who have used the PISA tables as a rallying cry for a return to traditional methods now find themselves having to rebut some of PISA's own conclusions about what constitutes good teaching and good curriculum.

In any event, when I look at the percentages for Canada from 2003 to 2012, I see changes, but I don't think that they are earth-shattering. Would you be concerned if one section of your Calculus course averaged 68% and another section averaged 66%? I’ve run into that situation teaching back-to-back sections of the same course, and I was not convinced that the section with the higher score was the superior one.

* * *

In a few weeks, the OECD will release the results of the 2015 PISA round. I am very interested in seeing if Canada's results are significantly different in a practical sense. My expectation (based on no evidence whatsoever) is that the results will be much the same as before, and will fall within the 65% to 67% range. ( Hey! A Halifax weatherman had a very successful forecasting run by predicting tomorrow's weather to be the same as today's.)  I'll do a Trumpian gloat if my prediction is correct, and a Clintonian sob if it's wrong.



References


[1] 2003 PISA report (Math scores are on page 92)

[2] 2006 PISA report (Math scores are on page 53)

[3] 2009 PISA report (Math scores are on page 134)

[4] 2012 PISA report (Math scores are on page 5)

[5] PISA 2012 Technical report (Proficiency levels are described on pages 297 - 301)

[6] Ten Questions for Mathematics Teachers… and How PISA Can Help Answer Them, OECD Publishing, Paris (2016).



Thursday, 13 October 2016

Alberta's new math curriculum


Will David Eggen be prudent in his redesign of the K-6 Alberta math curriculum?


A few days ago, the Provincial Achievement Test results were released, and the media reacted:
Math scores dip as Alberta Education releases latest PAT results
— Metro News, Oct 7
Calgary students show well on Alberta PAT tests, but concerns over math scores
— Calgary Sun, Oct 8
Math results continue to slide on Alberta provincial exams
— Edmonton Journal, Oct 8
Grade 6 math marks concern Edmonton school boards, Alberta education minister
— Global News, Oct 7
Province and schools eye changes as grade six math students struggle
— CHED, Oct 8

So what are the results?


Here are the percentages of students that achieved acceptable and excellence status in the Grade 6 math PAT’s from 2011/12 to 2015/16 (the data is from the Alberta Education website):


The bandaid solution?


Already some curriculum changes have been made. Recent amendments include memorization of the addition and multiplication tables and learning the standard/traditional algorithms. The format of the algorithms is not explicitly specified. This means, for example, that any of the following three would qualify as the standard division algorithm:



Where did these come from? The leftmost algorithm is the one I was taught umpteen years ago. The other two are the ones my grandsons learned a few years ago during the midst of the math wars. I don’t know if any of these were being taught in elementary schools throughout Alberta. I presume not, otherwise there would be little need for the Summary of Clarifications published here. (Alberta Education does provide a very simple example of a division algorithm. It resembles the one I learned as a youngster — one which I did not really understand at the time.)

A brand new wrinkle in the curriculum is that the Math 6 PAT's will now have a no-calculators-allowed part: 
Part A [of the PAT] is a 15-question test including five multiplication/division questions, five "connecting experiences" questions, and five "number relationship" questions, according to a guide for testers. Calculators are not permitted, and the test is designed to be finished in 15 minutes. 
— Edmonton Journal, Sep 1.
Some people have said that the time pressure created by this part of the test helps prepare the students for the pressure they will have in the real world. (Whose real world is that? Mine? The teacher’s? Part of education is to help students understand their real world, which is radically different from an adult's real world. I think that a child's real world provides plenty of pressure without a timed test.)


Do the PAT scores mean that more changes are needed?


Without knowing how the cut points for the PAT tables are determined, and without knowing why the average test scores are increasing while the percentage of successful students is decreasing, it is difficult to tell if the PAT results indicate that more curriculum changes are needed.

I'm not sure that the PAT results even matter — everyone knows that the real driver for curriculum change is the belief that we must get higher scores on international assessments like PISA. However, as signalled by the media headlines, the grade 6 PAT scores will surely be used to apply more pressure to David Eggen's real world.


A final thought


Currently I'm involved in preparing material that ties our math fair puzzles to the Alberta K-6 mathematics curriculum and to the American Common Core State Standards for K-8 mathematics.

By necessity, I have spent quite a bit of time examining the Alberta K-6 part of the curriculum. It is based upon four mathematical strands and seven overarching mathematical processes which together provide ample mathematical proficiency. I don't think that wholesale rejigging is needed. I feel that the curriculum as it stands already encourages the growth of mathematical thought along with practice and understanding of the basic numeracy skills. I hope this balance will not be upended by the new curriculum.


Thursday, 7 July 2016

Three simple ordinal number puzzles


There are two fundamentally different ways that we understand numbers, namely as cardinals and ordinals. When we think of a number as magnitude or size, that's a cardinal number. When order and sequence are the things we want, then we are dealing with ordinal numbers.

In a previous post I related how I handle ordinal numbers more easily than ordered sequences of other things (like the English alphabet). I thought it might be interesting to create a couple of puzzles that deal with ordinal numbers.

Here are three puzzles that are intended for children in the elementary grades, but they can be scaled up to higher levels. The first two are quite new (but I recently discovered a version of the first one in Kordemsky's "Moscow Puzzles".  The third puzzle is one from a small set that I made up over 15 years ago.

Each puzzle asks you to rearrange a set of numbered disks (or tiles) so that they are in numerical order. What makes the puzzles a challenge is that there are restrictions on how you can move the disks.


Swap positions 1


For this puzzle you require five disks numbered 1 though 5. Place them in a line in random order:




The puzzle is to rearrange the numbers so that they are in order:







You cannot just rearrange them any way you want.


Rules:

  • You must solve the puzzle in a step by step fashion.
  • For each step you must swap the positions of two numbers.



Here are two possible steps for the puzzle above. (Not saying that these steps are part of the solution.)



You always have to swap two numbers. You are not allowed to squeeze one number between two others like this:




Small Spoiler:  (Use your mouse to select the hidden text in the following box.)

With these rules, the solver may soon figure out that you can swap two numbers that are next to each other and successively move a number up and down the line (thereby inventing an algorithm that solves the puzzle).



Swap Positions 2


Let's adjust the rules and make it just a bit more difficult. And because it is more challenging, try it using four numbers instead of five.


New Rules:

  • You must solve the puzzle in a step by step fashion.
  • For each step you must swap the positions of two numbers.
  • The two numbers must have at least one other number between them.


This is what you can and cannot do under the new rules:





Some other modifications are suggested at the end of this post.




The criss-cross puzzle


For this puzzle you will need a grid of squares forming a cross, and three tiles numbered 1, 2, 3 that will fit in the squares of the grid:






Place the tiles in the grid in the order 3-2-1 from left to right. By pushing the tiles within the grid, rearrange them so that they are ordered 1-2-3 from left to right. The tiles must end up in the position shown.






Rules:

  • The tiles must always move and stay inside the grid.
  • You can push tiles up, down, left, or right as long as you stay inside the grid. You cannot push a tile past, or over, or under another tile.
  • If two or more tiles are touching you can push them together at the same time. 
  • You cannot separate tiles that are touching by pulling them apart in the same line.







This move is OK. Here, the three tiles have together been pushed to the left.







This move is OK. You can push a tile up or down as long as it stays inside the grid.







But this move is Not OK. You are not allowed to pull tiles apart in the same line if they are touching.





You can also try the puzzle with different starting positions:




Note:  Depending upon the student(s), you might want to prepare them for the puzzle by letting them get familiar with moving the tiles. For example, have them figure out how to solve the following puzzle which only asks them to switch the position of the red tile. It's an interesting puzzle in its own right. The rules for moving the tiles are the same.






Possible modifications

Both swap positions puzzles can be solved for any set of numbers, e.g.,



What if we change the rules so that you are only allowed to swap numbers if they have at least two numbers between them? What other modifications could you make for the puzzle?


You can extend the criss-cross puzzle (to include more tiles) by changing the playing board. Here are two possibilities. The rules remain the same.





Computer science students: Can you write a program that solves these puzzles?