A front page article in the Herald-Leader described mathematicians using calculators with a software program called Zoom Math to take the ACT Compass, a test used for college placement purposes. The mathematicians said they were conducting an experiment about Zoom Math.
I think they had an experience, not an experiment.
Using Zoom Math, they answered only the items on the test where the software program was applicable. On the remaining items, they marked the first answer.
Their scores on the test, essentially by getting all of the Zoom Math items correct, exceeded an arbitrary cut-point labeled college readiness. I guess they are college ready.
Never miss a local story.
They concluded "about 55 percent of the COMPASS problems can be done without thought using Zoom Math."
On another report, they said "Our experiment demonstrates that essentially any student who has a calculator equipped with Zoom Math and who has been trained to use it can become college ready in mathematics in Kentucky and will be guaranteed placement into a college-level math class at any public college or university in our state."
It should be obvious what they really want: Don't let students use Zoom Math because "any student, without thought" can be labeled "college ready" simply by knowing how to manipulate the software.
But, how does one get from college professors using a piece of software on a calculator taking a test to conclusions about not needing any thought to answer questions? And, who is any student? What does is mean to do something without thought? And how are mathematics professors, and their mathematics knowledge, comparable to that of the so-called any student? Their "experiment" is about them, not about students.
Their conclusions, then, are nonsense. And, their experience demonstrates nothing about thought or students.
Actually they wasted a lot of time getting those results. They could have gotten a very good estimate of what they found merely by scoring the test according to answering all the "calculator software" items correctly plus adding one-fifth (assuming the answers are assigned at random) of the other items.
It would have saved time and made more sense. One would then know directly the influence of certain items on a total test score.
There is a body of research about the effects of the use of calculators and software on performance on tests. It is true that under certain conditions, particular students using specific software have an advantage. There are many other settings, however, where there is no advantage.
What is the case for Zoom Math? This experience provides no information about that question, with the possible exception that mathematicians can use the software.
Perhaps, it would be appropriate for these mathematicians to review the research on calculator and calculator software use, not only for the findings but also to learn how to conduct experiments that yield applicable results and valid conclusions.
It is particularly difficult, perhaps impossible, to conceive of a valid study about how students use calculators and software that first does not include students.