Underlying, Deep, Critical?

Here’s a very reasonable statement, from this book, on techniques used by researchers to investigate conceptual knowledge of arithmetic:

The most commonly used method is to present children with arithmetic problems that are most easily and quickly solved if children have knowledge of the underlying concepts, principles, or relations. For example, if children understand that addition and subtraction are inversely related operations, even when presented with a problem, such as 354297 + 8638298 – 8638298, they should be able to quickly and accurately solve the problem by stating the first number. This approach is typically called the inversion shortcut.

Although, this borders on the problematic (for me at least). Why should ‘underlying’ be a prerequisite for calling something conceptual knowledge as opposed to plain old knowledge? Even the straightforward addition and subtraction here presumably requires knowing what to do with the numbers and symbols presented in this (likely) novel problem and thus involves conceptual knowledge of some kind.

Still, it makes some sense to distinguish between knowing how to add and subtract numbers and knowing that adding and then subtracting (or subtracting, then adding) the same number is the same as adding zero (or doing nothing). But the following just a few paragraphs later doesn’t make much sense to me:

The use of novel problems is important. Novel problems mean that children must spontaneously generate a new problem solving procedure or transfer a known procedure from a conceptually similar but superficially different problem. In this way, there is no possibility that children are using the rote application of a previously learned procedure. Application of such a rotely learned procedure would mean that children are not required to understand the concepts or principles being assessed in order to solve the problem successfully.

deep learning

The biggest problem is that the concept of ‘conceptual knowledge’ of arithmetic laid out here relies on the fact that the “inversion shortcut” is not typically taught as a procedure. But it seems easily possible to train a group of students on the inversion shortcut and then sneak them into a research lab somewhere. After the experiment, the researcher would likely decide that all of the students had ‘conceptual knowledge’ of arithmetic, even though the subjects would be using the “rote application of a previously learned procedure”—something which contradicts the researcher’s own definition of ‘conceptual knowledge’. On a larger scale, instead of sneaking a group of trained kids into a lab, we could emphasize the concept of inversion in beginning arithmetic instruction in schools. If researchers were not ready for this, it would have the same contradictory effect as the smaller group of trained students. If the researchers were ready for it, then the inversion test would have to be thrown out, as they would be aware that inversion would be more or less learned and, thus (for some reason) not qualify as conceptual knowledge anymore.

Second, why should adding and subtracting the numbers from left to right count as an application of a rote procedure (which does not evidence conceptual knowledge) rather than as a transfer of a known procedure from a conceptually similar but superficially different problem (which does show evidence of conceptual knowledge)? The problem is novel and students would be transferring their knowledge of addition and subtraction procedures to a situation also involving addition and subtraction (conceptually similar) but with different numbers (superficially different).

Clearly I Don’t Get It

I still see the value of knowing the concept of inversion, as described above. A person who notices the numbers above and can solve the problem without calculating (by just stating the first number given) is, most other things being equal, at an advantage compared to someone who can do nothing else but start number crunching (it’s also possible to not notice the equal numbers because you’re tired, not because you lack some as-yet undefined ‘critical thinking’ skill). What constantly perplexes me is why people insist on making something like knowing the inversion shortcut so damned mysterious and awe-inspiring.

You can know how to number crunch. That’s good to know. You can also know how to notice equal numbers and that adding and then subtracting the same value is the same as adding 0. That’s another good thing to know. The latter is probably rarer, but that fact alone doesn’t make it a fundamentally different kind of knowledge than the former. It is almost certainly rarer in instruction than calculation directions, so it should be no surprise that students are weaker on it generally. Let’s work to make it not as rare. A good place to start would be to acknowledge that inversion is not some deep or critical knowledge; it’s just ordinary knowledge that some people don’t know or apply well.

Coda

The section in question concludes:

Other concepts, such as commutativity, that is if a + b = c then b + c = a, have been investigated, but as they have not received as much research attention it is more difficult to draw strong conclusions from them compared to the concepts of inversion and equivalence. Also, concepts, such as commutativity are usually explicitly taught to children so, unlike novel problems, such as inversion and equivalence problems, it is not clear whether children are applying their conceptual knowledge when solving these problems or applying a procedure that they were taught and the conceptual basis of which they may not understand.

But how does it show that ‘conceptual knowledge’ is applied when we test students on something they haven’t been taught (do not know)? Where is the knowledge in conceptual knowledge supposed to be coming from? As long as it’s not from the teacher, it must be ‘conceptual’?


 

Just Some Data

I‘ve got nothing much lately. Here’s some data I’ve been playing with from the Department of Ed. It might take a second or two (or ten) to load.

These are data showing school-wide (all grade levels) state-assessment mathematics achievement for over 68,000 schools in the United States for the 14–15 school year. Each point represents a school, and each school’s location on the plot represents (x) the percent of male students at the school scoring at or above proficient and (y) the percent of female students at the school scoring at or above proficient.

You’ll notice some rectangularity to the data. This is due to the fact that many of the percent-proficient values were given as ranges. For each gender reported, I translated the data to the top value of the range. So, if a school reported 50–54 percent at or above proficient for females and 50–54 percent at or above proficient for males, that school would be placed at (54, 54).

Another noticeable feature of the plot is that it doesn’t look at all like there are over 68,000 points represented. This is because many of the values are stacked on top of each other. The lightest shade of blue that is present on the plot is the color of every data point, so if you’re seeing dark blue, you’re likely seeing 4 or 5 schools all at one location.

The data cut straight down the middle, as you might expect—perhaps much closer to the middle than might be expected. So, in general, the scores for males and females on state math assessments are very close. The regression line is \(\mathtt{y = 0.9396x + 3.98204}\), which shows an almost indiscernible advantage for the boys across all these data.

The regression line shows us that the data create a prediction that, given a male percent proficient or above from 0% to about 65%, you would predict a better female performance. From 67% upward, that prediction is reversed. You can see from the data points that what seems to weigh the line downward is what you might call outlier male-female disparities at the top of the range.


Building Systems

It’s always fun to build things that allow for (a) generative responses and (b) flexibility in responding. This “in action” video from our upcoming lesson app on systems of equations does those two things.

Students are asked to build a linear system with a given solution (generative), and there are an infinite number of ways of doing this (flexibility)!

We’re always looking for ways to incorporate generativity and flexibility in students’ work, along with the more typical stuff. It helps make the learning a little more interesting without dispensing with the rigor.


Transfer and Forgetting

I discovered a paper recently whose title is probably more interesting than its content: Unstable Memories Create a High-Level Representation that Enables Learning Transfer. Quite a thought—that the instability of memory could be advantageous for transfer.

Researchers conducted two experiments, asking participants in the first experiment to learn a word list and then a motor task and in the second experiment a motor task and then a word list. There were three conditions within each experiment: (1) the word recall and motor task had the same structure (see the supplemental material for how ‘same structure’ was operationalized here), (2) the two tasks had different structures, and (3) the tasks had no determined structure.

It’s Not “Transfer”, It’s Domain Similarities

transfer

In the first experiment, participants first learned the word list and then their skill at the motor task was measured over three practice blocks. When the word list and motor task were of the same structure, participants did significantly better across the three motor-task practice blocks. Similarly, in the second experiment, after the motor skill was learned, participants who then practiced the word list with a similar structure to the motor task improved significantly more than participants in the other conditions. This improvement on an unrelated though similarly structured task was measured as transfer, and it occurred in both directions.

Somewhat surprisingly, however, this transfer of learning between word and motor tasks (or motor and word tasks) was correlated with a stronger decrease in performance on the original task, when participants were tested 12 hours later. That is, subjects who learned the word list and then successfully transferred that learning to the motor task (because the tasks were of similar structure) showed a sharper decline in their word list recall than subjects in other conditions. The same results appeared in the experiment where subjects first learned the motor task and then the word list.

At first blush, this seems obvious. The subjects who actually transferred their learning saw their learning on the original task displaced by the similarly structured and thus interfering second task. But when researchers inserted a 2-hour interval between the original task and the practice blocks, this decline disappeared—and the transfer learning was no longer present. Thus, it seems that not only the similar structure of the two tasks but also the weakness of the memory for the first task were both responsible for the effective transfer learning. The authors put it this way:

By being unstable, a newly acquired memory is susceptible to interference, which can impair its subsequent retention. What function this instability might serve has remained poorly understood. Here we show that (1) a memory must be unstable for learning to transfer to another memory task and (2) the information transferred is of the high-level or abstract properties of a memory task. We find that transfer from a memory task is correlated with its instability and that transfer is prevented when a memory is stabilized. Thus, an unstable memory is in privileged state: only when unstable can a memory communicate with and transfer knowledge to affect the acquisition of a subsequent memory.

Forgetting, Spacing, and Transfer

This is intriguing. In some sense, this reinforces results related to the spacing effect. Spacing causes forgetting which causes “unstable memories.” When learning is revisited after a period of forgetting, it finds this unstable memory in a “privileged state”: a state which allows it to strengthen the connections of the original learning.

But the above also suggests that extending learning for transfer to other situations or to other concepts may be done optimally in concert with spaced practice. In other words, the best time for transfer teaching might be after a space allowing for forgetting.


Spacing and The Practice Meter

Without a doubt, students need to practice mathematics thoughtfully. Classroom instruction of any kind is not enough. Practicing not only helps to consolidate learning, but it can be a source of good extended instruction on a topic. And in recent years, research has uncovered—or rather re-uncovered—a very potent way to make that practice effective for long-term learning: spacing.

Dr. Robert Bjork here briefly describes the very long history and robustness of the research on the effectiveness of spacing practice:

It seems that not only is spaced practice more effective than so-called “massed” practice, but spaced learning is more effective than massed learning. A recent study by Chen, Castro-Alonso, Paas, and John Sweller, for example, provides some evidence that spaced learning is more effective than massed learning for long-term retention because spaced learning does not deplete working memory resources to the same extent as massed learning.

In one experiment, Sweller, et al. provided massed and spaced instruction on operations with negative numbers and solving equations with fractions to counterbalanced groups of 82 fourth grade students (from a primary school in Chengdu, China) in regular classroom settings. In both conditions, students were instructed using three worked example–problem-solving pairs. A worked example was studied and then a problem was attempted—for a total of three pairs (they were not presented together). In the massed condition, these pairs were given back to back, for 15 minutes. In the spaced condition, this same 15 minutes was spread out over 3 days.

In both conditions, a working memory test was administered immediately after the final worked example–problem-solving pair. And immediately following the working memory test, students were given a post-test on the material covered in the instruction. In the massed condition, this post-test occurred at the end of Day 1. In the spaced condition, the post-test occurred at the end of Day 4.

Students in the spaced condition scored significantly higher on the post-test than students in the massed condition. And there were some indications that working memory resource depletion had something to do with these results.

In the absence of…stored, previously acquired information, it was assumed that for any given individual, working memory capacity was essentially fixed. Based on the current data, that assumption is untenable. Working memory capacity can be variable depending not just on previous information stored via the information store, the borrowing and reorganizing, and the randomness as genesis principles, but also on working memory resource depletion due to cognitive effort.

Shorter, Smaller Chunks

spacing and practice meter

Taken together, the research on the spacing effect for both practice and instruction suggests that both instruction and practice should happen in shorter, smaller chunks over time rather than packed all together in one session.

As an example of this, here is a video of a module from the lesson app Add and Subtract Negatives. The user runs through this very quickly (and correctly), skipping the video and worked examples on the left side and the student Notes—and a lot of other things that accompany the instructional tool—to demonstrate how the work of this module flows from beginning to end. The Practice Meter is shown in the center of the modules (and instructor notes) on the homepage as a circle with the Guzinta Math logo. If you want to skip most of the video, just forward to the end (2:11) to see how the Practice Meter on the homepage changes after completing a module.

You can see that the Practice Meter fills up to represent the percent of the lesson app a student has worked through (approximately 55% in the video). Although not shown in the video above, hovering over the logo on the homepage reveals this percent. The green color represents a percent between 25 and 80. Under 25%, the color is red, and above or at 80%, the color is blue.

Whether or not the lesson is used in initial instruction, the Practice Meter fades over time. Specifically, the decay function \(\mathtt{M(t) = C \cdot 0.75^t}\) is used in the first week since either initial instruction or initial practice to calculate the Practice Meter level, where \(\mathtt{C}\) represents the current level and \(\mathtt{t}\) represents the time since the student last completed a module.

In our example above, during the first week after initial instruction or practice, the student’s Practice Meter level of 55 will drop into the red in about 3 days. If she returns to the app in 15 minutes to see a Practice Meter level of 54 and then raises that up to an 80 by completing the same module again or a different module (100 is max score at any time), then her Practice Meter level will drop to below 25 in about 4 days. If she raises it up to 100, then that will decay to below 25 in a little less than 5 days.

This fairly rapid decay rate applies only to the first week. After Day 7, and up until Day 28, the decay rate changes to \(\mathtt{M(t) = C \cdot 0.825^t}\), whether the student practiced during that time or not. This provides some incentive for spacing out practice a little more over time. Mapping this onto our example above, an initial Practice Meter level of 55 would decay to below 25 in a little over 4 days. A level of 80 would decay to below 25 in a little over 6 days, and a level of 100 would take about 7 and a half days to go red.

There are also decay rates for 28–90 days and after 90 days. For more information, see this Practice Meter Info page, which comes with the instructor notes in every lesson app.

(Lack of) Implementation Notes

The design of the Practice Meter is such that, if a student does not use a lesson for spaced practice, he or she will feel no interruption in their use of it. And it is important to implement it in a way that does not create extra responsibilities for the student if they aren’t required by their teacher. But if students and parents or students and their teachers do want to implement spaced practice, it can be easy to check in on the Practice Meter every so often, asking students to, say, keep their Practice Meter levels above 25 or above 80—perhaps differentiating for some students to start—at regular check-in intervals.

As always, though, implementing shorter and smaller in both instruction and practice is much more difficult than reading about it in research, especially when current practice or one’s institutional culture may be focused on “more” and more massed instruction and practice. But conclusions about spacing drawn from research are not regal edicts. We can keep them in mind as ideas for better practice and work to implement the ideas in the small ways we can—and then eventually in big ways.

Update: The Learning Scientists’ Podcast features a brief discussion of lagged homework, which definitely connects to what I discuss above. Henri wrote up something about it a few years ago.

 

Welcome to Guzinta Math’s Blog

Okay, all set. Welcome, welcome, welcome to the Guzinta Math blog! Here you’ll see things about what we’re developing and thinking about at Guzinta Math, along with content we can’t yet predict.

For the past two years, and likely for the next year or so, we have been and will continue to be mostly heads-down developing supplemental math lesson apps for middle school—6th, 7th, and 8th grades. As of this writing, we have fifteen Grade 6 lesson apps, seven Grade 7 lesson apps, and one Grade 8 lesson app. All of our math lesson apps are FREE Chrome apps, which can be found and installed on the Chrome Web Store or on our website. The Grade 7 and 8 lesson apps are compatible with Chrome OS (Chromebooks) only, and the Grade 6 apps are currently compatible with all operating systems (though that will stop being the case in early 2018). We have plans down the road to make sure that all of our apps work (in Chrome) with all operating systems. Stay tuned.

How Do the Apps Work?

Each lesson app contains 3 modules and instructor notes, which can be downloaded as a PDF. Each module contains video instruction, text or animated worked examples, and/or interactive instructional tools, along with short-response questions for students. Below, for example, is a short “in action” video from the first module of the Grade 6 Equations and Inequalities lesson app. It features Sal the Lizard, an interactive instructional game for teaching students about inequalities and graphing inequalities. Not shown here is that students are given audio directions for the interactive when they click on it.

The lesson app Equations and Inequalities is a bit of an outlier, because it is heavy on interactives. There is one in each module of that lesson app, actually. This is usually not the case. But the above layout is what you see in pretty much every module of every lesson app—interactives, videos, and worked examples on the left side and short-response questions on the right. Both sides are separately scrollable, and both sides also automatically scroll as students answer questions. You can see this in the “in action” video below, from the Grade 6 Quadrants in the Plane lesson app.

This is also a bit of an outlier, in that both sides scrolled together after just one question. That happens in the Quadrants in the Plane lesson app for a few questions, because we need the full right side to allow students to plot points to complete the problem. In most cases, a worked example, video, or interactive is used for a handful of questions on the right. These scroll automatically as students answer them, and then the left side scrolls automatically once the lesson is ready to move on to the next example. You can see that when you answer incorrectly, a red X is shown for a moment. When you answer correctly, a yellow star appears and a “ding” plays. No penalty applies to answering questions incorrectly. The red X is not accompanied by any sound at all.

We’ll come back to talk about features in a different post. But one thing that is visible in the video above is the drawing canvas, which is powered up by clicking on the left sidebar—the sidebar is present in all three modules of every lesson app. Turning on the drawing canvas allows teachers and students to draw all over the top of the screen—to highlight information, write notes, etc. You can choose four different colors: red, green, black, and blue.

When students complete all the items in a module correctly, an “applause” is played and a certificate of completion is displayed, which can be downloaded by a simple click. Below is an example of a certificate from the Grade 7 Proportion Equations lesson app.

certificate of completion welcome

More to Come

Okay, we’ve gone on with this welcome for far too long already. Thanks for sticking it out, and watch this space in the future for longer-form updates on our lesson apps, their features, and middle school math in general.