Fall 2011 to Spring 2012
Fall 2012 to Spring 2013
Fall 2013 to Spring 2014
Fall 2014 to Spring 2015
Waltham, Northeast Elementary School, Grade 3
Purpose and context:
The purpose of this five-week trial in a class of 23 grade three students was to investigate how technology that allows students to draw and use representational tools can support their learning the concepts of multiplication and division. The school in which we worked has been using the Math in Focus (MiF) curriculum for three years, after many years of using Trailblazers. Students’ math reasoning (and their test scores) had improved significantly since the switch to MiF, but the math coach still felt that students would benefit from using our software, as MiF provides limited opportunities for students to create their own representations, relying instead as asking them to label pre-created representations.
Students used CLP in the context of a MiF unit called “Multiplication Tables for 6, 7, 8, and 9.” With the support of the classroom teacher and math coach, we added instruction on division with those numbers. Porting the unit to the tablets involved significant modification of the curriculum to add opportunities for students to use our tools to create their own mathematical representations. Instead of focusing on multiplication tables, algorithms, and division as “turning around multiplication facts,” as was the case in MiF, we created lessons based on teaching underlying concepts of multiplication and division using a grouping model and illustrated these concepts with the CLP tools of stamps, arrays and number lines. We encouraged students to make their own choices about how, whether and when to use the tools. Students did the vast majority of their work on tablets during this 5-week period, although they still did homework and occasional classwork on paper.
Data collected:
As formal assessments, we used the pre- and post-tests that came with MiF. We collected test results from all three third grade classes in the school, with the other two serving as controls for the class that used CLP and our modified curriculum. Students in our experimental class were allowed to do the post-test assessment using CLP on the tablet. We also collected observation notes for each day of the trial; student electronic math notebooks, which include re-playable interaction histories, as well as final representations for each page; pre-unit, mid-unit, and post-unit interviews for six students, and post-unit interviews with two additional students; and some work done on paper for homework and during class.
Data analysis:
One of the advantages of using CLP for student work is that all of a students’ actions using the software are logged, so that we are able to analyze not only the final answer, but the student’s process of arriving at the answer. In order to analyze student work in a consistent and reproducible way, we developed a coding manual, using a grounded theory approach and several rounds of coding and re-coding a representative subset of work. Codes are in two categories: History codes reflect the sequence of steps that students took in solving the problem, primarily creating and manipulating representations. Analysis codes reflect our interpretation of the sequence of student actions, e.g. whether a representation was a correct one for a particular problem or whether a student had made a common error, such as skip-counting by the wrong dimension on an array.
Our team of 4 coders (including the two PIs) has coded all of the post-tests for the experimental classroom and is in the process of doing tests of inter-rater reliability. We are also developing scripts for analyzing the history and analysis codes. Our eventual goal is to do much of this analysis automatically, with CLP generating as many of the history and analysis codes as possible. If CLP can take over much of the coding labor, we will be able to analyze the rest of the students’ work from the trial, with the potential of learning a great deal about how students’ use of representations changed over the course of the trial.
We highlight here two preliminary findings and then include several other examples of questions we are pursuing with these data.
We were surprised to discover that sometimes students recorded their answer (more often than not the correct answer), then created a representation that would have enabled them to figure out the answer. Sometimes they even needed the answer to create the representation; for example, for the problem 56÷8, some students would enter the answer “7,” then create a 7x8 array. We have created codes for both of these situations so that we can see how prevalent they are, whether they occur more for some kinds of problems than others and whether some students tend to carry out this sequence of actions more than others. We also have the unanswered question: Why did these students, who seemed to know the answer to a problem without the use of a representation, still create a representation (and sometimes more than one)? We have several hypotheses, including: 1) Some students found it fun to use the computer tools, even if they didn’t have to use them to get an answer; 2) Using the computer tools was a way to check an answer; 3) There was a growing “norm” in the class that creating a representation was a good thing, appreciated by the teacher, the math coach and the software designers. Our ongoing analysis will attempt to distinguish among these, if possible, by examining other details of the situations in which students created representations after filling in answers.
Another finding that is emerging from our analyses is that individual students appear to have preferences for particular modes of representational use. Some students, for example, used almost no representations, while others (almost all girls) frequently used stamps, which afforded them the opportunity to draw. Some students used number lines as a default for division problems, but at least one student would not use a number line for a partitive division problem, since the representation suggests a known group size rather than a “dealing out” scenario. The codes we have developed will allow us to report how often each student used each representational tool and for what kinds of problems.
Other questions we are investigating with the coded student work include:
Did students use number lines more often for division than for multiplication problems? Did some students use number lines exclusively for one or the other operation?
How often and in what circumstances did students use CLP’s number line feedback (i.e., what size arc they had drawn) to modify their jumps? Preliminary analysis suggests that this happened quite frequently; further analysis will be necessary for us to describe whether it was due to students having trouble with the precision of the tablet pen or whether they were unsure of an arc’s ending number.
Which representations did students try to use then delete? Can we interpret these as students not being comfortable with particular representations? Can we tell what aspect of the representation was problematic for students?