Difference between revisions of "DiBiano Personally Relevant Algebra Problems"

From LearnLab
Jump to: navigation, search
m (Reverted edits by Kamearobinson (Talk); changed back to last version by Dibiano)
 
(10 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== Robust Learning in Culturally and Personally Relevant Algebra Problem Scenarios ==
+
== The Effect of Context Personalization on Problem Solving in Algebra ==
  ''Candace DiBiano, Anthony Petrosino, Jim Greeno, and Milan Sherman''
+
  ''Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman''
  
 
=== Summary Tables ===
 
=== Summary Tables ===
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
| '''PIs''' || Candace DiBiano & Anthony Petrosino
+
| '''PIs''' || Candace Walkington & Anthony Petrosino
 
|-
 
|-
| '''Other Contributers''' ||  
+
| '''Other Contributors''' ||  
 
* Graduate Student: Milan Sherman
 
* Graduate Student: Milan Sherman
 
* Staff: Jim Greeno
 
* Staff: Jim Greeno
 
|}
 
|}
 
<br>
 
<br>
'' Pre Study ''
+
'' Pilot Study ''
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
| '''Study Start Date''' || 09/01/08
+
| '''Study Start Date''' || September 2008
 
|-
 
|-
| '''Study End Date''' || 12/15/08
+
| '''Study End Date''' || May 2009
 
|-
 
|-
| '''Study Site''' || Austin ISD, Texas & Learnlab Site
+
| '''Study Site''' || Austin, TX
 
|-
 
|-
| '''Number of Students''' || ''N'' = 200
+
| '''Number of Students''' || ''N'' = 24
 
|-
 
|-
| '''Average # of hours per participant''' || 3 hrs.
+
| '''Average # of hours per participant''' || 2 hrs.
 
|}
 
|}
 
<br>
 
<br>
  
'' Full Study ''
+
'' In Vivo Study ''
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
| '''Study Start Date''' || 9/1/09
+
| '''Study Start Date''' || October 2009
 
|-
 
|-
| '''Study End Date''' || 12/15/09
+
| '''Study End Date''' || April 2010
 
|-
 
|-
| '''LearnLab Site''' || TBD
+
| '''LearnLab Site''' || Hopewell High
 
|-
 
|-
 
| '''LearnLab Course''' || Algebra
 
| '''LearnLab Course''' || Algebra
 
|-
 
|-
| '''Number of Students''' || ''N'' = 60-90
+
| '''Number of Students''' || ''N'' = 111
 
|-
 
|-
| '''Average # of hours per participant''' || 1 hr
+
| '''Average # of hours per participant''' || 3 hr
 
|-
 
|-
| '''Data in DataShop''' || n/a
+
| '''Data in DataShop''' || Yes - Personalization Hopewell 2010
 
|}
 
|}
 
<br>
 
<br>
  
 
=== Abstract ===
 
=== Abstract ===
In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001).  However, observations and discussions with teachers in Austin ISD suggest that the problem scenarios are disconnected from the lives of typical urban students.  This study will examine whether and the mechanisms by which cultural and personal familiarity with problem scenario context affect comprehension and [[robust learning]].  We will use the medium of [[cognitive tutor|Cognitive Tutor]] Algebra for the in-vivo portion of this study, but our aim is not to improve the quality of the software’s problem scenarios.  It is instead to study how student diversity affects cognition, motivation, and learning, by using the power of a computer system that has the ability to do what classroom teachers cannot – [[personalization|personalize]] each problem to the background and interests of each individual student.
 
  
The research will begin in Fall of 2008 with a study of the cultural and personal interests of urban students in Austin ISD and at a Learnlab site in Pittsbrugh.  Freshman algebra students will be surveyed and interviewed over their interests, such as sports, music, movies, etc., and the results of this study will be used to rewrite the algebra problem scenarios in one section of the [[cognitive tutor|Cognitive Tutor]] software.  In the Fall of 2009 at the Pittsburgh Learnlab site the [[cognitive tutor|Cognitive Tutor]] software will be programmed to give students an initial interests survey, and then select problem scenarios that match user interestsThe resulting [[robust learning]], measured by ''normal post-test'', ''delayed post-test'', ''curriculum progress'' and ''mastery of knowledge components'', will be analyzed with a 3-group design to measure the effects of the [[personalization]].
+
In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001). However, observations and discussions with teachers have suggested that some of the Cognitive Tutor problem scenarios may be disconnected from the lives and experiences of many students.  This study investigated whether students’ personal interest in story contexts affects performance and [[robust learning]].  
  
This research will be integrated with a study in the Spring of 2009 incorporating [[think-aloud data|think-aloud protocols]] of students solving algebra problems in familiar and unfamiliar contexts to examine how [[personalization|personal relevance]] and cultural familiarity interacts with conceptual difficulty in interpreting the problem.
+
The first stage of this research was a pilot study of the personal interests of students at an urban Texas high school. Freshman algebra students were surveyed and interviewed about their out-of-school interests, and were also asked to describe how they use mathematics in their everyday lives. Twenty-four of these students solved a number of Cognitive Tutor Algebra-style problems while thinking aloud. Results of this pilot study were used to critically examine the idea that personalization of story problems has the potential to support student learning, using qualitative data analysis methods.
 +
 
 +
The second stage of this research was an “in vivo” study that took place in Fall of 2009 at a Pennsylvania Learnlab site. Based on the results of the pilot study and additional student surveys from Pennsylvania, the 27 problems in Section 5 ("Linear Models and Independent Variables)" of the [[cognitive tutor|Cognitive Tutor]] software were rewritten to each have 4 “personalized” versions corresponding to different student interests. The [[cognitive tutor|Cognitive Tutor]] software was programmed to give participating students an initial interests survey, and then select problem scenarios that match their interests.  The resulting [[robust learning]], measured by a delayed post-test (measuring long-term retention), and mastery of knowledge components in a future section (measuring transfer), has been analyzed with a 2-group design (experimental vs. control) to measure the effect of [[personalization]] on learning. Measures from within Section 5 were also analyzed to measure the effect of personalization on performance.
  
 
=== Background and Significance ===
 
=== Background and Significance ===
  
This research direction was initiated by the observation of classrooms in Austin, Texas using the [[cognitive tutor|Cognitive Tutor]] Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching career. Teacher complaints were consistently centered not around the interface, the feedback, or the cognitive model of the software, but on the problem scenarios. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the [[cognitive tutor|Cognitive Tutor]] problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech.  It’s important to note that as part of the development of the PUMP Algebra Tutor (PAT), teachers had designed problems to be "culturally and personally relevant to students" (Koedinger, 2001). This research is designed to empirically test the claim that the cultural and personal relevance of problem scenarios affects [[robust learning]].   
+
This research direction was initiated by the observation of classrooms in Texas using the [[cognitive tutor|Cognitive Tutor]] Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching practice. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the [[cognitive tutor|Cognitive Tutor]] problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech.  A review of the literature showed limited evidence for the potential of relevant story contexts to increase learning, and little research had been done at the secondary school level. This study is designed to empirically test the claim that the personal relevance of story problems affects [[robust learning]] and performance.   
 +
=== Theoretical Framework===
 +
 
 +
This study is situated in the new “Motivation and Metacogntion” thrust.  The foundation of this study is that relevance of problem scenarios affects robust learning through increased intrinsic motivation (Cordova & Lepper, 1996). If learners that have the cognitive capacity to solve algebra story problems, enhancing motivation may increase their likelihood to exert effort to make sense of the scenarios by forming a more elaborated and better connected situation and problem models (Nathan, Kintsch, & Young, 1992), thus encouraging generative processing (Mayer, 2011). Mayer (2011) states the personalization principle as “People learn better when the instructor uses conversational style rather than formal style” (p. 70). Here, we are use the PSLC’s modified version of this principle, which states “Matching up the features of an instructional component with students' personal interests, experiences, or typical patterns of language use, will lead to more robust learning through increased motivation, compared to when instruction is not personalized.” This is related to what Mayer (2011) refers to as the “Anchoring” principle.
 +
 
 +
The construct through which personalization enhances intrinsic motivation is through increased personal interest (also called individual interest). Personal interest is considered to be stable, enduring preferences that individual learners bring with them to different situations (Anderman & Anderman, 2010). Interest promotes more effective processing of information and greater cognitive engagement. Students who have high interest may be more likely to relate new knowledge to prior knowledge and form more connections between ideas. They also may be more likely to generate inferences, examples and applications relating to the subject area they are trying to learn (Ormrod, 2008).
 +
 
 +
=== Pilot Study===
 +
 
 +
The first stage of this research began in Fall of 2008 with a pilot study of personalization at an "Academically Unacceptable" school in Texas (75% free/reduced lunch).  Twenty-four freshman algebra students were interviewed about their out-of-school interests, such as sports, music, movies, etc., and were also asked to describe how they use mathematics in their everyday lives. These interviews were audio recorded, and were used to write each student “personalized” algebra story problems. The research questions being investigated were:
 +
 
 +
* What is the impact of personalizing algebra story problems to individual student experiences, in terms of strategy use, language comprehension, and students’ epistemological frames about mathematical activity? (qualitative)
 +
 
 +
* How does personalizing algebra story problems to individual experiences impact student performance, when compared to their performance on normal story problems from the Cognitive Tutor curriculum with the same underlying structure? (quantitative)
 +
 
 +
A problem set containing five algebra problems on linear functions was written for each student; two of these were story problems that were personalized to the ways in which the individual student described using mathematics in their everyday life during their initial interview. The problem set also contained normal story problems from the Cognitive Tutor curriculum, completely abstract symbolic equations, story problems that contained symbolic equations, and story problems with simplified language and general referents (“generic” story problems). Each problem had four parts – the first two parts were “Result Unknowns” or “concrete cases” (i.e. solve for y given this x), and the fourth and final part was a “Start Unknown” (i.e. solve for x given this y).  For normal, personalized, and generic problems, the third part of each problem asked students to write a general symbolic equation or “algebra rule” representing the story. For normal story problems that already contained equations, students were asked to interpret the parameters in terms of the story.  For completely abstract symbolic problems, students were asked to write a story that could go with the equation.
 +
 
 +
Each of the 24 students was given their problem set of 5 problems, and asked to solve each problem while “thinking aloud” and being audio recorded. Transcripts and student work were blocked such that one block was one student working one part of one problem.  Blocks were coded with strategies, mistakes, and other issues the students had solving story problems (like reading issues); kappa values of 0.79 or higher were obtained using 2 coders.
 +
 
 +
Results showed that students regularly used informal, arithmetic approaches to solve result and start unknown story problems, especially when the problem had been personalized.  Personalized problems had the lowest “No Response” rate (1% No Response), the highest use of informal strategies (80% of time), and students overwhelmingly perceived personalized problems as being “easiest” when asked (82% of time). Personalized problems also had higher success rates and lower student use of “non-coordinative” strategies where situational reasoning was not well-connected to formal problem-solving computations. When asked why they were given story problems in algebra class, students described how these problems would help them in the real world and in the workplace.
 +
 
 +
However, personalized problems still had a relatively high overall use of non-coordinative approaches (16% of time), and students also struggled with reading on personalized problems at similar rates to other problems (also 16% of time; some overlap). Students’ overwhelming use of informal strategies when solving personalized problems could be framed as problematic in a course where the overall goal is to have students use symbolic equations as representational tools. Finally, there was evidence that students still sometimes epistemologically framed personalized problems as “school mathematics” tasks, disconnected from their lived experiences.
 +
 
 +
Quantitative analyses specifically aimed to compare performance on personalized story problems versus normal story problems were carried out replicating the methodology of Koedinger & Nathan (2004). Students solved personalized problem correctly 61% of the time overall, and solved normal story problems correctly 45% of the time overall. However, using two 2-factor mixed model ANOVAs that treated students (ANOVA 1) and items (ANOVA 2) as random effects, no statistically reliable overall differences in performance were found between normal and personalized problems. “Items” in this case described the underlying mathematical structure of the story problem – i.e., the story described the equation “y=4x+11.” The two ANOVAs were repeated using only the hardest items, and using only the weakest students, and statistically reliable (p<.05), positive effects were found for personalization. The effect size (Cohen’s d) for the hardest problems was 0.9, and for the weakest students was 1.5.
 +
 
 +
These results need to be interpreted with caution, as this was a small sample size (24 students), the personalization was done at a level of correspondence to real experiences that a computer could not replicate, and this was a population of students who overall were especially weak in mathematics.
 
   
 
   
=== Research Questions ===
+
=== Research Questions for In Vivo Study===
  
* How will [[robust learning]] be affected when [[personalization]] through culturally relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
+
* How will performance and time on task be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
* How will [[robust learning]] be affected when current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software are stripped of many of their contextual clues?
+
* How will [[robust learning]] be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
  
=== Independent variables ===
+
=== Independent Variables for In Vivo Study ===
  
This experiment will manipulate level of [[personalization]] through three treatment groups:
+
This experiment will manipulate level of [[personalization]] through a two grpuo design
*Students recieve current Cognitive Tutor Algebra problems
+
*Control: Students who receive current Cognitive Tutor Algebra story problems for Unit 5
*Students recieve matched Cognitive Tutor Algebra problems stripped of most contextual clues
+
*Experimental: Students who receive problems that have the same mathematical structure, but whose cover stories are personalized to individual students based on an interests survey
*Students receive matched culturally relevant Cognitive Tutor Algebra problems personalized according to student interest survey
+
 
<BR>
 
<BR>
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
| '''Treatment'''|| '''Example Problem''' || '''Received By'''
 
| '''Treatment'''|| '''Example Problem''' || '''Received By'''
 
|-
 
|-
| Problem scenarios stripped of most context || A task takes 30 minutes to completeHow many times can you complete the task in 3 hours? || 25-30 randomly-assigned Algebra I students at Learnlab site
+
| Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes.  A run consists of riding the ski lift up the hill, and skiing back down.  If she skiis for 3 hours, how many runs will she have completed? || 54 randomly-assigned Algebra I students at Learnlab site
 
|-
 
|-
| Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes.  A run consists of riding the ski lift up the hill, and skiing back down.  If she skiis for 3 hours, how many runs will she have completed? || 25-30 randomly-assigned Algebra I students at Learnlab site
+
| [[personalization|Personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)
|-
+
| Culturally relevant [[personalization|personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)
+
 
You noticed that the reality shows you watch on T.V. are all 30 minutes long.  If you’ve been watching reality shows for 3 hours, how many have you watched?
 
You noticed that the reality shows you watch on T.V. are all 30 minutes long.  If you’ve been watching reality shows for 3 hours, how many have you watched?
|| 25-30 randomly-assigned Algebra I students at Learnlab site
+
|| 57 randomly-assigned Algebra I students at Learnlab site
 
|}
 
|}
 
<BR>
 
<BR>
 +
=== Dependent variables for In Vivo Study ===
  
=== Hypothesis ===
+
[[Robust learning]] was measured through:
 +
*'''''Delayed Post-test''''' measuring [[long-term retention]]
 +
** A pre-test was administered before Unit 5, and a delayed post-test was administered at the end of Unit 6.
 +
* '''''Mastery of knowledge components''''' in the [[cognitive tutor|Cognitive Tutor]] software, including in subsequent units:
 +
**The students’ performance in Unit 7 was also examined, to see if there were performance differences between the experimental and control group even after the treatment was no longer in effect.
  
Students in the treatment with culturally and [[personalization|personally relevant]] problem scenarios will show improved performance in terms of some measures of [[robust learning]] as a result of two factors: <BR>
+
'''Intrinsic Motivation''' will be measured through:
* Increased intrinsic motivation (such as with the [[REAP_Study_on_Personalization_of_Readings_by_Topic_%28Fall_2006%29|REAP Tutor study]])<BR>
+
* Formation of a more detailed and meaningful situation model (Nathan, Kintsh, & Young, 1992).
+
 
+
=== Dependent variables ===
+
[[Robust learning]] will be measured through:
+
* '''''Normal Post-test''''' measuring near-transfer and [[transfer]] of learning.
+
*'''''Delayed Post-test''''' measiring [[long-term retention]]
+
* '''''Curriculum progress''''' and '''''Mastery of knowledge components''''' in the [[cognitive tutor|Cognitive Tutor]] software, including in subsequent units:
+
**The students’ progress through the knowledge components in the curriculum will measure [[accelerated future learning]] by reflecting the latency in mastering knowledge components that build on the knowledge components affected by the culturally relevant problem scenarios (such as quadratic equations building on linear equations).
+
 
+
Intrinsic Motivation will be measured through:
+
 
*Hint-seeking and reading behavior in Cognitive Tutor software
 
*Hint-seeking and reading behavior in Cognitive Tutor software
 
*Time on task in Cognitive Tutor software
 
*Time on task in Cognitive Tutor software
*Questionairre asking how interesting and fun students found problems in the affected unit
 
  
=== Method ===
+
=== Hypotheses for In Vivo Study ===
  
This experiment will begin in the Fall of 2008 with a small study of student cultural interests. An interests survey will be administered to high school classes in Austin ISD that contain a high proportion of diverse students, as well as at a Pittsburgh Learnlab. Structured in-depth interviews relating to student interests will be conducted with around fifteen of the surveyed students.  Based on the results of the survey and interviews, culturally relevant problem scenarios that correspond to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I will be formulated for Section 5, Linear Models and Independent Variables.  Approximately 30 problem scenarios from the selected section will be replaced, with 4-5 variations on each problem scenario that correspond to different student interests, in order to obtain [[personalization]].  I will write these problem scenarios  while consulting with Jim Greeno and Milan Sherman; they will have the same underlying mathematics as the original [[cognitive tutor|Cognitive Tutor]] problems, with changes to the objects or nouns (what the problem is about) and the pronouns (who the problem is about).  See the table above for an example of how these two changes might occur.
+
Students in the treatment with [[personalization|personalized]] problem scenarios will:
  
The culturally relevant problem scenarios will be reviewed by Algebra I teachers, and then by students.  In a pilot study, approximately 40 Algebra I students will rate their understanding and impression of the newly created questions. Problem scenarios that students have difficulties or issues with will be reworked. Also during this pilot study, the researcher will conduct audio-taped [[think-aloud data|think-aloud protocols]] with each student as they solve 2-3 [[personalization|personally relevant]] algebra problem scenarios and 1-2 algebra problem scenarios with unfamiliar contexts. 
+
H1) Demonstrate higher levels of correct performance in Section 5
  
The new problem scenarios will then be integrated into the [[cognitive tutor|Cognitive Tutor]] Algebra software in Spring 2009 with the cooperation of Carnegie Learning.  Once the new problem scenarios have been placed into the software, they will be used in an [[in vivo experiment]] at a Learnlab school site in Pittsburgh by approximately 25-30 randomly-assigned students in the Fall of 2009 semester.  An additional 25-30 randomly-assigned students will receive the regular problem scenarios.  A third randomly-assigned group of 25-30 students will receive a third set of problems that have the same underlying mathematics, but are stripped of even more contextual clues than the regular [[cognitive tutor|Cognitive Tutor]] problem scenarios. See table above for a description of the three treatment groups in this study.
+
H2) Show improved “time on task” and fewer instances of “gaming the system” in Section 5
  
In addition, informal interviews will be conducted with students at the University of Pittsburgh, including [[think-aloud data|thinking-aloud protocols]] obtained as they solve word problems with texts that differ in the degree of their cultural relevance to the students. These protocols will be analyzed to identify components of students’ understanding (i.e., their situation models), and to relate these to cultural relevance and familiarity.
+
H3) Show improvement on some measures of [[robust learning]], as measured by pre/delayed post differences and by performance in subsequent sections.
  
To summarize, the experiment will have the following progression:
+
=== Method for In Vivo Study ===
(1) Survey of student interests administered in Austin ISD and Learnlab site
+
(2) Based on survey data, structured interviews with students are conducted
+
(3) Culturally relevant problem scenarios are written by me and reviewed by teachers
+
(4) Culturally relevant problem scenarios are tested for understanding and as part of a [[think-aloud data|think-aloud protocols]] during a student pilot study
+
(5) One [[cognitive tutor|Cognitive Tutor]] Agelbra unit replaced at a Learnlab site with 3-treatment setup & [[think-aloud data|think-aloud protocols]] conducted at University of Pittsburgh
+
  
=== Explanation ===
+
Interest surveys were administered to algebra students in Pennsylvania (N=47) and algebra students in Texas (N=29). The surveys contained sections where students ranked their interest in 9 different topics and answered 20 open response questions about specific topics they were interested in.  The algebra students in Texas also participated in one-on-one interviews about their out-of-school interests (part of pilot study). Based on the results of the surveys and interviews, personally relevant problem scenarios corresponding to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I were formulated for Section 5, Linear Models and Independent Variables.  27 problem scenarios from the selected section were rewritten to have 4 different variations for each problem scenario, corresponding to 9 different topics students were interested in (sports, music, movies, computers, stores, food, art, TV, games).  The personally relevant problems had the same underlying mathematical structure as the original problems, with changes made to the objects or nouns (what the problem is about) in the story and the pronouns (who the problem is about).  See the table above for an example of how these changes occurred. The personally relevant problem scenarios were reviewed by two master Algebra I teachers for language and clarity and were modified based on teacher feedback.
 +
 
 +
The new problem scenarios were integrated into Unit 5 the [[cognitive tutor|Cognitive Tutor]] Algebra software at the high school site with the cooperation of Carnegie Learning.  111 students at the school site were randomly assigned to either the experimental group (personalized problems) or the control group (normal problems). The experiment was in-sequence, meaning that all students encountered Section 5 at their own pace (i.e. at the time they naturally reached that point the software). Immediately before students entered Unit 5, they were prompted to answer an interest survey where they ranked their level of interest in the 9 different topics, and took a pre-test where they solved two multi-part normal story problems. After the students completed Unit 6, they were given a delayed-post-test.
 +
 
 +
===Results===
 +
 
 +
H1) Students receiving personalized problems will demonstrate higher levels of performance in Unit 5 than students receiving normal problems.
 +
 
 +
In order to test this hypothesis, a logistic regression model was formulated with the following properties. The unit of analysis was one student solving one part of one problem.
 +
 
 +
* Dependent Variable – whether the student got the problem part correct on their first attempt, without asking for a hint.
 +
* Random Effects – the student ID , the item (linear function underlying the problem), and the problem name (which personalized version student was given, or which set of numbers student was given for result and start unknowns)
 +
* Fixed Effects – Condition (whether the student was in the experimental or control group) and what knowledge component was covered by the problem part
 +
 
 +
Each of these effects significantly improved the model.  Interactions did not significantly improve the model. The main effect for the treatment (personalization) was statistically significant at the 5% level. Personalization had a positive overall effect on student performance. The size of the overall impact of personalization on performance was around 5.3%. If a student had a 50% base chance of getting a problem correct on the first attempt, personalization would increase that chance to 55.3%.
 +
 
 +
Although interaction terms were not significant in this model, this seemed to be a combination of lack of statistical power and the addition of many parameters when interactions were modeled. Thus a second model was specified where the knowledge components were classified as easy, medium, and hard, and here there was a significant condition by knowledge component interaction.  Personalization had a significantly larger, positive impact on the two most difficult knowledge components relating to writing symbolic expressions, compared to the medium difficulty knowledge components. For the most difficult knowledge components, personalization increased success rates from 50% to 58%.
 +
 
 +
More results coming soon.
  
This research is situated within the new "Motivation and Metacognition" thrust.
 
  
 
=== References ===
 
=== References ===
 +
 +
Anderman, E., & Anderman, L. (2010). Classroom Motivation. Pearson: Columbus, OH.
  
 
Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.
 
Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.
Line 129: Line 159:
  
 
Nathan, M., Kintsch, W., & Young, E. (1992).  A theory of algebra-word-problem comprehension and its implications for the design of learning environments.  Cognition and Instruction, 9(4), 329-389.
 
Nathan, M., Kintsch, W., & Young, E. (1992).  A theory of algebra-word-problem comprehension and its implications for the design of learning environments.  Cognition and Instruction, 9(4), 329-389.
 +
 +
Ormrod, J. Human Learning. Pearson/Merrill/Prentice Hall: Columbus, OH.
 +
 +
Mayer, R. (2011). Applying the Science of Learning. Pearson.
  
 
[[Stoichiometry_Study|McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org]]
 
[[Stoichiometry_Study|McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org]]

Latest revision as of 11:37, 31 August 2011

The Effect of Context Personalization on Problem Solving in Algebra

Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman

Summary Tables

PIs Candace Walkington & Anthony Petrosino
Other Contributors
  • Graduate Student: Milan Sherman
  • Staff: Jim Greeno


Pilot Study

Study Start Date September 2008
Study End Date May 2009
Study Site Austin, TX
Number of Students N = 24
Average # of hours per participant 2 hrs.


In Vivo Study

Study Start Date October 2009
Study End Date April 2010
LearnLab Site Hopewell High
LearnLab Course Algebra
Number of Students N = 111
Average # of hours per participant 3 hr
Data in DataShop Yes - Personalization Hopewell 2010


Abstract

In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001). However, observations and discussions with teachers have suggested that some of the Cognitive Tutor problem scenarios may be disconnected from the lives and experiences of many students. This study investigated whether students’ personal interest in story contexts affects performance and robust learning.

The first stage of this research was a pilot study of the personal interests of students at an urban Texas high school. Freshman algebra students were surveyed and interviewed about their out-of-school interests, and were also asked to describe how they use mathematics in their everyday lives. Twenty-four of these students solved a number of Cognitive Tutor Algebra-style problems while thinking aloud. Results of this pilot study were used to critically examine the idea that personalization of story problems has the potential to support student learning, using qualitative data analysis methods.

The second stage of this research was an “in vivo” study that took place in Fall of 2009 at a Pennsylvania Learnlab site. Based on the results of the pilot study and additional student surveys from Pennsylvania, the 27 problems in Section 5 ("Linear Models and Independent Variables)" of the Cognitive Tutor software were rewritten to each have 4 “personalized” versions corresponding to different student interests. The Cognitive Tutor software was programmed to give participating students an initial interests survey, and then select problem scenarios that match their interests. The resulting robust learning, measured by a delayed post-test (measuring long-term retention), and mastery of knowledge components in a future section (measuring transfer), has been analyzed with a 2-group design (experimental vs. control) to measure the effect of personalization on learning. Measures from within Section 5 were also analyzed to measure the effect of personalization on performance.

Background and Significance

This research direction was initiated by the observation of classrooms in Texas using the Cognitive Tutor Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching practice. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the Cognitive Tutor problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech. A review of the literature showed limited evidence for the potential of relevant story contexts to increase learning, and little research had been done at the secondary school level. This study is designed to empirically test the claim that the personal relevance of story problems affects robust learning and performance.

Theoretical Framework

This study is situated in the new “Motivation and Metacogntion” thrust. The foundation of this study is that relevance of problem scenarios affects robust learning through increased intrinsic motivation (Cordova & Lepper, 1996). If learners that have the cognitive capacity to solve algebra story problems, enhancing motivation may increase their likelihood to exert effort to make sense of the scenarios by forming a more elaborated and better connected situation and problem models (Nathan, Kintsch, & Young, 1992), thus encouraging generative processing (Mayer, 2011). Mayer (2011) states the personalization principle as “People learn better when the instructor uses conversational style rather than formal style” (p. 70). Here, we are use the PSLC’s modified version of this principle, which states “Matching up the features of an instructional component with students' personal interests, experiences, or typical patterns of language use, will lead to more robust learning through increased motivation, compared to when instruction is not personalized.” This is related to what Mayer (2011) refers to as the “Anchoring” principle.

The construct through which personalization enhances intrinsic motivation is through increased personal interest (also called individual interest). Personal interest is considered to be stable, enduring preferences that individual learners bring with them to different situations (Anderman & Anderman, 2010). Interest promotes more effective processing of information and greater cognitive engagement. Students who have high interest may be more likely to relate new knowledge to prior knowledge and form more connections between ideas. They also may be more likely to generate inferences, examples and applications relating to the subject area they are trying to learn (Ormrod, 2008).

Pilot Study

The first stage of this research began in Fall of 2008 with a pilot study of personalization at an "Academically Unacceptable" school in Texas (75% free/reduced lunch). Twenty-four freshman algebra students were interviewed about their out-of-school interests, such as sports, music, movies, etc., and were also asked to describe how they use mathematics in their everyday lives. These interviews were audio recorded, and were used to write each student “personalized” algebra story problems. The research questions being investigated were:

  • What is the impact of personalizing algebra story problems to individual student experiences, in terms of strategy use, language comprehension, and students’ epistemological frames about mathematical activity? (qualitative)
  • How does personalizing algebra story problems to individual experiences impact student performance, when compared to their performance on normal story problems from the Cognitive Tutor curriculum with the same underlying structure? (quantitative)

A problem set containing five algebra problems on linear functions was written for each student; two of these were story problems that were personalized to the ways in which the individual student described using mathematics in their everyday life during their initial interview. The problem set also contained normal story problems from the Cognitive Tutor curriculum, completely abstract symbolic equations, story problems that contained symbolic equations, and story problems with simplified language and general referents (“generic” story problems). Each problem had four parts – the first two parts were “Result Unknowns” or “concrete cases” (i.e. solve for y given this x), and the fourth and final part was a “Start Unknown” (i.e. solve for x given this y). For normal, personalized, and generic problems, the third part of each problem asked students to write a general symbolic equation or “algebra rule” representing the story. For normal story problems that already contained equations, students were asked to interpret the parameters in terms of the story. For completely abstract symbolic problems, students were asked to write a story that could go with the equation.

Each of the 24 students was given their problem set of 5 problems, and asked to solve each problem while “thinking aloud” and being audio recorded. Transcripts and student work were blocked such that one block was one student working one part of one problem. Blocks were coded with strategies, mistakes, and other issues the students had solving story problems (like reading issues); kappa values of 0.79 or higher were obtained using 2 coders.

Results showed that students regularly used informal, arithmetic approaches to solve result and start unknown story problems, especially when the problem had been personalized. Personalized problems had the lowest “No Response” rate (1% No Response), the highest use of informal strategies (80% of time), and students overwhelmingly perceived personalized problems as being “easiest” when asked (82% of time). Personalized problems also had higher success rates and lower student use of “non-coordinative” strategies where situational reasoning was not well-connected to formal problem-solving computations. When asked why they were given story problems in algebra class, students described how these problems would help them in the real world and in the workplace.

However, personalized problems still had a relatively high overall use of non-coordinative approaches (16% of time), and students also struggled with reading on personalized problems at similar rates to other problems (also 16% of time; some overlap). Students’ overwhelming use of informal strategies when solving personalized problems could be framed as problematic in a course where the overall goal is to have students use symbolic equations as representational tools. Finally, there was evidence that students still sometimes epistemologically framed personalized problems as “school mathematics” tasks, disconnected from their lived experiences.

Quantitative analyses specifically aimed to compare performance on personalized story problems versus normal story problems were carried out replicating the methodology of Koedinger & Nathan (2004). Students solved personalized problem correctly 61% of the time overall, and solved normal story problems correctly 45% of the time overall. However, using two 2-factor mixed model ANOVAs that treated students (ANOVA 1) and items (ANOVA 2) as random effects, no statistically reliable overall differences in performance were found between normal and personalized problems. “Items” in this case described the underlying mathematical structure of the story problem – i.e., the story described the equation “y=4x+11.” The two ANOVAs were repeated using only the hardest items, and using only the weakest students, and statistically reliable (p<.05), positive effects were found for personalization. The effect size (Cohen’s d) for the hardest problems was 0.9, and for the weakest students was 1.5.

These results need to be interpreted with caution, as this was a small sample size (24 students), the personalization was done at a level of correspondence to real experiences that a computer could not replicate, and this was a population of students who overall were especially weak in mathematics.

Research Questions for In Vivo Study

  • How will performance and time on task be affected when personalization through relevant problem scenarios is implemented instead of the current problem scenarios in the Cognitive Tutor Algebra I software?
  • How will robust learning be affected when personalization through relevant problem scenarios is implemented instead of the current problem scenarios in the Cognitive Tutor Algebra I software?

Independent Variables for In Vivo Study

This experiment will manipulate level of personalization through a two grpuo design

  • Control: Students who receive current Cognitive Tutor Algebra story problems for Unit 5
  • Experimental: Students who receive problems that have the same mathematical structure, but whose cover stories are personalized to individual students based on an interests survey


Treatment Example Problem Received By
Normal Cognitive Tutor Algebra problem scenarios A skier noticed that she can complete a run in about 30 minutes. A run consists of riding the ski lift up the hill, and skiing back down. If she skiis for 3 hours, how many runs will she have completed? 54 randomly-assigned Algebra I students at Learnlab site
Personalized problem scenarios (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)

You noticed that the reality shows you watch on T.V. are all 30 minutes long. If you’ve been watching reality shows for 3 hours, how many have you watched?

57 randomly-assigned Algebra I students at Learnlab site


Dependent variables for In Vivo Study

Robust learning was measured through:

  • Delayed Post-test measuring long-term retention
    • A pre-test was administered before Unit 5, and a delayed post-test was administered at the end of Unit 6.
  • Mastery of knowledge components in the Cognitive Tutor software, including in subsequent units:
    • The students’ performance in Unit 7 was also examined, to see if there were performance differences between the experimental and control group even after the treatment was no longer in effect.

Intrinsic Motivation will be measured through:

  • Hint-seeking and reading behavior in Cognitive Tutor software
  • Time on task in Cognitive Tutor software

Hypotheses for In Vivo Study

Students in the treatment with personalized problem scenarios will:

H1) Demonstrate higher levels of correct performance in Section 5

H2) Show improved “time on task” and fewer instances of “gaming the system” in Section 5

H3) Show improvement on some measures of robust learning, as measured by pre/delayed post differences and by performance in subsequent sections.

Method for In Vivo Study

Interest surveys were administered to algebra students in Pennsylvania (N=47) and algebra students in Texas (N=29). The surveys contained sections where students ranked their interest in 9 different topics and answered 20 open response questions about specific topics they were interested in. The algebra students in Texas also participated in one-on-one interviews about their out-of-school interests (part of pilot study). Based on the results of the surveys and interviews, personally relevant problem scenarios corresponding to current problem scenarios in Cognitive Tutor Algebra I were formulated for Section 5, Linear Models and Independent Variables. 27 problem scenarios from the selected section were rewritten to have 4 different variations for each problem scenario, corresponding to 9 different topics students were interested in (sports, music, movies, computers, stores, food, art, TV, games). The personally relevant problems had the same underlying mathematical structure as the original problems, with changes made to the objects or nouns (what the problem is about) in the story and the pronouns (who the problem is about). See the table above for an example of how these changes occurred. The personally relevant problem scenarios were reviewed by two master Algebra I teachers for language and clarity and were modified based on teacher feedback.

The new problem scenarios were integrated into Unit 5 the Cognitive Tutor Algebra software at the high school site with the cooperation of Carnegie Learning. 111 students at the school site were randomly assigned to either the experimental group (personalized problems) or the control group (normal problems). The experiment was in-sequence, meaning that all students encountered Section 5 at their own pace (i.e. at the time they naturally reached that point the software). Immediately before students entered Unit 5, they were prompted to answer an interest survey where they ranked their level of interest in the 9 different topics, and took a pre-test where they solved two multi-part normal story problems. After the students completed Unit 6, they were given a delayed-post-test.

Results

H1) Students receiving personalized problems will demonstrate higher levels of performance in Unit 5 than students receiving normal problems.

In order to test this hypothesis, a logistic regression model was formulated with the following properties. The unit of analysis was one student solving one part of one problem.

  • Dependent Variable – whether the student got the problem part correct on their first attempt, without asking for a hint.
  • Random Effects – the student ID , the item (linear function underlying the problem), and the problem name (which personalized version student was given, or which set of numbers student was given for result and start unknowns)
  • Fixed Effects – Condition (whether the student was in the experimental or control group) and what knowledge component was covered by the problem part

Each of these effects significantly improved the model. Interactions did not significantly improve the model. The main effect for the treatment (personalization) was statistically significant at the 5% level. Personalization had a positive overall effect on student performance. The size of the overall impact of personalization on performance was around 5.3%. If a student had a 50% base chance of getting a problem correct on the first attempt, personalization would increase that chance to 55.3%.

Although interaction terms were not significant in this model, this seemed to be a combination of lack of statistical power and the addition of many parameters when interactions were modeled. Thus a second model was specified where the knowledge components were classified as easy, medium, and hard, and here there was a significant condition by knowledge component interaction. Personalization had a significantly larger, positive impact on the two most difficult knowledge components relating to writing symbolic expressions, compared to the medium difficulty knowledge components. For the most difficult knowledge components, personalization increased success rates from 50% to 58%.

More results coming soon.


References

Anderman, E., & Anderman, L. (2010). Classroom Motivation. Pearson: Columbus, OH.

Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.

Cordova, D. I. & Lepper, M. R. (1996). Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice. Journal of Educational Psychology, 88(4), 715-730.

Eskenazi, M.; Juffs, A., Heilman, M., Collins-Thompson, K., Wilson, L., & Callen, J. (2006). REAP Study on Personalization of Readings by Topic (Fall 2006). The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org

Koedinger, K. R. (2001). Cognitive tutors as modeling tool and instructional model. In Forbus, K. D. & Feltovich, P. J. (Eds.) Smart Machines in Education: The Coming Revolution in Educational Technology. Menlo Park, CA: AAAI/MIT Press.

Nathan, M., Kintsch, W., & Young, E. (1992). A theory of algebra-word-problem comprehension and its implications for the design of learning environments. Cognition and Instruction, 9(4), 329-389.

Ormrod, J. Human Learning. Pearson/Merrill/Prentice Hall: Columbus, OH.

Mayer, R. (2011). Applying the Science of Learning. Pearson.

McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org