http://learnlab.org/research/wiki/api.php?action=feedcontributions&user=Kamearobinson&feedformat=atomLearnLab - User contributions [en]2020-02-21T12:05:13ZUser contributionsMediaWiki 1.24.1http://learnlab.org/research/wiki/index.php?title=Elaborated_Explanations&diff=12166Elaborated Explanations2011-08-31T09:57:07Z<p>Kamearobinson: </p>
<hr />
<div>Elaborated explanations are [[Self-explanation]]s that make use of multiple sources of information or multiple modes of expression. <br />
<br />
For example, an elaborated explanation in geometry requires students to explain their problem solving steps by referring both to verbal information (e.g., naming geometry principles that justify an answer) and visual information (e.g., naming the diagram features that are relevant to chosen geometry principles). <br />
<br><br />
<b>Relevant studies include:</b><br />
[[Using Elaborated Explanations to Support Geometry Learning (Aleven & Butcher)|Elaborated Explanations in Geometry (Aleven & Butcher)]]<br />
<br />
[[Category:Glossary]]<br />
<br />
[[Category:Interactive Communication]]<br />
<br />
[[Category:Independent Variables]]<br />
<br />
[[Category:Dependent Variables]]<br />
<br />
[[Category:Visual-Verbal Learning (Aleven & Butcher Project)]]<br />
<br />
[http://editingwritingservices.org/article.php article writing service]</div>Kamearobinsonhttp://learnlab.org/research/wiki/index.php?title=Effect_of_adding_simple_worked_examples_to_problem-solving_in_algebra_learning&diff=12165Effect of adding simple worked examples to problem-solving in algebra learning2011-08-31T09:56:55Z<p>Kamearobinson: </p>
<hr />
<div>''Lisa Anthony, Jie Yang, Kenneth R. Koedinger''<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Lisa Anthony, Jie Yang, & Ken Koedinger<br />
|-<br />
| '''Other Contributers''' || n/a<br />
|-<br />
| '''Study Start Date''' || December 4, 2006<br />
|-<br />
| '''Study End Date''' || December 20, 2006<br />
|-<br />
| '''LearnLab Site''' || Central Westmoreland Career & Technology Center (CWCTC)<br />
|-<br />
| '''LearnLab Course''' || Algebra<br />
|-<br />
| '''Number of Students''' || 38<br />
|-<br />
| '''Total Participant Hours''' || 114<br />
|-<br />
| '''DataShop''' || To be completed ASAP<br />
|}<br />
<br />
=== Abstract ===<br />
This ''in vivo'' experiment compared differences in learning that occur when students problem solve vs when they problem solve aided by worked [[example]]s. Students worked in the standard Cognitive Tutor Algebra lesson on 2-step problems. Those in the worked examples condition copied the worked example given to them using the solver's interface the first time they saw a particular problem type (''i.e.'', ax+b=c or a/x=c); following that, an analogous example would appear each time the students solve a similar problem.<br />
<br />
The hypothesis of this study was that students who were given the worked examples would experience improved learning in both normal learning and in terms of the [[robust learning]] measures of [[transfer]] and [[accelerated future learning]]. Copying the problem the first time the students encountered a particular problem type acts as additional scaffolding for students to solve the problems.<br />
<br />
Results are forthcoming.<br />
<br />
=== Glossary ===<br />
Forthcoming, but will probably include<br />
* Sample worked-out-example:<br />
[[Image:lanthony-example-unit9.jpg]]<br />
<br />
=== Research question ===<br />
Is robust learning affected by the addition of scaffolded worked examples to the problem-solving process?<br />
<br />
=== Background & Significance ===<br />
...Worked examples studies undergone at PSLC and beyond...<br />
<br />
See VanLehn's paper on students using examples -- copying vs. as feedback ...<br />
Lefevre & Dicksen ... (1986). Cognition and Instruction.<br />
<br />
See Koedinger & Aleven's Assistance Dilemma explanation ...<br />
<br />
=== Independent Variables ===<br />
One independent variable was used:<br />
* Inclusion of worked example: present or not present.<br />
<br />
=== Hypothesis ===<br />
The inclusion of worked examples during the problem-solving process will have benefits for learning by virtue of the scaffolding provided by having the students copy the example the first time they see a particular problem type.<br />
<br />
=== Dependent variables ===<br />
* ''Near [[transfer]], immediate'': Students were given a 15-minute post-test after their sessions with the computer tutor had concluded.<br />
<br />
* ''Near transfer, [[retention]]'': We intend to analyze the log data from the students' Cognitive Tutor usage in the equation solving unit that followed the 2-step problems, to determine if there was any difference in performance at the start of that lesson.<br />
<br />
* ''Far transfer'': Far transfer items such as 3-step problems and literal equations were included on the immediate post-test.<br />
<br />
* ''[[Accelerated future learning]]'': We intend to analyze the log data from the students' Cognitive Tutor usage in the equation solving unit that followed the 2-step problems, to determine if there were learning curve differences during training.<br />
<br />
=== Findings ===<br />
Final findings in progress.<br />
<br />
=== Explanation ===<br />
This study is part of the [[Coordinative Learning]] cluster and addresses the examples and explanation sub-group.<br />
<br />
The students were given examples throughout their use of the tutor. On the first instance of a particular problem type, students were asked to copy out a worked example; on subsequent instances, examples remained on the screen while students solved analogous problems.<br />
<br />
=== Descendants ===<br />
<br />
None.<br />
<br />
=== Annotated Bibliography ===<br />
<br />
Analysis and write-up in progress.<br />
<br />
=== Further Information ===<br />
Connected to [[Lab study proof-of-concept for handwriting vs typing input for learning algebra equation-solving]] in the [[Refinement and Fluency]] cluster.<br />
<br />
=====Plans for June 2007-December 2007=====<br />
<br />
* Complete transition of log data to DataShop.<br />
* Analyze data to determine effect of including examples on pre to post test gains and/or learning curves.<br />
* Write up results for publication in a learning science conference.<br />
* Lab study comparing alternative methods of delivering and presenting worked examples is a possible side avenue for the parent project of this study ([[Handwriting Algebra Tutor]]).<br />
<br />
[http://editingwritingservices.org/article.php article writing services]</div>Kamearobinsonhttp://learnlab.org/research/wiki/index.php?title=Declarative&diff=12164Declarative2011-08-31T09:56:43Z<p>Kamearobinson: </p>
<hr />
<div>[[Category:Glossary]]<br />
[[Category:PSLC General]]<br />
Declarative memory is the aspect of human memory that stores facts and experiences. It is so called because it refers to memories that can be consciously discussed, or declared. It applies to standard textbook learning and knowledge, as well memories that can be 'travelled back to' in one's 'mind's eye'. It is contrasted with procedural memory, which applies to skills. Declarative memory is subject to forgetting, but frequently-accessed memories can last indefinitely. Declarative memories are best established by using active recall combined with mnemonic techniques and spaced repetition.<br />
<br />
Declarative Knowledge is knowledge of objects and facts. Also called declarative memory, this includes sensory knowledge. Declarative knowledge is essential in both interpreting the external world and in introspectively placing one's self in context.<br />
[http://custom-essay.ws/index.php essay papers]</div>Kamearobinsonhttp://learnlab.org/research/wiki/index.php?title=Does_learning_from_worked-out_examples_improve_tutored_problem_solving%3F&diff=12163Does learning from worked-out examples improve tutored problem solving?2011-08-31T09:56:32Z<p>Kamearobinson: </p>
<hr />
<div>== Does learning from worked-out examples improve tutored problem solving? ==<br />
''Alexander Renkl, Rolf Schwonke, Vincent Aleven, & Ron Salden''<br />
<br />
=== Abstract ===<br />
In our research we combine two educational research branches. The first concerns the Cognitive Tutor research branch which deals with offering students the possibility to attain and extend their skills through means of a computerized curriculum of problems. The second branch concerns the research done on worked-out examples and the fading of worked-out steps. Because students’ prior knowledge on novice elements is usually low it is widely believed presenting a worked-out example might give the student more grasp on the novice elements. By presenting more information about the novice elements the student will not have to bridge the gap between his prior knowledge and the novice elements entirely by himself.<br />
<br />
However, as the students progresses through a curriculum his knowledge will grow and fully worked-out examples might give him redundant information. Hence, as the curriculum advances the number of worked-out steps in a problem is gradually decreased until all steps are left blank (i.e., problem solving). By continually fading the worked-out steps the student will have more optimal learning instances than either from only fully worked-out problems or only problem solving in the curriculum. Our studies address the main hypothesis which states that an example enriched Cognitive Tutor can create more deep conceptual understanding.<br />
<br />
In our recent In Vivo Study ('''Study 4''') we compared a control condition with two experimental conditions which faded worked-out examples. In the ''control condition'' the students had to solve problems and enter explanations for each step as they are used to do. In the ''fixed fading condition'' the same preset fading of worked-out steps was applied to all students. In the ''adaptive fading condition'' the fading of the worked-out steps was based on the individual student's progress as measures on their problem solving and their reasoning. The data of this study have been collected and are currently being processed to the Datashop after which data analysis will commence.<br />
<br />
=== Background and Significance ===<br />
The background of this research is twofold. (1) The very successful approach of Cognitive Tutors (Anderson, Corbett, Koedinger, & Pelletier, 1995; Koedinger, Anderson, Hadley, & Mark, 1997) is taken up. These computer-based tutors provide individualized support for learning by doing (i.e., solving problems) by selecting appropriate problems to-be-solved, by providing feedback and problem-solving hints, and by on-line assessment of the student’s learning progress. Cognitive Tutors individualize the instruction by selecting problems based on a model of the students’ present knowledge state that is constantly updated, through a Bayesian process called “knowledge tracing” (Corbett & Anderson, 1995). Although problem solving supported by [[cognitive tutor]]s has been shown to be successful in fostering initial acquisition of cognitive skill, this approach does not seem to be optimal with respect to focusing the learner on the domain principles to be learned. <br />
(2) The research tradition on worked-out examples rooted in Cognitive Load Theory (Sweller, van Merriënboer, & Paas, 1998) and, more specifically, the instructional model of example-based learning by Renkl and Atkinson (in press) are taken up in order to foster skill acquisition that is found in deep conceptual understanding. By presenting examples instead of problems to be solved in the beginning of a learning sequence, the learner have more attentional capacity available in order to self-explain and thus deepen their understanding of problem solutions.<br />
<br />
In order to foster a deep understanding of domain principles and how they are applied in problem solving, we combine the theoretical rationales of Cognitive Tutors and example-based learning. <br />
<br />
Especially, we address the following main hypotheses:<br />
# Enriching a Cognitive Tutor unit with [[example]]s whose worked-out steps are gradually faded leads to better learning.<br />
# Individualizing the fading procedure based on the quality of self-explanations that the learners provide further improves learning.<br />
# Using free-form self-explanations is more useful in this context as compared to the usual menu-based formats.<br />
# Learning can be enhanced further by providing previously self-explained examples – including the learner’s own self-explanations – as support at problem-solving [[impasse]]s.<br />
<br />
This project is in several respects of significance:<br />
<br />
(1) Presently, the positive effects of examples were shown in comparison to unsupported problem solving. We aim to show that example study is also superior to supported problem solving in the very beginning of a learning sequence.<br />
<br />
(2) The Cognitive Tutor approach can be enhanced by ideas from research on example-based learning.<br />
<br />
(3) The example-based learning approach can be enriched by individualizing instructional procedures such as fading.<br />
<br />
=== Glossary ===<br />
<br />
[[:Category:Salden Learning-from-Examples|Salden Learning-from-Examples Glossary]]<br />
<br />
[[Learning by worked-out examples]]<br />
<br />
[[Learning by problem solving]]<br />
<br />
[[Self-explanation]]<br />
<br />
[[Fading]]<br />
<br />
=== Research question ===<br />
Can the effectiveness and efficiency of Cogntive Tutors be enhanced by including learning from worked-out examples?<br />
<br />
=== Independent variables ===<br />
The independent variable refers to the following variation: <br />
<br />
(a) Cognitive Tutor with problems to be solved <br />
<br />
versus <br />
<br />
(b) Cognitive Tutors with intially worked-out examples, then partially worked-out examples, and finally problem to be solved.<br />
<br />
(b1) The fading occurs according to a fixed procedure which is used for all students.<br />
<br />
(b2) The fading occurs adaptively based on the knowledge ''and'' reason tracing of the individual student's progress.<br />
<br />
(c) Previously performed problems are accessible as hints. Meaning that the regular textual hints are replaced by screenshots of the previously performed problems.<br />
<br />
Although self-explanation prompts are a typical "ingredient" of example-based learning, but not of learning by problem solving, such prompts were included in all conditions of our studies. Thereby, the potential effects can be clearly attributed to the presence or absense of example study.<br />
The exception to this is Study 6 which includes two Untutored conditions. For more information on this study please scroll down to the '''Further Information''' section.<br />
<br />
=== Dependent variables ===<br />
1) [[Normal post-test]] of [[procedural|procedural knowledge]]: measured by the students' performance on new problems dealing with the content they learned in the Cognitive Tutor.<br />
<br />
1) [[Transfer]] test assessing [[conceptual knowledge]]: measured by a variety of questions on the post test that are different in format, non-isomorphic, from those used in training. These include drawing problems, multiple questions and open questions concerning the specific geometric principles the students were exposed to in the Cognitive Tutor.<br />
<br />
3) [[Accelerated future learning|Acceleration of future learning]] (in future experiments): we plan to use a related unit to the angles and/or circles units which follows each of these units in the curriculum.<br />
<br />
4) Learning time: measured in total time to complete the unit in the Cognitive Tutor.<br />
<br />
5) Efficiency of learning (relating learning time to learning outcomes): based on the mental efficiency measure developed by Paas and van Merriënboer (1993).<br />
<br />
=== Hypotheses & Results ===<br />
The provision of example in Cognitive Tutors should lead to better conceptual understanding and, thereby, transfer performance. In addition, examples in Cognitive Tutors should reduce learning time.<br />
<br />
On the whole, the present results confirm the hypotheses with respect to conceptual knowledge and learning time. The expected effects on transfer were not found.<br />
<br />
===Study 1 (lab study at Freiburg, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**Lab Study: 8th grade and 9th grade students from a German high school in Freiburg<br />
**Domain: translated Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: March 1, 2006<br />
**End Date: March 31, 2006<br />
**Number of Students: 50<br />
**Participant Hours: 1.5<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of two conditions:<br />
**Problem Solving Condition: In this control condition students solved answer steps and entered explanations on all problems<br />
[[Image:Freiburg_Summer_2006_-_problem_solving_versus_fixed_fading_examples_study_-_Problem_condition-small.jpg]]<br />
**Worked Example Condition: In this experimental condition students were first presented with problems that had worked out (i.e., filled in) answer steps but still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations.<br />
[[Image:Freiburg_Summer_2006_-_problem_solving_versus_fixed_fading_examples_study_-_Example_condition-small.jpg]]<br />
<br />
*''Findings''<br />
**No overall effect of experimental condition on students' conceptual and procedural knowledge on the post-test: ''t'' < 1.<br />
**However, about the same learning outcomes were achieved in shorter learning times in the example-enriched Cognitive Tutor: ''t''(48) = -3.11, p < .001 (one-tailed), ''r'' = .41.<br />
**Accordingly, the efficiency of learning was superior in this latter learning condition: ''t''(48) = 1.73, ''p'' < .05 (one-tailed), ''r'' = .24 for conceptual knowledge, and ''t''(48) = 1.82, ''p'' < .05 (one-tailed), ''r'' = .25 for the acquisition of transferable knowledge.<br />
<br />
===Study 2 (lab study at Freiburg, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**Lab Study: 9th grade and 10th grade students from a German high school in Freiburg<br />
**Domain: translated Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: April 1, 2006<br />
**End Date: April 31, 2006<br />
**Number of Students: 30<br />
**Participant Hours: 1.5<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of two conditions:<br />
**Problem Solving Condition: In this control condition, students solved answer steps and entered explanations on all problems. See Study 1 for a screenshot of this condition.<br />
**Worked Example Condition: In this experimental condition, students were first presented with problems that had worked-out (i.e., filled in) answer steps, but they still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations. See Study 1 for a screenshot of this condition.<br />
<br />
*''Findings''<br />
**With regard to conceptual understanding part of the transfer post-test, an advantage of the example condition over the problem condition was found: ''t''(28) = 1.85, ''p'' < .05 (one-tailed), ''r'' = 0.33.<br />
**There were no significant differences on other, procedural, transfer items: ''t'' < 1.<br />
**Similar to Study 1, students in the example condition spent significantly less time than students in the problem solving condition: ''t''(28) = -3.14, ''p'' < .001 (one-tailed), ''r'' = 0.51.<br />
**Hence, on a measure of robust learning efficient (i.e., relating performance in terms of the acquisition of conceptual knowledge to the effort in terms of time on task) a large effect was obtained: ''r'' = 0.55, ''t''(28) = 3.48, ''p'' < .001.<br />
<br />
===Study 3 (lab study at CMU, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**Lab Study: 8th grade and 9th grade students in rural Pennsylvannia schools<br />
**Domain: Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: October 1, 2006<br />
**End Date: November 30, 2006<br />
**Number of Students: 45<br />
**Participant Hours: 2<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of two conditions:<br />
**Problem Solving Condition: In this control condition, students solved answer steps and entered explanations on all problems. See Study 1 for a screenshot of this condition.<br />
**Worked Example Condition: In this experimental condition, students were first presented with problems that had worked-out (i.e., filled in) answer steps, but they still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations. See Study 1 for a screenshot of this condition.<br />
<br />
*''Findings''<br />
**No overall effect of experimental condition on students' conceptual and procedural knowledge on the post-test: ''t'' < 1.<br />
**No difference in time to complete the Circles Unit in the Tutor: ''t'' < 1.<br />
<br />
===Study 4 (In Vivo study at CMU, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**In Vivo Study: 10th grade geometry classes in rural Pennsylvannia high school<br />
**Domain: Angles Unit in the Geometry Cognitive Tutor<br />
**Start Date: January 8, 2007<br />
**End Date: March 9, 2007<br />
**Number of Students: 51<br />
**Participant Hours: 5<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of three conditions:<br />
**Problem Solving Condition: In this control, condition students solved answer steps and entered explanations on all problems.<br />
[[Image:January_2007_-_adaptive_fading_study_CWCTC_-_Problem_solving_condition-small.jpg]]<br />
**Fixed Fading of Worked Examples Condition: In this experimental condition, students were first presented with problems that had worked-out (i.e., filled in) answer steps, but they still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations.<br />
[[Image:January_2007_-_adaptive_fading_study_CWCTC_-_Fixed_fading_condition-small.jpg]]<br />
**Adaptive Fading of Worked Examples Condition: This experimental condition is similar to the Fixed Fading condition but differs in the fact that the fading of the filled in answer steps is based on the individual student's performance on both the answer and the reason steps.<br />
See screenshot of the Fixed Fading condition, the interface was identical in both condition.<br />
<br />
*''Findings''<br />
**Positive effect of the Adaptive Fading Condition over the Problem Solving Condition on the delayed post-test: ''t''(20) = 2.15, ''p'' < .05, ''d'' = .96.<br />
**No difference in time to complete the Circles Unit in the Tutor: ''t'' < 1.<br />
<br />
===Study 5 (lab study at Freiburg, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**Lab Study: 9th grade and 10th grade students from a German high school in Freiburg<br />
**Domain: translated Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: March 19, 2007<br />
**End Date: March 26, 2007<br />
**Number of Students: 57<br />
**Participant Hours: 2<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of three conditions:<br />
**Problem Solving Condition: In this control, condition students solved answer steps and entered explanations on all problems.<br />
**Fixed Fading of Worked Examples Condition: In this experimental condition, students were first presented with problems that had worked-out (i.e., filled in) answer steps, but they still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations.<br />
**Adaptive Fading of Worked Examples Condition: This experimental condition is similar to the Fixed Fading condition but differs in the fact that the fading of the filled in answer steps is based on the individual student's performance on both the answer and the reason steps.<br />
<br />
*''Findings''<br />
**A planned contrast comparing the adaptive fading condition with the problem solving + fixed fading conditions revealed higher transfer performance for the adaptive fading condition on the regular post-test: ''F''(1, 54) = 5.05, ''p'' < .05, ''η²'' = .09. This effect was replicated on the delayed post-test: ''F''(1, 54) = 4.42, ''p'' < .05, ''η²'' = .08.<br />
**There were no differences in time spent on either of the post-tests: ''F''s < 1.<br />
<br />
===Study 6 (lab study at CMU, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**Lab Study: 8th, 9th and 10th grade students in rural Pennsylvannia school<br />
**Domain: Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: April 18, 2007<br />
**End Date: April 19, 2007<br />
**Number of Students: 62<br />
**Participant Hours: 2<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of 4 conditions:<br />
**Tutored Problem Solving Condition: In this control, condition students solved answer steps and entered explanations on all problems in the standard Cognitive Tutor.<br />
**Tutored Worked Examples Condition: In this experimental condition, students were first presented with problems that had worked-out (i.e., filled in) answer steps, but they still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations.<br />
**Untutored Problem Solving Condition: In this experimental condition students solved answer steps and entered explanations on all problems in a stripped down version of the Cognitive Tutor. Students could NOT ask for hints and only after entering all answer and reason steps could they receive corrective feedback. They could still browse the Glossary while working on a problem.<br />
[[Image:Screenshot-notutoring_problem_solving.jpg]]<br />
**Untutored Worked ExamplesProblem Solving Condition: In this experimental condition, students were first presented with problems that had worked-out (i.e., filled in) answer steps, but they still had to enter the explanations for these steps. As they progressed through the Unit, these worked-out answer steps were faded meaning that, towards the end of the Unit, the students had to fill in answer steps and explanations.<br />
Furthermore, students were working in a stripped down version of the Cognitive Tutor. Students could NOT ask for hints and only after entering all answer and reason steps could they receive corrective feedback. They could still browse the Glossary while working on a problem.<br />
[[Image:Screenshot-notutoring_worked_examples.jpg]]<br />
<br />
*''Findings''<br />
**No overall effect of experimental condition on students' conceptual and procedural knowledge on the post-test: ''t'' < 1.<br />
**No difference in time to complete the Circles Unit in the Tutor: ''t'' < 1.<br />
<br />
===Study 7 (In Vivo study at CMU, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**In Vivo Study: 10th grade geometry classes in two rural Pennsylvannia high schools<br />
**Domain: Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: April 30, 2007<br />
**End Date: June 1, 2007<br />
**Number of Students: 104<br />
**Participant Hours: 5<br />
**Data in Datashop: Yes<br />
<br />
*The students were randomly assigned to one of 4 conditions:<br />
**Problem Solving Condition: In this control, condition students solved answer steps and entered explanations on all problems in the standard Cognitive Tutor and received step-by-step textual hints. For a screenshot see the Problem Solving Condition of Study 1.<br />
**Worked Examples in Hints Condition: In this experimental condition, students solved answer steps and entered explanations on all problems in the standard Cognitive Tutor and for each Geometry concept received a worked example as a hint. In contrast to the control condition which had several hints levels for each step, this experimental condition had only one hint level: the worked example.<br />
[[Image:Spring_2007_-_examples_in_hints_study_CWCTC_and_Wilkinsburg_-_Examples_condition-small.jpg]]<br />
<br />
*''Findings''<br />
**No overall effect of experimental condition on students' conceptual and procedural knowledge on the post-test: ''t'' < 1.<br />
**No difference in time to complete the Circles Unit in the Tutor: ''t'' < 1.<br />
<br />
<br />
===Study 8 (In Vivo study at CMU, Geometry Cognitive Tutor) ===<br />
*''Summary''<br />
**In Vivo Study: 10th grade geometry classes in three rural Pennsylvannia high schools<br />
**Domain: Circles Unit in the Geometry Cognitive Tutor<br />
**Start Date: April 15, 2009<br />
**End Date: June 10, 2009<br />
**Number of Students: 151<br />
**Participant Hours: 9-10<br />
**Data in Datashop: No<br />
<br />
*The students were randomly assigned to one of 3 conditions:<br />
**Problem Solving Condition: In this control, condition students solved answer steps and entered explanations on all problems in the standard Cognitive Tutor.<br />
**All Examples Condition: In this experimental condition, students receive worked-out answer steps for all the required problems. There is no fading of these steps thus the first time students get to solve answer steps is on the remedial problem set.<br />
**Adaptive Fading of Worked Examples Condition: This experimental condition is similar to the All Examples condition but differs in the fact that fading of the filled in answer steps occurs and the fading is based on the individual student's performance on both the answer and the reason steps.<br />
<br />
<br />
*''Findings''<br />
**The study is still in progress.<br />
<br />
<br />
=== Summary of Findings and Explanation ===<br />
This project belongs to the interactive communication cluster because it investigates a variation of the amount of contribution from the system and from the learner, respectively: Who provides the solution of the initial solution steps? <br />
<br />
More specifically, this study is about changes in path choices that occur when a tutoring system includes partially worked examples. The basic idea is that when a tutor relieves a student of most of the work in generating a line by providing part of it, then students are more likely to engage in deep learning to fill in the rest. However, the instruction must be engineered so that students still become autonomous problem solvers—they eventually can do all the work themselves.<br />
<br />
In the first German laboratory study, the standard Cognitive Tutor was compared with an example-enriched Cognitive Tutor. While no effects on procedural and conceptual knowledge transfer items were found, the students working with the example-enriched Tutor completed the curriculum faster than the students in the standard Cognitive Tutor. Using the learning time to measure the condition efficiency showed that the example-enriched Tutor obtained higher learning efficiency on the transfer test. Since the German students were inexperienced with the Cognitive Tutor, more detailed instructions were provided in the follow up study. Consequently, the students working on the example-enriched Tutor showed a higher gain on the conceptual knowledge items of the transfer test than the students working with the standard Tutor. Furthermore, similar to the first study the example-enriched Tutor led to significantly shorter learning time than the standard Cognitive Tutor. Lastly, using the learning time to measure efficiency revealed higher learning efficiency for the example-enriched Tutor on the conceptual knowledge items of the transfer test.<br />
In terms of the robust learning framework, these results shows that worked-out examples lead to same level of foundational skills in less time. Furthermore, the second study shows that fading worked-out examples can improve sense-making which consequently leads to better robust learning.<br />
<br />
<br />
<br />
=== Annotated bibliography ===<br />
Salden R. J. C. M., Aleven, V., Renkl, A., & Wittwer, J. (2006). Does Learning from Examples Improve Tutored Problem Solving? In 2006 Proceedings of the 28th Annual Meeting of the Cognitive Science Society (pp. 2602), Vancouver, Canada. [http://www.learnlab.org/uploads/mypslc/publications/salden.pdf Link to paper]<br />
<br />
Presentation to the PSLC Advisory Board, Fall 2006.<br />
<br />
Schwonke, R., Wittwer, J., Aleven, V., Salden, R. J. C. M., Krieg, C., & Renkl, A. (2007). Can tutored problem solving benefit from faded worked-out examples? Paper presented at The European Cognitive Science Conference 2007, May 23-27. Delphi, Greece. [http://www.learnlab.org/uploads/mypslc/publications/eurocogsci%202007%20fading%20of%20woe%20in%20cogtutor_rs_jw_ar_rs_jw_rs.doc Link to paper]<br />
<br />
Salden, R., Aleven, V., & Renkl, A. (2007). Can tutored problem solving be improved by learning from examples? Proceedings of the 29th Annual Conference of the Cognitive Science Society (p. 1847), Nashville, USA.<br />
<br />
Salden, R., Aleven, V., & Renkl, A., & Schwonke, R. (2008). Worked Examples and the Assistance Dilemma. Paper presented at The American Educational Research Association 2008, March 23-27. New York, USA. [http://www.learnlab.org/research/wiki/images/AERA_2008_paper_-_Salden%2C_Aleven%2C_Renkl_%26_Schwonke.pdf Link to paper]<br />
<br />
Salden, R., Aleven, V., Schwonke, R., & Renkl, A. (2008, June). Worked Are Worked Examples and Tutored Problem Solving Synergistic Forms of Support? Poster presented at the 8th International Conference of the Learning Sciences (ICLS).<br />
[http://www.learnlab.org/research/wiki/images/6/60/ICLS_2008_poster_Salden%2C_Aleven%2C_Schwonke_%26_Renkl.pdf Link to paper]<br />
<br />
Salden, R., Aleven, V., & Renkl, A., & Schwonke, R. (2008, July). Worked Examples and Tutored Problem Solving: Redundant or Synergistic Forms of Support? Paper presented at the 30th Annual Meeting of the Cognitive Science Society, July 23-26. Washington DC, USA.<br />
[http://www.learnlab.org/research/wiki/images/e/e8/Fp495-salden.pdf ''Winner of the "Cognition and Student Learning" award'']<br />
<br />
Schwonke, R., Renkel, A., Krieg, C, Wittwer, J., Aleven, V., Salden, R. J. C. M. (2009). The Worked-example Effect: Not an Artefact of Lousy Control Conditions. ''Computers in Human Behavior, 25'', 258-266.<br />
<br />
Salden, R. J. C. M., Aleven, V. A. W. M. M., Renkl, A., & Schwonke, R. (2009). Worked examples and tutored problem solving: Redundant or synergistic forms of support? ''Topics in Cognitive Science, 1'', 203-213.<br />
<br />
Salden, R. J. C. M., Aleven, V. A. W. M. M., Schwonke, R., & Renkl, A. (2009). ''The expertise reversal effect and worked examples in tutored problem solving''. Manuscript submitted for publication.<br />
<br />
<br />
<br />
=== References ===<br />
Anderson, J. R., Corbett, A. T., Koedinger, K. R., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. ''The Journal of the Learning Sciences, 4'', 167-207.<br />
<br />
Corbett, A. T., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of procedural knowledge. ''User Modeling and User-Adapted Interaction, 4'', 253-278.<br />
<br />
Koedinger, K. R., Anderson, J. R., Hadley, W. H., & Mark, M. A. (1997). Intelligent tutoring goes to school in the big city. ''International Journal of Artificial Intelligence in Education, 8'', 30-43.<br />
<br />
Paas, F., & van Merriënboer, J.J.G. (1993). The efficiency of instructional conditions: An approach to combine mental-effort and performance measures. ''Human Factors, 35'', 737-743.<br />
<br />
Renkl, A., & Atkinson, R. K. (in press). Cognitive skill acquisition: Ordering instructional events in example-based learning. F. E. Ritter, J. Nerb, E. Lehtinen, T. O’Shea (Eds.), ''In order to learn: How ordering effect in machine learning illuminate human learning and vice versa''. Oxford, UK: Oxford University Press.<br />
<br />
Sweller, J., Merriënboer, J. J. G. van, & Paas, F. G. (1998). Cognitive architecture and instructional design. ''Educational Psychology Review, 10'', 251-296.<br />
<br />
<br />
[http://editingwritingservices.org/ essay editing]</div>Kamearobinsonhttp://learnlab.org/research/wiki/index.php?title=DiBiano_Personally_Relevant_Algebra_Problems&diff=12162DiBiano Personally Relevant Algebra Problems2011-08-31T09:56:20Z<p>Kamearobinson: </p>
<hr />
<div>== The Effect of Context Personalization on Problem Solving in Algebra ==<br />
''Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman''<br />
<br />
=== Summary Tables ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Candace Walkington & Anthony Petrosino<br />
|-<br />
| '''Other Contributors''' || <br />
* Graduate Student: Milan Sherman<br />
* Staff: Jim Greeno<br />
|}<br />
<br><br />
'' Pilot Study ''<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''Study Start Date''' || September 2008<br />
|-<br />
| '''Study End Date''' || May 2009<br />
|-<br />
| '''Study Site''' || Austin, TX<br />
|-<br />
| '''Number of Students''' || ''N'' = 24<br />
|-<br />
| '''Average # of hours per participant''' || 2 hrs.<br />
|}<br />
<br><br />
<br />
'' In Vivo Study ''<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''Study Start Date''' || October 2009<br />
|-<br />
| '''Study End Date''' || April 2010<br />
|-<br />
| '''LearnLab Site''' || Hopewell High<br />
|-<br />
| '''LearnLab Course''' || Algebra<br />
|-<br />
| '''Number of Students''' || ''N'' = 111<br />
|-<br />
| '''Average # of hours per participant''' || 3 hr<br />
|-<br />
| '''Data in DataShop''' || Yes - Personalization Hopewell 2010<br />
|}<br />
<br><br />
<br />
=== Abstract ===<br />
<br />
In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001). However, observations and discussions with teachers have suggested that some of the Cognitive Tutor problem scenarios may be disconnected from the lives and experiences of many students. This study investigated whether students’ personal interest in story contexts affects performance and [[robust learning]]. <br />
<br />
The first stage of this research was a pilot study of the personal interests of students at an urban Texas high school. Freshman algebra students were surveyed and interviewed about their out-of-school interests, and were also asked to describe how they use mathematics in their everyday lives. Twenty-four of these students solved a number of Cognitive Tutor Algebra-style problems while thinking aloud. Results of this pilot study were used to critically examine the idea that personalization of story problems has the potential to support student learning, using qualitative data analysis methods. <br />
<br />
The second stage of this research was an “in vivo” study that took place in Fall of 2009 at a Pennsylvania Learnlab site. Based on the results of the pilot study and additional student surveys from Pennsylvania, the 27 problems in Section 5 ("Linear Models and Independent Variables)" of the [[cognitive tutor|Cognitive Tutor]] software were rewritten to each have 4 “personalized” versions corresponding to different student interests. The [[cognitive tutor|Cognitive Tutor]] software was programmed to give participating students an initial interests survey, and then select problem scenarios that match their interests. The resulting [[robust learning]], measured by a delayed post-test (measuring long-term retention), and mastery of knowledge components in a future section (measuring transfer), has been analyzed with a 2-group design (experimental vs. control) to measure the effect of [[personalization]] on learning. Measures from within Section 5 were also analyzed to measure the effect of personalization on performance.<br />
<br />
=== Background and Significance ===<br />
<br />
This research direction was initiated by the observation of classrooms in Texas using the [[cognitive tutor|Cognitive Tutor]] Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching practice. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the [[cognitive tutor|Cognitive Tutor]] problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech. A review of the literature showed limited evidence for the potential of relevant story contexts to increase learning, and little research had been done at the secondary school level. This study is designed to empirically test the claim that the personal relevance of story problems affects [[robust learning]] and performance. <br />
=== Theoretical Framework===<br />
<br />
This study is situated in the new “Motivation and Metacogntion” thrust. The foundation of this study is that relevance of problem scenarios affects robust learning through increased intrinsic motivation (Cordova & Lepper, 1996). If learners that have the cognitive capacity to solve algebra story problems, enhancing motivation may increase their likelihood to exert effort to make sense of the scenarios by forming a more elaborated and better connected situation and problem models (Nathan, Kintsch, & Young, 1992), thus encouraging generative processing (Mayer, 2011). Mayer (2011) states the personalization principle as “People learn better when the instructor uses conversational style rather than formal style” (p. 70). Here, we are use the PSLC’s modified version of this principle, which states “Matching up the features of an instructional component with students' personal interests, experiences, or typical patterns of language use, will lead to more robust learning through increased motivation, compared to when instruction is not personalized.” This is related to what Mayer (2011) refers to as the “Anchoring” principle.<br />
<br />
The construct through which personalization enhances intrinsic motivation is through increased personal interest (also called individual interest). Personal interest is considered to be stable, enduring preferences that individual learners bring with them to different situations (Anderman & Anderman, 2010). Interest promotes more effective processing of information and greater cognitive engagement. Students who have high interest may be more likely to relate new knowledge to prior knowledge and form more connections between ideas. They also may be more likely to generate inferences, examples and applications relating to the subject area they are trying to learn (Ormrod, 2008).<br />
<br />
=== Pilot Study===<br />
<br />
The first stage of this research began in Fall of 2008 with a pilot study of personalization at an "Academically Unacceptable" school in Texas (75% free/reduced lunch). Twenty-four freshman algebra students were interviewed about their out-of-school interests, such as sports, music, movies, etc., and were also asked to describe how they use mathematics in their everyday lives. These interviews were audio recorded, and were used to write each student “personalized” algebra story problems. The research questions being investigated were:<br />
<br />
* What is the impact of personalizing algebra story problems to individual student experiences, in terms of strategy use, language comprehension, and students’ epistemological frames about mathematical activity? (qualitative)<br />
<br />
* How does personalizing algebra story problems to individual experiences impact student performance, when compared to their performance on normal story problems from the Cognitive Tutor curriculum with the same underlying structure? (quantitative)<br />
<br />
A problem set containing five algebra problems on linear functions was written for each student; two of these were story problems that were personalized to the ways in which the individual student described using mathematics in their everyday life during their initial interview. The problem set also contained normal story problems from the Cognitive Tutor curriculum, completely abstract symbolic equations, story problems that contained symbolic equations, and story problems with simplified language and general referents (“generic” story problems). Each problem had four parts – the first two parts were “Result Unknowns” or “concrete cases” (i.e. solve for y given this x), and the fourth and final part was a “Start Unknown” (i.e. solve for x given this y). For normal, personalized, and generic problems, the third part of each problem asked students to write a general symbolic equation or “algebra rule” representing the story. For normal story problems that already contained equations, students were asked to interpret the parameters in terms of the story. For completely abstract symbolic problems, students were asked to write a story that could go with the equation.<br />
<br />
Each of the 24 students was given their problem set of 5 problems, and asked to solve each problem while “thinking aloud” and being audio recorded. Transcripts and student work were blocked such that one block was one student working one part of one problem. Blocks were coded with strategies, mistakes, and other issues the students had solving story problems (like reading issues); kappa values of 0.79 or higher were obtained using 2 coders.<br />
<br />
Results showed that students regularly used informal, arithmetic approaches to solve result and start unknown story problems, especially when the problem had been personalized. Personalized problems had the lowest “No Response” rate (1% No Response), the highest use of informal strategies (80% of time), and students overwhelmingly perceived personalized problems as being “easiest” when asked (82% of time). Personalized problems also had higher success rates and lower student use of “non-coordinative” strategies where situational reasoning was not well-connected to formal problem-solving computations. When asked why they were given story problems in algebra class, students described how these problems would help them in the real world and in the workplace.<br />
<br />
However, personalized problems still had a relatively high overall use of non-coordinative approaches (16% of time), and students also struggled with reading on personalized problems at similar rates to other problems (also 16% of time; some overlap). Students’ overwhelming use of informal strategies when solving personalized problems could be framed as problematic in a course where the overall goal is to have students use symbolic equations as representational tools. Finally, there was evidence that students still sometimes epistemologically framed personalized problems as “school mathematics” tasks, disconnected from their lived experiences.<br />
<br />
Quantitative analyses specifically aimed to compare performance on personalized story problems versus normal story problems were carried out replicating the methodology of Koedinger & Nathan (2004). Students solved personalized problem correctly 61% of the time overall, and solved normal story problems correctly 45% of the time overall. However, using two 2-factor mixed model ANOVAs that treated students (ANOVA 1) and items (ANOVA 2) as random effects, no statistically reliable overall differences in performance were found between normal and personalized problems. “Items” in this case described the underlying mathematical structure of the story problem – i.e., the story described the equation “y=4x+11.” The two ANOVAs were repeated using only the hardest items, and using only the weakest students, and statistically reliable (p<.05), positive effects were found for personalization. The effect size (Cohen’s d) for the hardest problems was 0.9, and for the weakest students was 1.5.<br />
<br />
These results need to be interpreted with caution, as this was a small sample size (24 students), the personalization was done at a level of correspondence to real experiences that a computer could not replicate, and this was a population of students who overall were especially weak in mathematics.<br />
<br />
=== Research Questions for In Vivo Study===<br />
<br />
* How will performance and time on task be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?<br />
* How will [[robust learning]] be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?<br />
<br />
=== Independent Variables for In Vivo Study ===<br />
<br />
This experiment will manipulate level of [[personalization]] through a two grpuo design<br />
*Control: Students who receive current Cognitive Tutor Algebra story problems for Unit 5<br />
*Experimental: Students who receive problems that have the same mathematical structure, but whose cover stories are personalized to individual students based on an interests survey<br />
<BR><br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''Treatment'''|| '''Example Problem''' || '''Received By'''<br />
|-<br />
| Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes. A run consists of riding the ski lift up the hill, and skiing back down. If she skiis for 3 hours, how many runs will she have completed? || 54 randomly-assigned Algebra I students at Learnlab site<br />
|-<br />
| [[personalization|Personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)<br />
You noticed that the reality shows you watch on T.V. are all 30 minutes long. If you’ve been watching reality shows for 3 hours, how many have you watched?<br />
|| 57 randomly-assigned Algebra I students at Learnlab site<br />
|}<br />
<BR><br />
=== Dependent variables for In Vivo Study ===<br />
<br />
[[Robust learning]] was measured through: <br />
*'''''Delayed Post-test''''' measuring [[long-term retention]]<br />
** A pre-test was administered before Unit 5, and a delayed post-test was administered at the end of Unit 6.<br />
* '''''Mastery of knowledge components''''' in the [[cognitive tutor|Cognitive Tutor]] software, including in subsequent units: <br />
**The students’ performance in Unit 7 was also examined, to see if there were performance differences between the experimental and control group even after the treatment was no longer in effect.<br />
<br />
'''Intrinsic Motivation''' will be measured through:<br />
*Hint-seeking and reading behavior in Cognitive Tutor software<br />
*Time on task in Cognitive Tutor software<br />
<br />
=== Hypotheses for In Vivo Study ===<br />
<br />
Students in the treatment with [[personalization|personalized]] problem scenarios will:<br />
<br />
H1) Demonstrate higher levels of correct performance in Section 5<br />
<br />
H2) Show improved “time on task” and fewer instances of “gaming the system” in Section 5<br />
<br />
H3) Show improvement on some measures of [[robust learning]], as measured by pre/delayed post differences and by performance in subsequent sections.<br />
<br />
=== Method for In Vivo Study ===<br />
<br />
Interest surveys were administered to algebra students in Pennsylvania (N=47) and algebra students in Texas (N=29). The surveys contained sections where students ranked their interest in 9 different topics and answered 20 open response questions about specific topics they were interested in. The algebra students in Texas also participated in one-on-one interviews about their out-of-school interests (part of pilot study). Based on the results of the surveys and interviews, personally relevant problem scenarios corresponding to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I were formulated for Section 5, Linear Models and Independent Variables. 27 problem scenarios from the selected section were rewritten to have 4 different variations for each problem scenario, corresponding to 9 different topics students were interested in (sports, music, movies, computers, stores, food, art, TV, games). The personally relevant problems had the same underlying mathematical structure as the original problems, with changes made to the objects or nouns (what the problem is about) in the story and the pronouns (who the problem is about). See the table above for an example of how these changes occurred. The personally relevant problem scenarios were reviewed by two master Algebra I teachers for language and clarity and were modified based on teacher feedback.<br />
<br />
The new problem scenarios were integrated into Unit 5 the [[cognitive tutor|Cognitive Tutor]] Algebra software at the high school site with the cooperation of Carnegie Learning. 111 students at the school site were randomly assigned to either the experimental group (personalized problems) or the control group (normal problems). The experiment was in-sequence, meaning that all students encountered Section 5 at their own pace (i.e. at the time they naturally reached that point the software). Immediately before students entered Unit 5, they were prompted to answer an interest survey where they ranked their level of interest in the 9 different topics, and took a pre-test where they solved two multi-part normal story problems. After the students completed Unit 6, they were given a delayed-post-test.<br />
<br />
===Results===<br />
<br />
H1) Students receiving personalized problems will demonstrate higher levels of performance in Unit 5 than students receiving normal problems.<br />
<br />
In order to test this hypothesis, a logistic regression model was formulated with the following properties. The unit of analysis was one student solving one part of one problem.<br />
<br />
* Dependent Variable – whether the student got the problem part correct on their first attempt, without asking for a hint.<br />
* Random Effects – the student ID , the item (linear function underlying the problem), and the problem name (which personalized version student was given, or which set of numbers student was given for result and start unknowns)<br />
* Fixed Effects – Condition (whether the student was in the experimental or control group) and what knowledge component was covered by the problem part<br />
<br />
Each of these effects significantly improved the model. Interactions did not significantly improve the model. The main effect for the treatment (personalization) was statistically significant at the 5% level. Personalization had a positive overall effect on student performance. The size of the overall impact of personalization on performance was around 5.3%. If a student had a 50% base chance of getting a problem correct on the first attempt, personalization would increase that chance to 55.3%.<br />
<br />
Although interaction terms were not significant in this model, this seemed to be a combination of lack of statistical power and the addition of many parameters when interactions were modeled. Thus a second model was specified where the knowledge components were classified as easy, medium, and hard, and here there was a significant condition by knowledge component interaction. Personalization had a significantly larger, positive impact on the two most difficult knowledge components relating to writing symbolic expressions, compared to the medium difficulty knowledge components. For the most difficult knowledge components, personalization increased success rates from 50% to 58%.<br />
<br />
More results coming soon.<br />
<br />
<br />
=== References ===<br />
<br />
Anderman, E., & Anderman, L. (2010). Classroom Motivation. Pearson: Columbus, OH.<br />
<br />
Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.<br />
<br />
Cordova, D. I. & Lepper, M. R. (1996). Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice. Journal of Educational Psychology, 88(4), 715-730.<br />
<br />
[[REAP_Study_on_Personalization_of_Readings_by_Topic_%28Fall_2006%29|Eskenazi, M.; Juffs, A., Heilman, M., Collins-Thompson, K., Wilson, L., & Callen, J. (2006). REAP Study on Personalization of Readings by Topic (Fall 2006). The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org]]<br />
<br />
Koedinger, K. R. (2001). Cognitive tutors as modeling tool and instructional model. In Forbus, K. D. & Feltovich, P. J. (Eds.) Smart Machines in Education: The Coming Revolution in Educational Technology. Menlo Park, CA: AAAI/MIT Press.<br />
<br />
Nathan, M., Kintsch, W., & Young, E. (1992). A theory of algebra-word-problem comprehension and its implications for the design of learning environments. Cognition and Instruction, 9(4), 329-389.<br />
<br />
Ormrod, J. Human Learning. Pearson/Merrill/Prentice Hall: Columbus, OH.<br />
<br />
Mayer, R. (2011). Applying the Science of Learning. Pearson.<br />
<br />
[[Stoichiometry_Study|McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org]]<br />
<br />
<br />
[[Category:Study]]<br />
<br />
<br />
[http://editingwritingservices.org/hesitating.php creative writing services]</div>Kamearobinson