Difference between revisions of "Composition Effect Kao Roll - old, please keep"

From LearnLab
Jump to: navigation, search
 
(11 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== The Composition Effect - What is the Source of Difficulty in Problems which Require Application of Several Skills? ==
+
== The Composition Effect - What is the Source of Difficulty in Problems which Require Application of Several Skills? ==
Yvonne Kao, Ido Roll, Kenneth E. Koedinger
+
Ido Roll, Yvonne Kao, Kenneth E. Koedinger
  
 
=== Abstract ===
 
=== Abstract ===
  
 
+
Composite problems, i.e., problems that require the application of more than one skill, are shown to be harder than a collection of single-step problems requiring the application of the same set of skills.
 +
A common explanation is that the composition itself imposes another difficulty level. However, an alternative explanation suggests that the composition makes the application of the individual skills harder. According to that explanation, poor feature validity and shallow domain rules make it harder on students to apply the individual skills correctly in the cluttered environment of composite problems, regardless the need to apply additional skills.
 +
Our study investigates these issues in two ways: (1) Having a DFA which evaluates performance on composite problems and single-step problems using the same data, and (2) By evaluating the effect of instruction targeting a common misconception in single-step problems on composite problems.
  
 
=== Glossary ===
 
=== Glossary ===
  
 +
- Composite problems: Problems which require the application of several skills, such as solving 3x+6=0 for x.
 +
 +
- Single-step problems: Problems which require the application of a single skill, such as y+6=0 or 3x=-6
 +
 +
- DFA (Difficulty Factor Analysis): A test that includes pairs of items varying along one dimension only. It allows to evaluate the difficulty level of the single dimensions along which the problems differ.
 +
 +
- The Composition Effect: The effect according to which composite problems are harder than a set of single-step problems using the same skills.
  
  
 
=== Research question ===
 
=== Research question ===
  
 +
What is the main source of difficulty in composite problems?
  
  
 
=== Background and Significance ===
 
=== Background and Significance ===
  
 +
Significance: This study can shed some light on the source of difficulty on composite problems, and thus can inform the design of relevant instruction and remediation.
  
 
=== Independent Variables ===
 
=== Independent Variables ===
  
 +
An instruction in the form of solved-example, targeting a common misconception - identifying base and hight in a cluttered environment.
  
 
=== Dependent variables ===
 
=== Dependent variables ===
  
The study uses two levels of dependent measures:
+
Three tests are used in the study:
- Directly assessing Help Seeking skills
+
- Pre-test: given before all instruciton
- Indirectly assessing help-seeking skills through their contribution to domain learning
+
- Mid-test: given after students learned about single-step problems and before composite problems
 
+
- Post-test: after students have learned and practice all material.  
Direct help seeking assessment:
+
- Procedural knowledge - by tracings students' actions against a model of ideal help-seeking behavior
+
- Declarative knowledge - using hypothetical help-seeking dilemmas.  
+
- Long term retention - by assessing help-seeking behavior on subsequent learning events (was not done yet)
+
- Transfer - assessing help-seeking behavior across environments, on a paper and pencil test which includes embedded help-seeking opportunities
+
  
Assessing help-seeking through domain knowledge
+
The tests include the following items. Some of which are [[transfer]] items, evaluating robust learning, since they require and adaptive application of the knowledge learned and practiced in class.
- Normal posttest
+
- Transfer measure - using not-enough-information items
+
- Accelerate future learning - assessing learning on subsequent learning events.  
+
  
 +
* Simple diagram:
 +
*# no distractors, canonical orientation
 +
*# distractors,    canonical orientation
 +
*# no distractors, tilted orientation
 +
*# distractors,    tilted orientation
 +
* Complex diagram:
 +
*# Given complex diagram, ask for skill A
 +
*# Given complex diagram, ask for skill B
 +
*# Given steps A and B,  ask for skills C (which requires A and B)
 +
*# Given complex diagram, ask for C (which requires A and B)
  
 
=== Hypothesis ===
 
=== Hypothesis ===
  
 +
# The difficulty level in composite problems originates in poor feature validity of the single skills.
 +
#* An operationalized version of this hypothesis is that performance on items of type "Find measure C based on diagram" will be equivalent to the multiplication of success rate on items "Find measure C based on items A and B", "Find measure A based on diagram", and "Find measure B based on diagram".
 +
# Tilted orientation and distractors still imposes difficulty even once students mastered the skills
 +
# Direct instruction during the test which targets these misconceptions in the form of a solved example can improve performance
  
 
=== Findings ===
 
=== Findings ===
  
 +
None yet.
  
 
=== Explanation ===
 
=== Explanation ===
Line 53: Line 72:
 
=== Annotated bibliography ===
 
=== Annotated bibliography ===
  
Aleven, V., & Koedinger, K.R. (2000) Limitations of student control: Do students know when they need help? in proceedings of 5th International Conference on Intelligent Tutoring Systems, 292-303. Berlin: Springer Verlag.
+
Bransford (2000). How people learn: brain, mind, experience, and school National Academy Press.
Aleven, V., McLaren, B.M., Roll, I., & Koedinger, K.R. (2004) Toward tutoring help seeking - Applying cognitive modeling to meta-cognitive skills . in proceedings of 7th Int C on Intelligent Tutoring Systems, 227-39. Berlin: Springer-Verlag.
+
Heffernan, N.T., & Koedinger, K.R. (1997) The composition effect in symbolizing: The role of symbol production vs. text comprehension. in proceedings of Nineteenth Annual Conference of the Cognitive Science Society, 307-12. Hillsdale, NJ: Erlbaum.
Aleven, V., Roll, I., McLaren, B.M., Ryu, E.J., & Koedinger, K.R. (2005) An architecture to combine meta-cognitive and cognitive tutoring: Pilot testing the Help Tutor. in proceedings of 12th Int C on Artificial Intelligence in Education, Amsterdam, The Netherlands: IOS press.
+
Koedinger, K.R., & Anderson, J.R. (1997). Intelligent Tutoring Goes to School in the Big City. International Journal of Artificial Intelligence in Education 8, 30-43
Aleven, V., McLaren, B.M., Roll, I., & Koedinger, K.R. (2006). Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. Int J of Artificial Intelligence in Education(16), 101-30
+
Koedinger, K. R. & Cross, K. (2000). Making informed decisions in educational technology design: Toward meta-cognitive support in a cognitive tutor for geometry. Presented at the annual meeting of the American Educational Research Association, New Orleans, LA.
Roll, I., Aleven, V., & Koedinger, K.R. (2004) Promoting Effective Help-Seeking Behavior through Declarative Instruction. in proceedings of 7th Int C on Intelligent Tutoring Systems, 857-9. Berlin: Springer-Verlag.
+
Owen, E., & Sweller, J. (1985). What do students learn while solving mathematics problems?  Journal of Educational Psychology, 77, 272-284.
Roll, I., Baker, R.S., Aleven, V., McLaren, B.M., & Koedinger, K.R. (2005) Modeling Students’ Metacognitive Errors in Two Intelligent Tutoring Systems. in L. Ardissono, (Eds.), in proceedings of User Modeling 2005, 379-88. Berlin: Springer-Verlag.
+
Simon, H. A., & Lea, G. (1974).  Problem solving and rule induction: A unified view. In L. W. Gregg (Ed.), Knowledge and cognition. Hillsdale, NJ: Erlbaum.
Roll, I., Ryu, E., Sewall, J., Leber, B., McLaren, B.M., Aleven, V., & Koedinger, K.R. (2006) Towards Teaching Metacognition: Supporting Spontaneous Self-Assessment. in proceedings of 8th Int C on Intelligent Tutoring Systems, 738-40. Berlin: Springer Verlag.
+
Roll, I., Aleven, V., McLaren, B.M., Ryu, E., Baker, R.S., & Koedinger, K.R. (2006) The Help Tutor: Does Metacognitive Feedback Improves Students' Help-Seeking Actions, Skills and Learning? in proceedings of 8th Int C on Intelligent Tutoring Systems, 360-9. Berlin: Springer Verlag.
+
  
  
 
[[Category:Empirical Study]]
 
[[Category:Empirical Study]]
 +
[[Category:Protected]]

Latest revision as of 14:40, 9 April 2008

The Composition Effect - What is the Source of Difficulty in Problems which Require Application of Several Skills?

Ido Roll, Yvonne Kao, Kenneth E. Koedinger

Abstract

Composite problems, i.e., problems that require the application of more than one skill, are shown to be harder than a collection of single-step problems requiring the application of the same set of skills. A common explanation is that the composition itself imposes another difficulty level. However, an alternative explanation suggests that the composition makes the application of the individual skills harder. According to that explanation, poor feature validity and shallow domain rules make it harder on students to apply the individual skills correctly in the cluttered environment of composite problems, regardless the need to apply additional skills. Our study investigates these issues in two ways: (1) Having a DFA which evaluates performance on composite problems and single-step problems using the same data, and (2) By evaluating the effect of instruction targeting a common misconception in single-step problems on composite problems.

Glossary

- Composite problems: Problems which require the application of several skills, such as solving 3x+6=0 for x.

- Single-step problems: Problems which require the application of a single skill, such as y+6=0 or 3x=-6

- DFA (Difficulty Factor Analysis): A test that includes pairs of items varying along one dimension only. It allows to evaluate the difficulty level of the single dimensions along which the problems differ.

- The Composition Effect: The effect according to which composite problems are harder than a set of single-step problems using the same skills.


Research question

What is the main source of difficulty in composite problems?


Background and Significance

Significance: This study can shed some light on the source of difficulty on composite problems, and thus can inform the design of relevant instruction and remediation.

Independent Variables

An instruction in the form of solved-example, targeting a common misconception - identifying base and hight in a cluttered environment.

Dependent variables

Three tests are used in the study: - Pre-test: given before all instruciton - Mid-test: given after students learned about single-step problems and before composite problems - Post-test: after students have learned and practice all material.

The tests include the following items. Some of which are transfer items, evaluating robust learning, since they require and adaptive application of the knowledge learned and practiced in class.

  • Simple diagram:
    1. no distractors, canonical orientation
    2. distractors, canonical orientation
    3. no distractors, tilted orientation
    4. distractors, tilted orientation
  • Complex diagram:
    1. Given complex diagram, ask for skill A
    2. Given complex diagram, ask for skill B
    3. Given steps A and B, ask for skills C (which requires A and B)
    4. Given complex diagram, ask for C (which requires A and B)

Hypothesis

  1. The difficulty level in composite problems originates in poor feature validity of the single skills.
    • An operationalized version of this hypothesis is that performance on items of type "Find measure C based on diagram" will be equivalent to the multiplication of success rate on items "Find measure C based on items A and B", "Find measure A based on diagram", and "Find measure B based on diagram".
  2. Tilted orientation and distractors still imposes difficulty even once students mastered the skills
  3. Direct instruction during the test which targets these misconceptions in the form of a solved example can improve performance

Findings

None yet.

Explanation

Descendents

Annotated bibliography

Bransford (2000). How people learn: brain, mind, experience, and school National Academy Press. Heffernan, N.T., & Koedinger, K.R. (1997) The composition effect in symbolizing: The role of symbol production vs. text comprehension. in proceedings of Nineteenth Annual Conference of the Cognitive Science Society, 307-12. Hillsdale, NJ: Erlbaum. Koedinger, K.R., & Anderson, J.R. (1997). Intelligent Tutoring Goes to School in the Big City. International Journal of Artificial Intelligence in Education 8, 30-43 Koedinger, K. R. & Cross, K. (2000). Making informed decisions in educational technology design: Toward meta-cognitive support in a cognitive tutor for geometry. Presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Owen, E., & Sweller, J. (1985). What do students learn while solving mathematics problems? Journal of Educational Psychology, 77, 272-284. Simon, H. A., & Lea, G. (1974). Problem solving and rule induction: A unified view. In L. W. Gregg (Ed.), Knowledge and cognition. Hillsdale, NJ: Erlbaum.