DataShop 3.x Features
- 1 v3.1 (November 2008)
- 2 v3.2 (February 2008)
- 3 v3.3 (March 2009)
- 4 v3.4 (June 2009)
- 5 v3.5 (August 2009)
- 6 v3.6 (October 2009)
v3.1 (November 2008)
Learning Curve Point Info [Actual: 6 weeks?]
As a researcher exploring learning curves, I'd like to see more information about a data point.
- This is the first half of a feature to drill down on points in the learning curve.
v3.2 (February 2008)
CFG Stored Procedure [Estimate: 6 weeks]
As a system administrator of DataShop, I want the CFG to run much faster so that the log conversion stays within a couple of hours and does not take days, and that the users can see new data on a daily basis.
- DS766: (Speed: use stored procedures to speed up the CFG)
SSSS Creation Speed-up [Estimate: 2 weeks]
As a researcher using LFA, I want LFA to run on more KCMs, so that I can compare BICs for different KCMs.
- DS792: (LFA: SSSS generation throws an OutOfMemory error.)
- I noticed that an LFA run got an 'out of memory error' before even getting to Hao's code. -- Alida, 11/21/2008
v3.3 (March 2009)
Sample Creation Speed-up [Estimate: 6 weeks]
As a DataShop user, I want to be able to create new samples on big datasets so that I can analyze the data more easily.
v3.4 (June 2009)
Learning Curve Point Info Details [Estimate: 6 weeks]
As a researcher exploring learning curves, I want to be able to easily go from a data point in a curve to a list of the problems, steps, students, and KCs that produced that point so that I better analyze the reasons for the ragged curve.
- "Being able to easily go from a data point in a learning curve to a list of the problems (could be only a single problem!) or problem steps that produced that point. Would help in analyzing the reasons for ragged curves and in improving a cognitive model. Probably, it would not be hard to do the same analysis in Excel, after a Data Shop export, assuming that each line in the Excel indicates the skill-opportunity-number. So – maybe that argues against implementing this in the Data Shop."
- "How do I find the KC for a specific opportunity; for example when there is a spike in the learning curve, how do I identify that KC?" -- Kirsten Butcher, Winter Workshop 1/23/2008
User Export Speed-up [Guesstimate: 4 weeks]
As a potential new user of DataShop, I want to be able to preview the export data on a very large dataset quickly, so that I can decide if its worth exporting.
- Use the new stored procedures created for the CFG from the web application.
- This means that we need to change something to allow for multiple exports to run in parallel, like generate and load the stored procedure per request, so that the temporary tables have unique names.
- Also do some benchmarking so we understand if running the stored procedure is linear by the number of transactions and what our threshold is for how many can run simultaneously.
- Progress bar will have to be more of a guesstimate.
- What if two different users request the same sample.
As someone trying out DataShop for the first time and unfamiliar with the datasets, I'd like to receive some guidance on which dataset to try so that I can explore DataShop without being lost or hindered by a large, slow dataset.
- aka the Susan Goldman story: she started with an Algebra dataset, as it was public and alphabetically first. The web app was not responsive, and she didn't know what to do with DataShop.
- Also see related "feedback" story below
- aka "Getting Started" datasets
- Might include "Geometry Area 1996-1997"
v3.5 (August 2009)
Step Duration [Estimate: 3 weeks]
This combines three user stories into one document.
Change "Assistance Time" to "Step Duration"
As a researcher using DataShop, I want "Assistance Time" to capture only time spent on this step, excluding time spent on other steps that occurred between the first and last transactions for this step, and have it renamed so that I have a more accurate measure and name for this step. -- Phil, Bob, email thread, Nov/Dec 2008
- Calculate transaction durations so that time spent on other steps is not double-counted
- Rename "Assistance Time" to "Step Duration"
- Rename "Correct Step Time" to "Correct Step Duration"
- Evidence from Jack Mostow at DS v3.0 Release Event on Oct 30. Jack mentioned that "assistance time" could also capture time when there was no assistance.
- Recommended by Phil in email thread from Nov/Dec 2008
- Recommended by Bob in phone conversation on Jan 5 2009. Bob would use the "time spent on step" to calculate "time spent on KC" (a KC rollup).
- Ryan, Brett and Alida met on Dec 15, 2008 and agreed that finding the duration of each transaction would enable us to find time spent on a step (without double-counting).
As a researcher, I want the step rollup columns renamed or changed so that I understand these columns (without calling you). -- Bob Hausmann, phone call, 01/2009
- Order time-related columns in student-step rollup as follows:
- Step Start Time
- First Transaction Time
- Correct Transaction Time [this is a new field; it can be null]
- Step End Time
- Remove "Step Time" from student-step rollup
- Optionally, rename "Step Time" in code to "Correct or Last Transaction Time"
Add dependent variable, "Error Step Duration" to learning curve and step rollup
As a researcher concerned with step latencies, I want an "Error Time" variable--the total time on the step when the first opportunity was a incorrect attempt or a hint request, including all steps that did not qualify as a "correct time"--so that I can get step latency for correct-first-attempt student-steps and error-first-attempt student-steps, and every student-step will fall into one of these categories. -- Phil Pavlik, email thread, Nov/Dec 2008
Change Sort of KC Models [Actual: .5 days]
As a researcher comparing KC Models, I want the sort to be by BIC or BIC within current groupings, whichever is easier, so that I can compare models easily.
- This does not involve UI changes which would be part of the 'KC Model Sort' user story.
- Don't do if it takes longer than one day. -- Ken Koedinger, team meeting, Nov 7, 2008
v3.6 (October 2009)
Web Services Feature (Authentication, Get Dataset Metadata, Get Sample Metadata)
As a researcher (EDM, M&M, CMDM), I want to use DataShop's web services to get information about what datasets and samples are available, so that I can retrieve transaction or step-reollup data programmatically from DataShop.
- The goal of DataShop web services is to provide a way for researchers with a background in programming to enable their program or web site to retrieve DataShop data and (eventually) insert data back to the central repository. We've created the start of such a service--right now, the service allows you to authenticate with DataShop, and retrieve metadata about datasets and samples in DataShop. Coming next will be the ability to retrieve transaction and step-level data. -- Brett, v3.6.8 release notes, October 2009
The Logging Activity report provides a logging diagnostic by displaying counts of all recent log messages received by the logging server, organized by dataset and student session. It is intended for use by technical researchers or staff who are in the process of verifying logging activity from a study site.