As a parent who watched DreamBox fail to teach my daughter simple mathematical concepts I don’t trust DreamBox. I have not given the program much thought recently, but reading this teacher’s description of DreamBox on edsurge brought the memory of my experience back to the forefront of my mind:

The more help a teacher gives the student, the less the program is able to identify what the student truly knows. So, although teachers may be inclined to explain activities to students, this somewhat diminishes the functionality of the program.

Can an algorithm identify what a student really knows? No software should make a teach feel like they need to get out of the way and let the program do its job. This is not appropriate use of technology. Especially in early grades. The comment resonates because is matches exactly how I felt watching my own daughter work with DreamBox. Any help I offered, moved her through the exercises too quickly and was not welcomed by her.

My final, sincere effort to give DreamBox a chance involved packing boxes. Packing boxes is the abstraction they chose to illustrate base 10 notation. In the activity, students click and drag individual objects to pack them into a box. Once a box is filled, the box moves to another area of the screen, leaving the remaining objects to be packed into another box. Each box holds 10 items. If you have three full boxes plus four items left over, you have 3 in the tens place and 4 in the ones place: 34 items.

Sounds great, BUT it did not work. My daughter got the abstraction right away. She could pack boxes like no ones business. Click-drag, click-drag, click-drag, click-drag, click-drag, click-drag, click-drag, click-drag, click-drag, click-drag. Full box. Move it over. Repeat until you get a star. The problem was that she never recognized the importance of a full box holding exactly 10 items. The assessment was ineffective, her ability to pack boxes allowed her to pass on to the next level without understanding base 10 notation.

Of course, without actually learning what she was supposed to learn she was completely unable to do the next level of activities. After watching her struggle for a while, I did some adaptive teaching. I turned off the computer and found a different way to make sure she understood base 10 notation.

Abstractions are ubiquitous in education. I use many in my own teaching. When they work, they work well. However, they can also get in the way. I see this all the time. As an instructor, part of my job is to gauge student understanding, recognize when a particular abstraction is not working and be ready with an alternative.

This is what I did two years ago when DreamBox could not and did not adapt to my daughters situation. Younger students do not need adaptive learning, they need adaptive teaching.

Who is working on that?

Pingback: Play and Learn Weekly Sep.1st, 2012 | Classroom Aid

Thanks for sharing your experience with DreamBox. I’m the Curriculum Director at DreamBox, and my son had the exact same problem a few years ago with the place value lessons. At the time, I was a district math administrator for 18 elementary schools and 10 secondary schools. I was using my son as a guinea pig to see whether DreamBox was worth using with students in my district. The place value lessons are among the most challenging lessons we have. Place value is a complex concept, especially for young students. Students need to understand why 34 is equivalent to 3 tens and 4 ones as well as 2 tens and 14 ones or 1 ten and 24 ones. This depth of conceptual understanding takes time to develop.

It may seem counter-intuitive that DreamBox helps students by asking parents and teachers not to intervene. DreamBox is continuously assessing student thinking and progress, and it’s critical to isolate a student’s own ideas and proficiency. Every mouse click and answer a student submits is a piece of useful evidence about the student’s thinking and understanding. Our technology is able to support teachers as they work tirelessly to differentiate and help all students learn mathematics. If you’d like to talk further about assessment, our place value lessons, or how we transition students from concrete to abstract representations, I’d be happy to follow up with you personally.

Tim Hudson

Director of Curriculum Design

DreamBox Learning

Tim,

Thanks for the comment. I love the idea behind Dreambox. I should have said this in the original post.

However I’d like to see more evidence for its effectiveness.

It is not at all counter intuitive to get out of the way of a program that you trust will do what it claims.

The problem is that your description of how DreamBox works does not match the experience I had watching my daughter use it. She hit a conceptual wall, got frustrated and tuned out.

How much effort has gone into rewriting the place value lessons since I saw them two years ago?

You’re right to look evidence of effectiveness. Things in ed tech are moving fast and often bring a lot of hype that may or may not be substantiated by measurable improvements in student learning. For DreamBox, our website has a link to some existing effectiveness studies, and other research projects are currently underway. The most compelling evidence thus far is an independent SRI study that used NWEA measures and concluded that students using DreamBox made learning gains equivalent to progressing 5.5 points in percentile ranking in just 16 weeks.

Improving the place value lessons was a priority for us. Regarding your daughter’s experience with that particular lesson, there are a few design principles we use that would be helpful to clarify before discussing the specific changes we made. When we introduce students to a new virtual manipulative, we include a tutorial lesson first. In this tutorial, we ask students to click, drag, etc. and interact with the tool in all of the ways they will need in the actual lessons. Each successful interaction gets a check mark at the top of the screen, and the lesson ends when students complete all of the interactions. There’s no mathematical thinking involved; we use this as a means of making sure the assessment in each lesson is reflective of mathematical proficiency rather than a result of challenges using the manipulative. It’s a validity issue.

Right after the tutorial, the usual next lesson is actually a pre-test that we refer to as a “placement.” It’s designed to determine whether a student already knows the content (and should therefore skip the upcoming lessons). Young students don’t realize these are pre-tests, which is important because we don’t want them to have any added stress that might skew their performance. So these placements look virtually identical to the other lessons. The differences are that they are marked with a star on the lesson map (as opposed to a number), and instead of a ‘hint’ button in the lesson, we provide a “too hard” button so students can exit the lesson. It’s new content, so it likely will be too hard if they don’t already know it. Understandably, then, these placements are the instances in which parent, sibling, or teacher help would have the most detrimental impact on a student’s experience because the student will be skipped ahead to more challenging content based on invalid assessment data. Our adaptive engine is able to move the student “backwards,” but it takes a bit of time and is a frustrating experience for the student. All that said, we certainly have room to improve our communication with parents about this design feature. One last point to note is that the placement is necessarily at higher levels of abstraction, and students who do not pass the placement begin with lessons that are very concrete.

Based on student performance data as well as using feedback from other parents like you and me, our teachers made a couple of key changes to the lesson. First, we adjusted the numbers in each lesson to make the learning progression more gradual for students (if they don’t pass the placement). With my own son (before I joined DreamBox), I saw the lessons jump from place value problems involving small numbers like 24 and 37 all the way up to 398 and 457 far too quickly. After our teachers fixed that, we saw significant changes in student progress and proficiency through these lessons. Second, there were a few minor changes we made to the way we assess student answers for each question in the lessons. These small changes were very significant in terms of helping ensure students receive better feedback and adaptation while in the place value lessons. The algorithms for recommending the next problem and next lesson to a student are always built from our teachers’ assessment standards, and adjustments to these standards improved student learning and progress.

Sorry for the length of this comment. I hope this helps explain a bit more about what’s happening inside DreamBox and how it may have impacted your daughter. I’m really sorry she got frustrated and tuned out. We continually monitor how students progress through our lessons in order to improve student learning and the overall learning experience. I’m happy to answer more questions if you have them, and if your daughter ever gets back on DreamBox, I think she’d have a better experience.

Thank you for such a thoughtful reply. I am glad to hear that the activities are being modified in response to teacher input.

The hype is a problem. When I hear the DreamBox process described as having captured “what the best teacher would do when interacting with a student”, I know the hype has outpaced reality.

I am also pleased to hear that you are doing more evaluation studies. Particularly in light of the cautionary statements at the end of the SRI International report regarding the limits of their study.

As a victim of the IPI movement in the 1970′s I see too many parallels between what you call adaptive learning and the individualized learning initiatives of old. I wish you could bring Stanley Erlwanger back from the grave to tell me that the DreamBox approach is fundamentally different. Certainly the technology has changed, but has the underlying philosophy changed?

I’d like to see more stress placed on how DreamBox’s assessment allow teachers to get frequent, fine grained data on a few, important aspects of student performance. This seems to be where it is having the greatest effect.

to make my point I need look no further than your own case studies. Take these quotes from an intervention involving high risk kindergarden students:

“We found we could use the teacher reports from DreamBox to adjust the placement of students in classroom lessons. The DreamBox program has become that important in our arsenal of assessment data.”

“DreamBox enabled teachers to work 1:1 with Tier 3 intervention students in smaller groups.”

“DreamBox enabled me to have some kids working on the computer in conjunction with groups that I was leading so I only had to pull them off as needed”

“In [comparison to the previous year] the

combination of targeted intervention servicesand DreamBox Learning’s adaptive online program was equally powerful with intervention students moving from the 1st percentile to the 50th percentile nationally in a single school year.” (emphasis added)links:

The SRI report: http://www.dreambox.com/downloads/pdf/DreamBox_Results_from_SRI_Rocketship_Evaluation.pdf

The case study:

http://www-static.dreambox.com/wp-content/uploads/2012/08/case-study-early-childhood-center-impactful-math-intervention.pdf

I love the capabilities that dreambox helps give to support my children with multiple ways of learning how to reason over fundamental mathematical concepts, I recommend this program highly to my peers with children.

Oddly enough though, 1 year after this post myself and one of my children have run into some issues involving the same lesson.

The gui interface for interacting with packing the boxes is mechanically slow, which I feel has discouraged packing experimentation due to the high time overhead needed to render the animations. The kids seem to take a long time to solve each problem rather then gaining experience quickly to uncover the underlying patterns. It seems like the animation speed gets in the way a little here.

A second issue for us was that these two lessons seem to serve as a keystone activity for other lessons which are perhaps, not obviously dependent on this. In our case, one of my daughters was somewhat bottlenecked on accessing new lessons until this one was complete. I would have preferred that she continue to work on this, but have some more latitude to explore other lessons at the same time. This could have helped prevent becoming discouraged and let her have some options to grow in other areas while she gained the context – or just spent the time needed to truly understand the place value lessons.

I am interested for my younger daughter to reach this point to see how she does. This does make me wonder though, if there might be some alternative ways to present and teach this concept. It’s a lynchpin skill but maybe this representation doesn’t resonate with some kids quite as well.

Also wanted to say, thanks for a great product, your team has done some amazing work here.

-Keith

As you rightly point out, our partnership with teachers is critical. As the quotes described, we provide teachers with more ways to strategically use their class time because they know DreamBox will effectively differentiate and cause learning for students. Providing teachers with real-time, fine-grained data is one of our key supporting features. Our adaptive engine collects and uses around 48,000 data points per student per hour. We turn those data into useful information and accessible reports for teachers, parents, and principals. It’s definitely a partnership and a team effort.

To your second point, our approach and philosophy are fundamentally different from IPI. In reading other articles and blogs, it seems there’s a perception that “adaptive” is necessarily “behaviorist.” While that may be accurate in other cases, it’s not true with DreamBox. Using Erlwanger’s lens, here are a few reasons why. DreamBox (1) doesn’t define mathematical learning objectives in behavioral terms, (2) has a very different pedagogy than IPI, (3) doesn’t recommend lessons linearly or narrowly, (4) adapts not only based on correct answers but also based on a students’ strategies, and (5) provides instant feedback and mathematically structured manipulatives to ensure misconceptions don’t have time or opportunity to take hold.

Our lessons and adaptive engine were informed by the work of advisors John Bransford, Cathy Fosnot, Skip Fennell and others who are committed to transfer, fluency, conceptual understanding and strategic thinking. Whereas IPI laid out a sequential hierarchy of behavioral objectives for students to follow, DreamBox was designed using ideas like Fosnot’s “landscape of learning,” in which students do not come to understand various models, strategies, and big ideas in a linear fashion. My son, when stumped by the place value lessons we were discussing, was able to choose to work on DreamBox lessons involving skip counting and early multiplication. After learning in those lessons for a while, he had improved success when he returned to the place value lessons because he had developed better understanding of the multiplicative structures upon which place value is built. Building coherent learning progressions into our lessons and sequencing ensures students can make connections between important ideas in mathematics.

Thanks for clarifying your concerns and asking the important questions.