Maine Laptop Program Studies Fail to Convince Me

Someone I believe associated with (or in support of) the Maine laptop program pointed me to three downloadable studies archived here under MLTI Research Reports that evaluate the ongoing performance of the program's effects on students and teachers.

I've read the reports, and I'm still seeing the same problems that I write about before:

  • Significant, objective, quantitative results are absent from the reports (see below)
  • Subjective surveys are useful as a touchpoint, but they don't address the actual performance changes, just the subjective experiences; these must be correlated. If 90% of teachers who report that laptops have made a significant difference in students' attendance can be correlated to show that yes, in 75% of those 90% reporting, absenteeism was down by a significant factor (5%? 10%), that means something.
  • Obvious correlation apepars to be missing. Couldn't a random sampling of student essays be taken from classes in which there are high and low laptop use over the period surveyed and evaluated using standard tools for language skills? If laptop-using classes showed an average improvement in grade level of writing and thinking by a half a grade, or any significant amount, that would be phemonenal.

The report from March 2003 is forward looking: it asks for anticipated outcomes. The general report from Feb. 2004 has subjective questions of teachers, principals, and students up until we get finally to roman numeral page 27 (not the PDF number, but the page number):

Of the 154 schools who responded to the survey, 114 report that they were able to track data for at least one of these three areas. However, due to time constraints at the school level, as well as difficulties gathering non-computerized information, data was received by the evaluation team from only 8 schools.

This is exactly the information I'm curious about, and there is a statistically tiny sample. The report states:

…the limited amount of concrete evidence to support these claims indicates substantial further research is needed in this area. Which is my point. The response rates from school employees note on page 7 are totally unacceptale for producing results. The subsample of responses isn't balanced against all schools in any way that I can see, so the results charts and graphs can't reflect the true range of response and behavior. About teacher training, another concern of mine: One message heard consistently by educators during the 15 month Phase One evaluation is the lack of sufficient time for teachers to become more skilled technically in using the laptops, and more skilled pedagogically in integrating the laptops into their instruction.

Right.

There is a lot of discussion of technical problems, but there is no comparison of the hours used to deal with them and the disruption of education balanced against the many anecdotes citing the improvement in efficiency in getting to the heart of the lesson by bypassing mechanical computation and repetitive tasks.

My conclusion from reading the three reports is that teachers, students, and administration love the currency of information that laptops connected to the Internet provide, and that it's extremely clear that students are being better educated as active participants in the information economy: they have a better sense of what's happening now and around them, and teachers are much better able to introduce material that is the most accurate, complete, and comprehensive.

I wish more of the reports focused on ways to measure whether having more current information and more information in general produced students who became better lifelong learners, earners, and citizens.