AAHESGIT152: Evaluating Ed. Impact of I.T.

This is a combination of several postings from the AAHESGIT mailing list, from Steve Ehrmann of AAHE <SEhrmann@aahe.org>, Jim Greenberg of SUNY Oneonta <greenbjb@snyoneva.cc.oneonta.edu>, and Richard Hake of Indiana University <hake@ix.netcom.com>.

It starts with Greenberg's Question, followed by Ehrmann's Comments, then Hake's report on research, and more comments from Ehrmann, with Steve Gilbert's editorial comments at the end.


Original question from James Greenberg

Subject: Does Technology Really Make Students Learn Better?

In the March/April 1997 issue of Educom Review, Larry Irving, an administrator of the NTIA is quoted as saying, "Study after study is beginning to demonstrate that students who use technology learn better and learn differently from kids who don't."

In the same issue of Educom Review, Thomas Russell, Director of Instructional Telecommunications at NC State writes, "Technology is not neutral, despite the fact that study after study has concluded that using it in the classroom neither improves nor diminishes instruction for the masses."

My questions is:

Which is it and where are all these studies being cited?

I have been in Instructional Technology for 20 years now and have heard a lot of talking like this but have not seen ANY serious research to address this question. Can anyone point me to it so I can become educated?

Thanks.

James B. Greenberg
Assistant Director Computing Services
Fitzelle 204, SUNY Oneonta,
Oneonta New York 13820
phone (607) 436-2701
Internet: <GREENBJB@ONEONTA.EDU>
Ignorance is curable, stupidity is forever.

Does Technology Really Make Students Learn Better?

Jim Greenberg's question below reflects all those points of view. Steve Gilbert and I hear virtually all of them as we visit campuses and run TLTR and Flashlight events. The viewpoints certainly seem inconsistent. But what if, instead, Greenberg's question had been about paper: does research indicate that paper makes students learn better?

I think the two questions, about "technology" and about paper, are comparable. If they are, it helps explain both quotes in Greenberg's question, since paper is both extremely helpful in learning and also doesn't "make" anyone learn anything, not by itself (e.g., is something written on the paper? If the writing on the paper is for teaching, how good is that content? If the student is doing the writing, under what circumstances is the writing being done? Or is the piece of paper being used by an art student for a sketch? etc., etc.).

For a little more elaborate review of the educational research bearing on those questions, you might take a look at my article, "Asking the Right Questions: What Research Tells Us about Technology and Higher Learning," that originally appeared in Change Magazine and is now available in the TLTR Workbook. It's also available on the Web at Learner Online

http://www.learner.org/content/ed/strat/eval/ACPBRightQuestion.html

Steve Ehrmann

P.S. I try to collect educational research on technology and higher learning that _does_ have something useful to say for faculty and students. If you see something really good, please pass it along and I'll try to share at least some such pieces here in AAHESGIT.

Stephen C. Ehrmann, Ph.D.                 202-293-6440
Director, The Flashlight Project          202-293-0073(fax)
c/o The American Association for Higher Education    
One Dupont Circle, Suite                  SEhrmann@aahe.org
Washington, DC 20036-1110                      
http://www.aahe.org/TLTR.htm
http://www.learner.org/content/ed/strat/

Interactive-Engagement in Physics

I have read Ehrmann's thoughtful article "
Asking the right questions: What Research Tells Us about Technology and Higher Learning." I think the results below are consistent with his contention that "strategies matter most," and that, in agreement with Richard Clark and Robert Kozma, it is useful to study (a) which teaching/ learning strategies are best, and (b) which technologies are best for supporting those strategies.

A survey (1, 2) of pre/post test data for 62 introductory physics courses enrolling a total of N = 6542 students in high-schools (14 courses, N = 1113), colleges (16 courses, N = 597), and universities (32 courses, N = 4832) throughout the country strongly suggests that "interactive-engagement methods" can increase mechanics-course effectiveness in promoting both conceptual understanding and problem-solving ability well beyond that achieved by "traditional" methods.

The data were sent to me in response to requests for test results using the well-known conceptual "Force Concept Inventory" (FCI) (3) and problem-solving "Mechanics Baseline" (MB) (4) exams of Hestenes et al. This mode of data solicitation tends to pre-select results which are biased in favor of outstanding courses which show relatively high gains on the FCI. When relatively low gains are achieved (as they often are) they are sometimes mentioned informally, but they are usually neither published nor communicated except by those who (a) wish to use the results from a "traditional" course at their institution as a baseline for their own data, or (b) possess unusual scientific objectivity and detachment. Fortunately, several in the latter category contributed data to the present survey for courses in which interactive engagement methods were used but relatively low gains were achieved. Some suggestions for increasing course effectiveness have been gleaned from those cases. (1, 2) As in any scientific investigation, bias in the detector can be put to good advantage if appropriate research objectives are established. We did not attempt to access the average effectiveness of introductory mechanics courses. Instead we sought to answer a question of considerable practical interest to physics teachers: Can the classroom use of IE methods increase the effectiveness of introductory mechanics courses well beyond that attained by traditional methods?

"Interactive Engagement" (IE) methods are defined as those designed at least in part to promote conceptual understanding through interactive engagement of students in heads-on (always) and hands-on (usually) activities which yield immediate feedback through discussion with peers and/or instructors, all as judged by their literature descriptions. "Traditional" (T) courses are defined as those reported by instructors to make little or no use of IE methods, relying primarily on passive-student lectures, recipe labs, and algorithmic-problem exams. The average normalized gain <g> for a course is defined as the ratio of the actual average gain <G> to the maximum possible average gain, i.e.,

<g> = %<G> / %<G>max = ( %<Sf> - %<Si>) / (100 - %<Si>), where <Sf> and <Si> are the final (post) and initial (pre) class averages.

The interactive engagement courses were, on average, more than twice as effective as traditional courses in promoting conceptual understanding since <<g>>IE = 2.1 <<g>>T. Here, the double carets "<<X>>NP" indicate an average of averages, i.e., an average of <X> over N courses of type P, and sd = standard deviation (not to be confused with random or systematic experimental error ). The difference <<g>>48IE - <<g>>14T = 0.25 is 1.8 standard deviations of <<g>>48IE and 6.2 standard deviations of <<g>>14T, reminiscent of that seen in comparing instruction delivered to students in large groups with one-on-one instruction. (5) As discussed in ref. 1, it is extremely unlikely that systematic error played a significant role in the large difference observed in the average normalized gains of the T and IE courses.

Among the survey courses of total enrollment N = 6542, the most widely used interactive-engagement methods in terms of students using the methods were Collaborative Peer Instruction, 4458 (all IE-course students); Microcomputer Based Laboratories, 2704; Concept Tests, 2479; Socratic Dialogue Inducing Labs, 1705; Overview Case Study and Active Learning Problem Sets, 1101; Modeling, 885; and research- based text or no text, 660.

As for uses of technology, Collaborative Peer Instruction in large "lecture" sections has been effectively accomplished with systems such as "Classtalk." Classtalk (6) provides hand-held computers for students, a master computer for the instructor, and a classroom network which allows immediate feedback from Concept Tests or a lecturer's questions. In microcomputer-based laboratories (10) students use an ultrasonic motion detector to observe the motion of any object - including their own bodies- and to display in real- time graphs of position, velocity, and acceleration. There is some use of computers in Socratic Dialogue Inducing Labs (11), Overview Case Study and Active Learning Problem Sets (12), and in Modeling (13).

The present survey suggests that interactive-engagement rather than technology per se may be the crucial factor for improving the effectiveness of introductory mechanics classes, but that technology can be very important when it supports such an approach. It would appear that this may also be the case for other undergraduate science, mathematics, engineering, and technology courses (SME&T) (14-16). The situation for non-SME&T courses may be similar.

REFERENCES

  1. R.R. Hake, "Interactive-engagement vs traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses," Am. J. Phys., in press, and on the Web at http://carini.physics.indiana.edu/SDI/
  2. R.R. Hake, "Interactive-engagement methods in introductory mechanics courses," submitted on 5/16/97 to the potential new "Journal of Physics Education Research," and on the Web at http://carini.physics.indiana.edu/SDI/
  3. D. Hestenes, M. Wells, and G. Swackhamer, "Force Concept Inventory," Phys. Teach. 30, 141-158 (1992). I. Halloun, R.R. Hake, E.P. Mosca, and D. Hestenes, Force Concept Inventory (Revised, 1995) in ref. 7a.
  4. D. Hestenes and M. Wells, "A Mechanics Baseline Test," Phys. Teach. 30, 159-166 (1992). A plot (ref. 1) of the average percentage score on the problem-solving Mechanics Baseline posttest vs the average percentage score on the conceptual Force Concept Inventory posttest for all the available data shows an extremely strong positive correlation with coefficient r = + 0.91. Thus it would appear that problem-solving capability is actually enhanced (not sacrificed as some would believe) when concepts are emphasized.
  5. B.S. Bloom, "The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring," Educational Researcher 13(6), 4-16 (1984): "Using the standard deviation (sigma) of the control (conventional) class, it was typically found that the average student under tutoring was about two standard deviations above the average of the control class....The tutoring process demonstrates that most of the students do have the potential to reach this high level of learning. I believe an important task of research and instruction is to seek ways of accomplishing this under more practical and realistic conditions than the one-to-one tutoring, which is too costly for most societies to bear on a large scale. This is the '2 sigma' problem."
  6. Classtalk, a classroom communication system, is a product of Better Education, Inc., 4824 George Washington Highway, Suite 103; Yorktown, VA 23692, 804-898-1897, <info@bedu.com>; J.C. Webb, G.R. Webb, R. Caton, and F. Hartline, "Collaborative Learning and Insights on Students' Thinking: Computer Technology Offers New Possibilities for Large Classes," AAPT Announcer 24 (4), 64 (1994); Classtalk use at Harvard for ConcepTests and student evaluation of Classtalk is discussed in ref. 7; Classtalk usage at the Univ. of Massachusetts (Amherst) is discussed at http://www-perg.phast.umass.edu/pages/MainMenu.html

    A commercial wireless classroom communication system is described by R.A. Burnstein and L.M. Lederman, "Report on progress in using a wireless keypad response system," "Proceedings of the 1996 International Conference on Undergraduate Physics Education" (College Park, MD, in press). Pedagogical advantages and utilization of Classroom Communication Systems (CCS) are discussed by R.J. Dufresne, W.J. Gerace, W.J. Leonard, J.P. Mestre, and L. Wenk, "Classtalk: A classroom communication system for active learning," J. Computing in Higher Ed. 7(2), 3-47 (1996). CCS may allow a cost-effective Socratic approach (see, e.g., ref. 8) to instruction in large-enrollment "lecture" sections.

  7. Eric Mazur, (a) "Peer Instruction: A User's Manual" (Prentice Hall, 1997); (b) "Qualitative vs. Quantitative Thinking: Are We Teaching the Right Thing?" Optics and Photonics News 3, 38 (1992); (c) "Are science lectures a relic of the past?" Physics World 9(9), 13-14 (1996). For assessment data, course syllabus, User's Manual, information on Classtalk, and examples of Concept Tests see at http://galileo.harvard.edu/. For a case study of Mazur's methods, see ref. 9, Chap. 8, p. 114-122, "Students Teaching Students, Harvard Revisited."
  8. A. B. Arons, "A Guide To Introductory Physics Teaching" (Wiley, 1990).
  9. S. Tobias, "Revitalizing Undergraduate Science: Why Some Things Work and Most Don't" (Research Corporation, Tucson, AZ, 1992).
  10. R.F. Tinker, "Computer Based Tools: Rhyme and Reason," in "Proc. Conf. Computers in Physics Instruction," ed. by E. Redish and J. Risley (Addison-Wesley, 1989), pp. 159-168; R. K. Thornton, "Tools for scientific thinking: Learning physical concepts with real-time laboratory measurement tools," ibid. pp. 177-189; "Tools for scientific thinking - microcomputer-based laboratories for physics teaching," Phys. Educ. 22, 230-238 (1987); R.K. Thornton and D. R. Sokoloff, "Learning motion concepts using real-time microcomputer-based laboratory tools," Am. J. Phys. 58, 858-867 (1990); R.K. Thornton, "Conceptual dynamics: changing student views of force and motion," in "Thinking Physics for Teaching", ed. by C. Bernardini, C. Tarsitani, and M. Vicintini (Plenum, 1995). See also at http://www.tufts.edu/as/csmt/research.html.
  11. R.R. Hake, "Promoting student crossover to the Newtonian world," Am. J. Phys. 55, 878-884 (1987); S. Tobias and R.R. Hake, "Professors as physics students: What can they teach us?" Am. J. Phys. 56, 786-794 (1988); R.R. Hake, "Socratic Pedagogy in the Introductory Physics Lab," Phys. Teach. 30, 546-552 (1992). For a summary of recent work and a updated listing of electronically available SDI materials (e.g., manuals, teacher's guides, sample lab exams, equipment set-up lists) see http://carini.physics.indiana.edu/SDI/ or contact R. R. Hake at hake@ix.netcom.com.
  12. Alan Van Heuvelen, "Learning to think like a physicist: A review of research-based instructional strategies," Am. J. Phys. 59, 891-897 (1991); "Overview, Case Study Physics," ibid., 898-907 (1991); (e) "Experiment Problems for Mechanics," Phys. Teach. 33, 176-180 (1995); "ActivPhysics" CD-ROM with workbook is available from Addison Wesley Interactive, http:awi.aw.com/products.html#ActivPhysics For a case study of Van Heuvelen's methods see ref. 9, p. 100 - 103. Some materials are available commercially from Hayden- McNeil Publishing Inc., 47461 Clipper St. Plymouth, MI 48170; 313-455-7900.
  13. D. Hestenes, "Towards a modeling theory of physics instruction," Am. J. Phys. 55, 440-454 (1987); "Modeling Games in the Newtonian World," ibid. 60, 732-748 (1992); "Modeling Methodology for Physics Teachers," Proceedings of the 1996 International Conference on Undergraduate Physics Education" (College Park, MD, in press); M. Wells, D. Hestenes, and G. Swackhamer, "A modeling method for high school physics instruction, Am. J. Phys. 63, 606-619 (1995). See also at http://modeling.la.asu.edu/modeling.html.
  14. R.C. Hilborn, "Guest Comment: Revitalizing undergraduate physics - Who needs it?" Am. J. Phys. 65, 175-177 (1997).
  15. "Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology," Advisory Committee to the NSF Directorate for Education and Human Services, 1996, available at http://www.ehr.nsf.gov/EHR/DUE/documents/review/96139/start.htm or as a hard copy by request to <undergrad@NSF.gov>.
  16. "Science Teaching Reconsidered," NRC Committee on Undergraduate Science Education, National Academy Press, 1997. See http://www2.nas.edu/wwwcat/Education.html.
Richard Hake <hake@ix.netcom.com> Emeritus Professor of Physics, Indiana University 24245 Hatteras Street Woodland Hills, CA 91367


Further response from Steve Ehrmann

As I said in AAHESGIT#146, how would I answer if Greenberg's question had been about paper?

In fact we do lots of educational research that is implicitly about the educational value of paper. And we see a similar variety of findings -- improves, hurts, no difference -- for the same reasons: different uses, different contexts. Anyone feel compelled to cut paper budgets while we settle this?

Similarly, depending on how modern technology is used, research shows different things: better, worse, different, no difference. For example, Tom Russell (quoted by Greenberg) has compiled a huge list of the 'no significant difference' studies in distance learning

http://tenb.mta.ca/phenom/phenom.html

but there are also huge numbers of positive studies on technology and learning. There are even some negative findings.

What's really important is to look hard for relevant research or evaluation on educational practices that parallel your own. If a certain technique works elsewhere, it _could_ work for you, too. If a certain type of failure happens elsewhere, it _might_ happen for you, too. Other people's research and evaluation can expand your imagination about what to try, why, and what errors to watch out for. I think the most important function of research and evaluation is to furnish what a mathematician might call "existence proofs" and an engineer might casually label as a "performance envelope": an increased sensitivity to the boundaries of the possible.

Corollary: there's no substitute for studying your own educational practices, technology-based or otherwise, because no one else's successes or failures will tell you for sure what's happening with your own students.

For example, there's a very nice little study by Jerald Schutte about a big jump in student achievement in a course using the Web at Cal State Northridge

http://www.csun.edu/sociology/virexp.htm

Recently someone said to me, "Schutte's study shows that asynchronous learning works!" As Nero Wolfe, the fictional detective used to say, "Pfui!" Schutte showed something very important but it wasn't that. He showed that his particular approach to instruction _can_ work. Big difference. To discover whether something similar does work for you, you need to look for yourself at your own practices. That's the purpose of our Flashlight evaluation tool kit project, to help you create such studies of your institution's educational uses of technology. For an introduction, see "The Flashlight Project: Spotting an Elephant in the Dark".

We'll start to offer site licenses for Flashlight this summer. Watch AAHESGIT.


(6/10/97 AAHESGIT #146.  Approx. 100 Lines from
Steve Ehrmann of AAHE <SEhrmann@aahe.org> and 
Jim Greenberg of SUNY Oneonta
<greenbjb@snyoneva.cc.oneonta.edu>

I asked Steve Ehrmann to provide some additional 
intro/comments to the posting from Greenberg.  Both messages 
below raise important questions about and help clarify the 
availability of research on the educational impact of various 
uses of information technology.  

I believe that most leaders of educational institutions are 
now convinced of the need for educational uses of information 
technology to compete for students (and faculty!) and to 
support career/job/work preparation.  However, many board 
members, presidents, et al. are still faced with major 
resource allocation decisions about the broader educational 
role of technology.  These leaders could make major 
investment decisions more enthusiastically and comfortably if 
better information were available confirming the educational 
value of increasing the academic use of technology.  
Anecdotal reports from faculty and students about their 
strong preferences and personal beliefs in the educational 
effectiveness of various applications of information 
technology are rapidly accumulating, but good research data 
would still help wherever it could be applied.  Please let 
Ehrmann  and Greenberg know if you have any!)


(6/24/97 AAHESGIT #152.  
Approx. 5 pages from Richard Hake of Indiana U. 
<hake@ix.netcom.com> and Steve Ehrmann of AAHE 
<sehrmann@aahe.org>.

This posting begins with Hake's extremely thoughtful and 
well-documented discussion of the evaluation of "interactive 
engagement" methods for teaching intro. physics/mechanics.  
That is followed by a copy of an earlier posting by Jim 
Greenberg of SUNY questionning what research has to say about 
the educational effectiveness of information technology.  
This posting ends with an additional response to Greenberg by 
Ehrmann. 

Hake:  ""Interactive Engagement" (IE) methods are defined as 
those designed at least in part to promote conceptual 
understanding through interactive engagement of students in 
heads-on (always) and hands-on (usually) activities which 
yield immediate feedback through discussion with peers and/or 
instructors, all as judged by their literature 
descriptions.... The interactive engagement courses were, on 
average, more than twice as effective as traditional courses 
in promoting conceptual understanding..."

Ehrmann:  "What's really important is to look hard for 
relevant research or evaluation on educational practices that 
parallel your own. If a certain technique works elsewhere, it 
_could_ work for you, too. If a certain type of failure 
happens elsewhere, it _might_ happen for you, too. Other 
people's research and evaluation can expand your imagination 
about what to try, why, and what errors to watch out for. I 
think the most important function of research and evaluation 
is to furnish what a mathematician might call "existence 
proofs" and an engineer might casually label as a 
"performance envelope": an increased sensitivity to the 
boundaries of") 
Steve Gilbert ===============================================



This letter from Prof. Richard Hake is the best response I've
gotten so far from AAHESGIT posting #146.  It's followed by
the original post from James Greenberg and some further notes
from me about it.

Steve Ehrmann
Director, The Flashlight Project
AAHE

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Information below last updated:  4/27/97
TLTR Summer Institute -- July 11-16, 1997 Phoenix, Arizona

    Steven W. Gilbert, Director, Technology Projects
       American Association for Higher Education
    202/293-6440 X 54              FAX:  202/293-0073
                  GILBERT@CLARK.NET
      http://www.aahe.org [includes TLTR Web Site]

    SCHEDULE FOR 1997 TLTR WORKSHOPS AVAILABLE FROM
   AMANDA ANTICO 202 293 6440 EXT 38 ANTICO@CLARK.NET
      Order TLTR Workbook at Special AAHESGIT Reader Rate:  
        Call 202/293-6440 x 11 and give code "SGIT 4/97"   
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 
NOTE:  Anyone can subscribe to the AAHESGIT Listserver by 
sending the EMail message (with subject line left blank):
SUBSCRIBE AAHESGIT yourfirstname yourlastname
to  LISTPROC@LIST.CREN.NET
    If you would like to post a message to the AAHESGIT Listserv,
send it to  AAHESGIT@LIST.CREN.NET
     With over 6,000 subscribers, not all messages sent to
AAHESGIT can be posted.  Those that are selected for posting
are reviewed and may be edited for clarity.  Authors are
often asked to expand or clarify their messages before
distribution to the List.  Facts, including URLs, are not
checked or confirmed by me.  Opinions expressed in AAHESGIT's
postings do not necessarily reflect those of anyone
officially affiliated with AAHE.
     I intend that each posting be protected by copyright as a
compilation.  As the copyright holder for the posting, I give
permission ONLY for duplication and transmission of each compilation
complete and intact including this paragraph.  Duplication and/or
transmission of any portion should be guided by "fair use"
principles, and explicit permission should be obtained when needed.
Permission to duplicate or transmit any portion written by a
contributor must be obtained from that person.
 - Copyright 1997 Steven W. Gilbert

This page maintained by Dave Mason
Last modified: Tue Jun 24 11:37:46 EDT 1997