Sigma Xi Statement - Gerry W. Clarkson

Gerry W. Clarkson, Howard Payne University

My interest and limited experiece in the use of assessment in course design began in 1993 as I attended the Conference On The Introductory Physics Course (May 20-23, Rensselaer Polytechnic Institute). Along with insight into new models for teaching, I learned of the work of Halloun and Hestenes (American Journal of Physics, v. 53, pp. 1043-1065; v. 55 pp. 440-454) concerning students' lack of conceptual understanding in introductory physics and the development and validation of the Mechanics Diagnostic Test. After reading their work, I decided to use the diagnostic test with my next semester's class. I gave the test as a pre-test and post-test to assess students' conceptual understanding. The results were similar to those described by Halloun and Hestenes, i.e. not very good. This assessment was instrumental as a motivation for me, and as justification to my department, to redesign the introductory physics course. The course now includes many problem solving activities and concept discussion questions based on published models that have been developed (e.g. Mazur's 'Peer Instruction', Moore's 'Six Ideas That Shaped Physics', etc.) I use these as student learning tools and to allow me to assess student understanding of concepts on a daily, topical basis. I continue to use the diagnostic test to assess student learning and as a reality check for me concerning whether the course is accomplishing my goals. I was able to put this experience to use in 1997 when my university became involved in a state wide initiative to improve science education for prospective and current teachers. As part of this project I redesigned a physical science course along the lines of the physics model, using problem solving activities and concept discussion questions as learning tools and for assessment of student understanding. The redesigned course is now taken by Elementary Education majors along with a parallel course in Life Science. Unfortunately we had no assessment tool to quantitatively assess the overall effectiveness of the courses for student understanding. We did survey attitudes toward science using an inventory based on work by Gogolin and Swartz (Journal of Research and Science Teaching, 1992) and Weinburgh (American Association of Education Research paper presentation, 1994). Students in traditional course sections were compared to those in redesigned courses both before and after the courses. The results were interesting, but due to small sample sizes and lack of control on some variables, the significance of the results is difficult to determine. I have also used some informal testing strategies in my course to assess whether recurring problem areas result from misunderstanding science concepts or from a lack of mathematical proficiency. Although I have not been able to do quantifiable assessment in this area, the qualitative results have been useful in suggesting a course sequence for Elementary Education majors. Currently I am interested in improving what I already do and in finding ways to do quantitative assessment in the physical science course. I would also like to learn about assessment in other areas so I can encourage colleagues in the use of assessment in course design. In attending conferences associated with science education reform I have heard several new models proposed or promoted for various courses but often with no consideration given to assessing the effectiveness of the model. I would like to be able to offer better encouragement for assessment to such attempts in improving science education.