Sigma Xi Statement - Marshall D. Sundberg

Marshall D. Sundberg, Emporia State University

Since 1990 I have been actively involved in developing and utilizing assessment strategies to improve student learning in undergraduate biology classes. Until 1997 this work occurred at Louisiana State University and it was instrumental in our development of the interdepartmental biology program that has now grown into a unified Department of Biology. For the past four years I have been at Emporia State University, KS, - a comprehensive public university of approximately 6000 students. Although these are very different institutions, they pose similar problems in terms of: 1) educating faculty about the association between their teaching and student learning; 2) convincing science faculty that assessment strategies are useful tools; 3) assisting faculty in developing student-active approaches to their teaching. In my current capacity as a department chair, I continue to find that the greatest faculty resistance to implementing any teaching strategy endorsed by educationalists, is the belief that these strategies are just another passing fad. Ten years ago I naively thought that all it would take is some data to convince my scientific colleagues that change can be good. I now know that it can still be a hard sell, even with long-term data derived from a variety of instruments. I am very interested in this Roundtable because I still find myself to be a saleman in my department and in the University.

Content pre-and post-tests are the tool we have used most consistently throughout this period. Our instrument was designed to measure student understanding of a dozen basic concepts known to be difficult to teach because of a variety of well-entrenched (and well-documented) misconceptions. Some of these results have already been published (Sundberg and Dini, 1993; Sundberg, Dini and Li, 1994). The long-term data allows us to examine the effect of a variety of teaching strategies on promoting more sophisticated student understanding.

Throughout this period we also have employed a variety of other tools to answer specific questions. For instance, we have several semesters of data using the Watson-Glaser Critical Thinking Appraisal to assess changes in students ability to: 1) draw inferences; 2) recognize assumptions; 3) make deductions; 4) interpret evidence; and 5) evaluate arguments. We have tracked a cohort of these students, university-wide, through four years to provide a baseline for future comparisons and we are working with the General Education Council and the Honors Program to implement a series of multidisciplinary / multisemester interventions using alternative teaching strategies.

A major secondary focus of our early work was the effect of instructional content-richness on student attitudes as well as conceptual understanding. We found an inverse relationship between the amount and rigor of content presented and 1) positive change in student attitude and 2) increased conceptual understanding! Students developed more sophisticated conceptual understanding, and a more positive regard for science, when fewer specifics were taught (Sundberg and Dini, 1993; Sundberg, Dini and Li, 1994). While most of our early work concentrated on lecture classes, we quickly included the laboratory as an obvious place to emphasize student-active approaches. In addition to the above-mentioned assessment tools we have used student interviews, journals and concept maps to track the effect of different strategies (Moncada, 1993, Sundberg and Moncada, 1994; Sundberg, 1997).