Volume IV: What works, what matters, what lasts

Emergent strategy for the assessment of student learning over time

Download the PDF version of this document.

There is a growing body of "wisdom-based-on-experience" about how to initiate, implement, and institutionalize efforts that work toward ensuring that the STEM education of all students is of the highest quality. Just as the work of pedagogical pioneers and early adapters was advanced by studying what did and did not work for their peers, the work of today’s leaders influences the quality and character of student learning in STEM fields.

One of the sessions at the recent annual meeting of Chemlinks and MC^ participants was set aside for the discussion of campus-based strategies for the assessment of student learning. Prior to the meeting the participants at four Chemlinks campuses agreed to come and discuss with each other, and with others interested in student assessment, how they might best demonstrate student learning in their modules. Each campus group was invited to bring with them to the meeting one colleague from their department who teaches a standard chemistry class which is the closest parallel to a particular module in content and student level, but which is taught by traditional methods. Two out of the four groups were able to do this. A third group have a 'traditional' colleague with whom they can work (but who was unable to attend); the fourth group (who are in a small department) have no 'traditional' colleagues, and are considering alternative ways to cross-monitor their students' learning. (This is an important variation for other campus groups who share their departmental situation.)

The essence of the campus plan based on "pairs" (i.e., traditional and non-traditional) is that they jointly consider whether and where their learning objectives for students concur, and how, in those shared areas, they might jointly test, or otherwise monitor the performance levels of, both groups of students.

Other important student assessment questions are:

  1. What do students in each class know when they enter each class, and what do they remember when they enter the next chemistry (or other) class?
  2. How may students' performance and career progression be monitored over time?

Where "convincing colleagues" is an issue, planning might also begin (as it did at the University of Wisconsin, Chemistry 110 assessment experiment) by asking colleagues what kinds of evidence would demonstrate the efficacy (or harmlessness) of the modular approach to student learning.

The workshop group discussed these issues and made a commitment to develop comparative 'pairs' plans for each module, and to develop these into an overall assessment plan for their campus group. The outlines of these two kinds of plans are attached.

The perceived value of these strategies is that:

  1. They will give immediate feedback to faculty (both module and traditional) on their students' performance compared with a matched student group. This can be replicated over time.
  2. They emphasize grass-roots', rather than externally-imposed, evaluation strategies.
  3. They support the professional habit of collegial pedagogical review (as a parallel to review of research activities).
  4. The student assessment data thus generated can be matched with data of similar type from other campuses and contribute to the overall evaluation of student learning in modular classes and labs.
  5. They are, inherently, a vehicle for dissemination within departments.

The MC^ group are currently discussing which of their campus groups might join this initial group of comparative student assessment experimenters. The two evaluators are also discussing whether, and how best, to build on the work of the initial campus pairs/groups in designing the next round of assessment workshops.


At the outset:

  • What do students REMEMBER from the last chemistry class? (relevant skills & knowledge)
  • What chemistry do they know\what skills DO THEY ENTER WITH that they will need for this class? What do they expect/want from the class?
  • What are my LEARNING OBJECTIVES for this class? (What are my PRIORITIES? How do I weight them?)
  • How can I OPERATIONALIZE my (prioritized) learning objectives into student assessments (assignments and tests?)
  • Which elements of my learning objectives and assessment plans are\could be:
    • SHARED WITH COLLEAGUES (cross-assessment; collaborative assessment)
    • MONTITORED\EVALUATED BY COLLEAGUES- Chemistry or other (quality check)?

As the class progresses & at the end:

  • What do l\we know about what my\my colleagues' students know\can do?
  • How do I\we know this?
  • What form do our data take?

When my/our students enter the next chemistry (or other) class:

  • What do they remember\what can they do on entry?
  • How do we learn this?


  1. The specifics of the module class assessment plan for each module class.


    an outline of the elements that are:
    • shared with colleagues
    • monitored\evaluated by colleagues
  2. The design for your 'over-time* monitoring of student assessment outcomes:
    1. for individual module classes, as compared with traditional classes
    2. for student progression from lst\2nd year classes to more senior classes
    3. for any other forms of time-tracking:

      • which majors modular students choose
      • how they perform (comparatively)
      • whether they persist, switch, or drop out
      • their gradation record
      • comparative student evaluation by later science\other faculty
  3. Describe the kinds of data you will be able to collect by all of these methods.
  4. Describe how the data will be:
    • collected
    • analyzed
    • reported

Who will do each of these?

In what forms will it be offered?

Who will act as data coordinator with the Chemlinks\MC^ Evaluators?