“Assessment is an ongoing process aimed at understanding and improving student learning. It involves making our expectations explicit and public; setting appropriate and high standards for learning quality; systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and using the resulting information to document, explain, and improve performance.”

Angelo, T.A. (1995). Reassessing and defining assessment. AAHE Bulletin, 48(3): 149.

I’m Mary Wright, Director of Assessment and Associate Research Scientist at the Center for Research on Learning and Teaching at the University of Michigan.




I’m Amber Smith, Instructional Consultant at the Center for Research on Learning and Teaching at the University of Michigan.



Assessment is a vital part of any curriculum design or implementation project related to instructional interventions. Assessment data can be used to understand the impact of learning objects and to encourage others to use them. The resources on this website will help you to design an assessment plan, understand the types of evidence you can collect, and help you think through key ethical considerations.

Please find below:


1. What Do I Want to Assess About My Learning Object?

Planning for assessment should begin by thinking about the goal(s) of the Learning Object. Start by asking: What will students know or be able to do as a result of using the Learning Object?

In the following video, Christine Modey, of the University of Michigan’s Sweetland Center for Writing, gives an example of some research questions she would like to ask about “The Revision Project,” which utilizes student video testimonies to provide strategies for revising written works. The goal is to help more novice writers to better engage with the revision process. 

Once you are clear about the learning goals for your Learning Objects and the assessment questions you are interested in, you can develop an assessment plan. The assessment plan should outline your research questions, the methods used to gather evidence and analyze the data, and how the results will be disseminated.

The table below lists questions that might be useful to think about as you are creating your assessment plan.

Type of Question Examples
  • What does the process of students learning with Learning Objects look like?
  • How does the use of Learning Objects change the learning process for students?
  • Does this process vary for subgroups of students (e.g. majors/non-majors, men and women, novices and experts)?
Impact of Learning Object on learning
  • Do students achieve my learning goals for the Learning Object?
  • How does student learning change over time, before and after the use of a Learning Object?
  • Do students learn better because of this Learning Object (i.e., attributing impact or gains to Learning Objects)?
Quantifying and cataloging participation
  • How many students use the Learning Object and at what points in the term?
  • Which Learning Objects do students find most valuable?

2. How Do I Answer My Questions?

Once you have identified your research questions, the next step is to figure out how to answer these through data collection. Some possible approaches are found below, and ideally, your plan will draw from multiple categories: direct measures of student learning, indirect measures of student satisfaction or reported learning, process or participation, student background, and teaching practice.

Examples of Direct Measures
These assess student learning of your objectives through Learning Object use:

  • Presentations or projects scored with a rubric
  • Quiz or exam questions
  • Pre-/ post-tests

In the next video, Ginger Schultz examines the impact of VoiceThread, an online discussion tool, on students’ learning of spectroscopy in an organic chemistry course. She describes the use of a direct measure, a pre- and post- assessment that included problems that were similar to homework problems and an additional question asking students to explain how they got the answer.

Examples of Indirect Measures
These assess students’ reported learning, learning experiences, and satisfaction with the Learning Object:

  • Focus groups involving a discussion with 8-10 students to reflect on the Learning Object
    • Guidelines on planning and facilitating focus groups
  • Surveys of students

In this video, MELO co-project leader Nancy Konigsberg Kerner (Chemistry) provides an example of how to use a survey to understand what students found helpful about use of interactive online resources.  Click the survey title to view the survey discussed in the video below, E1SoftchalkSurvey chem.125.

Examples of Process or Participation Measures
These capture instructors’ and students’ use of the Learning Object:

  • Classroom observations by instructor and peers
  • Instructor self-reflection
  • Students’ reported use of the Learning Object
    • This poster presents a sample project that asked students to report on their change in usage of learning resources over the term in medical school.
  • Analytics from online sites (e.g., use data that records the frequency of student interaction with website)

In this video, MELO co-project leader Brenda Gunderson (Statistics) describes how she analyzed “use data” to understand when — and how many — introductory statistics students were using an interactive exercise.

 Student Demographics or Background

Aspects of a student’s identity and academic background may influence their use of or experience with Learning Objects.  It is often important to understand the variation of students within a cohort to make sense of the assessment data.  For example, did students do better because they used the Learning Object or because they had more educational experience when they used it? Is there a difference in how students of different genders or race use the Learning Object?

It is likely that you  can gain access to this type of data through an office at your institution.  For some data you will want to go through the Office of the Registrar, but you might also look for an Office of Academic Planning or an Institutional Research (IR) Office.

The University of Michigan has two major sources of data related to student performance:

  • The LSA Academic Reporting Toolkit (ART) enables LSA faculty to create course-specific data reports on a range of topics such as: enrollment and grade histories of students in a given course or between courses and relations of course grades to pre-college measures such as ACT/SAT exams.
  • A second data repository, the U-M Data Warehouse, is a collection of data that supports reporting activities for University business and includes student demographic data and course data. To request access to these data, instructors should contact their school/college’s data steward or the Office of the Registrar.

Documentation of Teaching Practice

The teaching practice or learning environment may influence the effectiveness of the learning object. Documentation of the course structure and any deviations from the course plan are important to consider as part of the assessment plan. Some characteristics that you may want to record include:

    • course’s meeting structure (e.g., frequency and type of meeting)
    • resources for students
    • challenges, barriers, and strategies (logistical and pedagogical)
    • content delivery mode (e.g., online, in person)
    • types of assignments
    • timing and frequency of practice and/or feedback


Human Subjects or Institutional Review Boards (IRB) are primarily concerned with protecting the rights and welfare of human research subjects, but they also work with researchers  in the design and conduct of sound research.  IRB practices and requirements vary by institution. Be sure to consult with your institution’s IRB prior to beginning a research project to understand the institution-specific requirements. FERPA (Family Educational Rights and Privacy Act) and copyright regulations may also affect dissemination of your research findings.

Here are some sample questions to ask about your Learning Objects assessment plan:

    • Are students able to freely consent to participating in the research, without fear that their grades or status in the program will be affected?
    • If using a control/comparison group, are you depriving some students of a more educationally powerful educational experience?
    • In dissemination of your results, is it possible to identify individuals?
    • If displaying examples of student products (e.g., projects or papers), have students given written permission?


To learn more:



This page was developed by Amber Smith and Mary Wright, Center for Research on Learning and Teaching, University of Michigan.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>