Assessment of student learning is a faculty-driven process that takes place at the department level. The purpose of this page is to provide information and resources to faculty and others interested in conducting assessment, both at the course level and the program level, and in creating an assessment plan.

Data Collection Types of assessment data and how to use rubrics in data collection.

Learning Outcomes
How to create clear, measurable student learning outcomes.

Program Assessment
How to assess at the program level, and examples of curriculum maps.

Assessment Plans
Develop a basic assessment plan.

Closing the Loop
Take action on the data collected.

Additional Resources
Further assessment information from IUP and external institutions.

Assessment Data Collection

Assessment data can be collected both directly and indirectly. At the course level, having at least one direct and one indirect assessment is ideal.

Direct assessments offer clear and compelling evidence of what students learned. Examples of direct evidence could include:

  • Pass rates on national exams
  • Written work or presentations scored using a rubric
  • Classroom response systems (clickers)
  • Behavioral observations of students performing a task
  • Conference presentations
  • Culminating project (capstone projects, senior theses, senior exhibits)
  • Embedded assignments, course activities, tests/quizzes
  • Employer/supervisor direct evaluations of student performance
  • Licensing and professional examinations
  • Peer-reviewed journal articles submitted for publications
  • Pre-test/Post-test evaluations

Indirect assessments demonstrate that learning took place, but do not show how or in what context. Examples of indirect evidence could include:

  • Individual course grades or overall GPA
  • Awards and honors, including graduation honors
  • Student/alumni perceptions of skills and satisfaction
  • Starting salaries of graduates or placement rates
  • Course evaluations
  • Enrollment in higher degree programs
  • Student interviews
  • Analysis of syllabi and transcripts

Using Rubrics in Assessment

Using a rubric helps instructors assess student work against an established set of criteria and expected performance standards. All rubrics should state the criteria by which the objectives will be assessed, delineate a range of performance levels, and describe the performance corresponding to each level. While rubrics may not be appropriate for every course or level of learning, they can identify systematic gaps in understanding and highlight content areas to address.

Example Rubrics

The following are examples of rubrics that may be used in a range of disciplines.

General Rubric Template (University of West Florida CUTLA Center)

Reflection Assignment (Harvard Graduate School of Education)

Quantitative Reasoning (Peggy Maki)

Information Literacy (Peggy Maki)

Critical Thinking (Washington State University)

Rubric Banks

Links to externally maintained rubric banks for all subject areas.

University of Hawaii at Manoa

Association for the Assessment of Learning in Higher Education

Creating Course-Level SLOs

Assessment begins with creating clear and measurable learning outcomes (SLOs). Course-level SLOs are distinct from course goals, course descriptions, the list of topics covered, teaching techniques, or learning activities. They describe the knowledge, skills, and attitudes that characterize a student as a result of taking a given course. Individual courses typically have three to six SLOs.

Well-written student learning outcomes should:

  • Be clearly related to topics, assignments, and exams that were included in the course.
  • Describe student learning in ways that suggest direct measures.
  • Represent learning that is appropriate for the course level.
  • Be aligned with program goals and objectives.
  • Use simple language.

When developing course-level learning outcomes, you should ask yourself the following questions:

  • What do I want students to learn as a result of having taken this course?
  • What do I want students to be able to do after completing this course?
  • Am I able to observe and measure the outcomes as they are stated?
  • Are students able to perform or demonstrate the outcomes?

The following example provides a basic template to help write your own student learning outcomes:

"Upon completion of this course, students will be able to (knowledge, concept, or skill they should acquire) by (how they will apply the skill or knowledge)."

Creating Measurable Learning Outcomes

Avoid words that are difficult for a student to demonstrate like "know" or "appreciate" or "understand."

The following examples illustrate the difference between clear SLOs that use measurable verbs and unclear SLOs that do not.

Examples that are unclear and unmeasurable:

  • Students will explore the literature on teaching effectiveness.
  • Students will understand the benefits of exercise.
  • Students will demonstrate knowledge of economic theory.

Examples that are clear and measurable:

  • Students will be able to write a paper based on an exploration of the literature on teaching effectiveness.
  • Students will be able to explain how exercise lowers the risk of cardiac disease.
  • Students will apply empirical evidence to evaluate the validity of economic theory.

For more examples of measurable verbs, see the revised version of Bloom's taxonomy.

Once the method has been selected and the learning outcome has been identified, it is helpful to specify the desired performance criteria. Course grades alone are not appropriate performance criteria because they do not give insight into the strengths and weaknesses pertaining to a specific learning outcome. Rubrics, by contrast, highlight areas of high achievement or those that still need improvement.

Using the clear and measurable example from above:

"Students will apply empirical evidence to evaluate the validity of economic theory, achieving at least a 7 out of 10 from the rubric."

By being specific about the goals we are seeking to achieve, we can more effectively improve the quality of our results and, in turn, increase our students' level of intellectual development.

Program-Level Assessment

Assessment at the program level should begin with student learning outcomes that are more general than those at the course level. They are broad descriptions of the knowledge, skills, and abilities that should characterize all students after they complete a program of study.

Program SLOs should be designed in a way that fulfills the program's mission, just as course-level SLOs should be designed in a way that meets the respective program-level SLOs.

Depending on the length of the program of study, each program should develop between three and six SLOs. In addition, each track or concentration should have at least one SLO that is unique from the main program-level SLOs and from those of other tracks or concentrations.

Program-level outcomes worksheet: a useful tool to help programs get started with developing their own outcomes.

Creating Curriculum Maps

To ensure that instruction aligns with the stated program-level outcomes, each program should map its curriculum. A curriculum map is a tool that describes the structure and coherence of an academic program by demonstrating how each course contributes to the program-level learning outcomes in the curriculum. Every required course in the curriculum should contribute to at least one program-level SLO.

A basic curriculum map should have one column for each program-level outcome and one row for each course or experience (or vice versa).

The following is an example of a fictitious Geography program:

I = Introduced, R = Reinforced, M = Mastery, A = Assessed
SLO 1: Apply spatial research tools such as GIS SLO 2: Describe human-environment interactions SLO 3: Identify the planet's physical characteristics SLO 4: Interpret geographic processes
Geo 101 I I
Geo 201 I R
Geo 301 R M, A I
Geo 401 M

In this example, the Geography program has four program-level outcomes and four courses. The curriculum map indicates where each learning outcome is addressed and to what level. SLO 1, for example, is not addressed in Geo 101, introduced in Geo 201, and reinforced in Geo 301. The map also shows that students are never asked to demonstrate mastery of that learning outcome, indicating a gap in the curriculum. Likewise, SLO 4 is introduced in Geo 301, but not addressed in the other parts of the curriculum. While students are asked to demonstrate mastery of SLO 3, the program is not assessing whether the students have achieved that level of learning, and that outcome was never reinforced in lower-level courses.

Creating a curriculum map, therefore, enables programs to identify gaps, such as having a program-level SLO that no course supports, and to identify if students receive enough practice with a skill before they are expected to demonstrate mastery. The practice also highlights where assessment data are collected or need to be collected.

Creating an Assessment Plan

Having a system or plan in place to guide assessment initiatives is a great way to establish a culture of sustained assessment in your program or department. At a minimum, an assessment plan should have three elements.

  1. Learning Goals
  2. Measures
  3. Follow-Up

Learning Goals

All degrees, certificates, and tracks or concentrations in your program should have their own learning outcomes. While the goals may overlap, each certificate, track or concentration should have at least one unique learning outcome from the others in the program of study.

Measures

At a minimum, use direct measures to assess each outcome. Ideally, your plan should incorporate both direct and indirect measures. Embedded within the measures should be a time frame for when data collection will take place, and when it will be reported.

The most appropriate choice of direct measure will depend upon the focus of the discipline. In some fields, the results of a certification exam provide an important measure, while others may not have that option available. An effective option for an indirect measure is to conduct student interviews, focus groups, or surveys to gather information about how students rate their learning experiences. Job placement rates provide another indirect measure that may be important in some fields.

Follow-Up

Once the data are collected, the program needs to decide on a forum in which to discuss and act upon the results. Meetings should discuss the data gathered for each degree program, determine how to use the data to improve student learning, and assign responsibility for following up on the plan to improve. Keeping minutes at the meetings provides documentation if needed, but also helps you stay on course when following up with the action items.

Once you have developed the program-level student learning outcomes, this Assessment Plan Template is a useful tool to map out how they will be incorporated into the program and acted upon.

Closing the Loop

The final step in an assessment cycle is "closing the loop," meaning that we must use the data that was collected to inform decisions or make changes. Closing the loop helps a program determine if students are learning what we intend them to learn, and to identify strengths and weaknesses in the curriculum.

This process ensures that any revisions are made on the basis of qualitative or quantitative evidence as opposed to intuition or anecdotes. Assessment is not complete until the results have been used to make improvements that contribute to student success.

Apart from making improvements to courses, teaching, or programs, closing the loop also entails sharing assessment results with appropriate members of the campus community. The purpose of sharing results should never be to identify individual faculty or courses, but rather to initiate a campus-wide discussion of assessment results and initiatives, promote shared decision-making, and to celebrate successes.

As a best practice, appoint a person or committee responsible for following up and taking action once data are collected, and specify how and with whom data will be shared.

Additional Resources

IUP Resources

Professional Organizations

External Assessment Resources

Bibliography of Print Resources (PDF)