Assessment Glossary
Introduction
The cycle of actions that determines how assessment should be conducted: identify learning outcomes or other program outcomes, gather evidence, interpret evidence, and implement change based on the results. The following resource includes more details on the assessment cycle:
Intentionally using assessment, research, and evaluation to guide planning, resource allocation, and decision-making to ensure a focus on student success. A culture of evidence empowers staff at all levels to engage in thoughtful evaluation and meaningful innovation.
Program Outcomes
A hierarchy of action terms that refer to different levels of learning - for example, "evaluating," "applying," or "remembering." The levels vary in the complexity of students' learning - e.g., evaluating an argument is more complex than remembering facts. These terms are useful for creating learning outcomes (IOWA). The following workshop slides include more information about how to use Bloom's Taxonomy in assessment:
A measure that directly evaluates student learning (IOWA). Examples of using direct measures to evaluate learning include giving students a survey question that asks them to define leadership in their own words or observing students as they engage in leadership activities and evaluating their performance with a rubric or checklist.
A measure that evaluates perceived, rather than actual, learning. For example, a survey question that asks students to rate their understanding of leadership on a scale from 1 (very low understanding) to 5 (very high understanding).
What students should know, think, and be able to do after they participate in a program or service (IOWA). The following workshop slides provide additional background and guidance on creating learning outcomes:
Unique to an institution, the mission, vision, and values determine what is a priority and can help guide the development of outcomes. The links below provide the mission statement for UNCW's Division of Student Affairs, as well as more background on what mission, vision, and values entail:
Program Activities
Tables that identify where an outcome is being addressed in the curriculum; these can be adapted for Student Affairs, with student activities and experiences in a program taking the place of academic courses. The workshop slides below provide additional information about curriculum maps:
A set of dimensions that define and describe the important components of an outcome being assessed; e.g., a rubric for an outcome such as "Students will deliver persuasive presentations about how to lead a healthy lifestyle" may include dimensions such as "engaging the audience" or "organization and coherence of presentations". Each dimension of a rubric contains multiple levels of competence, with a score assigned to each level and a clear description of what criteria need to be met to attain the score at each level (IOWA). Defining levels of performance for a learning or experiential outcome in a rubric can guide decisions about program activities and structure. Access additional resources, including an example of a rubric, at the following links:
Evidence Gathering
IRBs review and approve all research at the University of North Carolina Wilmington. Under the Family Educational Rights and Privacy Act (FERPA), assessment work done solely for improvements to internal programming and services, and that won't be published or shared outside the institution, does not require IRB approval. Research done for internal improvement that will also be shared with external audiences does require such approval. UNCW's IRB website can be found at the following link:
The methods used to gather information for the purposes of assessment (IOWA); examples include surveys, applying rubrics to portfolios, focus groups, and interviews, among others. The following resources include more information about specific methods of measurement:
Involves the assignment of numbers or ranks to objects, events, or observations according to some rule - for example, assigning a rank of 1 to respondents who select "strongly agree" for a survey item, or a score of 90 to students who answer 90% of items on a test correctly. Ideally, instruments with established psychometric properties are used to collect data, and statistical methods and data visualization techniques are used to analyze data and draw conclusions. (IOWA)
Involves the detailed description of situations, events, people, interactions, and observed behaviors; the use of direct quotations from people about their experiences, attitudes, beliefs, and thoughts; and the analysis of excerpts or entire passages from documents, correspondence, records, and case histories. (Upcraft and Schuch, 2001, via IOWA)
Differs from assessment in that it guides theory development, tests concepts, and has implications that extend beyond a single institution. The role of the research investigator is to describe what has been done, and in some cases explain implications for practitioners within a field such as student affairs. In contrast, assessment guides good practice within a specific context; its implications can rarely be generalized beyond a single institution; and the assessment investigator's role is not only to describe what has been done but what should be done given the findings of the study (IOWA).
The method by which a pool of participants is selected from the population of interest in an assessment or research project (IOWA). The following workshop slides provide guidance on constructing surveys:
Changes for Improvement
Involves data collection, analysis, and storytelling for external audiences (e.g. legislators, donors, parents) and demonstrates the effectiveness of programs and services to stakeholders (IOWA).
After collecting data, analyze the strengths and weaknesses of student affairs programs and services, and/or overall divisional effectiveness, to implement change. The following links provide helpful resources on closing the loop:
Anyone who has a vested interest in our work and needs to be informed of the results of assessment projects that speak to its effectiveness; stakeholders may include students, parents, the public at large, political leaders, faculty, staff and/or administrators (IOWA).