Planning to conduct an assessment? Not sure where to begin?
In addition to assessment planning support, SAIRO has compiled resources that provide guidance about how to design and administer effective assessments at every step. For more specific information about the assessment process, see our Assessment Guide.
Assessment Cycle
The assessment cycle is a continuous process that involves planning and identifying outcomes, collecting data, engaging in sense-making/analysis, implementing improvement strategies and evaluating progress.
Questions to Consider Before Conducting an Assessment
- Does this information already exist within the institution?
- What specific outcomes related to student experience and success are you trying to accomplish?
- How do you plan to analyze the data about student experience and outcomes?
- What timeline do you have for conducting the assessment?
- What methods will you use to collect the data and why (e.g., focus groups, surveys, interviews, etc.)
- What resources are required?
- How will you take action based on the results to improve student support services?
- How will you communicate the results to community stakeholders, including students and staff?
Step 1: Plan and Identify Outcomes
What are you trying to learn about?
An outcome is a clearly defined idea that is likely focused on students or programs and services and reflects measurable goals. Outcomes matter in assessment because they help you know if you are meeting big picture goals and help you to understand how the topic you are assessing fits into your department as well as in Student Affairs. It is generally recommended to have 2-3 outcomes to assess in the academic year. When developing your outcomes consider resources, such as number of staff, etc.
Developing Outcomes
There are two types of outcomes you should consider for your area: program-level outcomes and student learning outcomes. Both types explain how the mission of the program is put into action and/or operationalized.
Program-Level Outcomes
- What it is: Examine what a program is to accomplish for improvement, usually driven by needs/satisfaction
- Example: Decrease the percentage of students experiencing stress, depression and anxiety by x% within the next [time frame].
Student Learning Outcomes
- What it is: Examine what a student is to do or learn as a result of the program or service and describe measurable actions.
- Example:Students practice mindfulness, exercise, time management and relaxation after participating in [program].
All outcomes should be SMART (Strategic/Specific, Measurable, Realistic, and Time-Bound)
- Strategic/specific: Reflects on important dimensions of what your area seeks to accomplish (i.e., programmatic or capacity-building priorities).
- Measurable: Includes a standard or benchmark to be met
- Achievable:Is challenging enough that achievement would mean significant progress. It is a "stretch" for the organization.
- Realistic:Is not overly challenging and takes resources, capacity, and execution into consideration; it is possible to track progress and worth the time and energy to do so.
- Time-bound: Includes a clear timeline/deadline.
Who should be involved?
Developing outcomes should involve student affairs professionals, administrators, and students to ensure a holistic approach and a consideration for diverse needs and perspectives.
How often are outcomes assessed?
Outcomes should be assessed regularly. We recommend that outcomes be reviewed annually and more comprehensively every five years.
Step 2: Collect Data
Student affairs assessment often involves quantitative data (e.g., utilization rates, numerical data points from surveys, etc.) or qualitative data (e.g., student testimonies, student anecdotes, etc.). A mixed methods approach can provide a richer understanding of student experiences since quantitative data can show trends and reveal opportunity gaps within services, and qualitative data can provide the "why" something is happening.
Collecting Existing Data
The Student Affairs Information and Research Office (SAIRO) administers several surveys aimed at understanding the experiences and outcomes of UCLA student population. Some surveys, such as UC Undergraduate Experience Survey (UCUES) and First Destination Survey (FDS), can be connected to utilization data using UIDs. There are also additional resources available via our campus partners and UC Information Center. Check out all of the available resources here: Data Resources at a Glance.
Collecting Department/Program Level Data
Department/Program level data can be collected via focus groups, interviews, surveys, Question Wall method, space utilization, etc. to understand student experiences in accessing services. See common data collection methods and data collection tip sheets below.
Common Data Collection Methods
- Surveys
- Focus Groups
- Interviews
- Questions Wall method
- Analysis of Student Records
- Systematic Observations
Interested in knowing more? See our Assessment Guide (pg. 5)
Data Collection Tip Sheets
Step 3: Analyze Data
Quantitative Data Analysis Tips
-
Use descriptive statistics (such as average, medians, etc.) to understand overall trends
-
Disaggregate data: break down by demographics (e.g., gender, race/ethnicity, first-generation status, class level, etc.) to understand group differences in experiences
-
Visualize the data using charts or graphs to make patterns more visible
-
Look at trends over time (can be quarter-to-quarter or year-to-year)
For more information, see: Quantitative Data Collection Best Practices (coming soon)
Qualitative Data Analysis Tips
-
Organize and clean your data in Excel (or other software)
-
Create codes based on themes or patterns, starting with a few broad categories and refining as you go
-
Identify recurring themes, concepts, or sentiments in data
-
Use quotes to give voice to student experiences
-
Use qualitative insights to deepen understanding of quantitative trends
-
Be aware of bias and acknowledge any assumptions that come up during the analysis process
For more information, see: Qualitative Data Collection Best Practices
Equity-Minded Sense Making
Review data to determine where the program or service must adapt to achieve desired outcomes. Consider using Equity-Minded Sensemaking when in the analyzing stage. When reviewing data disaggregated by various student groups, consider the following:
- What patterns do you see in the data?
- Which student groups are experiencing disadvantage?
- What additional information is needed to better understand opportunity gaps?
- What additional questions might you ask for additional understanding?
- How might this information inform goal setting?
Interpreting data in this way helps us identify gaps in our services.
Technique to Dig Deeper: Root Cause Analysis
Interrogative technique that is used to explore the cause-and-effect relationship and reach the fundamental reasons behind disparities. This technique is most effective when used in a group-setting with varied perspectives and knowledge:
- Create a focused problem statement and ensure group consensus
- Unpack root causes
- Have group write a problem statement and write sub-problems (or causes) underneath
- Ask, "Why did this happen?" for each cause until you get to the root cause. Generally, there are five "why" questions before reaching the "root cause."
Step 4: Share Results
Once you have your results, it can be challenging to figure out the best way to analyze and present the information. It is good practice share your results with others use. Sharing results with stakeholders, including students, increases transparency about why data is collected and how it is used in decision-making. Here are some resources that will help you share your results in effective and illuminating ways.
Step 5: Identify and Implement Changes
A thorough, well-crafted plan is essential for successfully implementing changes. Your plan should build on outcomes identified during the assessment process and clearly define:
- Key outcomes aligned with departmental goals
- Who will lead the implementation of efforts
- Action steps to close service gaps and improve student support services.
Use the Outcomes and Departmental Plan template to guide your planning. Once your plan is in place, you can begin making data-informed adjustments to your programming. Any changes should be rooted in data and insights gathered through the assessment process to ensure program is meeting students' needs.
Step 6: Assess Impact of Change
The objective of assessment is continuous improvement, so once changes have been implemented the assessment cycle begins again. Successful changes are shown when you have met your outcome goal, whether closing opportunity gaps in access or increasing success outcomes for student groups.