DASA Assessment Highlights 2019-2020
Assessment within the Division of Academic and Student Affairs seeks to promote continuous improvement of courses, programs, and services. Each year, through the annual assessment reporting process, DASA units measure their progress towards their outcomes and reflect on ways to enhance their unit. This process requires that DASA staff and faculty engage in significant effort to gather data, analyze results, and disseminate and discuss results. This showcase seeks to highlight the best practices DASA units demonstrated in their 2019-2020 assessment activities: utilizing existing data, triangulation of multiple data points, collaboration, and taking action with data. When units engage in best practices staff may save time, generate more meaningful results, and develop a culture of assessment within their unit.
While DASA Assessment applauds the great assessment work many DASA units have done in 2019-2020, this page highlights DASA Facilities, Academic Advising Programs and Services, the Academic Success Center, and Advising Technology as examples of DASA units striving to conduct assessment that is intentional, meaningful, and collaborative.
Utilizing Existing Data
DASA faculty and staff may have access to existing systems or datasets such as Student Information Systems (SIS), “attendance” or card swipe data, student products, and university surveys. These data are often passively collected, convenient, and ready for cleaning and analysis. DASA units are encouraged to utilize existing data sources when possible and appropriate for their assessment plan.
2019-2020 Highlight: DASA Facilities
DASA Facilities’ commitment to engaging in best practices was made even more impressive because the unit was just onboarded to the annual assessment process in 2019-2020. As they prepared to engage in assessment, DASA Facilities staff conducted an audit of their existing datasets and mapped the data to their new outcomes. This revealed that DASA Facilities collects enough on-going, relevant data in their work order system to measure most of their outcomes without additional data collection. During data analysis staff identified opportunities to improve students’ experiences with Facilities through additional staff training and the refinement of existing processes. Integrating their assessment into their existing workflows saved staff time and offered them an opportunity to analyze their datasets from a different perspective.
The assessment process does not require that you create new data or instruments. Staff in DASA Assessment can help you identify other relevant sources of data that may be available to you on campus. You could even use existing data to triangulate your results!
Triangulation
Data are a snapshot in time. When you utilize multiple data sources to “triangulate” your results, you are often able to develop a more complete picture of your ability to meet your outcomes. Incorporate more than one data set in your assessment to gain even more insights into your population.
2019-2020 Highlight: Academic Advising Programs and Services
Academic Advising Programs and Services (AAPS) used multiple data collection methods to assess one of their learning outcomes, “students will be able to plan their degree progression.” Through use of multiple methods, faculty and staff in AAPS collected direct and indirect evidence of students’ abilities to plan their degree. Faculty applied rubrics to student work and utilized queries in the Student Information System (SIS) to observe how students were demonstrating the skills they learned in class related to degree planning; additionally, they analyzed survey data to better understand students’ self-reported confidence and behaviors towards planning their degree. Through triangulation, AAPS was able to have confidence that they were meeting their outcomes because they measured it three different ways! Using methods that produced direct and indirect evidence can mean that your results are more reliable.
Direct evidence is required through the DASA assessment process. However, we know that faculty and staff appreciate indirect evidence (often in the form of survey results). You may utilize indirect evidence in conjunction with your direct evidence to triangulate your results.
Collaboration
Assessment is a team effort! Incorporating multiple faculty and staff into your assessment process can foster a culture of assessment within DASA. Collaboration is especially important when it comes to sharing and utilizing results for program improvement.
2019-2020 Highlight: Academic Success Center
The Academic Success Center (ASC), formerly the University Tutorial Center, has engaged in assessment best practices for many years. A strength of their assessment process is the emphasis staff place on growing a culture of assessment through collaboration. As ASC cycles through their outcomes assessment, staff share the responsibility for leading various assessment activities. Through this collaboration, ASC has developed a robust assessment toolkit with various methods and strategies for assessing outcomes. ASC has created a culture of assessment by encouraging staff to take the lead on assessment projects and by providing resources, tools and support for their efforts.
Develop a culture of assessment in your unit by sharing assessment responsibilities. Reviewing assessment results as a team can increase buy-in when implementing changes to services, programs, or processes.
Using Data to Make Changes
The DASA assessment process requires that units report data at a level where it is possible to identify strengths and areas for improvement. The areas for improvement that are identified by the unit provide clear direction for making data-informed changes to the teaching or delivery of content.
2019-2020 Highlight: Advising Technology
Advising Technology collects robust data sets through GPS (NC State’s advising platform). This year Advising Technology assessed how well faculty and staff engage with Student Success GPS as an advising tool. Advising Technology staff found that faculty advisors across the board were less engaged with the platform than full-time professional advisors. Results also indicated that engagement levels were uneven across the academic colleges and within specific academic and co-curricular units. To address these areas for improvement, Advising Technology decided to customize workshops with faculty, intentionally partner with colleges with lower levels of engagement, and devote additional effort to supporting campus partners with minimal usage.
Actions taken on assessment data should be focused and aligned to the areas for improvement identified in the data. Actions do not have to be large-scale to promote change. Often actions taken are small tweaks and changes to existing teaching and service delivery.