North Carolina State University
SACS Compliance Certification
August 15, 2003

Comprehensive Standards: Institutional Mission, Governance, and Effectiveness 3.3.1 (outcomes assessment)
The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.

Compliance
North Carolina State University is in compliance with this standard.

Explanation

This document includes brief summaries of assessment activities for each major administrative area, including academic and administrative units.  The references list includes a link to a menu of more extensive area summaries with supporting documentation.  Assessment of the colleges’ academic activities is summarized in the sections on Undergraduate Education, the Graduate School, Research Administration, Extension and Engagement, and College of Veterinary Medicine.   

For more information on the university’s ongoing, integrated, and institution-wide research-based planning and evaluation processes, see Core Requirements #5.


Introduction

As an institution whose land-grant background fosters attention to effectiveness and impact, outcomes assessment and the use of assessment results in planning have been part of NC State University’s institutional effectiveness activities for many years.  In a large, complex, doctoral/research-extensive university like NC State, individual academic and administrative units define and evaluate their contributions to the university’s mission in different ways.  Notable current activities related to outcomes assessment and outcomes-based planning include:

  • Compact Planning, a cyclical process used to establish priorities for action by academic and administrative units and by the university as a whole.  Compact plans utilize assessment results to help shape various planning and decision-making processes, as described in the compliance document for Core Requirements #5.
  • Some individual academic and administrative units have maintained and expanded outcomes assessment as part of institutional effectiveness activities initiated in the early 1990s.  The summaries of assessment activities attached to this document indicate that major effort since then has gone into developing and supporting assessment of student learning outcomes for campus-wide undergraduate program review, in the College of Engineering’s preparation for ABET program-accreditation reviews, and, most recently, as part of revitalizing the university’s undergraduate general education requirements and the Graduate School’s program review process.  These experiences also inform current efforts to coordinate outcomes assessment processes in Student Affairs and in Finance and Business.
  • As the assessment summaries indicate, extensive support for student-related assessment activities includes data from regularly-scheduled, campus-wide surveys of second-semester sophomores, graduating seniors, and baccalaureate alumni, and from graduating students in graduate programs; special research projects such as the campus diversity survey or study of retention rates; student demographics and performance measures such as enrollments and graduation rates; extensive information, training, and consultation about assessment.

The collective impact of these activities gives NC State University a climate that stresses development of outcomes assessment from the ground up, tailored to the individual unit’s programs and services at the level of delivery.  This is particularly evident in Undergraduate Academic Program Review (UAPR) and is becoming progressively more evident in other areas’ assessment activities.  The university has worked hard to elicit and nurture outcomes assessment processes that are developed and "owned" by individual academic departments and programs, service departments, and administrative and managerial offices rather than mandated from above.  We believe ground-up activities produce more lasting and effective assessment. 

Our experience is that top-down mandates often interfere with and stifle genuine assessment.  For example, we were implementing top-down, centrally managed assessment processes at the time of our last SACS review.  While some units continued these assessment activities after the review, many were not sufficiently invested in them for their own purposes and soon stopped using them. 

Not long after the 1994 review we began embedding assessment in ongoing processes like UAPR, linking them to existing planning processes like our current compact plans, and developing support and training to help ground assessment in units' own regular activities.  In this approach, central administration's role is to support unit-based assessment and to invite and use units' results for planning.  We believe that the level of institutional support and the widespread and still expanding unit-based outcomes assessment is a strong demonstration of our success with our encourage-and-nurture approach.


History: Systematic Development of Outcomes Assessment
In the late 1980’s, NC State University’s assessment activities were centrally organized and managed.  The direction changed in the early-1990s as we learned that this approach was not an effective way to develop a general climate of assessment rooted in individual units’ activities.  Key events include:

  • 1989 Fifth-Year Report for SACS outlining how we would implement outcomes assessment on campus, and the related assessment planning for the 1994 SACS review;
  • Key faculty members’ and administrators’ attendance at national assessment conferences sparked organized discussion of assessment issues, a University Institutional Effectiveness Committee, and TQM/CQI discussions and initial planning;
  • Regular campus-wide surveys of undergraduates’ opinions and reflections (see the summaries included in the references section for more information on student surveys);
  • Formation in 1997 of the Committee on Undergraduate Academic Program Development and Review Process Improvement (CUAPDRPI), which redesigned the existing program review process, piloted an outcomes-based process, and established the Committee on Undergraduate Program Review (CUPR) to refine the revisions further and implement them campus-wide. The resulting UAPR process includes extensive training and support components and draws on the expertise and experience of those involved in already-existing assessment activities (see the summaries included in the references section for more information on these issues);
  • Full-time assessment directors hired by the Division of Undergraduate Affairs, the Office of Institutional Strategy and Analysis, and the College of Engineering, to assist faculty members and academic programs plan and implement outcomes assessment;
  • Development of a common language to facilitate discussion of student learning outcomes assessment across academic areas, which also serves as a guide for assessment discussions in administrative and management units (see the summaries included in the references section for more information on the effort to communicate across areas);
  • Special assessment activities in administrative and management units such as Facilities Management’s “Team Excellence” project and Human Resources’ organizational study (see the summaries included in the references section for more information on these special assessment activities);
  • Use of assessment results in the university’s compact planning process and encouragement of planning initiatives that result from units’ monitoring their own effectiveness and efficiency.

Much of the success of NC State University’ outcomes assessment activities comes from identifying programs and projects that already have mature assessment processes, connecting the people involved with each other, and having them help others whose assessment activities are less developed.  In this collaborative environment, assessment projects with successful histories serve as “best practice” examples for projects at earlier stages of development.  The connections, overlaps, and training/support activities establish a broad base for outcomes assessment at NC State University, with dozens of people and more time and resources involved than would be the case with a more centralized process. In particular, undergraduate assessment processes have provided a model considered by other units including Student Affairs and the Graduate School.


Process for this Review of NC State University Institutional Effectiveness Activities
First, an Institutional Effectiveness Compliance Team was formed.  Members representing the university’s major administrative areas were responsible for summarizing outcomes assessment in their areas, which include Undergraduate Affairs, the Graduate School, other Provost’s Office functions, Student Affairs, Research Administration, Extension and Engagement, Finance and Business, and (collectively) the remaining units reporting directly to the Chancellor.  Each area’s report is briefly summarized below.  Each summary includes a link to the more detailed reports of assessment in that area and links within each report lead to further details.

In preparing and discussing the area summaries, team members characterized the overall level of maturity of each area's assessment activities.  Several evaluative rubrics provided guidance, such as the levels of implementation matrix developed by the Higher Learning Commission of the North Central Association of Schools and Colleges, and the rubrics developed as part of NC State University's UAPR process to evaluate individual academic programs' assessment activities and the overall undergraduate assessment process.  These rubrics are available from the UAPR website in the supporting documents for the detailed report of assessment in undergraduate education.

The next section of this document briefly summarizes assessment activities in each area, with links to the more extensive area summaries and their “drilldown” details.  The references link at the end of this document also leads to the full area summaries and their details.


Overviews of Assessment Activities by Major Academic and Administrative Area
Each overview links to a more detailed summary of assessment in that area, and links within each report lead to further details.

Undergraduate Education
Outcomes assessment in undergraduate education at NC State University is extensive, well organized, and supports programs’ ownership, understanding, and acceptance of assessment activities.  Degree-granting programs, curriculum-renewal and development projects, and educational-support programs all participate.  We believe faculty and co-curricular experts know their programs best and have the professional and pedagogical expertise to obtain evidence that demonstrates their programs’ strengths and weaknesses and can be used in planning improvements.  As is appropriate in a ground-up environment, individual programs’ assessment processes are at various levels of development.  Some are mature, a few are just beginning, and most could be classified as either “developing or advanced.”

The overall environment for undergraduate program assessment at the university is advanced.  The Council on Undergraduate Program Review (CUPR) provides extensive training and support activities for departments and colleges doing outcomes assessment.

The more detailed overview of assessment in undergraduate education outlines these developments.

Systematic development of student learning outcomes assessment began at NC State University in the late 1980’s with centrally organized and managed assessment activities.  The direction changed in the mid-1990s as we recognized that this approach was not an effective way to develop a general climate of assessment rooted in individual faculty members’ activities.  Since then, development has focused on fostering unit-based and -owned assessment and incorporating it into existing processes such as UAPR and assessment of General Education Requirements.  Key events in this history are listed above and in the more detailed report on assessment in undergraduate education.

Undergraduate Academic Program Review (UAPR) is a systematic and well-organized process, intended to promote meaningful and manageable assessment of academic programs and undergraduate affairs administrative programs.  Distance education programs are included in undergraduate program review.  To support assessment activities between periodic full program reviews, the Committee on Undergraduate Program Review (CUPR) requests regular progress reports and provides extensive feedback to help programs develop and maintain their assessment activities.  CUPR and the Division of Undergraduate Affairs also nurture the university’s assessment culture with a variety of resources.  Workshops are often customized for a particular college or set of departments.  Web-based resources include an extensive set of answers to frequently asked questions, and access to dozens of campus assessment experts and other faculty and staff with assessment experience who have become UAPR “facilitators” available for one-on-one consultation and assistance to departments and programs.

In keeping with NC State University’s ground-up approach to assessment, departments, colleges, and programs vary in their assessment processes and how they document what they have learned.  For example, the Department of Foreign Languages and Literatures’ process involves faculty at all levels.  The department has made the details of their process, assessment results, and use of their results completely available on the Web.  Their work illustrates the kind of “best practices” example provided by well-established assessment programs within a department.  At the college level, the College of Engineering has a very organized, sophisticated process for assessment and documentation of results and plans for program improvements.  Some colleges and programs share their assessment processes and results publicly via the Web.

The more detailed summary of assessment in undergraduate education includes links to these parts of the university’s extensive UAPR website.  Restricted files include all assessment plans, reports of assessment results and how results are used, and CUPR feedback on these plans and reports; the detailed report includes information about access to these files for the Compliance Certification Team and others involved in NC State University’s compliance review for SACS reaffirmation.

Outcomes assessment in special projects and curriculum development: the Campus Writing and Speaking Program actively helps undergraduate departments develop explicit writing and speaking outcomes for their disciplines and methods to assess them.  Curriculum development projects in Inquiry-Guided Learning (IGL), Service Learning, and diversity and inclusiveness include regular assessment of the programs’ effectiveness.  Faculty-driven experiments in computer-assisted instruction include comparing student learning in computer-assisted and conventional versions of the courses involved.  In addition, outcomes assessment processes in student-support services of the Division of Undergraduate Affairs (UGA) follow the general guidelines developed for UAPR.

Assessment of general education requirements: NC State University’s current undergraduate general education requirements (GERs) were formalized in 1992 and implemented campus-wide in 1994.  As in many other universities with “smorgasbord” requirements, general education assessment was initially localized in results from campus-wide student surveys and in the work of focused programs such as the Campus Writing and Speaking Program, assessment of critical thinking in a variety of Inquiry-Guided Learning projects, and program assessment processes in mathematically-oriented undergraduate programs.  Our faculty wanted to assess general education in a manner that would improve the general education course offerings and the program as a whole.  The Council on Undergraduate Education, the university’s standing committee to advise on general education policy, developed a new assessment plan in 2002-03 with specific objectives for each GER category and course-specific outcomes and assessments related to these objectives in each GER course.  The new process is being piloted in spring and fall 2003 and then phased into general use.

Campus-wide surveys of undergraduate experience: sophomores, graduating seniors, and baccalaureate alumni are surveyed regularly about their educational experience and its contributions to their knowledge, skills, and development.  Departments may request special survey inserts with the graduating-senior and baccalaureate-alumni surveys that ask additional questions specific to the student’s college and major program.  Results from the campus-wide surveys are provided to colleges and academic departments to assist their assessment activities and campus-wide and college-level results are on a public website.  Many units use these results for program improvement as part of their own assessment processes.

Graduate School
The Graduate School at NC State University is responsible for 165 degree programs (106 master’s-level, 59 doctoral) that conferred 1,479 graduate degrees in 2001-02 and enrolled 5,450 students in Fall 2002. It uses a variety of assessment tools to evaluate its effectiveness on an ongoing basis.

?         Dashboard indicators of the quality and diversity of applying, admitted, and enrolled students help the Graduate School and individual programs evaluate recruitment efforts.  For example, these data indicate that we have had much success in diversifying the graduate student population. 

  • Surveys of graduate program directors and of participants in fellowship, traineeship, student professional development, and diversity programs provide information used for improvements, such as streamlined processes and shorter turnaround times for processing plans of graduate work and graduate applications.
  • Exit surveys are collected from students when they submit their theses or dissertations, or when they apply for graduation checkout; survey data are made available annually and indicate student satisfaction with their departments and programs.
  • Periodic self-study and onsite review of individual graduate programs help the Graduate School and individual programs monitor and improve their programs.
  •   In addition to these formal assessments, the Graduate School collects informal feedback at annual meetings and summer workshops with graduate program directors and graduate secretaries.  Information from all these assessments is used to improve both programs and services and strongly influences the Graduate School’s compact planning process.

Currently, the Graduate School conducts formal graduate program reviews on a ten-year cycle.  These reviews engage program faculty, other faculty inside and outside the university, and the graduate dean in study and evaluation of the programs' academic performance in relation to the mission of NC State University.  Each review consists of a self-study by the appropriate department and a review by external faculty.  External reviewers’ reports generally include constructive comments for improving the program, identification of possibilities for collaboration with other departments, and a description of those activities that the department is doing well as an example for other departments. 

Evaluation of the program review process by a Graduate Program Review Task Force suggests that the current system achieves some goals of program review (evaluating the program’s purposes within NC State, effectiveness in achieving these purposes, and overall quality) but does not adequately challenge programs being reviewed to identify future objectives for the program and the changes necessary to achieve those objectives.  In addition, the process does not yet fully reflect the climate change on campus, typified by UAPR, towards more outcome-based assessment of academic programs and continuous use of that information in program planning.  Supporting material in the area summary of assessment in the Graduate School includes the resulting recommendations for explicit training in and support for outcomes-based assessment to enhance graduate program review.  The recommendations have been reviewed by the Administrative Board of the Graduate School, the academic associate deans in the colleges, and by the graduate program directors, and broad distribution and faculty comment have been invited.

The Graduate School continues to evaluate the assessment tools currently in place to further improve their effectiveness.  This includes formalizing assessment activities such as management surveys, adding questions to the graduate student exit survey about Graduate School services, and documenting information gathered through informal assessment activities.  The outcome of these efforts will be a “unit effectiveness” plan that will allow the Graduate School to more systematically assess services and programs and to improve processes for making and monitoring changes based on assessment data.

Distance education programs are included in graduate program review.  Students in the off-campus versions of on-campus programs participate in a survey conducted by the UNC Office of the President to compare on-and off-campus students’ satisfaction with their degree programs.

Office of the Provost
The chief administrators of ten service units and the deans of the ten colleges report to the provost.  This section summarizes institutional effectiveness activities in the service units and in the administrative offices of the colleges.  The service units reporting to the provost include the provost’s office itself and the offices of the vice provosts for Academic Affairs, Academic Administration, NCSU Libraries, Enrollment Management and Services, Equal Opportunity and Equity, Diversity and African-American Affairs, Distance Education and Learning Technology Applications (DELTA), Information Technology, and International Affairs.  (Assessment activities in service units reporting to the vice provost for Undergraduate Affairs are described above in the section on Undergraduate Education.)

The units are at various stages of development in their assessment-based institutional effectiveness processes.  The NCSU Libraries illustrates a well-established, mature approach to assessment, with a wide range of procedures conducted on a regular basis producing quantitative as well as qualitative information used to improve services.  A number of the units (e.g. DELTA, International Affairs, Diversity and African-American Affairs) created within the past three years or (e.g. Academic Affairs, Equal Opportunity and Equity, Academic Administration) recently reorganized or with new assignments, are in the early stages of institutional effectiveness. 

Taken as a whole, the units in the provost’s office demonstrate a growing commitment to assessment.  Units know what they are trying to accomplish, understand how to gather information about their performance, are increasingly using that information to improve their work.  A range of assessment activities is used, including “counts and amounts” (e.g. Information Technology, Enrollment Management and Services), regular meetings (e.g. provost’s office), client surveys and focus groups (e.g. Libraries), customer surveys (e.g. Academic Administration), and external consultants (e.g. Office of Academic Affairs—Faculty Center for Teaching and Learning).

At the conclusion of each semester a distance education student evaluation form is sent to each  student enrolled in a distance education course. The collected surveys are tabulated and summarized by an independent entity and the results used to assess and refine existing courses, programs and services, and to provide future direction for program and service development. Distance Education students also participate in the graduate survey administered by the UNC Office of the President that compares the quality of off-campus programs with similar on-campus programs.

Similarly, units vary in how formally assessment results are used.  The main provost’s office, for example, holds regularly scheduled meetings with six different groups, and feedback garnered in these meetings helps guide office activities in informal, qualitative ways.  In contrast, the NCSU Libraries uses findings directly to improve services.  For example, when students’ ratings on campus-wide surveys indicated high satisfaction with library services but less satisfaction with the training provided in how to use those services, the libraries provided more ways for students to access training and information and customized instruction that faculty members can incorporate in their courses.

Certain assessment processes are widespread in the college deans’ offices.  For example, university policy requires five-year administrative reviews of college leadership and programs.  Most colleges work with one or more professional accreditation associations, and their periodic reviews include assessment of administrative effectiveness.  The section on Research Administration, later in this document, indicates that research centers, institutes, and laboratories are reviewed on a regular basis. 

Some colleges do additional assessment.  For example, in the College of Design the dean holds “dinner with the dean” meetings once or twice each semester to give students a chance to provide direct feedback regarding the effectiveness of the college’s programs. The College of Textiles has annual all-day faculty retreats at which key strategies and programs are reviewed and working groups prepare analyses of the college’s progress, strengths and weaknesses, and give specific suggestions for improvement.  Like the service units, the colleges vary in the degree to which assessment findings are used explicitly to modify administrative functioning. 

An example of an advanced assessment effort is found in the College of Engineering.  A weeklong workshop for new faculty has been offered each of the past three years. The program’s effectiveness is evaluated by immediate participant feedback, a survey of faculty participants in their second year, and through review of participants’ teaching and research performance.  As a result, improvements in this faculty-development program have been made each year. 

In the College of Education, feedback from the teacher education accrediting body has guided the college in constructing departmental websites about assessment, training faculty in the use of assessment tools, sponsoring faculty discussions on the use of assessment evidence for continuous program improvement, and piloting a new student teaching performance-based observation process.

College of Veterinary Medicine—Academic Programs
The college’s PhD programs are reviewed and evaluated by the Graduate School, but its professional program (Doctor of Veterinary Medicine) is evaluated internally by the college.

The college pursues a range of assessment and institutional effectiveness efforts relative to its academic programs, including identifying the college’s goals and objectives, student course evaluations, peer-review of teaching, and course evaluations on a three-year cycle.

These assessments are used in regularly revising courses and in annual faculty appraisals and for promotion and salary considerations.  Feedback from students is sought at the end of year two and in the last semester of the senior year, when the dean conducts exit interviews.  This feedback, and information from the alumni survey conducted every five years, is used to plan improvements such as changes in the curriculum to address alumni suggestions that students needed more opportunity to practice surgical skills, more training in business and interpersonal skills, and more attention to their specialties while still in school.

Student Affairs
The Division of Student Affairs contributes to the mission of NC State University by providing programs and services for students and the larger community to enhance quality of life, facilitate intellectual, ethical and personal growth, and create a culture of respect for human diversity.  The division’s website links to information about its 35 service areas. All units participate in the university’s compact planning process, with assessment plans as an integral component.

Many units in the division work closely with student advisory groups to help shape the educational experience provided for NC State students.  These student groups offer direct feedback and input for enhancing programs and activities and improving services to students.  A number of units conduct regular evaluations of services through customer surveys and comment cards, some of which are offered on the Web.  Units such as the Counseling Center and Student Health Services use information from student client records to help direct program planning.  Student Affairs units that teach academic classes (Physical Education, Music Department, ROTC units) conduct end of term class evaluations.  This feedback, shared with the instructors and department heads, leads to syllabus and other class improvements.

A few units within the division are fully engaged in assessment with detailed program outcomes and assessment plans.

  • As the result of an extensive self-study, the Office of Student Conduct has developed specific program outcomes and learning outcomes for faculty, staff and students, and has implemented an assessment plan in phases with specific deadlines.  Data are used to make decisions about program effectiveness such as improvements in the training provided for students, in the documentation of policies and procedures, and in the quality of the assessment process itself.
  • Educational Talent Search has specified program outcomes with target measures and corresponding deadlines; this provides the basis for the annual work plan against which progress can be measured.
  • Student Health Services also has a defined performance improvement program with specified outcomes and assessment deadlines.  Their performance improvement plan includes an annual outline of reports and assessments to be made, and a planning retreat each summer includes reviewing/evaluating the past year and setting goals and objectives for the coming year.  Results have led to improvements in the information available to students on Health Services’ website.
  • Multicultural Student Affairs has goals for each of its twelve focus areas with outcomes, strategies, measures, deadlines and the person responsible for each.  Data are used to set goals for each program. 

Overall, Student Affairs’ assessment process is “developing.”  To develop further, the division has entered into partnership with the Office of Assessment in Undergraduate Affairs, which is providing support in updating objectives, outcomes, and assessment plans, identifying and creating measures, providing workshops, providing assessment impact reports, relating assessment to budgeting and planning, and other issues.  As part of this partnership, Student Affairs has created an assessment committee that disseminates information regarding assessment, sets time lines, reviews plans and reports and provides constructive feedback.  Every Student Affairs unit has submitted an assessment plan to this committee and will submit periodic progress reports.

Research Administration
Research Administration’s central office, Sponsored Programs and Regulatory Compliance Services (SPARCS), is responsible for developing and implementing internal policies and procedures affecting research and sponsored activity, for helping faculty succeed in the sponsored research activities that are part of their career development, and for training and supporting college research offices that deal with research budgets, proposals, and agreements.  Another responsibility is to manage Centers, Institutes and Laboratories (CILs), which are multidisciplinary and have a strong graduate educational component. 

Research Administration is decentralized to provide more effective service to college and research faculty.  It is challenging to coordinate information and achieve economies of scale in a decentralized context, so extensive efforts to organize, support and promote efficiencies are underway through SPARCS.  CILS are also reviewed biennially by the Office of the President of the University of North Carolina. 

Three primary feedback and oversight groups are central to SPARCS’ effort: the Research Operations Council (ROC) comprised of the associate deans for research and selected others, the Research Support Council (RSC) comprised of college-level and key sub-college level research administrators, and the University Research Committee (URC) comprised of two senior members of the faculty from each college.

These committees meet at least monthly and discuss new and ongoing initiatives.  Periodic progress reports are made at monthly meetings or via the various listservers used to communicate with the committee and councils.  The committees come together twice a year to share their objectives and discuss accomplishments and plans. 

One of these semi-annual meetings is a two and a half day off-site retreat designed to establish priorities on issues of paramount concern to the research community.  Faculty, staff and leadership are represented on all of the committees and are invited to participate in the semi-annual meetings.  Action plans are developed to deal with issues raised during the retreat and the other semi-annual joint meeting.  The Research and Graduate Studies Compact Plan includes Research Administration strategies developed at the monthly committee/council meetings, the research retreat, the semi-annual joint meetings, etc.

One way Research Administration evaluates its effectiveness is by establishing, monitoring, and updating these committees’ goals and the goals of other, related processes.  This often leads to process improvements such as development of the web-based Grant Application and Management Systems (GAMS) to facilitate faculty members’ access to necessary information and procedures.

Research Administration submits monthly reports to the associate deans for research, the deans, the vice chancellors, the chancellor and the president of the university system, including, when appropriate, information from national and local rankings and efforts in which we participate.  Supporting material in the Research Administration section of this institutional effectiveness report includes many of these reports.

The reports are discussed at the chancellor’s executive officers meetings, at ROC meetings, and at the senior staff meetings of the college deans.  In addition, Research Administration uses several tools to ensure efficient operation of the institution's research enterprise.  These include bi-weekly staff meetings, frequent use of e-mail list servers to announce methods to improve performance, and use of computerized tools such as the Remedy Solution Tracking system managed by NC State University's Network and Computer Services division.

As noted in the area summary, Research Administration’s assessment efforts so far have been problem-focused and centrally organized assessment efforts are only in the early stages.  Developments include more explicit activities to evaluate research administration by the various oversight and feedback groups.  Formal incorporation of assessment efforts began at the February 2003 Research Administration Retreat.

Extension and Engagement
As part of NC State University’s land grant mission, Extension and Engagement provides leadership for university partnerships with external communities, which facilitate research and discovery; teaching and learning; and outreach and service.  This involves four primary thrusts: economic development, K-12 education excellence, leadership development, and sustainable community development and environmental stewardship. 

Extension and engagement activities representing NC State University’s colleges and units are present in each of North Carolina’s 100 counties plus the Cherokee Indian Reservation.  The activities may involve one-on-one consultations with small business owners to devise more effective business, marketing and development plans for future growth; in-plant assessment of manufacturing processes and operations to improve efficiencies through the integration of new technologies; program planning sessions with advisory committees; development of customized, contractual training modules based on the needs of the company’s workforce to improve profitability and stimulate economic development; and other interactions that provide access to learning opportunities and to the resources of NC State for individuals, organizations, businesses and industries, and communities.

Assessment of these efforts ranges from formal outcome measures to traditional end-of-program evaluations and customer satisfaction surveys to revenue generated by repeat customers.  Supporting material in the Extension and Engagement area summarizes documents assessment of representative activities of each college.   

?         The College of Agriculture and Life Sciences Cooperative Extension Service uses a variety of assessment methods including end-of-program evaluations, one-on-one interviews with participants, case studies, participant testimonials, participant response to on-farm demonstrations, success stories, cost benefit analysis, and the Cooperative Extension Advisory Leadership System.

?         The College of Textiles used assessment results to refocus training material for its Spun Yarn Manufacturing Program.

?         The College of Humanities and Social Sciences’ Humanities Extension has long-term partnerships with the Department of Public Instruction (DPI) and the Library of the State of North Carolina.  Assessment efforts include evaluations, requests for feedback, and interim and final participant responses. 

?         Other assessment activities include those used by the College of Engineering’s Industrial Extension Service (IES), the College of Physical and Mathematical Sciences’ Science House, the College of Veterinary Medicine’s Office of Continuing Education and Outreach, and the McKimmon Center for Extension and Continuing Education.

?         Examples of individual projects’ follow-ups and impacts are in the supporting material for the Extension and Engagement area summary, such as the “Success Stories” on IES’s website and the summary of feedback from Science House participants.

These programs are well developed and produce significant evaluation.  Other college extension programs are currently working towards greater coordination in assessment processes:

  •  The College of Education’s extension efforts consist of relatively autonomous projects and centers.  The extent of systematic assessment in these efforts ranges from recording hits on a website to rigorous, formal evaluation research producing articles published in professional journals.  Evaluation of on-site extension efforts (e.g., workshops, demonstrations) not part of externally funded projects typically consists of output data such as the number of workshops taught, the number of teachers in the workshops, and the number of students served by these teachers attending the workshops.  A Faculty Activity Report that is filed with the Dean’s Office twice a year documents the outreach activity that has occurred between the faculty members and a school district or teacher(s). This report, however, does not evaluate the impact or outcome of the activity being reported.
  • The College of Design does not yet have a formally organized evaluation process. Responsibility for organizing the college’s extension activities has recently been assigned to a faculty member to incorporate assessment/evaluation into any future activities or programs that the college delivers.

Finance and Business
The Office of Finance and Business includes six service divisions: Financial Services, Environmental Health and Public Safety, Resource Management and Information Systems, Treasurer, Human Resources, and Facilities.  All but one of the divisions routinely survey customer satisfaction levels, two have either comprehensive, overall organizational efforts or plans for such efforts.  All divisions that use formal workshops and training sessions to provide information to customers solicit feedback through course evaluation forms at the end of the sessions and use the information at least informally for program planning.

The only division without any direct customer surveys or focus groups is the Treasurer’s Division, which is appropriate.  That division is responsible for oversight of endowment assets, financing activities and student bookstore operations.  These areas, especially endowments and financing, do not lend themselves to customer feedback.  Instead, this division monitors return on investment and other appropriate measures.  The bookstore does rely on a university standing committee to help with policy development and to make recommendations for improvements in operations.  The University Standing Committee on Bookstores includes students, faculty and staff members.

The Human Resources Division (HR) has the greatest involvement in assessment and related improvements.  For example, training classes and workshops are evaluated at several different levels.  Every session ends with an evaluation form designed to gather information about course content and facilitator delivery.  HR’s Pathways program has mid-year and end-of-year surveys to assess participant satisfaction, supervisory support, participant progress toward individual and organizational goals, and ultimately there is an evaluation to measure the return on investment.  Participants and supervisors look at how the participant uses learned skills and determine the impact of the use of those skills on the organization. 

Another example of assessment in HR is the facilitator evaluations used to evaluate customized training efforts and organizational interventions.  Project plans and Statement of Work goals are reviewed with the customer at the beginning of the work to ensure mutual understanding of expectations and goals and at the end to ensure that the customer is satisfied and that goals have been reached.  Facilitators contact customers via telephone and seek feedback after the closing session. 

In addition, HR has begun extensive assessment as part of an organizational study to improve processes and services, implement partner relationships throughout the university, and generally become more focused on the client.  The study was paused after HR assumed additional university-wide duties and responsibilities last year, but plans are to resume the study during Fall 2003.

Another development effort that includes organized assessment processes is Facilities’ TEAM Excellence project, which seeks to improve communication, customer service, and the work environment.  In addition, facilities uses a number of assessment methods including customer satisfaction surveys to make improvements in service delivery, project satisfaction surveys which are used to evaluate staff, and a “Door Hanger Program” to provide opportunities for immediate feedback and communication.  The “Door Hanger Program” results in a measured response within 24 hours.  Facilities also uses a formal process to evaluate designers, contractors and project managers.

Other examples of assessment activities can be found in the Finance and Business area summary accompanying this institutional effectiveness report and in the associated supporting material. 

Chancellor’s Area
The chief administrators of seven independent NC State University units report directly to the chancellor. These units do not have similar or related missions and they are not organized under a larger coordinating administrative structure the way other university units are (e.g., academic affairs units, finance and business units). The “chancellor’s area” units are the Department of Athletics; the Internal Audit Division; the William R. Kenan Institute for Engineering, Technology, and Science; the Office of Legal Affairs; the Park Scholarships program; University Advancement; and University Planning and Analysis.

These units’ objectives and assessment activities are as varied as their missions.  "Counts and amounts" are typically used to monitor progress toward financial-development objectives such as building the university’s endowment and funding athletics programs.  Programs such as the Park Scholarships and student-athlete development activities look at students’ participation and resulting performance. 

Units that provide service to other university units, such as Internal Audit, Legal Affairs, and Institutional Strategy and Analysis, examine client-units’ use of and satisfaction with their services.  Service units such as this monitor what services clients use most often and survey their “customers” on a regular basis. 

All the units use assessment information for planning.  While some have formal processes to “close the loop,” others use assessment information informally and as needed.  For example, Internal Audit and Legal Affairs use assessment information in weekly meetings to review current and pending cases, and Park Scholars program staff discuss results with their advisory board several times a year.  University Planning and Analysis’ self-assessment illustrates how processes continue to develop as a function of experience and changing needs, with focus groups, interviews, and both focused and general “customer” surveys used as appropriate. 

The feedback from these assessments has led to changes such as adding historical trends to more reports, moving routine reports to the Web, developing tools so that customers can find what they are looking for easily, and eliminating reports that are infrequently used or requested.

Overall, these units’ assessment and institutional effectiveness activities reflect the general culture of assessment that is becoming progressively stronger and more explicit at NC State.  Most units are clear about what they want to accomplish, how they evaluate their progress, and how they use that information.  Where this is not the case, units are generally moving toward assessment that is more formal.


Using the Results of Assessment

Academic departments use assessment results in program review, and both academic and administrative units use assessment results in compact planning. Perhaps the best use of assessment results, however, is as feedback to faculty and administrators responsible for planning a particular program or service.

Because assessment in academic and administrative units has matured at different rates in different units, the use of assessment results varies, too. Below are some examples from units with well-developed assessment processes.

Faculty have made a variety of curricular changes in response to results from their performance assessments and student surveys.  For example, according to exit questionnaires and structured interviews with faculty, graduating seniors in Foreign Languages and Literatures requested more emphasis on cultural and work-related content.  Several elective courses in the French program have been dropped, added or consolidated as a result.

Similarly, after reviewing course syllabi and graduating seniors’ and baccalaureate alumni survey responses, faculty in Electrical and Computer Engineering increased their curriculum’s emphasis on continuity in tools for required courses, with more use of specific mathematical software in lower level courses and experience with standard laboratory equipment repeated across courses.

Students’ performance in Wood and Paper Science senior projects led faculty to develop methods that help students understand more clearly what the faculty expect in a project’s deliverable items (project proposal, notebook, progress report, final report, final presentation), including detailed course rubrics that are given to the students at the start of the semester.

As a result of recommendations from a comprehensive Graduate Program Review, Crop Science faculty analyzed trends in job opportunities for our graduates and developed ways to increase graduate enrollment, charged their Graduate Education Committee to continually assess the program and bring recommendations for changes to the faculty, and worked with the College of Agriculture and Life Science Career Services Office to enhance their post-graduate employment survey.

The College of Engineering’s Industrial Extension Service provides several success stories, such as their work with a North Carolina clothing manufacturer with high energy demands because they use a constant supply of hot water to dye, set, and wash their product.  Specialist engineers from NC State University helped the company identify and make improvements that reduced fuel costs by more than 42%, made dye process workers more efficient because they now had an uninterrupted supply of sufficiently hot water, and increased the quality of the company’s products.

Administrative offices have also used assessment results to improve services.  Surveys of the Office of Legal Affairs’ clients led to more effective client education and user-friendly website information.

In response to many types of customer feedback, including surveys and focus groups, Institutional Strategy and Analysis moved all of its routine reports to the Web and developed web-based strategies to allow customers to find information more easily through improved navigation and query tools.

Details for these examples, as well as additional examples, are in the supporting documentation that accompanies the individual area summaries.  The References section below has a link to a menu of these summaries and documentation.


What We Have Learned as a Result of this Compliance Review
The overviews in the previous section of this document, and the individual area reports and supporting documentation, indicate broad and continuously growing outcomes assessment activity at NC State University.  Each academic and administrative area’s units engage in outcomes assessment and use assessment results in planning which has resulted in improvements ranging from changes in individual courses to changes in academic and administrative programs. These activities vary across administrative areas and across units within each area, as should be expected when outcomes assessment processes are developed from the ground up.

The most intense and organized developments during the past decade have been in assessing undergraduate student learning outcomes.  As the introduction mentions, the success of NC State University’s undergraduate outcomes assessment activities comes from identifying programs and projects that already have effective assessment processes, connecting people with each other, and letting those people teach their techniques to others.  This has established a broad base for outcomes assessment, with more people, time, and resources involved than would be the case with a more centralized process.  The connections extend beyond undergraduate education, most clearly so far in developments in the graduate school and in student affairs, but also to other administrative areas.


References


NC State University Home--> Accreditation Home--> Compliance Reports --> Governance 3.3.1

N.C. State University
Last Modified: Thursday, 17-Jul-03 15:48:09