Over the holiday, in fact in preparing for the New Year, I set to paper the projects for the Spring 1996 semester for the Assessment Resource Office. For planning purposes this was probably a good idea. For the purposes of relaxing with family and enjoying my hometown in California, it probably was not a good idea. Regardless, what is done is done, and Iíd like to use part of this DATAWave to discuss the many activities planned in outcomes assessment, all of which will hopefully help ENMU be a more responsive institution.
Though I will discuss these in greater detail, briefly the ARO activities will be to:
In 1995, ENMU and all other institutions accredited by the North Central Association of Colleges and Schools (NCA) were required to submit a Plan for Assessment of Student Academic Achievement. This document was recently reviewed by the consultant/evaluator corps of the NCA and approved. In developing its Plan, each academic department was required to develop an Academic Outcomes Assessment Plan (AOAP) for implementation in academic year 1995-1996.
As part of the Universityís ongoing efforts to improve the services offered to its various constituencies, and to demonstrate to these constituencies that we are faithful stewards of the publicís resources and trust, the Assessment Committee has determined that all nonacademic programs and services of ENMU should prepare, implement, and participate in a program of outcomes assessment. This is consistent with the charge to the Committee from President Frost:
It is known that many of the nonacademic areas are actively involved in various forms of measuring their effectiveness. Typically these efforts are surveys, anecdotal information, and employee evaluation. Some areas use more strategic methods, such as Management by Objectives. Such activities do not facilitate a systematic review and reporting of program effectiveness. Of course, effective measuring of program success, and eventual improvement, will lead to a higher functioning University. It is anticipated that most units can articulate what is required to do their job well, and can demonstrate, when asked, that they do so. OAPís consist of:
Beginning in AY 1997-1998, each non-instructional unit will submit a report by November first of each year to the appropriate senior administrator. These reports will include:
The Assessment Committee is in the process of developing timelines for the development of OAPís and protocol for their review. This brief article is intended to inform the campus community and solicit suggestions for improvement.
Outcomes assessment, for both instructional and non-instructional areas, has its purpose of demonstrating institutional effectiveness and discovering ways to improve the services offered to the Universityís various constituencies. Ultimately, students will be rewarded with an improved educational process, employees will be rewarded with a more pleasant workplace, and taxpayers will be rewarded with more efficient use of the resources made available to the institution. The Assessment Committee recognizes that many units of the campus currently provide excellent services and monitor their obtainment of goals.
The 1996 State of New Mexico Higher Education conference will be held on February 22 and 23 in Albuquerque at the La Posada de Albuquerque hotel. The conference theme is the Route to Excellence. Dr. Testa of the Assessment Resource Office is the chair of this yearís conference steering committee.
The conference is designed for faculty, staff, and administrators involved in outcomes assessment at New Mexico higher education institutions. A special emphasis of the conference will be on outcomes assessment activities that work and have led to documentation of student learning, and to change.
This conference has been designed and developed by local practitioners in order to share the successes and failures experienced at New Mexico's higher education institutions. Conference participants will have the opportunity to learn how similar institutions and programs have addressed outcomes assessment, share their own experience, and discover instruments and tools available from national publishers. The conference program has workshops for the person just learning about outcomes assessment as well as the experienced practitioner.
Several members of the Eastern faculty and administration will be presenting at the conference. They are Dr. Andrea Broomfield, Making It Matter: Assessment From the Junior Faculty Member's Perspective; Dr. William Calton, Development of the Assessment Plan at Eastern New Mexico University; Dr. David Eisler and Dr. Calvin DeWitt, Assessing the Benefits of Technology in the Teaching/Learning Process; Dr. Greg Erf, The Art of Assessment; Ms. Janeice Scarbrough and Ms. Angie Eskins, Addition/Portfolio Review in the Fine Arts; Dr. Anthony Schroeder, ITV Services Assessment; Dr. Nile Stanley, Outcomes-Based Assessment from a Vygotskian Perspective; and Dr. Alec Testa, Assessment 101.
The Assessment Resource Office will be able to assist a number of faculty and administration from Eastern to attend the conference. Support will include conference registration, per diem, and mileage for those who share a ride. Conference brochures and information regarding support is available to those individuals interested in attending. Contact Dr. Alec Testa of the Assessment Resource Office at extension 4313.
One of the main responsibilities of the Assessment Resource Office is to increase faculty participation in the analysis of assessment data. In the DATAWave during last year, analyses of assessment data from the Cooperative Institutional Research Project (CIRP) Freshman Survey, the College Outcomes Measures Program (COMP), the Collegiate Assessment of Academic Proficiency (CAAP), the Student Satisfaction Inventory (SSI), retention data and other sources were reported. Frankly, to date, the reports have just begun to scratch the surface.
Seemingly, the wealth of data thatís available could provide interesting research opportunities to new faculty or faculty looking for new research endeavors. Interested parties are encouraged to talk to Dr. Testa regarding research activities. More importantly, the Assessment Resource Office can provide support to interested individuals. Such support could be release time, special pay, or travel to conferences. Should funding be provided to a faculty member to analyze assessment data, one condition would be the requirement that the results be disseminated either through the DATAWave, publication, or presentation.
During the course of the semester specific guidelines or procedures for funding of faculty efforts may be developed. However, in the meantime, interested parties should contact Dr. Testa at extension 4313.
In past editions of the DATAWave, Collegiate Assessment of Academic Proficiency (CAAP) results have been reported. Beginning with this third volume of the DATAWave, CAAP results will be reported in a systematic and formalized fashion. Over the next six editions of the DATAWave results from the COMP from 1987 to 1992 will be discussed as well as the scale scores for the CAAP (writing, reading, mathematics, science reasoning, and critical thinking). It is planned that these results will be updated every spring and be reported through the DATAWave in order that the campus community might remain informed of outcomes assessment results.
The analysis that I plan to conduct would include comparison of mean scores for a variety of subgroups. These groups would include transfer students versus students who began at Eastern as native freshmen, CAAP performance by grade earned in general education courses (i.e., English 102 for writing scores, Biology 113 for science reasoning scores), students who pass versus students who do not pass, Eastern's performance versus national norms, trends in CAAP scores by test administration date, and trends in CAAP scores by term in which courses were completed. Some regression analysis might also be conducted in which grades earned or terms completed or other variables are used to account for the total variance in CAAP scores.
It's my intention when reporting these results throughout the semester to offer only the briefest explanation of the data, so as to invite the campus community to analyze the results.