
In last week’s DATAWave, students' performance on the CAAP math score was reported. Special attention was paid to how students performed when classified by courses completed. In this edition of the DATAWave, students' performance by incoming ACT score, gender, and ethnicity will be reported. Also, students’ remarks regarding the general education courses in mathematics (that were collected during the fall 1995 focus groups) and survey data will be reported.
Three hundred twenty-two students had both CAAP math scores and ACT scores available. Of those students, 119 (37%) scored below the diagonal. That is, when their scores were converted to deciles, students scored lower on the CAAP math scale than they did on the comparable ACT scale. One hundred twenty-three (38%) scored above the diagonal. That is, 38% of students improved their math abilities as measured by the CAAP math scores in comparison to their ACT math score. Eighty students (25%) remained about the same. (Please see Figure 1 on page 3 for the decile distributions for these scores).
When CAAP math scale scores are compared by gender, men scored higher than women (56.48 vs. 55.96). Reasons why this might be were discussed in the DATAWave during the spring of 1995, and this author continues to believe that the most likely explanation is socialization. That is, men are encouraged to do well in courses involving mathematics, whereas women often are not.
On the CAAP math scale at Eastern New Mexico University, students whose self-reported ethnicity was White or Caucasian (n=539) scored highest (mean = 56.66, standard deviation = 3.61), and African-Americans (n=48) scored lowest (mean = 54.63, standard deviation = 3.08). The average scores for other ethnic groups (and these are categories provided by ACT) were: American-Indian/Alaskan (n=24) 55.25, Mexican-American or Chicano (n=98) 55.18, Asian or Pacific Islander (n=10) 55.4, Puerto Rican, Cuban, or Hispanic (n=68) 55.26, Filipino (n=2) 56, other (n=13) 56.07, and those who preferred not to respond (n=73) 55.95. ACT does not provide norms for various ethnic groups; therefore, comparisons of ethnic minorities at Eastern to a national cohort cannot be made.
Commencing in the fall of 1995, graduating students were randomly selected to participate in focus groups and to complete a short survey on their general education experiences here at Eastern New Mexico University. In the fall of 1995, ten such students participated. When asked to respond to the statement, “I understand and can apply basic math principles,” 4 strongly agreed, 4 agreed, 1 disagreed, and 1 strongly disagreed. That is, 80% of students agreed or strongly agreed that they understood and could apply basic math principles. Students were also asked to respond to the statement, “The general education program at ENMU improved my basic mathematical abilities.” Their responses were 3 strongly agreed, 3 agreed, 1 disagreed, 2 strongly disagreed, and 1 student did not respond. Of this small sample, 66.67% of students either agreed or strongly agreed that the general education program at ENMU improved their mathematical abilities.
Students were also asked a series of questions to probe their opinions of the general education courses in mathematics prior to completing the survey. Some of their responses are as follows:
Two weeks ago I attempted to draw conclusions from the data, observing that the results indicated overall enhancements in student learning. This may be almost as intuitive as scientific, and comments from the campus are welcomed.
Recently I attended the annual meeting of the North Central Association Commission on Institutions of Higher Education. I am pleased to report that at this meeting the assessment plan for Eastern New Mexico University was one of 33 plans included in a special resource area on outcomes assessment (ENMU-Roswell was also included). The impressions that I took from the annual meeting are that outcomes assessment continues to be an important endeavor for the Commission; that clear and concise instructions are still limited, yet increasing; and that the Commission is more explicit than ever before on the need to implement outcomes assessment plans. This increased emphasis in outcomes assessment is evidenced by the distribution of a working draft wherein the Commission proposes changes for two of the criteria for assessment (the institution is accomplishing its educational other purposes; and, the institution can continue to accomplish its purposes to strengthen its educational effectiveness). In this document, eight items of an effective assessment program are offered. In this article, I would like to report these items to you with comments on my perspective of Eastern’s compliance (in italics).
A strong, readily-identifiable relationship exists between overall institutional mission and objectives and the specific educational objectives of individual departments or programs.
At Eastern all departmental Academic Outcomes Assessment Plans (AOAP's) include an Expanded Statement of Institutional Purpose. Essentially, departments prepare a goal statement which flows from the University’s Mission Statement. Though the institution has recently adopted (fall 1996) a new Mission Statement, units will have the opportunity to incorporate it when they review their plans next fall.
Faculty, including on-campus and off-campus faculty, own and drive the program and use it to find ways to improve the education they provide. The institution motivates, recognizes, and rewards faculty efforts in assessment.
Faculty are responsible for the development of the plans in their departments. They are also an important part of the assessment committee which set up the protocol for the plans and are responsible for the review of these plans. ENMU-Roswell, with a separate accreditation, is not part of the assessment effort here at the Portales campus. There is a need to determine how faculty at the Ruidoso Instructional Center should be included in our assessment efforts. Currently, through a coordinated effort of the College of Liberal Arts and Sciences, the Vice President for Academic Affairs, the Ruidoso Instructional Center, and the Assessment Resource Office, students that are graduating in the spring of 1996 will be assessed. An individual will be identified as the assessment coordinator for the Ruidoso campus for AY 96-97.
More needs to be done to motivate, recognize, and reward faculty efforts in assessment. The Assessment Resource Office would welcome suggestions. The Research and Public Service Award from the Research and Public Service Funds from the state legislature may be an appropriate source for such recognition and rewards. Increased motivation, recognition, and rewards should be a goal for assessment efforts for AY 96-97.
The design and operation of outcomes assessment for academic programs at Eastern is in compliance with University governance and academic administration. The chief academic officer, Dr. George Mehaffy, is ultimately responsible for the outcomes assessment plan.
As mentioned in the previous statement, Dr. Mehaffy is responsible for the oversight of assessment efforts at the institution. The Assessment Resource Office is responsible for monitoring and reporting to him each unit's compliance with their assessment plan.
This statement is specific to those institutions which do not have specific plans for the measurement of student learning, but use existing measures. Here at Eastern, we have a separate process for the assessment of student learning; however, the plan does suggest that all curricular changes brought to general education curriculum program review and graduate committees ought to be accompanied by assessment results. Assessment results are not part of the faculty evaluation process.
Assessment results are to be used in support of changes in curriculum brought to the general education curriculum program review and graduate committees. All academic units, on an annual basis, beginning next fall are to report on the results of their outcomes assessment endeavors, including how the results will be used. These reports are sent from the department chairs to the college deans to the Vice President for Academic Affairs.
It is perhaps in this area that Eastern should pay the most attention. It is unclear to this writer if students have a clear understanding of outcomes assessment and the role it plays in their instruction. Over the summer, this office will be considering a number of ways to increase students’ understanding, awareness, and appreciation of outcomes assessment with the hopes of piloting some new efforts in the fall.
Currently, a number of direct and indirect measures are used here at Eastern. For example, students performance are direct measures, and alumni surveys and employer surveys are indirect measures. In the working draft previously mentioned and in other materials circulated by Commission staff, it is becoming apparently clear that organizing assessment through measures of cognitive learning, behavioral learning, and affective learning is attractive. This office will endeavor to make available to the administration, the University's self-study committee, and the accreditation team (scheduled for visitation in March of 1997) how our assessment efforts do or do not comply with this stratification.
In conclusion, the faculty, staff, and administration of Eastern can be assured of their compliance with the Commission’s intentions for outcomes assessment. In the past, other institutions in the state have been able to say that, but now find themselves out of compliance. The expectations for outcomes assessment are constantly increasing. The institution will be well served by paying attention to how faculty are motivated and recognized for their contributions to outcomes assessment, and how students are informed of the importance of outcomes assessment. Not to suggest that these do not occur, but rather that this could increase with the overall result being an improved outcomes assessment program.
