V 4.1 Date: 9.11.96
In the final edition of the Spring 1996 DATAWave, members of the campus community were asked to complete a survey. The Assessment Resource Office utilizes these surveys in order to assess the services provided to the campus community.
The DATAWave was mailed to 266 recipients during the semester, of those, 80 responded (30%). Forty-four respondents were faculty (55.0%), 12 were administrators (15.0%), 22 were professional staff (27.5%), and 2 were support staff (2.5%).
When asked how often they read the DATAWave, three responded never (3.8%); 25, occasionally (31.3%); 10, half of the time (12.5%); 21 responded that they read the DATAWave frequently (26.3%); and 21 stated that they always read the DATAWave (26.3%). While administrators stated that they almost always read the DATAWave, nearly half of the faculty members were more apt to read the DATAWave occasionally, professional staff responses varied greatly, and responses from support staff indicated that they read the DATAWave half of the time.
Forty-six respondents (57.5%) indicated they agreed or strongly agreed that the information in the DATAWave was useful, 26 respondents were neutral (32.5%), and 7 respondents (8.8%) disagreed or strongly disagreed that the information was useful.
When asked if the information in the DATAWave is presented clearly, 51 respondents (63.8%) indicated that they agreed or strongly agreed, 14 respondents (17.5%) were neutral, 12 respondents disagreed (15.0%), and 2 strongly disagreed (2.5%).
The use of statistics in the DATAWave proved to be important to many respondents. When asked if the presentation of information in the DATAWave was too technical (relying on statistics and research methodology), 47 respondents (58.8%) stated that they strongly disagreed or disagreed, 20 respondents were neutral (25.0%), and 13 respondents (16.3%) stated they agreed or strongly agreed that the information presented in the DATAWave was too technical.
When asked about the content that comprises the DATAWave (i.e. words; pictures, graphs & charts; or a nearly equal representation of both) 45 respondents (56.3%) indicated they preferred about equal of both (words and pictures, graphs or charts), 5 responded (6.3%) that they would prefer more words in the DATAWave, 15 respondents stated that they would prefer more pictures, graphs, or charts in future issues of the DATAWave, while 15 respondents (18.8%) did not respond to the question.
Respondents were asked to indicate on a scale of 1 through 5, with 1 being low and 5 being high, their interest in a number of potential topics in future editions of the DATAWave. Responses that were either coded 4 or 5 were combined, and used to determine readers’ interests. These are, in rank order: students’ development and growth at ENMU (65.0%), students’ values and attitudes at ENMU (65.0%), student satisfaction inventories (55.1%), student learning as measured by the CAAP (51.3%), the results of assessment conducted by various academic departments (50.1%), theory on student learning (37.6%), general information on how to do assessment (32.5%), and theory on student development and growth (28.8%). Listed below are some of the responses elicited from each question. For a comprehensive list of answers to each question, and/or a full report of the survey results reported in this issue of the DATAWave, contact the Assessment Resource Office at ext. 4313.
What recommendation(s) would you make to increase student participation in assessment activities?
· If assessment comes from program priorities, students may see the assessment as “relevant.”
· Have students talk about assessments in forums with faculty, especially those that have experienced both traditional and alternative formats.
· Educate them as to what assessment is and what it involves. I don’t think most of the students on this campus really know what it entails.
· A.R.O. needs to “sell” the process to the student. The effort to inform and educate them as the “value” of the process is very weak to nonexistent. Explain it to them.
· Students need to understand why we do assessment. Many perceive assessment as a negative.
· Many of our “working” or nontraditional students are so stressed out and overwhelmed that they have no time to be involved. Single parents have no time.
What recommendation(s) would you make to increase student awareness of assessment efforts?
· Unless faculty discuss its importance, I’m not sure how. Assessment should be a discussion faculty have with students.
· Have students talk about assessment in forums with faculty, especially those that have experienced both traditional and alternative formats. Include other students in the dialogue.
· Possibly put excerpts from articles in the student paper/Monday Memo. Maybe a coordinated effort article from you and a Chase staff member.
· Ask faculty to discuss outcomes as part of their class presentations-put on course syllabus.
What recommendation(s) would you make to increase faculty involvement in assessment activities?
· As programs come on-line with their own assessment, involvement will (should) follow.
· I would like to see more workshops on alternative assessment; too much emphasis is given to traditional assessment at this university, but the Art Dept. is doing wonderful things that they need to be invited to share throughout the university community.
· We just have too damn much stuff to do!
· Who cares?
· Assistance with understanding which types of assessment will give the type of information to support changes in curriculum design.
If you are a faculty member, though monies are available to support travel, research, and new endeavors in assessment, what would be necessary to invite your involvement?
· Just invite me. I’d love to talk with others about my ideas on assessment and to see what the Art and Theater departments are doing. The ELED department has a lot to share and we’d love to do it.
· None. I would be interested—qualitative assessment.
· First of all, I didn’t know this! What is the process? Who is responsible for getting this information out?
· Let it count for promotion and tenure consideration across campus.
· Seven respondents indicated that a shortage of time was the problem they faced in becoming involved with assessment.
· As usual, we are at a critical point in assessment. Programs will need to show how assessment informs department planning and curriculum. Several units will need help articulating this and interpreting their data.
· I think the A.R.O. has done a very credible job in their brief existence at Eastern. The office is working at a level beyond its year of service—keep it up. The novelty will be over soon and the job just gets harder. A.R.O can handle the challenge.
· This is a joke! Number crunching will NEVER properly assess outcomes. Lies, damned lies and statistics.
Assessing the Assessment Resource Office
When asked how they rated the service of the Assessment Resource Office, 13 respondents (16.3%) answered with a rating of excellent, 23 responded with very good (28.8%), 18 stated that the services were good (22.5%), 11 gave the A.R.O. a fair rating (13.8%), one responded with a poor rating (1.3%) and 14 did not respond (17.5%).
When asked to rate the responsiveness of the Assessment Resource Office, 39 respondents (48.8%) rated the A.R.O. as excellent or very good, 16 responded with good (20.0%), 10 respondents gave the A.R.O. a fair rating (13.8%), one responded with a poor rating (1.3%), and 14 people did not respond (17.5%).
Upcoming Assessment Activities
Originally funded through a Research and Public Service Program from the State of New Mexico Legislature in 1994, the Assessment Resource Office is beginning its third year at Eastern. The purpose of the Assessment Resource Office is to provide support to faculty, administration, and staff in the development of outcomes assessment plans, research in student learning and development, linking faculty to funding opportunities, and to conduct analysis on the more than ten years of outcomes assessment data collected at Eastern. The office is staffed by a coordinator, Dr. Alec Testa; a secretary, Deborah Bentley; and two graduate assistants, Irene Nolen and Sarah Stacy. Primarily, the role of the office and its staff is to assist all members of the campus community in determining how well their office/program is contributing to the University’s Mission and to discover how best to enhance student learning and growth.
Members of the Eastern campus community should look for the following from the Assessment Resource Office in 1996-1997: