Alumni Assessment in The ABET 2000 Enviro
Alumni Assessment in The ABET 2000 Enviro
David L. Soldan, Ph.D. Professor and Head Electrical and Computer Engineering 261 Rathbone Hall Kansas State University Manhattan, Kansas 66506-5105 soldun @ksu. edu Abstract - In the mid-80s the Kansas Board of Regents
mandated that all schools in the state system develop a program for assessment of alumni experiences. The objective of this program is to provide feedback on many issues having to do with the experiences that alumni had while in school. The program developed at Kansas State surveys new graduates, one-year graduates, and four-year graduates on a periodic basis. Feedback from these surveys has been used to make adjustments in academic programs, advising, and enrollment processes. This i l l summarize some of the actions initiated in these areas. paper w The ABET Engineering Criteria 2000 indicate that acceptable evidence of success can be obtained from alumni surveys that document professional accomplishments and career development activities. No mention of alumni feedback regarding program quality issues is made in the criteria. This paper will discuss this issue and other assessment groups mentioned in the ABET Engineering Criteria 2000. Questions about the various methods of obtaining evidence for ABET will be raised. The appropriate use of the results of various methods will be discussed. This paper will discuss the alumni survey system in place at Kansas State University and how it has led to program improvements. Also the relation of this program to the ABET Engineering Criteria 2000 will be discussed. Finally, the overall assessment process for ABET Engineering Criteria 2000 will be discussed with a focus on assessment techniques.
Assessment Programs
A complete assessment program must assess three areas [2]: a) student input; b) the environment; and c) student outputs. Student inputs refer to those qualities that incoming students have. The environment characterizes the students experiences during the educational program. The outputs refer to those outcomes that the graduate of the program is supposed to possess. A fundamental purpose of assessment should be to learn as much as possible about how to structure the educational environment so as to maximize the outputs from the program. The American Association for Higher Education has identified the elements of good assessment. The Principles of Good Practice for Assessing Student Learning [3] are:
Introduction
Engineering education and engineering program accreditation have been focused on the curriculum and course content as measures of program success. Recent changes in the Accreditation Board for Engineering and Technology (ABET) criteria [ 11 for evaluating programs have focused on assessment of student outcomes as a means of program evaluation. Criterion 3 of ABET Engineering Criteria 2000 specifies that each program must have a process for assessing student outcomes. It also specifies that each program have evidence that the results of these assessments have been used for program improvement. One of the items mentioned as an assessment tool is alumni surveys, The Kansas Board of Regents in the mid-1980s mandated that all schools in the state system develop a program for assessment of alumni experiences. The objective of this program is to provide departments feedback on issues having to do with the experiences that alumni had while in school. Feedback from these surveys has been used to make adjustments in academic programs, advising, and enrollment processes.
o-7803-4086-8 Q1997 IEEE
+;
1. The assessment of student learning begins with educational values 2. Assessment is most effective when it reflects and understanding of learning as multidimensional, integrated, and revealed in performance over time 3. Assessment works best when the programs it seeks to improve have clear, explicitly stated purposes 4. Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes 5 . Assessment works best when it is ongoing, not episodic 6. Assessment fosters wider improvement when representatives from across the educational community are involved 7. Assessment makes a difference when it begins with issues of use and illuminates questions that people really care about 8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change
I E
4 ;
bi %
f
f*
0 0 0 0
Satisfaction with academic advising Availability of leisure time activities Support in finding appropriate employment Availability of faculty for student questions Attitude of faculty toward students Student services (residence halls, student union, student health center, computing services, etc.)
Most of these are not as closely related to ABET Engineering Criteria 2000 issues as were the issues in the previous section. However, in the overall perception of programs these issues can have a major impact. Universities and engineering programs will need to deal with major difficulties in these areas. Graduating seniors are certainly one group that can provide valid feedback on these types of experiences.
Developing skills in leadership or participating on teams Becoming more aware of world issues and pressing social, political, and economic problems Thinking clearly, meeting a problem, and following it to a sensible conclusion.
Respondents are also asked to give their opinion on what areas of the curriculum should have more (or less) emphasis placed on them. Examples of these areas are:
0 0 0
Written and oral communications skills Computer skills Problem solving skills
Additional information is gathered fiom seniors and alumni having to do with their job placement and career development. Specific information is requested dealing with salaries, job titles, and how the job was located. Information on job satisfactionand continuing career development is also requested. In addition to alumni surveys the College of Engineering at Kansas State uses other assessment techniques for program improvement. The Career and Employment Services office gathers and publishes data concerning initial student placement. This data includes employer,job title, and beginning salary. Data is also gathered on graduate,s who pursue graduate school or other forms of furthering their education. The College of Engineering has an Industrial Advisory Council that provides feedback on many issues. Some departments have Advisory Committees. The Electrical and Computer Engineering (EECE) Department gathers data from all companies that interview on campus plus several other companies that have hued KSU graduates. Also special surveys may be done to gather more indepth feedback on critical issues.
The responses received to the educational outcomes questions can be very valuable in identifymg areas that recent graduates have identified as being particularly strong or weak. The process of strengthening areas that are identified as weak would probably involve additional information gathering and the involvement not only of alumni but other constituencies such as companies that hire graduates.
Feedback Examples
There are many examples where feedback received from the alumni survey and other assessment tools has been used for program improvement. An early example of a curriculum change that was made based on feedback from the Advisory Council and other alumni is the addition of ENGL 415, Written Communications for Engineers, to the curriculum. This was done in the early 1980's and was intended to provide improved
'm H a ;!a~!j
+ * . * , . ', " .* "
bc*p,,
technical writing skills for engineering graduates. A special survey of alumni [ 6 ] was done in 1989. This survey was conducted to gather additional feedback on the effectiveness of the addition of ENGL 415 to the curriculum. Approximately 600 alumni who had taken Written Communications were asked to respond to questions about its importance to their job and how effectivethe course had been in improving their technical writing skills, This feedback strengthened the commitment of the College to Written Communications and demonstrated to the University Administration the continuing need to have resources available in the English Department for this course. Most curriculum changes can be attributed to feedback obtained from several of the assessment tools available. The Alumni Surveys in many cases are part of that feedback. Recent changes in the EECE curriculum that are attributable to employers and alumni include the addition of a required class on microcontrollers for all electrical engineering and computer engineering majors. Another change was the addition of a required software engineering course for all computer engineering majors. Similar examples exist in all engineering programs at KSU.
information will be gained about the educational environment, however. Principle of Good Practice 2 [3] would seem to indicate that use of nationally-normedexams by themselves is not desirable since they provide only performance data for one day. Used with other ongoing assessment techniques these exams may provide some comparative data. Recent discussions at educational meetings indicate that many people do not feel that assessing all of the items in Criterion 3 is possible. Several areas of required assessment appear to be straight-forward. Examples are (letters refer to Criterion 3 subparagraphs): an ability to apply knowledge of mathematics, science, and engineering c) an ability to design a system, component, or process to meet desired needs g) an ability to communicate effectively a) Student portfolios of graded work (assuming the instructor is competent) would certainly provide evidence that these outcomes were being met. Other areas, however, do not seem to have easy answers. For instance: d) an ability to function on multi-disciplinary teams h) the broad education necessary to understand the impact of engineering solutions in a global and societal context i) a recognition of the need for, and an ability to engage in life-long learning Fewer techniques seem to be available to evaluate these. At a recent National Electrical Engineering Department Heads Association (NEEDHA) meeting there was lengthy discussion of how to assess all of the items in Criterion 3. While the results of this discussion are not an official NEEDHA position, they do represent a first step in finding valid assessment techniques. A summary of this discussion is available on the web [ 7 ] .
Each program have an assessment process with documented results Evidence must be given that assessment results are applied to program improvement The assessment program must demonstrate that outcomes are being assessed.
This criterion also enumerates eleven areas that engineering programs must demonstrate outcomes assessment of their graduates. While there is mention of applying the results of outcomes assessment to improve the program (environment) there is no direct mention of input assessment. Criterion 1 does indicate that student quality and progress must be monitored and evaluated. However, assessment of incoming students in not mentioned. An institution that wants to have a complete assessment program should consider incoming student assessment in some form. The issue of incoming student capabilities can also be addressed in the program objectives. An public-supported institution will probably have to deal with students with a wider range of backgrounds than will a private institution that recruits nationally. Criterion 3 lists several assessment methods. These include student portfolios, nationally-normed subject exams, alumni surveys of professional accomplishmentand career development, employer surveys, and placement data. Each of these can provide some input on the educational outcomes assessment. Very little
Conclusions
Engineering programs in the United States will be implementing assessment programs not only to satisfy accreditation requirements but to better serve their students. Complete assessment programs must look beyond educational outconies assessment and include assessment of the educational environment and of entering students abilities. Many tools are available for assessment and use of a broad range of tools will lead to more complete results. Assessment results must be used to improve program quality to satisfy accreditation requirements. As the discussion of the KSU Alumni Surveys indicates, tools beyond those mentioned in the ABET Engineering Criteria 2000 can provide valuable input on both educational outcomes and educational environment topics.
E
E E
References
1.
2.
3.
4.
5.
6.
7.
Engineering Criteria 2000, Engineering Accreditation Commission, Accreditation Board for Engineering and Technology, http:www.abet.ba.md.usfEACleac2OOO.htmi, 1995. Astin, Alexander W., Assessment f o r Excellence, MacMillan Publishing Company, New York, NY, 1991. Principles of Good Practice for Assessing Student Learning, American Association for Higher Education, Washington, D. C., 1992. Banta, Trudy W., Jon P. Lund, Karen E. Black, Frances W. Oberlander, Assessment in Practice, Jossey-Bass Publishers, San Francisco, CA., 1996. Kansas State University Major Field Assessment Report, Office of Educational Advancement, Kansas State University, Manhattan, Kansas, 1995. Hackett, Ann M. and Downey, Ronald G., Alumni Reactions to Written Communication for Engineers, Office of Educational Research, Kansas State University, Manhattan, Kansas, September 1989. NEEDHA Special Committee on Criteria 2000 Report, http:ffwww.needha.org/l 996-9710utcomes.sht~ March 1 8, 1997.
$on;
,+ ,
81.
: mH f
& . . *
. *A -c,,
e .
v S. 2