Upload
raymond-hoare
View
214
Download
0
Embed Size (px)
Citation preview
Cross-InstitutionalAssessment: Developmentand Implementation of theOn-line Student SurveySystemRAYMOND HOARE,1 MARY BESTERFIELD-SACRE,2 DAN ERTMAN,1 JESSICA GERCHAK,2
TROY JOHNSON,1 ROBERT SHIELDS,1 AND LARRY SHUMAN2
1Department of Electrical Engineering, School of Engineering, University of Pittsburgh, Pittsburgh, PA 15261
2Department of Industrial Engineering, School of Engineering, University of Pittsburgh, Pittsburgh, PA 15261
Received 23 October 2001; accepted 1 May 2002
ABSTRACT: As ABET has increased the need for routine student assessments, engineer-
ing faculty are faced with the problem of doing this in an efficient manner that minimizes the
time required to conduct, tabulate, and analyze the requisite surveys. To meet this need,
researchers at the University of Pittsburgh have developed the On-line Student Survey System
(OS3) to facilitate EC 2000 assessment and cross-institutional benchmarking. OS3 allows
multiple engineering schools to conduct customized, routine program evaluations using Web-
based surveys specifically designed to meet EC 2000 objectives. Since its inception, seven
engineering schools have adopted OS3. This article provides an overview of the system, a des-
cription of its survey instruments, and an evaluation of the system. �2002 Wiley Periodicals, Inc.
Comput Appl Eng Educ 10: 88–97, 2002; Published online in Wiley InterScience (www.interscience.wiley.
com.); DOI 10.1002/cae.10013
Keywords: assessment and evaluation; EC 2000; outcomes assessment; Web-based
assessment
INTRODUCTION
With the introduction of the Accreditation Board for
Engineering and Technology’s (ABET) challenging
‘‘EC-2000’’ accreditation criteria [1], the more than
1600 US engineering programs must develop
and implement systems for continuous improvement.
Faculty must demonstrate that the outcomes important
Correspondence to M. Besterfield-Sacre ([email protected]).
Contract grant sponsor: Engineering Information Foundation;contract grant number: EiF 98-4.
Contract grant sponsor: National Science Foundation; contractgrant number: EEC-9872498.
� 2002 Wiley Periodicals, Inc.
88
to the mission of the institution and the objectives of
the program are being met via sound measurement,
and must give evidence that these results are applied
to further develop and improve the program. Unfortu-
nately, most engineering programs simply do not have
sufficient expertise, time, funds, and infrastructure to
conduct complete program assessments.
One cost-effective method to enhance an engi-
neering school’s evaluation program is the use of clo-
sed form questionnaires. Such surveys allow faculty to
obtain quantitative information from usable statistics,
and can be generated and fed back in a timely manner.
Closed form questionnaires are relatively easy to
administer and analyze [2] compared to other assess-
ment methods, such as portfolios, authentic assess-
ment, concept maps, etc., and can be effectively used
on large numbers of students. Further, questionnaire
results can be used for comparison purposes including
tracking students across academic levels (e.g., fresh-
man, sophomore, junior, and senior), across depart-
ments within a school, or for benchmarking specific
departments across several schools. Even so, accurate
assessments require well developed and tested sur-
veys, an administrative technique that is unobtrusive
of faculty and student time, an ability to capture data
for analysis, and the capability to generate meaningful
reports for programmatic improvements. Currently,
the majority of engineering surveys utilize a ‘‘paper-
pencil’’ format that is administered during valuable
class time. Extra time is then required to record stu-
dent responses so that the resultant data are ready for
analysis and reporting. Using ‘‘paper-pencil’’ surveys
also increases the opportunity for data errors. For
example, students can mark several responses or acci-
dentally skip a question, if scanned, questionnaire
responses can be misread, or if manually tabulated,
data entry errors routinely occur.
A cost-effective solution would be the develop-
ment of a system for assessing student outcomes that
uses Web-based surveys so that questionnaires can
be taken on-line and reports generated to facilitate
student tracking and various programmatic and insti-
tutional comparisons. Researchers at the University of
Pittsburgh have developed such a system—the On-
line Student Survey System (OS3). One important
objective of OS3 is to allow multiple engineering
schools to conduct customized, routine program
evaluations using EC 2000 related Web-based
survey instruments. To date, OS3 has been used by
several engineering schools with as many as 1000
students concurrently taking various questionnaires
and surveys.
This paper first provides an overview of the
system, its underlying architecture, and a description
of each of the components. A description of the vari-
ous survey instruments is also provided, followed by a
discussion of how the system has been used to date.
Finally, a short discussion is provided about the future
direction of OS3.
OVERVIEW OF THE OS3
The goal in developing OS3 has been to provide
a Web-based system for administering surveys that
can be utilized by faculty at various institutions
without requiring extensive technical knowledge for
implementation and use. As Figure 1 depicts, the OS3
consists of three levels of users: student, local admi-
nistrator, and global administrator. The student users
participate by taking one or more surveys throughout
their college experience. To maintain cross-institu-
tional data integrity, the global administrator controls
the addition and modification of surveys to the system,
and provides local administrators access to the sys-
tem. The local administrator is the responsible
individual at the school where the survey is being
conducted. The local administrator decides which
survey should be administered, to whom, and for what
length of time. The goal is to provide the local
administrators with the necessary tools to administer
the survey themselves. While the OS3 server is located
at the University of Pittsburgh, the local administrator
has control over the survey appearance and can
customize the interface to include the local school’s
logo and colors.
To start a survey, the local administrator ‘‘cuts
and pastes’’ the targeted user (student) email ad-
dresses into a Java Applet running on a Web browser.
The local administrator also creates initial and
reminder email messages that will be sent to each
student on selected dates. Once this information has
been entered, the Java Applet sends the information
back to the OS3 server. The OS3 server application
(also written in Java) then connects to an Oracle
database and records the information. The OS3 system
Figure 1 A network view of OS.3
CROSS-INSTITUTIONAL ASSESSMENT 89
creates random passwords for each student; once a
particular student has taken a survey, his/her password
is invalidated. A local administrator can also start a
survey without emails by creating random passwords
from student names, which can then be exported to
a mail merge document and handed out individu-
ally to students. Student names are not maintained in
the database. (As with ‘‘pencil and paper’’ surveys,
schools are highly advised to consult their institution’s
institutional review board for policies concerned with
the use of human subjects. In many cases, human
subject consent is exempt for surveys because the data
is used for educational improvements.)
On the date specified by the local administrator,
the OS3 sends an email (created by the local admini-
strator) requesting that students take the on-line
survey. While the University of Pittsburgh’s mail
server is used to send the email messages, the return
email address is that of the local administrator at
the student’s institution. Each student receives a
customized email message that gives the Web address
link for the survey. The student then logs into the
system using the password provided as shown in
Figure 2 (left).
The Java Applet downloads from the OS3 server
to the student’s browser and is only 33 KB; i.e., no
larger than a moderate size Web page with images. A
typical survey that has 50 to 70 questions takes less
than 15 min to complete. Only one question is pre-
sented to the student at a time, as shown in Figure 2
(right). Once the student submits the survey, the
results are compressed and sent to the OS3 server.
The results are then stored in the OS3 database and
the student’s password is invalidated. A confirmation
screen thanks the student for taking the survey. A
major benefit of using a Java Applet rather than HTML
is fast response time between questions, as the applet
contains all the survey questions and only sends the
results when the survey is submitted.
Figure 3 provides a sequence diagram for OS3 that
delineates the general interactions between the student
applet, the local administrator, the server program, and
the Oracle database for a survey that has been scheduled
by the local administrator using student e-mails.
OS3 summarizes all active surveys for each
participating school daily. Summary reports on the
number of students who have and have not taken the
surveys are sent to the local administrators. In addi-
tion, local administrators may access the system to
inquire about the status of individual students, send
reminder emails to students, and extend a survey
beyond its previous end date.
Global administrators have the ability to create,
modify, and delete local administrator accounts and
surveys.
OS3 SYSTEM ARCHITECTURE
The OS3 system architecture depends on six main
components: three user interface applets, the server
application, the database, and the mail server. The
user interface applets, as described in the previous
section, are written in Java 1.1 so that they can be
executed in older browsers. The Student Applet
enables a student to take a survey, the Local Admini-
strator Applet enables the administration of surveys,
and the Global Administrator Applet enables survey
addition and maintenance and addition/deletion of
local and global administrators.
Rather than enabling each applet to connect
directly to the Oracle database, a server-side Java
application sits between the applets, the database, and
the mail server. This provides an additional level of
security, drastically reduces the size of each of the
applets, and provides an efficient means of testing.
The Server Architecture, the Database Structure, and
the Application Protocol are individually discussed.
Figure 2 Sample screen images from the student applet. [Color figure can be viewed in the online
issue, which is available at www.interscience.wiley.com.]
90 HOARE ET AL.
Server Architecture
As shown in Figure 4, the OS3 server application is
responsible for providing an interface to the client
applets (see Application Protocol description), mana-
ging the database, sending out emails to students
and administrators, and automatically starting/stop-
ping surveys as they are scheduled. The server was
developed using the Java language, and software
libraries, such as the Java DataBase Connectivity
(JDBC) library, were used to ensure versatility and
platform independence. The production environment
for OS3 is a Windows NT 4.0 server with dual
Pentium II Xeon processors, a RAID 5 disk sub-
system, and Oracle as the database. One of the OS3
development environments is a Linux workstation
with dual Pentium II processors and PostgreSQL as
the database.
Figure 3 Typical OS3 sequence diagram.
CROSS-INSTITUTIONAL ASSESSMENT 91
While development and deployment used two
different operating systems and two different data-
bases, there were only minor data type conflicts. This
heterogeneous development/deployment environment
demonstrates the portability of Java, SQL, and OS3.
As shown in Figure 4, the three user interface applets
are run within browsers remotely on the Internet. The
Server Application and the Oracle Database both
currently reside on the same machine, but this is not a
requirement. The Mail Server is a separate machine
currently maintained by the University of Pittsburgh.
During the testing of the pilot OS3, the system was
able to maintain over a thousand concurrent connec-
tions without failure.
The ServerApplication is the main module of
the server. It initializes the major components of the
server and sets up the network sockets to handle
incoming connections. Once the system setup and
initialization have been completed successfully, the
ServerApplication checks for connections on the
socket it has created, and initiates SessionThreads to
handle each connection. This allows each connection
to be handled separately, but executed concurrently.
SessionThreads do the bulk of the processing work in
the server. Each SessionThread is responsible for
managing a single connection to a client. The Session-
Thread sits in a loop waiting for a message to be
received from the client. Once a message is received,
the SessionThread requests an appropriate Message-
Handler from the MessageHandlerManager, processes
the message, sends back the response, and then goes
back to waiting. This continues until either party
requests termination of the connection.
The MessageHandlers and MessageHandlerMa-
nager are responsible for implementing the applica-
tion protocol used by OS3. Each MessageHandler is
designed to process a single message in the applica-
tion protocol, and the MessageHandlerManager is
responsible for finding the correct MessageHandler
for a given message. When a SessionThread receives a
message from the client, it simply requests a handler
from the MessageHandlerManager, and uses the re-
turned MessageHandler to process the message. The
ServerApplication registers the MessageHandlers
with the MessageHandlerManager during initializa-
tion. Examples of MessageHandlers include login
requests and responses, requests for school lists
and responses, and successes and failures of an
operation.
The SchedulerThread is created by the Server-
Application during initialization, and manages the
starting and stopping of scheduled surveys and admi-
nistrator updates, waking once every 24 hours to per-
form its operations, then returning to sleep. The final
components of the server are the EmailQueue and
SurveyDatabase. The EmailQueue acts as a mail spo-
oler and handles all interactions with the Simple Mail
Transfer Protocol (SMPT) server necessary to send
the message. The SurveyDatabase sits between the
actual database and the rest of the server, providing
a simple database interface and managing all of the
details internally. Interactions with the actual database
are conducted using the JDBC library.
Database Structure
The entity relationship diagram for the OS3 is depic-
ted in Figure 5. The Surveys table contains a record
for each survey in the system. Each survey consists of
a number of SurveyItems, such as multiple-choice
questions or instruction text blocks. Any given record
may correspond to a number of ActiveSurveys. An
ActiveSurvey record represents a scheduled survey for
a given school. For example, if a local administrator
wants to conduct a particular survey from April 2 to
May 5, an ActiveSurvey record will be created with
the relevant information. This allows multiple schools
to use the same survey at the same (or overlapping)
times. In addition to the ActiveSurvey record, when
the specific local administrator schedules a survey,
StudentSurvey records will be created for each student
that is to take the survey. The StudentSurvey records
are used to determine if a student is allowed to take
a certain survey, and once a student has taken a
particular survey, the appropriate StudentSurvey
record is deleted from the database, preventing the
student from logging back in and retaking the survey.
StudentSurvey records are used to allow combinations
of students to be registered for multiple surveys. When
Figure 4 The server architecture of OS.3
92 HOARE ET AL.
a student completes a survey and the StudentSurvey
record is deleted, a SurveyResponse record is created
which contains the student’s responses to the survey
just submitted. In addition, the student’s alphanumeric
responses to open-ended questions are stored indivi-
dually for easy processing.
The Schools table consists of a single record for
each school containing the school’s name, the applet
customization information for that school (their logo,
welcome message, and text colors), and the unique
identifier that the system gives to the school. The
Users (Student, Local Administrator, or Global Ad-
ministrator) are also maintained in a table, which
records the username, password, email address, and
access level.
Application Protocol
The OS3 Application Protocol is the defined interface
between the client and server, and is used for all
network communication in the system. It is a text-
based protocol and uses TCP/IP for the network
communication. The protocol is broken down into a
collection of messages, each of which is terminated by
a newline character. By decomposing the system into
individual messages and handlers, the system can be
easily expanded and tested. Each of the three applets
(student, local administrator, and global administra-
tor) communicates with the server using these mes-
sages. The only difference is in the authorization level
that is associated with the particular login name. User
login is made once per TCP/IP connection and the
authorization level determined for that user is kept
with the TCP/IP connection information.
BATTERY OF AVAILABLE EC 2000 SURVEYS
A series of questionnaires were developed and tested
by faculty in the Department of Industrial Engineering
and are an integral part of the University of Pittsburgh
School of Engineering on-going assessment and acc-
reditation process [3–5]. The surveys were designed
to assess and track students at appropriate points in
their academic careers. Depending on their level, all
engineering students take the Pittsburgh Freshman
Engineering Attitudes Survey� (pre and post surveys),
the Sophomore Engineering Learning and Curriculum
Evaluation Instrument,� or the Junior Engineering
Learning and Curriculum Evaluation Instrument,�
each academic year. When students become seniors,
they take the Senior Exit Survey� during the semester
prior to graduating. Three to five years post gradua-
tion, students will be asked to take the Pittsburgh
Alumni Survey.�
Each survey was developed using a conceptual
model of the engineering education system based on
Bloom’s general taxonomy of the cognitive domain
[6] and Krathwohl’s taxonomy of affective domain [7].
See Besterfield-Sacre et al. for a general description of
the educational underpinnings of the model [8].
The content of each questionnaire includes
the following. The Pittsburgh Freshman Engineering
Attitudes Survey� (PFEAS) measures several facets of
students attitudes; e.g., their opinions about aspects of
the engineering profession and the reasons that they
chose to study engineering. The pre-questionnaire
also rates the students self-assessed confidence in their
background (preparatory) knowledge and skills, and
their perceived ability to succeed in engineering.
Figure 5 The entity-relationship diagram of the OS3 database.
CROSS-INSTITUTIONAL ASSESSMENT 93
Students also rate their study skills and their interest in
working in groups. The post version of the question-
naire contains the same measures as the pre-question-
naire plus additional items that capture the eleven
outcomes enunciated by EC 2000. The Sophomore
Engineering Learning and Curriculum Evaluation
Instrument� and the Junior Engineering Learning
and Curriculum Evaluation Instrument� measure the
same aspects as the PFEAS, but also add the student’s
self-assessed preparedness of his/her current level
given the previous year’s educational experiences.
The junior instrument asks additional questions about
students’ work experiences and how this relates to the
outcomes. The Senior Exit Survey� asks supplemen-
tary questions about the student’s future educational
plans and employment information. The Pittsburgh
Alumni Survey� instrument was developed from the
alumnus’ perspective of reflecting on the overall
educational system he/she experienced as a student.
Alumni are asked to provide overall ratings about their
degree program and the engineering school, as well as
to rate their level of competence in achieving each EC
2000 outcome at time of graduation. Individuals then
provide a candid critique of their education, in terms
of the curriculum, the culture of the school, the in-
class instruction, and their work experiences.
All questionnaires have been thoroughly tested
for reliability and validity. Since the inception of the
first questionnaire in 1993, over 20 schools have
adopted one or more of the questionnaires (paper-
pencil versions) in an effort to better understand the
characteristics of their engineering students [9], study
attrition and probation issues [10–12], provide an
evaluation tool for educational interventions [13–14],
as well as to measure EC 2000 outcome related issues.
Further, University of Pittsburgh engineering pro-
grams have used the questionnaires in as part of their
ABET accreditation self studies under the new
criteria.
EVALUATION OF THE OS3
An initial pilot of the system was conducted in spring
2000 at a single remote school using the Pittsburgh
Freshman Engineering Attitudes Survey� (PFEAS)
and over 800 students. This ‘alpha’ test proved that
the overall system capabilities and database worked
appropriately. After the ‘alpha’ test, a ‘beta’ pilot test
was conducted in the fall 2000 semester with three
remote schools (two small schools with approxi-
mately 60 students each and one large institution with
over 500 students) running concurrently with the
PFEAS survey.
For the first two pilots, the distribution of the load
was greatest during the first 4–24 hours after the initial
email was sent and the 4–12 hours after each reminder
email. Thereafter, the load on the system was dra-
matically reduced. Such information is helpful in
predicting peak loads on the system, as well as deter-
mining when reminder emails should be sent to
students. One concern is that the response rates from
the two pilots have been between 32 to 42% of the
students surveyed. As Dillman [15] indicates this
response rate is typical for a single email and one
reminder email. However, if the data is to be used
for viable assessment and evaluation of the EC 2000
outcomes, then higher response rates are desired. In a
third pilot that utilized three different surveys for
University of Pittsburgh engineering students, the
response rates were 70% for the Pittsburgh Freshman
Engineering Attitudes Survey� (post survey), 48% for
the Sophomore Engineering Learning and Curriculum
Evaluation Instrument� , and 44% for the Junior
Engineering Learning and Curriculum Evaluation
Instrument.� Each survey was available to the
students for two weeks with one reminder email.
Faculty and student advisors made frequent reminders
in the freshman engineering courses, hence the
respectable response rate; whereas, for the sophomore
and junior engineering students, no reminders were
given in person to the students. In the future,
courteous reminders will be given in the engineering
courses to improve the response rates. However, such
reminders must be done cautiously. Several students
had indicated in this last pilot that they did not wish to
take the survey and requested that they be removed
from the reminder list. In addition, a few students
were mis-classified in terms of the survey they were
asked to take (e.g., the student was assigned to take
the sophomore questionnaire and the student was a
junior) and requested a different survey. These
requests prompted new features to the local adminis-
trator applet—that of adding and dropping individual
users.
In the fourth and final pilot, the PFEAS post-
survey was conducted at a large, state university;
survey submissions made over a two-week period
were logged to uncover any underlying distributions.
From the log, the date and time of the survey
submissions were recorded. Similar to the previous
pilots, the load on the system was greatest during the
time period from 4–24 hours following the survey
initialization email. Survey submissions greatly
declined until a reminder email was sent, causing
another smaller surge of submissions. Over the time
period 12–15 hours following the reminder email,
26% of the surveys were submitted. At the completion
94 HOARE ET AL.
of the pilot, over 57% of the total students responded.
The load on the different days of the week was also
analyzed. The load was greatest on days that included
the survey email initialization or the reminder email,
indicating that load is not dependent on the day of the
week. The time of day that surveys were submitted
was also analyzed, and revealed that 43% of the
students completed the survey between the hours of
2 and 4 pm. These indications may help predict the
load on the system at any given day and time. If so, the
load can be controlled through systematic queuing.
DISCUSSION AND FUTURE WORK
As an increasing number of engineering schools
address the requirements of EC 2000, the need for a
mechanism to routinely elicit student self-assessments
and evaluations and to facilitate both tracking and
cross-institutional benchmarking is apparent. Many, if
not most, engineering programs simply do not have
sufficient expertise and resources to conduct these
necessary program assessments. The development
of the OS3 allows multiple engineering schools to
conduct routine program evaluations using EC 2000
related Web-based survey instruments.
Several information technology and assessment
issues need to be tackled before such a system can be
used on a national level. From the technology pers-
pective, although great lengths have been taken to
maximize the performance of the server-side compo-
nents of OS3 while minimizing the amount of comm-
unication that is required for each survey, further
improvements need to be made if the system is to be
used on a national-wide basis. Currently, the Student
applet is 33 KB and only communicates with the
server when a student logs into the system and when
the student has completed the survey. While this
minimal communication overhead has greatly expan-
ded the number of surveys that can be taken concur-
rently, multiple and geographically distributed servers
should be deployed to load balance the system and
enhance server speed, especially since the pilot work
indicates that server load distributions can be pre-
dicted. The overall load on the system can be reduced
further by planned queuing of the initial and reminder
email messages to the students based on when various
schools conduct the questionnaire(s) and the size of
the school. Thus, a more uniform distribution of
the overall system load can be maintained and better
distributed across the geographically located servers.
From the EC 2000 assessment perspective, a
flexible reporting system is needed to allow the local
administrator (and other faculty) to examine different
student groups (i.e., gender, ethnicity) and engineer-
ing programs, compare different programmatic initia-
tives, track students longitudinally as they matriculate,
evaluate how responses correlate to quality indicators
(i.e., performance, retention), and make comparisons
to other programs given certain criteria (i.e. size of
school/program, Carnegie Foundation classification).
In doing this, the Pitt-OS3 s database system can act as
a survey assessment warehouse for engineering pro-
grams conducting benchmarking studies.
ACKNOWLEDGMENTS
This research is being funded by grants from the
Engineering Information Foundation: Perception ver-
sus Performance: The Effects of Gender and Ethnicity
Across Engineering Schools (EiF 98-4) and the Nati-
onal Science Foundation: Engineering Education:
Assessment Methodologies and Curricula Innovations
(EEC-9872498). We would like to thank John Merrill,
Ohio State University, Teri Reed Rhoads, University
of Oklahoma, Barbara Olds and Ron Miller, Colorado
School of Mines, and Rick Olson and Susan Lord,
University of San Diego for their participation in the
OS3 development and pilot work.
REFERENCES
[1] Engineering Criteria, 2000, Third Edition: Criteria
for accrediting programs in engineering in the United
States. Published by the Accreditation Board for
Engineering and Technology (ABET), Baltimore,
Maryland. http://www.abet.org/EAC/eac2000. html;
1997.
[2] M. E. Besterfield-Sacre, C. J. Atman, and L. J. Shuman,
Engineering student attitudes assessment, J Eng Edu
87(2) (1998), 133–141.
[3] M. E. Besterfield-Sacre and C. J. Atman, Survey
design methodology: Measuring freshman attitudes
about engineering, Am Soc Eng Edu Confer Proc
(1994), 236–242.
[4] M. E. Besterfield-Sacre, L. J. Shuman, C. J. Atman,
and H. Wolfe, Three approaches to outcomes asses-
sment: Questionnaires, protocols, and empirical
Modeling, Am Soc Eng Edu Confer Proc, 1997
(CD ROM).
[5] M. Moreno, M. E. Besterfield-Sacre, L. J. Shuman,
H. Wolfe, and C. J. Atman, Self-assessed confidence
in EC-2000 outcomes: A study of gender and
ethnicity differences across institutions, Proceedings
of the 2000 Frontiers in Education Conference,
Kansas City, MO, October, 2000.
[6] Bloom et al., Taxonomy of Educational Objectives,
Handbook I: Cognitive Domain, New York: David
McKay Company (1956).
CROSS-INSTITUTIONAL ASSESSMENT 95
[7] Krathwohl et al., Taxonomy of Educational Objec-
tives, Handbook II: Affective Domain, New York:
David McKay Company (1964).
[8] R. Hoare, M. E. Besterfield-Sacre, L. J. Shuman, R.
Shields, and T. Johnson, Cross-institutional assess-
ment with a customized web-based survey system,
2001 Frontiers in Education Conference, Reno, NV,
October, 2001.
[9] M. E. Besterfield-Sacre, M. Moreno, L. J. Shuman,
and C. J. Atman, Comparing entering freshman en-
gineers: Institutional differences in student attitudes,
1999 Am Soc Eng Edu Confer Proc June 1999,
Charlotte, NC.
[10] M. E. Besterfield-Sacre, C. J. Atman, and L. J.
Shuman, Characteristics of freshman engineering
students: Models for determining student attrition
and success in engineering, The J Eng Edu 86(2)
(1997).
[11] A. Scalise, M. E. Besterfield-Sacre, L. J. Shuman,
and H. Wolfe, First term probation: Models for
identifying high risk students, accepted for publica-
tion in Proc of the 2000 Frontiers in Education
Conference, Kansas City, MO, October, 2000.
[12] L. J. Shuman, A. Scalise, H. Wolfe, C. Delaney, and
M. E. Besterfield-Sacre, Students who leave engi-
neering over time: An in-depth analysis of attrition
and interventions to improve retention, 1999 Amer-
ican Society for Engineering Education Conference
Proceedings, June 1999, Charlotte, NC.
[13] M. E. Besterfield-Sacre, L. J. Shuman, C. J. Atman,
R. L. Porter, R. M. Felder, and H. Fuller, Changes in
freshman engineers attitudes—A cross institutional
comparison. What makes a difference? Frontiers in
Education Conference, November 1996 (CD ROM).
[14] R. L. Porter and H. Fuller, A new ‘‘contact-based’’
first year engineering course, J Eng Edu 87(4)
(1998).
[15] D. A. Dillman, Mail and Internet Surveys: The
Tailored Design Method, New York: Wiley 2000.
BIOGRAPHIES
Raymond Hoare is an assistant professor
of electrical engineering at the University
of Pittsburgh. He received his master’s
degree from the University of Maryland
and his PhD from Purdue University.
Dr. Hoare teaches hardware design meth-
odologies and software engineering. His
research focus is on architectural design
of digital system. Within a single VLSI
chip, his focus is synchronization and application specific high-
performance architecture. For larger applications, custom designed
network interconnection is needed. Applications for these two
domains include networking appliances and System-on-a-Chip
(SoC) consumer devices. On the software engineering side, Dr.
Hoare’s research involves specialized systems of servers that utilize
a custom server network when the servers are centralized. For
distributed systems, the Internet and specialized communication
infrastructures are utilized to provide a foundation for software
development. Software applications include the Online Student
Survey System (OS3) and Pegasus, A Distributed System for
Mechanically Engineered Products.
Mary Besterfield-Sacre is an assistant
professor in the Department of Industrial
Engineering at the University of Pitts-
burgh. Her principal research interests
are in empirical modeling applications
for quality improvement in manufac-
turing and service organizations and in
engineering education evaluation meth-
odologies. She is widely published in the
engineering education literature and is
coauthor of the forthcoming Total Quality Management (third
edition; Prentice Hall, 2002). She is also the educational director
for the Swanson Center for Product Innovation at the University of
Pittsburgh. Dr. Besterfield-Sacre has been principle or co-principle
investigator on several sponsored research projects with funding
by the National Science Foundation, U.S. Department of Edu-
cation, Sloan, and the Engineering Information Foundation. She
received her BS in engineering management from the University
of Missouri-Rolla, her MS in industrial engineering from Purdue
University, and a PhD in industrial engineering at the University
of Pittsburgh. Prior to joining the faculty at the University of
Pittsburgh, Dr. Besterfield-Sacre worked as an industrial en-
gineer with ALCOA and with the U.S. Army Human Engineer-
ing Laboratory. She is a member of ASEE, IIE, HFES, and
INFORMS.
Dan Ertman is a graduate student in the Department of Electrical
Engineering at the University of Pittsburgh. He has recently
obtained a BS in computer engineering and a BA in English
Writing at the University of Pittsburgh, both with honors. His
research is in the area of medical imaging annotation and
evaluation. This study is his first journal publication. Dan improved
the functionality and stability of the OS3.
Jessica Gerchak is a graduate student at the University of
Pittsburgh in the Department of Industrial Engineering, where she
also received her BS degree. Her research is based in the area of
evaluation methodologies in engineering education. Jessica is a
member of IIE and was the winner of the Technical Paper
Competition at the Regional Conference.
Troy A. Johnson received his BS in computer engineering from the
University of Pittsburgh in December 2000. He is currently pursuing
an MS and PhD in electrical and computer engineering at Purdue
University. His research interests include speculative architectures,
optimizing compilers, and parallel/distributed systems. He was one
of the original designers of OS3.
Robert L. Shields currently works for CombineNet as a de-
veloper. He is a student majoring in computer engineering at the
University of Pittsburgh. He was one of the original designers
of OS3.
96 HOARE ET AL.
Larry J. Shuman is associate dean for
academic affairs, School of Engineering,
University of Pittsburgh, and a professor
of industrial engineering. His areas of
interest are improving the engineering
educational experience and the study of
the ethical behavior of engineers. He is a
coauthor of Engineering Ethics: Balan-
cing Cost Schedule and Risk—Lessons
Learned from the Space Shuttle (Cam-
bridge University Press, 1997). He has published widely in the
engineering education literature and serves on the editorial board
of the Journal of Engineering Education and the Journal of SMET
Education. Dr. Shuman has been principle or co-principle in-
vestigator on 25 sponsored research projects funded from such
government agencies and foundations as the National Science,
Foundation, U.S. Departments of Health and Human Services, and
the Department of Transportation, the Robert Wood Johnson
Foundation, and Engineering Information Foundation. He holds a
PhD in operations research from the Johns Hopkins University and
a BSEE from the University of Cincinnati. He is active in the
ERM section of ASEE and served as general co-chair of the 1997
Frontiers in Education Conference. He served as the academic dean
for the ‘‘Semester at Sea’’ Program for Spring 2002.
CROSS-INSTITUTIONAL ASSESSMENT 97