Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Improving the Quality of StudentAdvising in Higher Education – A CaseStudy
CARLB.MONTANO�,MADELYND.HUNT��&LOWELLBOUDREAUX†
�College of Business, Lamar University, Beaumont, Texas, USA, � �Center for General Studies, LamarUniversity, Beaumont, Texas, USA, †Lamar State College at Port Arthur, Texas, USA
ABSTRACT This paper demonstrates the application of the quality improvement process (DemingCycle) to improve the quality of student advising at Lamar University in Beaumont, Texas. Thequality improvement project was undertaken by the Continuous Quality Improvement (CQI) Teamat the Center for General Studies, assisted by two facilitators. All steps in the process are shownwith detailed information hoping that colleagues elsewhere who would like to apply the methodcan do so successfully. These steps include construction of SIPOC (Suppliers, Inputs, Process,Outputs, Customers) and flowchart, nominal group technique, control chart analysis, focus group,customer satisfaction survey, recommendations implemented, and follow-up. The president of theuniversity credited this project as having had a definitive impact on the retention of GeneralStudies students.
KEY WORDS: Deming cycle, quality of student advising, Total Quality Management, SIPOC,voice of the process, voice of the customer, nominal group technique, focus group
Introduction
Why is Improving Product Quality Important in Business as well as in a University?
The importance of improving the quality of an organization’s product (good/service) may
wax and wane in the news media, but hard-nosed business people take it as a given.
Lessons get etched in our minds when we see how bad things could become in business.
Take the case of Firestone tyres. Because of tread separation problems with its tyres
on Ford Explorer sport utility vehicles, numerous fatal accidents occurred. Recently,
the honeymoon between the two companies has ended: Ford Motor Company dropped
Bridgestone/Firestone as its tyre supplier (see Greenwald, 2001). This, along with
strings of product liability lawsuits, has staggering bottom-line implications for Firestone.
Total Quality Management
Vol. 16, No. 10, 1103–1125, December 2005
Correspondence Address: C. B. Montano, Department of Economics & Finance, Lamar University,
PO Box 10045, Beaumont, Texas 77710, USA. Email: [email protected]
1478-3363 Print=1478-3371 Online=05=101103–23 # 2005 Taylor & FrancisDOI: 10.1080=14783360500235843
What about in a university setting? How bad could things get before top management
takes action? In the area of student advising, my colleagues and I have seen and heard
horror stories. There was one about an adviser who told the student, ‘You came here
for my advice. Here’s the catalogue. You know how to read it.’ One wonders how this
kind of student advising service affects student enrolment and retention (see also
Yorke, 2000). Consider next the cascading effect of declining enrolment or semester
credit hours (SCH) on the university’s bottom line. As it turned out, in Texas and most
other states, budgets of state-supported universities are determined through formula
funding, which depends on past SCH production.
What has Quality of Student Advising got to do with Higher Education?
It may not be obvious at first blush, but if we apply the systems thinking taught by Deming
and other quality gurus, higher education is a production system with suppliers, inputs,
processes, outputs and customers (Figure 1). Student counselling or advising is a com-
ponent of this production system. How this process is managed will have significant
impact on the university’s outputs, customers, and financial well-being.
What Happens when the Quality of a Product is Improved?
Numerous research studies in business have shown that improving product quality has a
double-whammy impact on the firm’s profitability. Note that there are only two sides to
the economic profit formula (Economic Profit ¼ Total Revenue - Total Cost): revenue
and cost. Both sides benefit from improving quality (Figure 2). Improved quality of
design (‘nice Lexus’) raises the perceived value of a firm’s product, thereby increasing
the willingness of customers to pay a higher price for it. This higher price, plus the
increase in market share, contributes to higher revenues. Furthermore, improved quality
of conformance (‘less/no defects’) lowers manufacturing and service costs. This is
similar to Deming’s chain reaction, whereby improving quality decreases costs because
of less rework, fewer mistakes, fewer delays and snags, and better use of time and
materials. Consequently, productivity (¼outputs/inputs) improves. And, in turn, the
lower per unit cost enable firms to charge lower prices, thus becoming more competitive
Figure 1. Higher education as a production system. (Source: Evans & Lindsay, 1999: 51)
1104 C. B. Montano et al.
in gaining market share. We add to this analysis the demand- or revenue-enhancing effect
of repeat buying by satisfied customers over time.
Many organizations in the private and public sectors use Total Quality Management
(TQM) to improve the quality of everything they do. This management philosophy builds
quality throughout the entire production process, in contrast to the old and discredited
approach of building quality by inspection of the product at the end of the production line
(Spanbauer, 1995). TheGeneral AccountingOffice listed the following among the expected
benefits of TQM: customer satisfaction up, customer retention up, customer complaints
down, costs down, cycle time down, employee turnover down, employee satisfaction up,
safety and health up, productivity up, and, finally, market share up, and profits up.
Core Principles of TQM
The core principles of TQM are: (1) focus on the customer; (2) participation and team-
work; and (3) continuous improvement and learning (Evans & Lindsay, 1999). A
simple but unique way of explaining why an organization must focus on its customers
was heard from a quality supervisor who works at North Star Steel. He said, ‘This is
the person who puts bread on the table for my family.’ The second principle, participation
and teamwork, recognizes the fact that an organization’s most valuable resource is its
people. Humans, unlike money or machines, think, and they are the ones who make
things happen. The person who is actually doing the job/process possesses vast knowledgevital to understanding problems with that job/process. Managers must learn how to tap
this knowledge through participation and teamwork. And in the end, these are the same
people who must implement changes in the system. Given the opportunity, people do
their best and take pride in their work. The third principle, continuous improvement and
learning, is what we consider the heartbeat of TQM. An organization may have all the
components of TQM, but without implementing continuous improvement, no process,
and thereby quality of product, gets improved.
The Deming Cycle (PDSA)
Quality improvement is a journey, not a destination. It is a cyclical and never ending
process, often referred to as The Deming Cycle or PDSA Cycle (Figure 3). PDSA
Figure 2. The double whammy of improving quality. (Source: Evans & Lindsay, 1999: 22)
Quality of Student Advising in Higher Education 1105
stands for Plan, Do, Study, and Act. In their concept of kaizen, the Japanese prefer to view
process improvement as very incremental (Clayton, 1995). One may make only a minute
improvement per PDSA cycle, but if it is done continuously, in the long run it will amount
to a dramatic change. Thus, one author views the PDSA cycle as a wheel that goes up a
hill as it turns or cycles. The higher it goes, the higher the performance level of the
business in all aspects, especially financial. The four steps involved in this cycle can
also be summarized in the form of a flowchart (Figure 4).
Purpose of this Paper
A strong motivation for us in writing this paper is the desire to share with a wider audience
our experience in applying the Deming Cycle for improving the quality of student advising
in a university setting. We provide detailed information of the steps involved hoping that
colleagues elsewhere who would like to apply the method can do so successfully. Our
Continuous Quality Improvement (CQI) project progressed through the Plan and Do
steps of the cycle and was about to commence the Study phase. We will present key find-
ings that became the bases of the Plan. Then we will detail recommendations implemented
to improve the quality of student advising at Lamar University (the Do phase). Lastly,
we will outline our plans for the Study phase of the cycle.
How the Project got Started
This CQI project in the Center for General Studies is an offshoot of a similarly successful
CQI project conducted earlier in the Admissions Office of Lamar University (Montano &
Utter, 1999). We would like to stress at the outset that no TQM/CQI project can
succeed without a champion from the top management of the organization (Spanbauer,
1995). Our champion was Dr William Cale, then Executive Vice President for Academic
Affairs at Lamar University. He is now the Chief Executive Officer of the University of
Pennsylvania at Altoona.
Figure 3. The Deming Cycle (Source: Deming, 1993: 135)
1106 C. B. Montano et al.
The Quality Improvement Process at Lamar University
Orientation
A two-day orientation was conducted beginning with the introduction of facilitators and
participants. The facilitators briefed the team members on the project and the rationale
for selecting team leaders and their members. The orientation also included TQM
philosophy, quality improvement methodology, and quality improvement tools.
Team Formation
Dr Carl Montano, Professor of Economics, and Lowell Boudreaux, Instructor of
Economics, served as facilitators of this project. The remainder of the team consisted of
Figure 4. Process improvement methodology (Source: Handouts of Dr Bryan Cole)
Quality of Student Advising in Higher Education 1107
the staff of the Center for General Studies: Dr Madelyn Hunt, the Executive Director (the
team leader), two academic advisers, and the administrative assistant. One of the advisers
served as recorder during the meetings. Although team participation by all individuals who
provide services in the Center is very important (Spanbauer, 1995), student assistants were
not included on the team because of scheduling problems.
The project began in June 1999 and the final report was submitted to the administration
in May 2000. With few exceptions, we met on a weekly basis throughout the project year.
The PLAN Phase of the Deming Cycle
After the two-day orientation, the TQM/CQI team began the Plan phase of the Deming
cycle. In the Center for General Studies, this plan was to improve the advising process.
Five steps had to be undertaken before recommendations to improve the advising
process could be made and carried out (Figure 4).
1. SIPOC Analysis
The first step was SIPOC analysis. Used to identify all the components of the advising
process, SIPOC stands for
S ¼ ‘Suppliers of inputs’, the source of students and information, i.e. high schools,
other universities, and colleges;
I ¼ ‘Inputs to the process’, such as information used in advising, i.e. catalogues,
student files and records;
P ¼ ‘Process’, the activities in the Center, directly or indirectly having to do with
student advising, which produce output/product in the form of a service.
O ¼ ‘Outputs’, the result of the advising process, i.e. graduates, students with 2.0
or above GPAs, change of majors, etc; and
C ¼ ‘Customers’, people who receive the output, i.e. employers of our graduates,
graduate schools, etc. Primary customers are our General Studies (GS)
majors, undecided students, and students experiencing academic difficulties.
It took several brainstorming meetings to develop the SIPOC for the Center (Table 1).
2. Flowchart
The next step was to develop a flowchart (Figure 5). The TQM/CQI team discussed each
step of the advising process from the time the student enters the Center to the time the
student graduates, change majors, or drops out.
The advising process begins with the student intake. When the student contacts the
Center, either through calling for an appointment or walking in for an appointment, the
student’s social security number is obtained and entered in the Student Information
System (SIS). This is to determine whether or not the student should be advised in the
Center, a service intended for undecided and GS majors only (Figure 5). On Mondays,
walk-in days, a student can circumvent the waiting period for an appointment and see
an adviser while in the Center. As will be shown later, walk-in days create advising back-
logs just before each semester begins.
After receiving advice on course selection, students were then registered into courses.
This, too, creates problems on days just before the semester begins when most of the
classes have reached capacity and closed.
1108 C. B. Montano et al.
Table
1.SIPOCoftheCenterforGeneral
Studies(CGU),Lam
arUniversity
(LU)
Suppliers
Inputs
Process
Outputs
Customers
Lam
arState
College–PortArthur
Lam
arInstitute
ofTechnology
Lam
arState
College–Orange
HighSchools/G
eneral
Education
Diploma(G
ED)
AllLU
departm
ents
Students
AdmissionsOffice
Conferences
Institutional
Research
Texas
Academ
yforLeadership
intheHumanities(TALH)/Lam
arEarly
AccessProgram
(LEAP)
Office
oftheExecutiveVice
PresidentforAcadem
icAffairs
(EVPAA)
Texas
Academ
icSkillsProgram
(TASP)Office
Web
sites
Academ
icAdvisors
Forum
ComputerCenter
Faculty
Parents
Out-of-State
institutions
ofhigher
learning
In-State
institutionsofhigher
learning
Catalogues
Files/records
Gradereports
(TALH/LEAP)
StudentInform
ation
System
(SIS)data
General
inform
ation
TASPresults–class
placement
Changeofmajor
Policies
ofEVPAA
GSMajors:
†.2.0
gradepoint
average(G
PA)
†,2.0
GPA
Theprocess
of
advising,
registering,
monitoring,and
providing
inform
ationto
studentsuntilthose
studentschange
theirmajoror
graduate�
Graduates
Studentswith.2.0
GPA
Changeofmajor
TALH/L
EAPfolders
Phonecalls
Customer
contact
letters:
†monitoredprobation
†off-m
onitoredprobation
†suspensionwarning
†congratulatory
letters:
off-probation,
improved
GPA
Drops,adds,withdrawals
Waiver
ofsuspension
Suspension
Monitoredprobationstudy
Referralsto
CareerCenterand
departm
ents
DegreePlans
Participationin
University
Services:
†Orientation
†Springfest
†Committees
Employers
Graduateschools
Anydepartm
entwithin
LU
TALH/L
EAPprograms
Students��
Highschools
Parents
CouncilofInstructional
Departm
ents(CID
)CareerCenter
� Althoughnotdescribed
here,theMcN
airProject
andSupplementalInstructionarealso
under
theCGSprogram.
��Primarycustomers.
Quality of Student Advising in Higher Education 1109
Figure 5. (a) Flowchart of the advising process at the Center for General Studies
1110 C. B. Montano et al.
Next, the student’s desired major is determined (Figure 5). That is, whether the student
is undecided or desires a Bachelor of General Studies (BGS). If the student does not desire
a BGS and has good academic standing (a GPA of 2.0 or higher), he/she is referred to the
Career Center or scheduled to visit with a faculty member for career exploration. When the
student decides on a major, a change of major from GS to the department of the student’s
Figure 5. (b) Flowchart of the advising process at the CGS (continued)
Quality of Student Advising in Higher Education 1111
choice is initiated. If the student has a GPA of less than 2.0, the student is placed on
probation and receives remediation based upon the level of the GPA. While on probation,
the student will visit the Career Center or a faculty member for career exploration. Once
the student achieves good academic standing, a change of major is initiated.
Figure 5. (c) Flowchart of the advising process at the CGS (continued)
1112 C. B. Montano et al.
If the student desires a BGS and is in good academic standing (Figure 5), a degree plan
is prepared. The student is then advised in the Center until he/she graduates. Three seme-
sters before graduation, a formal degree plan is prepared and submitted to the Registrar’s
Office.
Several meetings were required to fine-tune the flowchart. The SIPOC and flowchart
enabled the team to understand clearly the mission and processes of the Center.
3. Nominal Group Technique (NGT)
The initial determination of quality-of-service problems was done by the CGS staff using
the NGT (Mosley, 1974). Each staff member contributed to the listing of problems, as well
as to the ranking of these problems according to their importance in affecting the quality of
advisement service at the CGS. The lowest number in the ranking represented an area
having the greatest effect on advising. Eleven areas were identified (Table 2). The NGT
results indicated the following areas to be of greatest concern to the staff: the customer
service skills of other departments (ranked 1st), lack of advisers in other departments
during holidays and between semesters (2nd), and volume of students or student/adviser ratios are too high (3rd). The two highest ranking priorities were beyond the
control of the Center, as were the 5th, 6th, 9th and 10th. However, items ranked 3rd,
4th, 7th, and 8th were within the Center’s control. The 3rd priority, student/adviser ratiobeing too high, could be addressed by hiring an additional CGS adviser. The 4th priority
concerned students’ waiting time for advisement. The staff felt that, at times, the students
waited too long to be advised. The staff also recognized that registering students in the
Center (7th priority) was very time consuming, especially on the days before the start
Figure 5. (d) Flowchart of the advising process at the CGS (continued)
Quality of Student Advising in Higher Education 1113
of each semester. The 8th priority, mis-routed calls, reflected the fact that attending to
these calls is time consuming, especially on high volume days.
What the staff saw as problems through the NGT were similarly revealed by students in
the focus group, as shown later.
4. Control Chart Analysis
The above SIPOC analysis and flowchart bring home the point that student advising
is a complicated process with suppliers, inputs, outputs and customers. A control chart
is a diagnostic tool to determine if a process is ‘in control’. It is a tool for management
to hear the voice of the process. It is important that the process is not ‘out of control’ to
ensure that the product/output is free of defects. Note that, in any production process,
the quality of the product/output can only be improved through improving the process.
This is what underlies the TQM philosophy of building quality throughout the
process, and not by inspection of the product at the end of the production line.
Since a control chart is designed for monitoring the output of the process over time,
there has to be a measurement of the output or of its various (quality) dimensions. For
example, if the process involves cutting of a widget into a certain length, the output can
be measured not only in terms of numbers of widgets cut, but more importantly, in
terms of the length of each to see if it meets the customer’s specification. If the length
is ‘out of spec’, the output produced (widget) is defective.
Table 2. Results of the NGT
Individual rankings
By Center staff Tally Rank Areas of concern
6 2 3 2 13 1 Customer service skills of other departments3 7 4 1 15 2 Lack of advisers in other departments during
holidays and between semesters4 3 6 4 17 3 Volume of students or student/adviser ratio is
too high10 5 1 3 19 4 Wait for advisement1 11 2 8 22 5 Students not required to attend mandatory
orientation (lack of knowledge of policiesand prerequisites)
2 10 5 7 24 6 Need to enforce deadlines on submittingadmissions applications, getting transcripts
11 1 9 5 26 7 Registration in the Center8 4 10 6 28 8 Mis-routed calls5 8 7 11 31 9 Forgotten TASP information7 6 8 10 31 9 Delay of removing holds at departmental level9 9 11 9 38 10 Students failing to appear for appointments at
Career Center
Nominal Group Technique (NGT) results indicate that high priority areas of concern are as follows: customer
service skills of other departments, lack of advisers in other departments during holidays and between semesters,
and volume of students or students/adviser ratios are too high.
NGT results also indicate that students failing to appear for appointments, students forgetting TASP information,
and delays in removing departmental holds are their low priority areas of concern.
Note: Center staff members ranked their individual areas of concern from most important to least important. The
lower the total, the more immediate the concern. The higher the total, the less immediate the concern.
1114 C. B. Montano et al.
It turns out that student advising at the CGS involves processing students until, at the
end of the production line, they graduate with a BGS degree or change majors. Meanwhile,
along the way, some would have dropped out or transferred to other schools. Any manage-
ment intervention at this endpoint, to lower the dropout rate (or increase the retention rate),
would be too late. Therefore, it is important that the measure of output (or its quality) is
done early in the process.
In this project, we were fortunate that the CGS office kept a record of the number of
students advised. This was the output data we used in the control chart analysis. We devel-
oped control charts for: (1) monthly; (2) daily; and (3) hourly (on Mondays) advice data.
The control chart of monthly advice data, from August 1998 to December 1999, is
shown in Figure 6. Three parts of the control chart are worth noting: Center Line (or
Average), Upper Control Limit (UCL) and Lower Control Limit (LCL). The Average
or Center Line shows that, on the average, the CGS advises 233 students per month.
The UCL is defined as three standard deviations above the Average or Mean. Correspond-
ingly, the LCL is defined as three standard deviations below the Mean or Average. In the
case we are dealing with here, the number of students advised per month, the LCL of
2 288 is meaningless and should be ignored because the smallest number of students
advised per month is zero (0). The relevant control limit to watch out for is the UCL.
Since none of the observations fell outside the UCL of 754, we conclude that, on a
monthly basis, the advising process at the CGS is ‘in control’. This means that the
Figure 6. Control chart of monthly advisement data, Center for General Studies, August 1998 toDecember 1999
Quality of Student Advising in Higher Education 1115
process is stable. Hence, any variation/fluctuation in output (or its quality) is predictable
and can be attributed to ‘common/systemic causes’ and/or random factors. When the
process signals that it is stable or ‘in control’, there is no need for management to make
any adjustment or alteration. Doing so amounts to unnecessary ‘tampering’, which can
make the process worse. The fact of the matter is, from the viewpoint of advisers,
student advising is an hourly or daily phenomenon or service encounter. Therefore, our
team decided also to do daily and hourly control chart analyses.
The control chart of the daily advisement data (Figure 7) shows that, on average,
the CGS advised about 12 students per day. The Upper Control Limit (UCL) is 27.
The chart shows that there are many out-of-control points, or observations that lie
above the UCL. Therefore, we concluded that, on a daily basis, the advisement process
at the CGS is out of control. Out-of-control points are attributed to ‘special causes’. It
is the job of management to find these special causes and eliminate them, in order to
bring the system back ‘in control’. If a system operates ‘out-of-control’, its product will
be defective. Again, using the earlier analogy of producing widgets, the process is out-
of-control if the length of the widgets produced is longer than the Upper Control Limit
(UCL) or ‘out of spec’. In such a manufacturing environment, the ‘special cause’ might
be the breakdown of the cutting machine, or a machine-setting mistake made by a
sleepy operator on midnight shift.
Figure 7. Control chart of daily advisement data, Center for General Studies, August 1998 toDecember 1999
1116 C. B. Montano et al.
Happily for our team, a few brainstorming sessions enabled us to pinpoint the ‘special
causes’ of the out-of-control points. Note that the out-of-control points bunched up in mid-
August, late November, early December, and mid-January. These dates are the busiest
advising days at the Center, occurring before the start of fall and spring semesters. The
Center advisers knew first-hand how the quality of their advising service deteriorated
when an unusually high number of students showed up for advisement (i.e. more than
27 per day). The average waiting time of each student got longer and advisers were
under pressure to hurry up, thus creating a situation in which errors could occur.
Hourly advisement data were recorded only on Mondays, when students can walk-in
without an appointment. The control chart of hourly advisement data (Figure 8) followed
a similar pattern as the daily one. Out-of-control points bunched up a few weeks before the
beginning of each semester. Most out-of-control points on Mondays occur at 11:00 a.m.
and 1:00 p.m. (Figure 9). These are the hours when the advising resources of the CGS
are stressed to the limit, and the quality of the advising service deteriorates.
5. Customer Satisfaction Survey
Focus group. In a free-market economy like ours, the ultimate judge of the quality of a
product is the external customer, the person who pays the money to buy the product. Thus,
it is crucial for the seller/producer to hear the voice of the customer. Before conducting a
Figure 8. Control chart of hourly advisement data, on Mondays, Center for General Studies, August1998 to December 1999
Quality of Student Advising in Higher Education 1117
full-blown customer satisfaction survey, which is fairly expensive, it is customary in mar-
keting research to conduct a focus group. With a few people, carefully selected to rep-
resent a cross-section of external customers, a free-wheeling atmosphere is created
whereby a facilitator can ask probing questions to determine: (1) those quality character-
istics of a product which the customer is looking for; as well as (2) his/her preferences(likes or dislikes). Our team applied the same approach.
A cross-section of students visiting the Center was selected to participate. Five students
were selected on the basis of gender, classification, standing with the university, area of
interest, and transfer status. The group was designed to include two males and three
females; a graduate student and a freshman; two students on probation and one qualifying
for suspension; five students with different areas of interest; and a student who had trans-
ferred from another university. Those students who were selected were mailed a letter
inviting them to participate in the focus group. The meeting was deliberately held at
lunch hour, with lunch provided as an incentive for participation. The agenda of the
meeting included a welcome and introductions of the students, staff, and TQM facilitators;
a disclaimer that emphasized the confidential nature of the information shared and assur-
ance that students’ responses would not in any way affect their grades or status in the
Center; the purpose of the meeting; the mission of the CGS; and discussion of the flow-
chart of the advisement process. The students were asked to comment on the delivery
of advising services in the Center, to share their experiences in the Center, and to state
their likes/dislikes about the Center. The central question posed to the focus group was,
‘What do you consider to be a high quality student advisement service?’
Brainstorming to form affinity sets. Following the meeting of the focus group, the
TQM team conducted a brainstorming session to systematically analyse focus-group
Figure 9. Histogram of out-of-control points, hourly advisement data, on Mondays, Center forGeneral Studies, August 1998 to December 1999
1118 C. B. Montano et al.
responses to the above central question. The comments of the students were listed and then
grouped into seven affinity sets: (1) Customer Friendly; (2) Adviser and Advising;
(3) Physical Facilities and Accessibility; (4) Availability of Advisers; (5) Perception of
GS Degree; (6) Other Departments; and (7) Students Lacked Information (Table 3).
The last two affinity sets were deleted because they were beyond the control of the
Center. The remaining affinity sets, plus NGT results mentioned earlier, constituted the
bases for question items in the survey questionnaire. Thus, the satisfaction survey ques-
tionnaire reflected the concerns of both students and Center staff.
Sample size determination. To ensure statistically valid conclusions, a sample size of
91 students was needed to provide an estimate of the population mean with a 95% confi-
dence level and a sampling error of +0.3 (Evans & Lindsay, 1999, 615). Originally, stu-
dents were randomly selected for mailing questionnaires; however, there was only a 1.5%
response rate. To fill out the balance of the sample size requirement, our team decided to
survey students who came into the Center for advice or inquiry. In all, 106 students
responded. Forty-two percent of the respondents were freshmen, 33% sophomores, 14%
juniors, and 11% seniors.
The questionnaire. Questions were developed using a Likert scale of 1–7
(1 ¼ Strongly disagree; 7 ¼ Strongly agree). To measure the quality of our advisement
services, quality-of-service attributes included in the questionnaire pertained to the
receptionist, advisers, and other aspects of advising services such as waiting time and
the Center’s location.
Two open-ended questions were included at the end of the questionnaire to determine
problems encountered by students, as well as their suggestions on how to improve our
advisement services.
Survey Findings
1. Quality of Advising Service
Receptionist. Survey results indicated that the receptionist greeted students promptly,
and that she was friendly and accommodating. Students’ level of satisfaction was very
high for these two attributes with a mean of 6.2 and 6.3, respectively (Table 4).
Adviser. We used seven quality-of-service attributes pertaining to the adviser
(Table 5). The mean of the ratings ranged from 5.2 to 5.9. All ratings indicated that stu-
dents were very satisfied with the service provided by our advisers, with only two excep-
tions. Questions relating to transferring students either to or from another school received
the lowest satisfaction levels (mean of 5.4 and 5.2, respectively). About 25% of students
advised at the Center transferred from another school or were planning to transfer from
Lamar to another school at the time of the survey. Many times students do not provide
adequate information to the adviser regarding the university to/from which they are trans-
ferring during advisement.
Advising service. Both the staff and students felt that the waiting time to see an adviser
was too excessive. Over 29% of the respondents indicated they had to wait 16 minutes or
Quality of Student Advising in Higher Education 1119
longer to see an adviser. Consequently, many (50%) felt that there is a need for more advi-
sers in the Center. The staff also indicated in the NGT that registering students into classes
presented a quality-of-service problem. This also surfaced in the survey of the students.
A slim majority of the respondents (56%) desired being registered by the adviser, while
Table 3. The seven affinity sets
Customer Friendly† Personal touch† Caring attitude† Know my name† Relationship building† I was treated very well† The adviser was nice to talk to† The adviser looks after the best interest of the student
Adviser and Advising† Very professional; everything (information) is ready† Adviser knew what I would take† The adviser helped plan my schedule† The adviser helped me get into a full class† The adviser looked at my whole background and made sure prerequisites are met† I really liked the registration help (Balanced against freedom to register themselves)† It is in my best interest to see the same adviser each time† Transfer student advising – course selection (include student responsibility with prerequisite
based on catalogue where to transfer)
Physical Facilities and Accessibility† We would like to have a study room as a way to connect with other GS students† Central location; it did not make sense to be at the Physics building† Parking convenience
Availability of Advisers† Flexibility of schedule for walk-ins; some people’s schedules are messed up† More advisers in GS† More advisers in departments† Waiting time for advisement† Lack of time for CGS in-house registration close to beginning of classes
Perception of GS Degree† Need recognition and respect as a GS major (stigma)† I feel that at GS, I have a broader liberal arts background† The catalogue does not generally describe the purpose of GS
Other Departments† Customer-service skills of other departments (affect student attitude towards LU)† Lack of advisors in other departments during holidays and between semesters† Students are not required to attend mandatory orientation; they lacked knowledge of policies &
prerequisites† Need to enforce deadlines on submitting admissions applications and getting transcripts† Forgotten TASP information† Delay of removing holds at departmental level
Students Lacked Information† Mis-routed calls† Forgotten TASP information† Students failing to appear for appointments at Career Center† Mis-directed students (often TASP-related)
1120 C. B. Montano et al.
44% disagreed with the idea or were indifferent/neutral. Additionally, a vast majority
(80%, mean ¼ 5.8; median ¼ 7.0) of the respondents prefer to see the same adviser for
each advisement session.
Other findings indicated the Center to be conveniently located (52.4% strongly agreed,
mean ¼ 6.2) and easy to find (86% agreed, mean ¼ 6.0). The focus group revealed to the
staff earlier a lack of cohesion among GS majors. Therefore, the staff felt that cohesion
could be promoted by providing a study room in the GS office where students could meet,
study and socialize. However, we were not sure whether the students would agree with
the idea. Thus, a question pertaining to this idea was included in the survey. The results indi-
cated that the studentswere divided on this issue (mean ¼ 3.5),withmore students disagree-
ing (48.5%) than agreeing (20.8%) with the idea of a study room. Nearly 31% were neutral
on thismatter. This crucial information caused us to abandon the idea of adding to theCenter
a costly study room that possibly would not be used by the students.
The GS staff were also interested in the student’s perception of the GS degree. When the
students were asked the question, ‘have you ever considered the Bachelor of General
Table 4. Quality-of-service attributes pertaining to the receptionist
Quality Attribute
Likert scale (1 ¼ Strongly disagree,
7 ¼ Strongly agree) Rating
Mean Std. Dev. Median
The receptionist at theCGS greeted me promptly
6.2 1.26 7.00
The receptionist was friendlyand accommodating
6.3 1.09 7.00
Table 5. Quality-of-service attributes pertaining to advisers
Quality Attribute
Likert scale (1 ¼ Strongly disagree,
7 ¼ Strongly agree) Rating
Mean Std. Dev. Median
1. The adviser was knowledgeable about the classes Ishould take and the prerequisites for those classes
5.9 1.31 6.00
2. The adviser took a personal interest in me and myacademic goals
5.7 1.61 6.00
3. The adviser looked at my entire background (testscores, other classes attempted or completed) whenhelping me choose courses
5.6 1.58 6.00
4. My adviser is understanding and a caring individual 5.8 1.42 6.005. My adviser provided enough time to discuss my needs 5.8 1.43 6.006. As a transfer student from another school, the adviser
explained to me which classes were accepted at Lamarand how they fit into my degree requirements
5.4 1.75 6.00
7. I informed my adviser that I would be transferring toanother school and the adviser helped me selectcourses that would transfer
5.2 1.79 5.00
Quality of Student Advising in Higher Education 1121
Studies as your degree?’, 77% of them answered ‘No’. Of those students who answered
‘No’, the following main reasons were given (in descending order of frequency):
(a) plan on going into another specific field/major (24); (b) not sure or undecided yet
(10); and (c) not aware/informed of Bachelor of General Studies (BGS) degree (5).
This result was presented on a Pareto diagram, which separates the vital few (reasons)
from the trivial many. That is, a few factors affect most of our problems.
The first reason (a) noted above did not bother the GS staff since students had already
made a decision concerning their major. However, the second (b) and third (c) reasons
were of concern and are addressed in our recommendations presented in the Do phase
of the Deming Cycle.
The staff were also curious regarding students’ knowledge of, or perception about, the
BGS degree. Therefore, two questions were included in the survey to address these con-
cerns. Most of the students (61%) did not respond to the statement: (1) ‘the BGS degree is
a highly regarded degree,’ and 22% did not respond to the statement; (2) ‘the BGS degree
provides a broader liberal arts background than other degrees.’ Of those who responded to
the first statement, 44% disagreed, while 29% agreed, and 27% were neutral (mean ¼ 3.6;
median ¼ 4.0).
2. Overall Degree of Satisfaction
Lastly, students were asked to rate their overall degree of satisfaction with the advisement
service they received in the Center. On average, CGS students are satisfied with the service
they received (mean ¼ 5.9; median ¼ 6.0). Thirty-seven percent were extremely satisfied
and only 7% were not satisfied with the service.
3. Open-ended questions
Problems encountered. The CGS staff was very interested in knowing whether stu-
dents had experienced any problems with their advisement service. The open-ended ques-
tion asked the students to list these problems. A verbatim list of student responses was
compiled and tallied as to how many times a similar problem was listed. The frequency
with which a similar problem was listed by the various respondents was used as an indi-
cator of its importance. A Pareto diagram was used to display respondents’ comments. The
most important problem experienced by the CGS customers was ‘long waiting time to be
advised’. The other problems listed with less frequency were: (1) misinformed about scho-
larship availability and developmental math; (2) advised wrong class; (3) non-caring or
rude treatment; and (4) not informed about BGS degree. The long waiting period to be
advised was addressed in our recommendations. The remainder of the problems listed
were not addressed because of their low frequency of occurrence.
Suggestions for improvement. Similarly, students were asked the open-ended ques-
tion, ‘list the things the CGS can do to improve its advisement services.’ The results
were presented on a Pareto diagram, also. The students recommended that the CGS
should: (1) add/hire more advisers/staff; (2) move to a more centralized/convenientlocation; (3) assign each student the same adviser each time; and (4) provide more infor-
mation about the GS degree program. All of these concerns were also addressed in our
recommendations except number 2, which was beyond our control.
1122 C. B. Montano et al.
The DO Phase of the Deming Cycle
After completing the Plan phase of this project, the team formulated recommendations
aimed at improving the quality of student advising in the CGS (the Do phase) and pre-
sented them to the Executive Vice President for Academic Affairs. All of the recommen-
dations were approved and implemented. (1) A full-time adviser was hired, which greatly
reduced the adviser/student ratio. With another adviser on staff, each student can now see
the same adviser when he/she visits the Center. (2) Additionally, the new adviser’s main
responsibility is to monitor and assist students who are experiencing academic difficulties.
This allows for more effective tracking of those who are on monitored probation. (3) The
way in which the Center scheduled appointments was changed to alleviate out-of-control
periods, which often occurred on Mondays, traditionally walk-in days. Appointments are
now made for all days of the week, including Mondays. This has resulted in more control
of student flow through the Center and allocation of time per student with the adviser.
(4) Advisers no longer schedule and register students during the advising session, unless
the student expresses a desire for the adviser to do so. Advisers can now advise within the
time constraints, and students are seen at or near their scheduled appointment times.
(5) Since students indicated that there was a need for more information about the GS
degree, a newsletter is planned for distribution each long semester informing our
majors of job opportunities with a GS degree and the programs that are offered through
the Center.
The STUDY Phase
The Study phase of the Deming Cycle – studying the results of the changes implemented
in the Do phase – began in Spring 2002. We will determine what works and what does
not work. Data were continually collected in the form of daily logs, and another student
satisfaction survey was conducted.
The ACT Phase
We entered the Act phase of the Deming Cycle in the Spring 2002 semester, when we ana-
lysed the data obtained during the Study phase. We will keep what works and discontinue
what doesn’t work.
Conclusions
Our experience demonstrated the proper application of the Deming cycle in improving the
quality of advisement service in higher education. In doing this, we learned some key
lessons.
In a university, as in the business world, there is no better substitute to management by
data. How else can a manager understand the system that he/she manages without obtain-
ing real-world observation/data/facts that are presented in the SIPOC, flowchart, control
chart, and customer satisfaction survey? Data are solid bases for management decisions.
A manager asking for an additional resource is usually more effective if the request is
accompanied by data to back up the request. For instance, in requesting for an additional
adviser for the CGS, Center Director, Dr Madelyn Hunt, got what she requested because
the Vice President for Academic Affairs saw the need for it in our report. The Voice of the
Quality of Student Advising in Higher Education 1123
Process (control chart) was screaming, ‘We need more advisers!’ The students in the
customer satisfaction survey (Voice of the Customer) were saying, ‘The Center needs more
advisers!’ How can the Vice President deny this?
Avoid costly mistakes by keeping a Customer Focus. After the Focus Group exercise
described earlier, the CGS Director thought that it was a good idea to set aside a study
room at the Center where General Studies majors could meet. One question in the custo-
mer satisfaction survey asked the opinion of students about this idea. To the Director’s
surprise, the BGS majors were not supportive of this idea, and so, the Director abandoned
it. Without hearing the voice of the customer, such a study room would have ended up as a
white elephant.
Lastly, for an institution like a university to undertake a full-blown TQM project, it will
require the participation of everybody, from the President on down, and it will involve all
academic and non-academic departments or units. Much monetary and non-monetary
resources will be needed, and it will definitely take a long time to complete and
succeed. Compared to what some leading corporations have accomplished, it would be
a tremendous accomplishment if the programme is up and running in three years. So,
short of a full-blown TQM approach, what can a university do to start improving the
quality of its service? Indirectly, this article is a suggestion for an incremental approach–
much like the Japanese concept of kaizen. Viewing the university as a forest, we can tackle
it one-tree-at-a-time. That is, undertake a CQI project unit-by-unit. Eventually, we will
cover the whole forest. This, of course, implies that one success breeds another, and
that the university does not run out of top management who would serve as champions
of a worthy cause.
Acknowledgements
CGS team members included Frances Morris, Coordinator for Academic Advising; Julie
Alford, Coordinator for Retention/Academic Counsellor; and Tina Johnson,
Administrative Assistant, all of whom provided valuable input in defining the problems
in the CGS and offering ideas for solutions to these problems.
Grateful acknowledgement is extended also to Dr Bryan Cole of Texas A&M Univer-
sity, who conducted the Training Workshops for Facilitators of Continuous Quality
Improvement Teams in Educational Institutions, which was attended by Dr Carl Montano.
Dr Cole’s handouts on the quality improvement process are very much appreciated and
liberally applied.
References
Clayton, M. (1995) Encouraging the kaizen approach to quality in a university, Total Quality Management, 6(5),
pp. 593–602.
Deming, W. E. (1993) The New Economics for Industry, Government, Education (Cambridge, MA: Massachu-
setts Institute of Technology, Center for Advanced Engineering Study).
Evans, J. R. & Lindsay, W. M. (1999) The Management and Control of Quality, 4th edn (Cincinnati, OH: South-
Western College Publishing).
Greenwald, J. (2001) Tired of each other, Time, pp. 50–56.
Montano, C. B. & Utter, G. H. (1999) Total Quality Management in Higher Education, Quality Progress, 32(8),
pp. 52–59.
1124 C. B. Montano et al.
Mosley, D. C. (1974) Nominal grouping as an organizational development intervention technique, Training and
Development Journal, 28(3), pp. 30–37.
Spanbauer, S. J. (1995) Reactivating higher education with total quality management: using quality and
productivity concepts, techniques and tools to improve higher education, Total Quality Management 6(5),
pp. 519–538.
Yorke, M. (2000) The quality of the student experience: what can institutions learn from data relating to non-com-
pletion?, Quality in Higher Education, 6(1), pp. 61–75.
Quality of Student Advising in Higher Education 1125