47

Analytics July August 2010

Embed Size (px)

DESCRIPTION

Despite all of the philosophers past and present urging us to “live for the moment,” most folks just seem to be more interested in the future. The past? Forget about it. The present? Who’s got time? The future? Bingo! Now you’re talking. So tell me, what’s going to happen tomorrow so I can make all the right moves today? Humanity’s intense interest regarding the future has kept countless soothsayers, fortune-tellers and astrologists busy for centuries, but basing predictions on the alignment of

Citation preview

Page 1: Analytics July August 2010
Page 2: Analytics July August 2010

Despite all of the philosophers past and present urging us to “live for the moment,” most folks just seem to be more interested in the future. The past? Forget about it. The present? Who’s got time? The future? Bingo! Now you’re talking. So tell me, what’s going to happen tomorrow so I can make all the right moves today?

Humanity’s intense interest re-garding the future has kept countless soothsayers, fortune-tellers and as-trologists busy for centuries, but bas-ing predictions on the alignment of the stars is akin to tossing a coin: you’re wrong as often as you’re right.

Perhaps no one is more interested in the future than corporate officers whose company’s success – and per-haps their jobs – depend on their ability to make the “right” business decisions “going forward.” Until fairly recently, C-level executives made these critical de-cisions based on personal experiences, intuition and what their “gut” told them. Unfortunately, their “gut” was about as reliable as tossing a coin.

Thanks to the explosion of data and the analytics to derive insight from it, virtually every enterprise now has the ability to see more clearly into the future than ever before. As Andy

Boyd points out in introducing a trio of cover stories of the subject, forecast-ing is “one of the most powerful and widely used tools by analytics profes-sionals, and rightly so” because “good forecasts lead to better decisions.”

Of course, predicting the future re-mains tricky business, even in the age of analytics and data-driven decision-making tools like forecasting. Boyd, for example, points out the myriad barriers forecasters typically encoun-ter in a corporate setting and how to overcome them in order to achieve forecasting’s full potential (page 8).

Next, co-authors Michael Gilliland and Udo Sglavo list the “worst practic-es in business forecasting” (page 12). It’s not a pretty picture, but it provides valuable insights for any forecaster whose forecasts, as the co-authors put it, “never seem to be as accurate as we would like them to be – or need them to be.” Jack Yurkiewicz wraps up our forecasting features by address-ing perhaps the toughest question a forecaster can face: With so many forecasting products to choose from, which one is right for me (page 18)? ❙

Page 3: Analytics July August 2010

DRIVING BETTER BUSINESS DECISIONS

8

12

18

24

28

33

Easier, More Powerful – and Far More Visual.

From its classic Solver Parameters dialog to its modernRibbon, modeless Task Pane, and charts to visualize the

and produce instant reports, charts and graphs, withoutprogramming. And you can do even more in Excel VBA.

and a wide array of charts and graphs. One simple,consistent user interface makes learning and teaching

Free Trials and Support for Your E orts at Solver.com.

about the many other ways we can support yourteaching and learning e orts – all at www.solver.com.

Tel 775 831 0300 Fax 775 831 0314 [email protected]

28

8

Page 4: Analytics July August 2010

Analytics (ISSN 1938-1697) is published six times a year by the Institute for Operations Research and the Management Sciences (INFORMS). For a free subscription, register at http://analytics.informs.org. Address other correspondence to the editor, Peter Horner, [email protected]. The opinions expressed in Analytics are those of the authors, and do not necessarily reflect the opinions of INFORMS, its officers, Lionheart Publishing Inc. or the editorial staff of Analytics. Analytics copyright ©2010 by the Institute for Operations Research and the Management Sciences. All rights reserved.

DRIVING BETTER BUSINESS DECISIONS

1

4

6

40

43

45

REGISTER FOR A FREE SUBSCRIPTION:http://analytics.informs.org

INFORMS BOARD OF DIRECTORS

President Susan L. Albin, Rutgers University President-Elect Rina Schneur, Verizon Network & Tech. Past President Don M. Kleinmuntz, Strata Dec.Tech. Secretary Anton J. Kleywegt, Georgia Tech Treasurer Stephen M. Robinson, Univ. of Wisconsin Vice President-Meetings Robin Lougee, IBM Research Vice President-Publications Terry P. Harrison, Penn State University Vice President- Sections and Societies Ariela Sofer, George Mason University Vice President- Information Technology Warren Lieberman, Veritech Solutions Vice President-Practice Activities Jack Levis, UPS Vice President-International Activities Jionghua “Judy” Jin, Univ. of Michigan Vice President-Membership and Professional Recognition Pinar Keskinocak, Georgia Tech Vice President-Education Jayashankar M. Swaminathan, UNC Vice President- Marketing and Outreach Anne G. Robinson, Cisco Systems Inc. Vice President-Chapters/Fora Stefan Karisch, Jeppesen

INFORMS OFFICES

www.informs.org • Tel: 1-800-4INFORMS Executive Director Mark G. Doherty Marketing Director Gary Bennett Communications Director Barry List Corporate, Member, INFORMS (Maryland) Publications and 7240 Parkway Drive, Suite 300 Subdivision Services Hanover, MD 21076 USA Tel.: 443.757.3500 E-mail: [email protected]

Meetings Services INFORMS (Rhode Island) 12 Breakneck Hill Road, Suite 102 Lincoln, RI 02865 USA Tel.: 401.722.2595 E-mail: [email protected]

ANALYTICS EDITORIAL AND ADVERTISING

Lionheart Publishing Inc., 506 Roswell Street, Suite 220, Marietta, GA 30060 USATel.: 770.431.0867 • Fax: 770.432.6969

President & Advertising Sales John Llewellyn [email protected] Tel.: 770.431.0867, ext.209 Editor Peter R. Horner [email protected] Tel.: 770.587.3172 Senior Art Director Alan Brubaker [email protected] Tel.: 770.431.0867, ext.218 Art Director Kat Wong [email protected] Tel.: 770.431.0867, ext.223 Advertising Sales Sharon Baker [email protected] Tel.: 813.852.9942

40

45

Get Results Quickly with our Latest Tools.

You need a fast way to prototype and test your and

Page 5: Analytics July August 2010

Stepping off a high dive for the first time can be a frightening experience. It’s some-thing many of us never forget. I well remem-ber my first plunge. It must have taken a dozen trips up the ladder – and back down – before I gathered the courage to jump. Fear kept me from my goal, even though I knew there wasn’t any real danger. I’d seen many people do it before. I certainly knew the wa-ter would break my fall. Yet my apprehension kept me from taking the final leap.

The use of analytics can require a similar leap. The tools of analytics are often math-ematically sophisticated. How can decision-makers be expected to rely on numbers if they don’t fully understand where those numbers come from? Consider the following example:

The CFO of a division of a large corporation is evaluating five different investments. If un-dertaken, each investment requires a capital outlay now and a capital outlay in six months, with revenue realized 12 months in the future. For example, Project 1 requires a capital in-vestment of $11 million now and $3 million six

months from now, and will generate $16 mil-lion in 12 months.

The investments along with their capital outlays are shown in Table 1. The amounts of capital the CFO has available to spend now ($26 million) and in six months ($12 million) are also shown in this table. Capital that is not used now may not be carried forward and in-vested in six months.

A project may be undertaken in a fractional amount at a fractional cost, but it also gener-ates a fractional return. For example, the CFO may choose to take half a position in Project 1 at a cost of $5.5 million today and $1.5 million in six months, but will only realize a return of $8 million. Of the five potential projects, what positions should the CFO take in order to max-imize revenue one year hence?

The problem is straightforward as stated, and open to whatever logic the CFO might

want to use to solve it. He might, for example, recognize that Project 5 has the highest re-turn and invest in that project alone, generat-ing $40 million (it would exhaust all available investment capital in the sixth month, so no other projects could be undertaken). When he arrives at a solution, he can trace the logic and defend his position.

Analytics professionals approach the problem differently, expressing the problem in a slightly more mathematical form, as shown in Table 2.

Any solution to the mathematical con-straints in Table 2 corresponds in a natural way to a solution of the original problem. A key advantage to expressing the problem mathematically is that it’s very precise. The mathematical formulation can be provided to an optimization algorithm with a request to return an optimal solution – values x1

, x2, x

3,

Table 1: The CFO problem (amounts in millions of dollars).Table 1: The CFO problem (amounts in millions of dollars)

Table 2: The CFO problem expressed as a mathematical model.

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 6: Analytics July August 2010

x4, and x

5 that, among all feasible solutions,

do so with the highest revenue. Analytics professionals develop algorithms for finding such solutions, and understanding these al-gorithms requires advanced mathematical training. But analytics professionals also de-vote considerable time to modeling: taking business problems like those faced by the CFO and converting them to mathematics.

Analytics professionals don’t concern themselves with optimal solutions when modeling. Instead, they focus on constraints. How much capital is available? What posi-tions can be taken? If the constraints are correctly expressed, they accept the answer supplied by the optimization algorithm. They take a leap of rationality.

And a leap of rationality is often required. It’s not difficult to generate some good solu-tions to the CFO Problem, and the reader is encouraged to try. The optimal solution is given at the end of this article. However, it’s not intuitive, and even those trained in optimi-zation rarely find it by trial and error. Analyt-ics professionals have no trouble accepting the solution because they understand the underlying mathematics. Without such an understanding, however, acceptance is un-derstandably more difficult.

How can non-mathematicians gain the confidence to use solutions from mathemati-cal optimization? First and foremost, it re-quires help from an analytics professional who understands more than mathematics. Not surprisingly, people who enjoy analytics

enjoy mathematics. But seasoned analytics professionals are acutely aware that success is the most important aspect of an analytics project, and that success requires far more than good mathematics. Such professionals understand how to communicate with ex-amples and straightforward language, not Greek characters.

And that level of communication is vi-tal. Decision-makers get where they are because they make decisions. That means finding solutions to problems, not model-ing them and handing them to a computer. Decision-makers who have no familiarity

with optimization will reason out a solution. But it can be extremely costly when bet-ter solutions exist. Decision-makers don’t need to learn the mathematics of optimi-zation, but they must understand the con-cepts well enough to overcome their fear. Only then are they freed to take the final leap of rationality and use analytics with confidence. ❙

Andrew Boyd served as executive and chief scientist at an analytics firm for many years. He can be reached at [email protected]. The optimal solution to the CFO Problem is x

1 = 1,

x2 = 3/4, x

3 = 0, x

4 = 1, and x

5 = 1/4, with an associated revenue of

$52 million.

Page 7: Analytics July August 2010

SCENE I: I recently received a voice mail from my old friend “Elk.” The tone of his mes-sage was urgent: “I’ve got a court date on Fri-day to fight this speeding ticket, and I need your help with some serious math!”

As I listened to his message, I figured the dude was almost surely guilty as charged. Elk drives fast cars and he drives them fast. A well-preserved memory from college: Elk racing his sports car down a two-lane coun-try road at 90 mph with me in the passenger seat, the windows open, rock-n-roll blaring from the stereo. I immediately envisioned that I would be stuck solving some kind of dif-ferential equation problem, a much less well-preserved collegiate memory. Finally, I was afraid that I would get sucked into spending precious time that I didn’t have to help him out, and inevitably end up even further be-hind on my mounting to-do list.

I called him back anyway (I am, alas, that kind of friend). Turns out he was driving a minivan and he swore to me that he was not speeding and that the police officer had grossly exaggerated his speed “to meet his monthly quota.” In just a few minutes, look-ing at the map together online, we managed to put together a simple spreadsheet that, even with conservative assumptions, dem-onstrates that the officer’s version of the facts literally did not add up.

The next day, he brings printouts from three different scenarios into court with him, and calmly explains our logic to the judge. Elk is rewarded with a full acquittal.

SCENE II: Last winter I received an e-mail from a nursing professor across campus who is a consultant to a local research hospital. The focus of her efforts is improving the hos-pital’s pediatrics operations, and she has been trying to understand how to improve the flow of patients. Groping around for ideas, she had come to believe that queueing and simulation models might be the key to im-proving patient flow.

I agreed to meet with her to learn more (I am, alas, that kind of colleague). She ex-plains that the goal of her work is to reduce the number of patients who are turned away due to lack of available beds and also to minimize the number of sick patients stranded in waiting rooms for long periods of time while waiting for a bed. Given the wide range of patients, ill-nesses, treatments, equipment and outcomes, the pediatrics facility that she describes is a very complex system. Though I was told that there is a huge amount of historical data, when I ask about very specific information, the most common answer is, “I think we can probably get that.” After spending time observing the folks who actually decide which patient gets a bed in which room in which unit at which time, I realized that we are a long, long way from be-ing able to create a simulation model that will have any business value at all.

Along the way, however, what I do come to understand is that these folks could ben-efit greatly from a forecasting model that could help them anticipate when different types of patients are likely to arrive. Along with another quant colleague, we’re working on a prototype of a data-driven forecasting system for them now.

SCENE III: A few weeks ago, I received an e-mail from the director of customer care ana-lytics at a local company with a big national profile with a reputation for excellent customer service (referred to herein as “Slick”). She had contacted me because of my background in call center operations, both as a consultant and as a researcher, though the conceptual problem that she outlined in her e-mail appeared to be quite straightforward to me.

I responded immediately to her e-mail, thinking that I could probably learn some-thing from talking to folks who are consid-ered to be on the cutting edge (I am, alas, that kind of professor). And I definitely did. Although Slick had acquired a leading com-mercial system for call forecasting and agent scheduling some years ago, they had only just recently begun using it on a regular ba-sis. The customer care analytics team at Slick had also recognized that, because of some very innovative data-driven manage-ment practices their customer service group has adopted, they also have a unique need for another layer of prescriptive analytics that no commercial vendor was likely to address.

Page 8: Analytics July August 2010

I am going to visit them next week with some ideas about how they might proceed; it turns out to be quite a challenging problem.

CONCLUSION: My business is studying and applying data-driven models and teaching my students to do the same. Somehow each of these three encounters with other people’s problems has helped me understand my own world a little bit more clearly. Sometimes, as in the case of Elk, a simple model gives you all the insight that you need for the situation you find yourself in. Other times, as in the case of the pediatrics group at the hospital, the complexity of your environment means that you have to start with something really focused and simple to get immediate value and generate ideas about what might come next. And finally, as in the Slick situation, one set of analytics-driven innovations may end up requiring you to come up with another more challenging one as well.

Anyway, these are a couple of my recent adventures. I am looking forward to hearing some of these types of stories from you as well. Drop me an e-mail or give me a call. You can be confident that I’ll respond. (I am, alas, that kind of guy.) ❙

Vijay Mehrotra ([email protected]) is an associate professor, Department of Finance and Quantitative Analytics, School of Business and Professional Studies, University of San Francisco. He is also an experienced analytics consultant and entrepreneur and an angel investor in several successful analytics companies.

INFORMS is the foremost association of O.R. and analytics experts. Our members literally wrote the book on how analytics and the principles of operations research are used to improve corporate or governmental decision making.

To find an expert to help you, log onto INFORMS O.R. Professional Database athttp://www.scienceofbetter.org/find.

http://www.informs.org

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 9: Analytics July August 2010

FORECASTING IS one of the most powerful and widely used tools by analytics pro-fessionals, and rightly so. It

rests on a firm mathematical foundation. It’s taught in universities world over. And it’s extremely useful. Whether forecast-ing the number of people expected to buy newspapers or the demand for computers, good forecasts lead to better decisions.

At the same time, forecasting is exam-ined and impugned more than any other

area of analytics – so much so that fore-casts are routinely ignored or thrown out. In this article, we discuss some of the bar-riers to forecast acceptance within an or-ganization and ways to overcome them.

FORECASTS RECEIVE scrutiny because they’re easy to understand. For purpos-es of discussion, consider the number of cars rented on a Monday morning by a car company at a major airport. The company

has reservations information, but some people who have booked won’t show up. Others will walk up to the counter looking to rent at the last moment. Forecasts of the actual number of people who rent are invaluable for both profitability and cus-tomer satisfaction.

A basic forecast of next week’s demand can be generated by looking at Mondays in the past. Figure 1 shows both observed demand and forecasts made by averaging the observed demand from all prior weeks.

Are these forecasts good or bad? Some common responses might include:• They look too flat. Something is being

missed.• They aren’t close enough to what

actually occurred. • They look pretty good.• The forecasting methodology is too

simple. Certainly something more sophisticated can be employed to generate better forecasts.

Figure 1: Observations and forecasts for the number of car rentals at an airport location on Mondays. Forecasts are made one week in advance and represent the average of observed demand in all prior weeks.

Page 10: Analytics July August 2010

All are valid responses. Individuals have their own ideas about “good” that affect their evaluation of forecasts. Differ-ing perspectives are natural and can be a healthy part of the forecasting process, but they can’t be ignored and treated as a nuisance rather than a fundamental issue. Individuals who create forecasts must be

willing to devote as much effort to two-way communication about their forecasts as they do to creating them. A consensus must be reached that forecasts are good – or at least good enough – if they are to be used.

WHILE “GOOD” IS largely subjective, some forecasting methods are clearly better than others. An average of past observations is better than a forecast based on phases of the moon, but is an average better than

exponential smoothing? When predicting the sale of umbrellas on a street corner, a forecast that incorporates the likelihood of rain is better than one that doesn’t. But are there other predictors that can further improve forecasts? Good forecasts must meet certain basic criteria, most notably, the choice of reasonable predictors and a reasonable forecasting method. But these basic criteria aren’t by themselves enough to ensure forecasts will be per-ceived as good.

Measuring forecast error is also vital to establishing whether a forecasting method is good – at least in comparison to alter-natives. Error measurement is especially useful for eliminating poor forecasts that may on the surface appear good. Con-sider, for example, the forecasts shown in Figure 2. They clearly overcome the objection of looking too flat, and they vi-sually appear to track the actual obser-vations quite well, but the mean absolute deviation is 40 percent higher than the forecasts of Figure 1. The method used in Figure 2 uses this week’s observation as next week’s forecast.

Useful as error measurement is when comparing forecasting methods, it can’t re-solve the fundamental question of whether forecasts are perceived as good. The fore-casts in Figure 1 show less error than those in Figure 2, but are they good?

DISAGREEMENTS ABOUT a best forecast-ing method can create barriers to suc-cess, though many of these barriers can be overcome by recognizing there’s no absolute right in forecasting. If multiple forecasting methods are considered good, and the question is which is best, then a forecasting initiative is well on its way to success.

A more significant obstacle is when the best forecasting efforts fail to generate any forecasts that are perceived as good. There are times when forecasts are truly inad-equate. However, frequently the forecasts are quite adequate for the task at hand, but decision-makers won’t accept them be-cause of preconceptions about what good forecasts should look like. Through exten-sive data collection and experimentation, a forecaster may determine that a simple average provides the lowest overall fore-casting error. On the other hand, if a deci-sion-maker thinks the forecasts are too flat to be indicative of actual demand, he may refuse to use them.

Further complicating the situation is that no matter how much time a forecast-er may spend experimenting, there are always more predictors and more fore-casting methods to be evaluated. This is often coupled with an underlying belief that if the right predictors can just be found

Figure 2: Observations and forecasts for the number of car rentals at an airport location on Mondays. Fore-casts are made one week in advance, using the present week’s demand as the following week’s forecast.

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

Page 11: Analytics July August 2010

and woven together with the appropriate forecasting methodology, better (good) forecasts will emerge. Thus, extensive time, effort and money may be devoted to a task with little hope of achieving signifi-cant improvement. The number of people who show up to rent cars at an airport is, in the end, a random process. If variabil-ity is high enough, even the best efforts can reduce forecast error only so much.

Another common barrier to deeming forecasts good is that as forecasting grows more sophisticated, it also becomes more difficult to determine exactly why a par-ticular forecast is what it is. Consider, for example, the forecasts shown in Figure 3. Looking at this figure, most individuals would agree that the forecast in week 15 is bad – that something must be wrong with the forecasting method. In actuality, the forecast includes the fact that a large convention is scheduled to be in town.

A convention is straightforward to un-derstand, and most decision-makers would react by saying that the forecast wasn’t as bad as they initially thought. But what if the forecast was the result of the interaction of a number of predictors through a complex mathematical formula? One of the simplest ways to evaluate forecasts is to view them in a format similar to Figure 3. Because it’s so simple, it’s frequently used by decision-makers to evaluate forecasts even though

it can be quite misleading. When the for-mulas grow complicated enough, there’s rarely a picture that can adequately explain exactly what’s going on.

The anecdotal “failure” of a handful of poor forecasts also has the potential to call an entire forecasting methodology into question. If millions of forecasts are being generated, as is often the case in opera-tional forecasting systems, it’s likely that some forecasts will look bad due to un-common data conditions – quite possibly errors in the input data. A forecast of one rental for a Monday at La Guardia would rightly raise eyebrows. While such fore-casts require investigation, they shouldn’t necessarily cast doubt on the remaining forecasts, though they often do.

WITH SO MANY potential barriers, getting an organization to accept forecasts as good is challenging. However, there are a number of guidelines all forecasting ini-tiatives can benefit from.

• Always be aware of the goal. Fore-casting is undertaken to improve some aspect of running a business. If no fore-casting methodology is in place (some-one is pulling numbers out of the air), then virtually any systematic effort based on numbers will represent an improvement.

If a forecasting methodology is already being used, then a baseline for a fore-casting initiative is to improve upon what exists. In either case, these are very dif-ferent goals than generating forecasts that achieve some platonic ideal of good. Is a flat forecast not good enough when the alternative is no forecast at all? Fore-casting initiatives far too often lose sight of the goal of improvement.

• Don’t get lost in the search for im-proved forecasts that don’t exist. Im-proving forecasts is a laudable goal and worth pursuing whenever there’s

evidence that additional effort will in-deed yield better forecasts, but it’s counterproductive to seek improvement when it doesn’t appear improvement is possible. Switching to an entirely differ-ent paradigm (from regression to neural nets, for example) rarely if ever works. More often than not, the underlying ran-dom process simply won’t allow for the desired improvement. Accept the state of affairs, deal with the business prob-lem and move on.

• Recognize that forecasting isn’t always the answer. Forecasting is

Figure 3: Forecasts for the number of car rentals at an airport location on Mondays.

Page 12: Analytics July August 2010

sometimes used as a band-aid for try-ing to solve more fundamental business problems. Consider the monthly de-mand for container space experienced by a container shipping company at one particular port, as shown in Figure 4. If the company could forecast demand

with sufficient accuracy, it could posi-tion and schedule its ships profitably. But the demand is lumpy and erratic, and forecasting can’t solve the prob-lem. Trying to forecast a way out of the predicament would lead to failure.

Alternatively, the company could ex-amine its business processes. For ex-ample, if the company booked space by simply waiting for its customers to call a few days in advance, it could consid-er working with its largest customers to

improve the flow of communication so that large shipments weren’t a surprise. Or the company could consider selling blocks of space well in advance for a re-duced rate. Similar approaches address the fundamental business problem, which is not a forecasting problem and can’t be solved with forecasting.

In addition to general guidelines, there are techniques experienced forecasters can use to improve the chances of a suc-cessful forecasting initiative.

• Educate. Preconceptions change with experience. By introducing individu-als without forecasting experience to an array of examples tailored for educational purposes, they will come to better under-stand what can and can’t be achieved with forecasting.

• Don’t set unrealistic expectations. It’s good to set out with fervor in an ef-fort to generate the best possible fore-casts. Exuberance is a good thing. But it can raise expectations among others to a point that those expectations can’t be met. It’s best to convey a sense of benign diligence when a forecasting ini-tiative begins and to share exciting re-sults if and when they become available. Frequently, in an effort to gain support for an initiative, sponsors may set very high, even unrealistic, expectations. In such cases, it’s necessary to go back

and carefully reset those expectations once actual work begins.

• Consider simpler forecasting meth-ods. Sophisticated forecasting methods offer potential forecast improvements but are often difficult to understand without an advanced degree in statis-tics. Yet, many good methods exist that are comprehensible to a much broader audience and often yield forecasts of comparable quality to their sophisticat-ed cousins. Forecasts are more likely to gain acceptance when there’s some understanding of where they’ve come from. Also, sophisticated methods are often employed with the expectation of big improvements over simpler meth-ods. If actual improvements don’t meet expectations, a new barrier to success is constructed.

Forecasting is a powerful tool for im-proving business operations. To take advantage of this tool, forecasts must be embraced by those who use them. By recognizing that individuals have their own ideas about what makes fore-casts good, and elevating the process of organizational acceptance to its rightful place, forecasting can fully realize its potential. ❙

Andrew Boyd served as executive and chief scientist at an analytics firm for many years. He can be reached at [email protected].

Figure 4: Observed monthly demand by a container shipping company over a two-year period.

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 13: Analytics July August 2010

OUR FORECASTS NEVER seem to be as accurate as we would like them to be – or need them to be. But why

are forecasts wrong, and sometimes so terribly wrong? Forecasts fail to deliver the level of accuracy desired for at least four reasons:• Unsound software lacking necessary

capabilities, has mathematical errors in it or facilitates inappropriate methods.

• Untrained, unskilled, inexperienced or unmotivated forecasters exhibiting behaviors that fail to improve the forecast or even make it worse.

• Political contamination of the forecasting process driven by the whims and personal agendas of process participants who use the forecast to represent what they want to occur, rather than as an “unbiased best guess” of what is really going to occur.

• Unforecastable behavior – the nature of what is being forecast (e.g., customer demand) is such that it cannot be forecast to the degree of accuracy desired. There may be a strong temptation for

management to just throw money at the forecasting problem in hopes of making it go away. But how many organizations are you aware of – perhaps even your own – that have thrown thousands or even millions of dollars at the forecasting problem, only to end up with the same old lousy forecasts?

No software, no matter how powerful, and no analyst, no matter how talented, can guarantee perfect (or even highly ac-curate) forecasts. The objective should be to deliver forecasts as accurate as can reasonably be expected given the nature of what is being forecast. For example, if we are asked to forecast heads or tails in the toss of a fair coin, we will be correct about 50 percent of the time over a large number of trials. It doesn’t matter that our boss may want us to achieve 60 percent accuracy or that our efforts are funded with millions of dollars in new comput-ers and software. Forecast accuracy is ultimately limited by the nature of the be-havior being forecast. Investment in fore-casting process, systems and people will pay off up to this limit of accuracy but can

take you no further. Unfortunately, most organizations fail to achieve this limit be-cause of “worst practices” that confound their forecasting efforts. This article iden-tifies several common worst practices in business forecasting.

1. Overly complex and politicized forecasting process

The forecasting process can be de-graded in various places by the biases and personal agendas of participants. The more elaborate the process, with more human touch points, the more op-portunity exists for these biases to con-taminate what should be an objective and scientific process. Can you believe anyone when it comes to forecasting? Perhaps not.

Those who have some input or author-ity in the forecasting process can use this influence to their own benefit. If you are a sales rep and it is quota-setting time, aren’t you going to try to lower expecta-tions and get easier-to-beat quotas? If you are a product manager with a new product idea, aren’t you going to forecast at least high enough to meet the minimum hurdles for new product development ap-proval? (No one is going to forecast their new product idea to fail in the market-place, even though that is the most likely outcome.) Just about every participant

Page 14: Analytics July August 2010

has a special interest of some sort, and these must be accounted for.

Elaborate and overly complex forecasting processes may also be a result of poor use of organizational resources. Do each step and each participant in your process actually make the forecast better? Can these participants be reassigned to more worthwhile activities in the organiza-tion? Consider using a method called Forecast Value Added (FVA) analysis [1] that streamlines the process by identifying process waste and ineffi-ciencies (activities that are not mak-ing the forecast any better).

2. Selecting model solely on “fit to history”

One practice common in unsound software, or in the misuse of good soft-ware, is choosing the forecasting mod-el based solely on the model’s “fit to history.” The software provides (or the forecaster builds) several models, and these are each evaluated compared to recent history. The model that most closely matches the recent history is then chosen and used for creating fore-casts of the future.

Remember that our objective isn’t just to fit a model to history – it is to find an appropriate model for forecasting

future behavior. It so happens that fit-ting a model to history is easy, anybody can do it, and it is always possible to find a model that has perfect fit. How-ever, having perfect fit to history is no guarantee that the model will generate good forecasts or is at all appropriate for forecasting. “Over-fitting” models to randomness in the behavior, rather than to the systematic structure, is a common consequence of focusing at-tention solely on fit to history.

3. Assuming model fit = forecast accuracy

How accurate can we expect a fore-casting model to be? Inexperienced forecasters, and those outside of fore-casting, may assume that a model’s fit to history indicates how accurately the model will forecast the future. So if the error of the historical fit is 20 percent, then the error of the future forecasts will also be 20 percent. This is a very bad assumption. One of the dirty tricks of software vendors is to only show you how well they can fit their models to your history, but never show you how well they really forecast.

For a lot of reasons, forecast ac-curacy will almost always be worse, and often much worse, than the fit of the model to history. For example, we

2010 INFORMS Annual MeetingNovember 7-10, 2010Austin Convention Center & Hilton Austin

REGISTER TODAY!

General ChairJonathan BardThe University of Texas at [email protected]

Program ChairDavid MortonThe University of Texas at [email protected]

Invited Sessions Co-ChairsAnantaram BalakrishnanThe University of Texas at Austin

Natarajan GautamTexas A&M [email protected]

Sponsored Sessions ChairIvilina PopovaTexas State University

Contributed Sessions ChairKumar MuthuramanThe University of Texas at Austin

Tutorials ChairJohn HasenbeinThe University of Texas at Austin

INFORMS LiaisonPaul A. JensenThe University of Texas at Austin

Plenary Sessions ChairElmira PopovaThe University of Texas at Austin

Practice Program ChairKevin Furman ExxonMobil

Arrangements Co-ChairsWes Barnes The University of Texas at Austin

James S. DyerThe University of Texas at Austin

Interactive Sessions Co-ChairsKaren SmilowitzNorthwestern University

Sila ÇetinkayaTexas A&M University

Teacher Program Liaison ChairJ. Eric BickelThe University of Texas at Austin

Fundraising ChairErhan KutanogluThe University of Texas at Austin

www.informs.org/Conf/Austin2010

THANKS TO OUR SPONSORS

Page 15: Analytics July August 2010

may have chosen an inappropriate model – one that happens to fit the history but does not capture the underlying mecha-nisms that guide the behavior. Or, we may have specified the right model that does correctly express the behavior – but then the behavior may change in the future. Whenever you are reviewing software to purchase, or even just reviewing the performance of the software you already have, make sure to focus on accuracy of the future forecasts . . . and not on the ac-curacy of the fit to history.

4. Inappropriate accuracy expectationsForecast accuracy is ultimately lim-

ited by the nature of the behavior we are trying to forecast. If the behavior exhibits smooth, stable, repeating pat-terns, then we should be able to fore-cast it quite accurately with simple methods. If the behavior is wild and er-ratic with no structure or stability, then we have no hope of forecasting it well, no matter how much time and money and resources we invest trying to do so. The most sophisticated methods in the world aren’t going to let us forecast

unforecastable behavior, and we have to learn to live with that reality.

The worst practice is having inappro-priate expectations for forecast accuracy and wasting resources trying to pursue unachievable levels of accuracy.

A better practice is to utilize what is called a naïve forecasting model. A naïve model is something simple and easy to compute, like a random walk (using the last known value as your future forecast), a seasonal random walk (such as using the known value from a year ago as your forecast for the same period this year), or a moving average of observations with a small sliding window (such as the aver-age value of the last three periods).

You can think of the naïve model as being something free – you don’t need expensive systems or an elaborate fore-casting process – you don’t need any-thing at all. A naïve model will achieve some level of forecast accuracy, say 60 percent. This 60 percent accuracy then becomes the baseline against which all your forecasting efforts are evaluated. If your process cannot do any better than a naïve model, why bother?

5. Inappropriate performance objectivesFailing to understand what forecast

accuracy is reasonable to achieve with your demand patterns can lead to setting

inappropriate performance objectives. As mentioned above, you cannot consis-tently call the tossing of a fair coin cor-rectly other than 50 percent of the time, so it makes no sense to give you a goal of achieving 60 percent accuracy. The same applies to forecasting demand. While management may want to achieve 90 percent accuracy, the nature of the de-mand patterns may be such that 90 per-cent is not achievable.

Goals are often assigned based on what accuracy management feels it “needs.” For example, it isn’t uncom-mon to have blanket goals such as “ac-curacy > 80 percent” across all products, with no consideration for whether this is reasonable.

Goals are sometimes based on indus-try benchmarks that purport to identify “best in class” forecasting performance. However, industry benchmarks are sub-ject to a number of perils and should not be used to set forecasting objectives for your organization.

6. Perils of industry benchmarksBenchmarks of forecasting perfor-

mance are available from several sourc-es, including professional organizations and journals, academic research and from private consulting/benchmarking or-ganizations. But there are many reasons

why industry benchmarks are irrelevant in setting your own forecasting performance objectives.1) Can you trust the data? Are the

numbers based rigorous audits of company data or responses to a survey? If un-audited survey responses, do the respondents actually know the answers or are they just making it up?

2) Is measurement consistent across the respondents? Are all organizations forecasting at the same level such as by product, or customer or region? Are they forecasting in the same time bucket such as week or month? Are they forecasting by the same lead-time offset, such as three weeks in advance, or three months? And are they using the same metric? Even metrics as similar sounding as MAPE (mean absolute percentage error), weighted MAPE and symmetric MAPE can deliver very different values for the same data.

3) Finally and most important, is the comparison even relevant? Does the benchmark company have equally forecastable data? Let’s consider a worst-case example: Suppose a benchmark study shows

that Company X has the lowest forecast error. Consultants and academics then

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

Page 16: Analytics July August 2010

converge on Company X to study its forecasting process and publish reports touting Company X’s “best practices.” Other companies read these reports and begin to copy Company X’s “best prac-tices.” However, upon further review, FVA analysis is applied, and we discover that Company X had very easy-to-forecast demand, and that they would have had even lower forecast error had they just used a moving average. In other words, Company X’s so-called “best practices” just made the forecast worse!

This example is not far-fetched. And many organizational practices, even pur-ported best practices, may only make their forecast worse.

Industry benchmarks for forecasting performance should be ignored. Bench-marks tell us what accuracy the so-called “best in class” companies are able to achieve, but they do not tell us how fore-castable their demand is. Companies at the top of the benchmark lists may be there simply because they have the eas-iest-to-forecast demand – not because their forecasting processes are worthy of admiration. Without information on fore-castability, industry benchmarks are ir-relevant and should not be used to set performance objectives.

Also, objectives should not be set ar-bitrarily, based on management’s wants

or needs. It makes no sense to set ar-bitrary, blanket objects such as “forecast accuracy must be > 80 percent” without any consideration of the forecastability of the demand. If the objective is set too high it will demoralize the forecasters and encourage them to cheat. If the objective is set too low, such that a naïve forecast could beat it, then the forecasters can simply idle at their desks all year and still make the goal.

The better practice is to tie the fore-casting performance objective to the un-derlying forecastability of the demand patterns, and the way to do this is to use a naïve forecast as the baseline. Perhaps the only reasonable objective is to beat the naïve model (or at least do no worse!) and to continuously improve the forecast-ing process. You improve the process not only by making the forecasts more accurate and less biased, but by making the process more efficient – using fewer and fewer resources and automating as much as possible. This is where good au-tomated forecasting software can be very effective.

7. Adding variation to demandThe forecastability of demand is large-

ly dependent on the volatility of that de-mand. When demand is smooth and stable, it can be forecast accurately with

simple methods. When demand is erratic and random, it is unreasonable to expect accurate forecasts.

The scatter plot in Figure 1 compares forecast accuracy (from 0 percent to 100 percent on the vertical axis) to the vola-tility of the sales pattern (as measured by the coefficient of variation) along the horizontal axis. It is based on one year of weekly forecasts for 5,000 stock-keep-ing units (SKUs) at a consumer goods

company. For SKUs with greater volatility (moving to the right in the plot), forecast accuracy tended to decrease.

Volatility analysis suggests that what-ever we can do to reduce volatility in the demand for our products, the easier they should be to forecast. Unfortunately, and the worst practice here, is that most orga-nizational policies and practices are de-signed to add volatility to demand rather than make it more stable.

Figure 1. Forecast accuracy versus volatility.

Page 17: Analytics July August 2010

Everyone is familiar with the quar-ter end push, or “hockey stick” – where companies do everything possible at the end of the quarter to make the short-term sales target. Figure 2 shows shipments from a consumer goods manufacturer to its custom-ers, the retail stores. Shipments are shown by the thin line, and you can see the big spikes at quarter end and the big drop off at the start of every new quarter.

The thicker line shows consumer purchases from the retail store. Con-sumption is quite stable, and just using the mean would have provided a fairly accurate forecast.

The variation of the shipment pat-tern is three times the variation of the retail sales pattern. These highly erratic and hockey stick patterns are encour-aged by our financial practices, such as hitting the quarter end revenue tar-gets, and by our sales and promotional

Figure 2: Shipments versus consumption.

Page 18: Analytics July August 2010

practices, such as cutting prices or offering other incentives that spike the demand. In many industries, customers have been trained to wait for quarter end to get the best deals.

Instead of policies that encourage volatile demand from your customers, a better practice (at least to improve forecasting) is to remove those kinds of incentives, or create incentives that encourage smooth and stable de-mand. In addition to being able to fore-cast smooth demand more accurately, smooth demand should be easier and cheaper to service, so you can reduce costs. Organizations can apply these same sorts of analyses with their own data and obtain visibility into the issues that may be created by their existing policies and practices [2].

8. New product forecastingNew product forecasting (NPF) is in-

herently difficult, and usually inaccurate. A worst practice is making business deci-sions based on the assumption that new product forecasts are going to be accu-rate – because they probably won’t be!

Since there is no historical demand data for the new product, the forecast is largely based on judgment. Often the forecast is provided by the product man-ager or general manager who is advo-cating introduction of the product, and almost assuredly the forecast will be high enough to exceed internal hurdles to get new products approved for development. When justification for the forecast is re-quired, a common method is to refer to past products, sometimes called “like items,” that are similar to the new prod-uct. This is known as forecasting by anal-ogy. While this approach is legitimate, it is subject to selection bias – of only utiliz-ing prior products that were successful. Since most new products fail in the mar-ketplace, basing a forecast only on suc-cessful product introductions creates an unjustifiably optimistic perception.

While there are dozens of methods available purporting to improve new product forecasting accuracy, the most important thing is being aware of the un-certainties and likely range of outcomes. Too much confidence in the accuracy of your new product forecast can lead to dangerously risky business decisions, and that is what we want to avoid. A struc-tured analogy approach can be useful in many NPF situations [3]. It augments hu-man judgment by automating historical

data handling and extraction, incorpo-rating statistical analysis and providing visualization of the range of historical out-comes. The software makes it possible to quickly extract candidate products based on the user-specified attribute criteria. It aligns, scales and clusters the historical patterns automatically, making it easier to visualize the behavior of past new prod-ucts. This visualization helps the fore-caster realize the risks, uncertainties and variability in new product behavior.

Expectations for the accuracy of new product forecasts should be modest, and acknowledgement of this uncertainty should be at the forefront. The structured analogy approach allows the organization to both statistically and visually assess the likely range of new product demand so that it can manage accordingly. Rather than lock in elaborate sales and supply plans based on a point forecast that is likely to be wrong, the organization can use the structured analogy process to as-sess alternative demand scenarios and mitigate risk.

Judgment is always going to be a big part of new product forecasting – as of today a computer will not be able to tell us what is going to be the hot new fashion color. But judgment needs as-sistance to keep it on track and as ob-jective as possible. While the structured

analogy approach can be used to gen-erate new product forecasts, it is also of great value in assessing the reason-ableness of forecasts that are provided from elsewhere in the organization. The role of structured analogy software is to do the heavy computational work and provide guidance – making the NPF process as automated, efficient and ob-jective as possible.

THIS ARTICLE HAS identified several com-mon worst practices in business forecast-ing. By identifying and eliminating these practices through methods such as FVA analysis, organizations can stop making the forecast worse and can achieve the level of forecast accuracy that is reason-able to expect given the nature of their demand patterns. ❙

Michael Gilliland ([email protected]) is a product marketing manager and Udo Sglavo ([email protected]) is a global analytic solutions manager at SAS Institute Inc., a leading business analytics software and services company based in Cary, N.C.

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

1. See the SAS white paper “Forecast Value Added Analysis: Step-by-Step.”

2. For more thorough discussion of volatility analysis, worst practices and FVA analysis, see Michael Gilliland, “The Business Forecasting Deal,” John Wiley & Sons, 2010.

3. See the SAS white paper “New Product Forecasting Using Structured Analogies” for more information on the service offering.

N O T E S

Page 19: Analytics July August 2010

WE CAN FORECAST this: fore-casting will continue to be in the news. The U.S. govern-ment gives the current un-

employment figures, and then we hear pundits and news sources predict what those figures will be next quarter and next year. We read about the Congres-sional Budget Office’s projections [1] for the country’s deficits (see Figures 1 and 2) through 2019 while speakers and ar-ticles quote and then extol or deride the

projections [2]. The CBO’s opening sen-tence, “Since the Congressional Budget Office last issued its baseline projec-tions … the outlook for the budget defi-cit has deteriorated further,” gives one pause. The projections made just a few months earlier for what will happen in a decade were wrong and had to be updated. In other words, as more data comes in, the forecasts may change.

Economic forecasts can be dicey, especially when the forecasts may

themselves affect the future. With the health care reform package now law, will there be enough doctors to serve the larger insured base? Of lesser impor-tance, how many movies will be filmed in 3D this and next year and how many screens in 2011-12 will be equipped to show these films? Everyone can add his or her own examples to the ones mentioned. What is common to many of these examples is that they fall into the forecasting category frequently called trend analysis.

Most management science, data anal-ysis and operations management texts include at least one forecasting chapter. Many business schools’ curricula indi-cate that they cover some forecasting in at least one mandatory graduate or undergraduate quantitative course. The level of coverage varies. Generally, if computer software is not used in a text or course, the forecasting techniques discussed are those in which the math is relatively easy to demonstrate (e.g., simple exponential smoothing, moving averages, etc.). Trend analysis, which

For a directory of the forecasting software products and their capabilities referenced in this article, go to: http://lionhrtpub.com/orms/surveys/FSS/fss-fr.html

Figure 1 (top): CBO’s projections for U.S. deficits through 2019. Figure 2 (bottom): CBO’s projections for revenues and outlays through 2019.

Page 20: Analytics July August 2010

19 | a n a ly t i c s m aga z i n e . co m

frequently involves regression mod-els, may be limited to linear trend, and if the trend appears to be loga-rithmic, exponential or a simple S-curve, transformations are made to utilize linear regression. Even if the course or text does use the computer, but the accompanying software used does not do the more complex fore-casting procedures (e.g., Box-Jenkins methods), then these techniques are not covered or barely delved. This is changing. As more commercial statis-tical vendors are making student ver-sions available for student budgets, these texts and courses are, could and should cover more powerful fore-casting procedures. Some examples of full-featured statistical analysis programs having Box-Jenkins and ex-ponential smoothing forecasting capa-bilities that are available in academic versions are SPSS, SAS OnDemand, Minitab, Statgraphics, NCSS and Sys-tat. Generally, dedicated forecasting (software that only does forecasting) products (e.g., AutoBox and Forecast Pro) do not offer student versions.

WhAt tO LOOk FOr

IN thE SOFtWArE

Forecasting soFtWare can fall into one of three categories. Automatic forecasting software will quickly do an analysis of the data and then make the forecasts using a methodology that it deemed the most appropriate for that particular data. The chosen technique may come from the soft-ware minimizing some statistic (AIC, BIC, RMSE, etc.). The software will give the optimal parameters of the model, confidence intervals for the forecasts, plots and various statistical summary measures. The user always has the option of bypassing the rec-ommended or chosen methodology and to specify some other technique. The software then gives similar out-put for the prescribed procedure.

Semiautomatic forecasting soft-ware asks the user to specify a methodology from a list of available techniques. The software then finds the optimal parameters for the chosen model, and then gets forecasts, con-fidence intervals, statistical measures and plots. Finally, manual software requires the user specify both the technique and the parameters for the model. For example, if the user’s time plot of the data shows seasonality,

F o r e c A s t i n g s o F t WA r e s u r v e y

Here are just some of the Community web sites you can visit to find out more about community activites and the following member benefits: » Access to specialized knowledge through Community newsletters, list serves, discussion boards and meetings » Increased ability for you to learn about the specific “tools of the trade” » Discounts on registration rates for special interest conferences » Ability to help shape the INFORMS Annual and Regional meetings through community -sponsored and -presented meeting tracks » Discounts on selected journal(s) » Ability to compete for specialized awards » Access to sometimes hidden analytics job opportunities » Volunteer leadership opportunities » Opportunities to make a positive contribution to the profession

You may join a community without being a member of INFORMS. Click here to join a community.(Note: not all INFORMS Communites offer all of the above benefits.)

www.informs.org/Subdiv

:: Societies :: Sections :: Fora :: Chapters :: Student Chapters ::

· Artificial Intelligence · Aviation Applications · Behavioral Operations Management · CPMS: The College on the Practice of Management Science · Data Mining · eBusiness · Energy, Natural Resources, and the Environment · Financial Services · Group Decision and Negotiation · Health Applications

· Applied Probability · Computing · Decision Analysis · Information Systems · Manufacturing & Service Operations Management

· Marketing Science · Military Applications · Optimization · Simulation · Transportation Science and Logistics

· Location Analysis · Organization Science · Public Programs, Service and Needs· Quality, Statistics, and Reliability · Railway Applications · Revenue Management and Pricing · Service Science · SpORts · SPRIG: Spreadsheet Productivity Research Interest Group · Technology Management · Telecommunications

SOCIETIES click on name to visit society website

SECTIONS click on name to visit section website

INFORMS COMMUNITIES

Click here to downloadSubdivision Application

Help Promote Analyt ics

it’s fast and it’s easy! visit: http://analytics.informs.org/button.html

Page 21: Analytics July August 2010

frequently involves regression mod-els, may be limited to linear trend, and if the trend appears to be loga-rithmic, exponential or a simple S-curve, transformations are made to utilize linear regression. Even if the course or text does use the computer, but the accompanying software used does not do the more complex fore-casting procedures (e.g., Box-Jenkins methods), then these techniques are not covered or barely delved. This is changing. As more commercial statis-tical vendors are making student ver-sions available for student budgets, these texts and courses are, could and should cover more powerful fore-casting procedures. Some examples of full-featured statistical analysis programs having Box-Jenkins and ex-ponential smoothing forecasting capa-bilities that are available in academic versions are SPSS, SAS OnDemand, Minitab, Statgraphics, NCSS and Sys-tat. Generally, dedicated forecasting (software that only does forecasting) products (e.g., AutoBox and Forecast Pro) do not offer student versions.

FORECASTING SOFTWARE CAN fall into one of three categories. Automatic forecasting software will quickly do an analysis of the data and then make the forecasts using a methodology that it deemed the most appropriate for that particular data. The chosen technique may come from the soft-ware minimizing some statistic (AIC, BIC, RMSE, etc.). The software will give the optimal parameters of the model, confidence intervals for the forecasts, plots and various statistical summary measures. The user always has the option of bypassing the rec-ommended or chosen methodology and to specify some other technique. The software then gives similar out-put for the prescribed procedure.

Semiautomatic forecasting soft-ware asks the user to specify a methodology from a list of available techniques. The software then finds the optimal parameters for the chosen model, and then gets forecasts, con-fidence intervals, statistical measures and plots. Finally, manual software requires the user specify both the technique and the parameters for the model. For example, if the user’s time plot of the data shows seasonality,

Here are just some of the Community web sites you can visit to find out more about community activites and the following member benefits: » Access to specialized knowledge through Community newsletters, list serves, discussion boards and meetings » Increased ability for you to learn about the specific “tools of the trade” » Discounts on registration rates for special interest conferences » Ability to help shape the INFORMS Annual and Regional meetings through community -sponsored and -presented meeting tracks » Discounts on selected journal(s) » Ability to compete for specialized awards » Access to sometimes hidden analytics job opportunities » Volunteer leadership opportunities » Opportunities to make a positive contribution to the profession

You may join a community without being a member of INFORMS. Click here to join a community.(Note: not all INFORMS Communites offer all of the above benefits.)

www.informs.org/Subdiv

:: Societies :: Sections :: Fora :: Chapters :: Student Chapters ::

· Artificial Intelligence · Aviation Applications · Behavioral Operations Management · CPMS: The College on the Practice of Management Science · Data Mining · eBusiness · Energy, Natural Resources, and the Environment · Financial Services · Group Decision and Negotiation · Health Applications

· Applied Probability · Computing · Decision Analysis · Information Systems · Manufacturing & Service Operations Management

· Marketing Science · Military Applications · Optimization · Simulation · Transportation Science and Logistics

· Location Analysis · Organization Science · Public Programs Service and Needs· Quality, Statistics, and Reliability · Railway Applications · Revenue Management and Pricing · Service Science · SpORts · SPRIG: Spreadsheet Productivity Research Interest Group · Technology Management · Telecommunications

SOCIETIES click on name to visit society website

SECTIONS click on name to visit section website

INFORMS COMMUNITIES

Chhapters :::: SSectiions :: Chhapters :: S

re to join a community

Click here to downloadSubdivision Application

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

Page 22: Analytics July August 2010

the user may ask for the software to use Winters’ method. The user must also supply the three smoothing constants, and the software then gets the fore-casts, plots, statistical measures, etc. Few commercial programs fall into this group because finding the optimal pa-rameters of the model can be a tedious trial and error process.

The user must decide whether to use a dedicated forecasting program or a general statistical product that has the desired forecasting capabilities. Dedicated products are more likely to be automatic programs. They may also offer more sophisticated forecasting techniques (e.g., ARIMA intervention, multivariate ARIMA transfer functions, etc.) than general statistical programs may not have [3]. Most general statistical

products do not have an automatic fore-casting mode, but a few now do. SPSS has an “expert modeler” that will find the best ARIMA or exponential smooth-ing model for the data. Statgraphics has an “automatic model selection” option, and its StatAdvisor explains why the software chose that particular model.

USERS REPORT that dedicated fore-casting and general statistical prod-ucts frequently do well at forecasting time series data that has seasonality, with or without trend. Because many recent news topics involved trend anal-ysis, for this survey, I tried something similar but less crucial to the national scene. Every Sunday evening we hear

or read which movie made the most money over the weekend. From www.boxof-ficemojo.com, a Web site that has accurate box office infor-mation for more than 10,000 films, I took the daily box of-fice returns of “Alice in Won-derland,” released on March 5 by Buena Vista (Disney) [4]. My question was simple: Considering I had the daily, and thus cumulative domestic

gross of the film for its first 23 days (to March 27) of release, how much mon-ey will “Alice” eventually gross? Figure 3 shows the time plot of the data. As of this writing (May 20), nearing the end of its domestic run, the movie has made approximately $332 million and is the 19th highest grossing film (do-mestic figures, not adjusted for infla-tion) of all time.

I put the data in an Excel spread-sheet. The first column, Time, had

Figure 3: Box office gross for “Alice in Wonderland.”

Figure 4: Forecast Pro’s Excel layout for data entry.

Page 23: Analytics July August 2010

integers from 1 to 23. The second col-umn, Alice, had the cumulative gross, in millions of dollars. Most programs can read or import a variety of for-mats, and all can import Excel work-sheets. However, sometimes getting the forecasting software to recognize the Excel spreadsheet can be coun-terintuitive or even cumbersome. For example, Forecast Pro requires the Excel spreadsheet have initial six rows before the data starts in row

seven. Figure 4 shows what the Excel layout should look like before you can import it into Forecast Pro. The first row must have the vari-able name, the second row has a description of what the variable is, the third row indicates the year the data starts, the fourth gives then the starting period (e.g., for Alice, March or the third month), the fifth has the number of periods per

year, and the sixth gives the number of periods per cycle. Thus, it may not be obvious to a new or casual user how to get the forecasting software to recognize the spreadsheet without resorting to the dreaded procedure of reading the user’s guide. Other prod-ucts import the spreadsheet but ask the user to supply similar information in a separate dialog box.

All the products I tried have this in common: clicking on “Help” gave excel-lent tutorials or explanations to guide users. With these products – even if you have an aversion to reading PDF manuals – I urge you to at least glance at them.

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Figure 5: Statgraphics’ forecasts for “Alice.”

Page 24: Analytics July August 2010

I strongly recommend that you check if a trial version of the product is avail-able from the vendor’s Web site. All of the general statistical software firms offer a trial program of the complete product that works from 10 to 30 days. Unfortunately, not all of the dedicated forecasting software vendors make a trial version available. If a trial version is available, verify that it allows you to use your own data, and not just the “trial” data that comes with the trial software. While evaluating its capabilities, judge how easy learning the software is, and once you have mastered its nuances, how easy using it is.

MOST PRODUCTS

GIVE the usual output – forecasts, statistical measures, graphs, etc. – without any prompting. Many will give more, or less, output by the user specifying, via dia-log boxes, the spe-cifics. What did the automatic programs say about “Alice in Wonderland”? All the products assumed

linear growth. Statgraphics gave Figure 5, Forecast Pro gave Figure 6 and SPSS gave Figure 7. All predicted that the film, by mid-May, would be the highest gross-ing film of all time. SPSS’ “Expert Model-er” said that it was 95 percent confident that “Alice” could end with gross as high as $1.4 billion or a loss as much as $700 million dollars. What the “Alice in Won-derland” example indicates is that users should know what the program is geared to do if it has an automatic mode. Using the software as a “black box” could lead to reasonable or outrageous forecasts [5].

In a previous survey [http://viewer.zmags.com/publ icat ion/a52b897c#/

a52b897c/44], I found that different products gave different forecasts for the same data using the same model. For this survey and using the most cur-rent versions of the software, I got simi-lar results. That is, I used another data set that exhibited linear growth and monthly seasonality, and I always told the software to use Winters’ method. The various programs I tried gave dif-ferent smoothing parameters and thus different forecasts. Why? The software did not use the same initial conditions to find the “optimal” smoothing constants for Winters’ method. Worse, very few of the products tell the user how they deter-mined those initial conditions, or what they were. Thus, the heads-up given in the previous survey ap-plies again.

THE AUTHOR, in con-junction with Analyt-ics and OR/MS Today magazines, recently conducted a new sur-vey of forecasting

products. The survey asked the vendor to check off the capabilities and features of the software and allowed he or she to include additional details not addressed by the questions. We tried to identify as many products as possible, using read-er and vendor feedback, advertising, displays at professional conferences, information from previous surveys, etc. We e-mailed the vendors and asked them to respond on our online question-naire, followed by some gentle nudging with subsequent phone calls. The goal

Figure 6: SPSS’ forecasts for “Alice.”

y

Figure 7: Forecast Pro’s forecasts for “Alice.”

Page 25: Analytics July August 2010

was to be as comprehensive as pos-sible to identify and poll the vendors. To those who say, “They left out (my) product X!” please accept our apology. Let us know of the company and prod-uct [the questionnaire is available at www.lionhrtpub.com/ancill/fssurvey.shtml], and we will add it to the online directory and listing of products.

THE PURPOSE OF the survey [see http://lionhrtpub.com/orms/surveys/FSS/fss-fr.html] is to inform the reader of what is available. The in-formation comes from the manufac-turers, and no effort was made to verify the submissions. My remarks about specific software should not be construed to be an overall review or evaluation of that program, rather just a few musings of one subject used on one data set.

If you are interested in buying a new forecasting program, or want to try another product from the one you have, you first should examine what techniques the software can do and

compare those with your needs. I rec-ommend that the software be at least semi-automatic. Get a trial version if you can. Finally, contact the vendor with your specific questions, and if the Web site does not mention a trial ver-sion, bring up the issue with the ven-dor directly. Users tell me that they found the vendors to be extremely helpful. ❙

Jack Yurkiewicz ([email protected]) is a professor of management science in the MBA program at the Lubin School of Business, Pace University, New York. In addition to management science, he teaches data analysis, operations management and simulating financial models. His current interests include developing distance-learning courses for these topics and assessing their effectiveness.

TO SIGN UP or RECEIVE MORE INFORMATION: Phone: +1-443-757-3500 or 1-800-446-3676E-mail: [email protected]: http://cpms.section.informs.org

LeadershipChair: Randall S. Robinson E: [email protected] Chair: Russell P. LabeSecretary: Michael GormanTreasurer: Douglas A. Samuelson

������������ �������������������

������������ �����������

Boost OR/MS Practice by Joining the Practice Section of INFORMS.If you practice OR/MS full time, practice part-time, lecture on practice, or otherwise have an interest in practice, you should be a member of CPMS. The mission of CPMS is to comprehensively support and advance practice in all types or organizations – business, government, military, health care, universities, and non-profits.

A membership in CPMS brings solid benefits for one low fee. And it shows your support for actual real-world applications, without which the profession would not exist. Benefits and support reach out to conferences, journals, awards, newsletters, and more.

Awards» Franz Edelman Award» Daniel H. Wagner Prize» INFORMS Prize

JournalsJoin INFORMS and make any of these your free "dues" journal.» Interfaces (a journal entirely devoted to practice)» Management Science (Management Insights give practice-oriented summaries of articles)» Operations Research (special Practice Area articles in most issues)

Meetings (Discounted Registrations)» Spring Practice Conference for practitioners, academics, and executives » Isolated practitioner workshop offered at the Fall Annual Meeting» Practice session track offered at the Fall Annual Meeting

Newsletter» Twice per year newsletter delivers thoughtful articles from well-known practitioners

Connect Online» Discuss key issues of the day on the CPMS LinkedIn group, electronic mailing list or website

DVDs, Videos and Podcasts» Franz Edelman Award and Wagner Prize DVDs and streaming videos showcase enduring lessons of excellence in practice » Experts share their insights in INFORMS Science of Better podcasts

Continuing Education Database» Take advantage of lifelong professional education and career development opportunities

Volunteer Opportunities» Network with leaders and upgrade organizational skills by coordinating key awards, meetings, and other programs

)

ny of these your free "dues" journal.

1. www.cbo.gov/ftpdocs/100xx/doc10014/Chapter1.5.1.shtml

2. To my anxious family, forecasts for my evil LDL levels through 2019 show good levels hovering around 75 mg/dL, with a peak of 83 in the summer of 2016 and then a slight decay to 77 by the end of 2019.

3. An excellent reference text for some of these techniques is “Forecasting Principles and Applications,” by Stephan A. DeLurgio, McGraw Hill, 1998.

4. www.boxofficemojo.com/movies/?page=daily&id=aliceinwonderland10.htm

5. Postscript: Figure 3’s time plot of Alice looked like an S-curve to me. I made an Excel template

for the Weibull function . With four parameters, this model gives more flexibility

than the simple S-curve . Finding the four parameters that will minimize the root mean square error was a nonlinear program for Solver. The template predicted Alice would gross $335 million dollars, approximately $3 million dollars more than the film actually did make.

N O T E S A N D R E F E R E N C E S

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

Page 26: Analytics July August 2010

A QUICK GOOGLE SEARCH of the phrase “predictive analytics” produces more than 375,000 results – the number of search

results crosses 1 million when the phrase “predictive modeling” is used instead. Just five years ago things were quite dif-ferent. Some of the early converts in the business world – primarily, experienced professionals who are “mathematical en-thusiasts” – still remember our struggles to convince senior executives in large

corporations why predictive analytics and related technologies will change the way they do business. The good news is those days are behind us, thanks primar-ily to IBM and Accenture who have been spending serious marketing dollars since mid-2009 to evangelize to executives and managers in almost all verticals – and horizontals – the virtues of business ana-lytics, in general, and predictive analytics, in particular. As readers of this magazine, you and I are probably “believers” – and

we owe a debt of gratitude to these two large consulting firms (especially IBM) and their primary competitors, who are following very similar footsteps, for mak-ing analytics an integral part of many im-portant business discussions.

This article is Part 1 of a two-part se-ries highlighting some of the major uses of predictive analytics in the after-sales service domain. Part 1 focuses on cus-tomer service; Part 2 will focus on field service.

LET’S DEFINE a few simple terms to keep us all on the same canvas.

Customer Service indicates the after-sales service provided through contact centers. This after-sales service can be for technical or non-technical issues that the customer is experiencing with the product or service she purchased from a company. Contact, usually inbound (i.e., a customer contacting a company), can be made by phone, e-mail, chat and Web.

Page 27: Analytics July August 2010

Field Service indicates the after-sales service provided by sending an expert to the customer site to resolve an issue, technical or otherwise, that a customer is experiencing with the prod-uct or service she purchased from a company. This usually takes place af-ter the customer has already commu-nicated with the company’s customer service division, and Customer Service has decided to engage Field Service to address the issue.

Predictive Analytics answers the questions what will happen, and when. It is a domain – coined by practitioners and industry experts – that uses data, al-gorithms and business rules to provide forward-looking visibility for a process or an initiative. Predictive analytics is much more of a business discipline than a sci-entific discipline – however, it borrows heavily from the mathematical sciences.

TODAY’S LARGE CONTACT center opera-tions have three primary objectives:

• Improve customer satisfaction and loyalty

• Reduce cost per contact• Transform from a cost center to a

profit center Customer service executives monitor

and manage a variety of metrics – also known as key performance indicators (KPIs) – to track progress toward the above objectives. These KPIs can be broadly grouped into the following four categories:

Satisfaction and Loyalty. This cate-gory includes KPIs that are specifically designed to understand if a customer is satisfied with after-sales service provid-ed by a company’s contact center(s) and whether this customer is likely to con-tinue to do business with the company. Examples include customer satisfaction (also known as event satisfaction), net promoter score, brand satisfaction, sat-isfaction with agent, recommend likeli-hood, repurchase likelihood, etc.

Resolution. Usually a customer con-tacts a company’s after-sales service to seek resolution for an issue – technical or non-technical – that she may be ex-periencing with one of the company’s offerings that she has purchased. It is of paramount importance to the company to resolve the customer’s issue quick-ly to ensure her satisfaction, loyalty,

etc. Examples of KPIs in this category include first contact resolution (also known as resolved in one), resolved within two, etc.

Operations & Productivity. This is the oldest and the most widely under-stood category in the contact center world. Examples of KPIs in this category include incoming contact volume, vari-ance from forecast, handle time, queue time, utilization, occupancy, adherence to schedule, etc.

Cost and Revenue. This category is about saving and making money. The latter piece is achieved by selling through the service channel, a practice many large companies have embraced during the past few years (perhaps the recent recession accelerated the adop-tion). Examples of KPIs here include cost per contact, revenue per contact (also per shift, per day, per week, per agent, etc.), sales conversion rate, ac-tual cost vs. budget, etc.

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 28: Analytics July August 2010

PREDICTIVE ANALYTICS IS starting to have a profound effect on customer service. The value proposition of knowing about an is-sue or an opportunity, before it becomes an issue or an opportunity, is compelling in the customer service environment. Prop-er identification and quantification of an upcoming issue can enable a corporation to take measures to preempt the same. Similarly, proper identification and quan-tification of an upcoming opportunity can enable a corporation to take advantage of it – otherwise, a missed opportunity can be looked upon as an issue by itself.

While contact center practitioners have been projecting forward the KPIs in the Operations & Productivity category for a while now, it hasn’t been the case for most of the KPIs in the other three categories until recently. So, let’s explore the applications of predictive analytics in these three categories in more detail.

Each of the KPIs in the Satisfaction and Loyalty category is a response vari-able (i.e., output) that depends on many moving parts (i.e., inputs or explanatory variables). Take for example customer satisfaction (CSAT). Say, we are talking about a customer service environment providing phone-based technical sup-port for a technology product (such as

hardware, software, electronics, etc.). In this scenario, the CSAT depends on the following factors:• The customer. Most large companies

collect and store lots of data on their customers. For example, CSAT may be different based on how tech-savvy the caller is.

• The product. CSAT may also vary depending on the product. Some products are more complex than others; some have more issues than others do; and so on. Large corporations usually have lots of data on their products.

• The agent. CSAT may depend on the characteristics of the agent taking the service call. Some agents have more extensive training than others; some are more experienced; some are easier to understand. Leading companies actively collect and manage data on their contact center agents.

• The issue. CSAT can vary based on the issue about which the customer is calling. Some issues are more difficult to solve over the phone – and some of these complex issues may even need follow-up calls. Leading companies track and store lots of data on the issues related to their major product offerings.

• The call. CSAT may also be different based on the quality of the call itself. Maybe the customer waited in queue for a long time before speaking with a live agent. Maybe the customer found the IVR (interactive voice response) directions misleading or confusing. Today’s sophisticated phone switches, also known as automatic call distributors, collect a wealth of information on each call. In addition, customer satisfaction –

which is usually measured by surveying a representative sample of the custom-ers via e-mail or automated post-call IVR – may also be affected depending on whether the customer bought the product through a channel (e.g., a notebook com-puter manufactured by HP bought from a retail store such as Best Buy) or directly from the manufacturer. Then there are environmental factors (for example, eco-nomic factors such as recession) that may play a role. CSAT is the response variable that can be modeled – and its value pre-dicted for future time horizons of interest – by talking into account the data from all the moving parts listed above. This is the case for net promoter score (“NPS”) and the other KPIs in this category as well. For NPS, it may be more insightful to model the promoters and the detractors sepa-rately and then combine them to get the

resultant NPS for the future time horizons that may be of interest to a company.

It is important to understand that the relationship between a response vari-able (such as CSAT, NPS, etc.) and its explanatory variables (such as data on customer/agent/product/issue/call/etc.) is dynamic. So, the models need to be refitted, and the predictions need to be updated, continually (the periodicity of recalibration can be decided by the do-main experts at the company). Another important matter is to focus on the ex-planatory variables under management control so they can be properly manipu-lated, where possible, to preempt an up-coming issue or to take advantage of an upcoming opportunity. By the way, we are not suggesting you exclude factors not under management control from your models. We are suggesting you put an additional emphasis on the factors that you can control so you can get as much insight as possible on how to best con-trol them for your specific objectives.

The KPIs in the Resolution category, for this phone-based technical support environment, can be modeled and pre-dicted similarly to those in the Satisfac-tion and Loyalty. The Resolution KPIs are also response variables that are affected by the evolving dynamics among the cus-tomer, the agent, the product, the issue

Page 29: Analytics July August 2010

and the call (and the channel, if there is one). Also, as is well known among the experienced practitioners, first contact resolution (“FCR”) significantly influences customer satisfaction – when FCR goes up, so does CSAT.

The KPIs in the Cost and Revenue cat-egory, for this phone tech-support example, may behave somewhat differently. Predic-tive analytics can predict cost per contact, for the future time horizons of interest, since cost per contact is also dependent on cus-tomer/agent/product/issue/call/(channel). A very useful application of predictive ana-lytics in the revenue generation category – i.e., selling through the service channel – is to automatically generate an appropri-ate offer during the call itself. The predictive model is producing an offer with the highest likelihood of customer acceptance; in short – the right offer to the right customer by the right agent at the right time. The right time is immediately upon resolution of the cus-tomer issue. It is imperative to remember that the customer called the company to re-solve an issue she is having with the com-pany’s product. The first responsibility of customer service is to address the primary

reason for the call, and then leverage the good will generated (through effective prob-lem resolution) by making an offer that the customer has a high likelihood to accept. Figure 1 explains how this predictive up-/cross-selling may work in the customer service environment.

Another interesting place where pre-dictive analytics is having a significant impact is in customer service for new product introductions. The IBM CEO Study of 2010 shows that most CEOs, across industries, are expecting an in-creasing portion of their future revenue to come from new offerings. As product lifecycles continue to shrink and inno-vations continue to proliferate, predict-ing what the contact centers can expect upon a new product launch is getting more and more difficult. Leading compa-nies are using predictive analytics today

to predict incoming contact volumes, for the future time horizons of interest, so they can appropriately staff their contact centers in anticipation of the new prod-uct launches. This is not only in terms of the number of agents per shift/day/week per region, but also in terms of the quali-fications of these agents.

CUSTOMER SERVICE IS evolving. The intro-duction of social media is already impact-ing how companies are approaching this key function. Social media have made pos-sible what most of us already knew intui-tively – good news travels slowly, but bad news travels instantly. Even though the false positives (i.e., flagging something that shouldn’t be flagged) and the false negatives (i.e., not flagging something that

should be flagged) have slowed the adop-tion of some new technologies (such as speech analytics, sentiment analysis, etc.), these technologies are getting better.

Advancements in predictive analytics technologies – and the promise to synergis-tically include different data types (numbers, text, audio, video, etc.) with huge data vol-umes to make useful predictions – should change customer service as we know it. In fact, the change has already begun. ❙

Atanu Basu is the CEO and president of DataInfoCom, an analytics software company headquartered in Texas. Basu has more than 16 years of experience in the semiconductor and software industry. Dell and Microsoft are DataInfoCom’s reference customers. DataInfoCom has recently won an “emerging technology” investment award from the Texas Governor’s office. Basu can be reached at [email protected]. Tim Worth is the senior manager of Delivery Operations at DataInfoCom. Worth has 20 years of frontline service industry experience with Dell (16 years in contact center analytics), Sallie Mae and Aditya Birla Minacs. He can be reached at [email protected] and 512-635-9203.

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

Figure 1: How predictive up-/cross-selling may work in the customer service environment.

Page 30: Analytics July August 2010

CONVERSATION AROUND busi-ness analytics is becoming a boardroom discussion. With the abundance of data, infor-

mation and content that enterprises have today, enhanced focus is on the disci-pline of business analytics to enable bet-ter business decision-making across the enterprise. This article provides a unique perspective on how companies should create an enterprise-wide approach to deploy analytics decision-making, as well

as a summary of the journey that Hewlett Packard took in deploying analytics in a shared services model.

IN TERMS OF an enterprise-wide ap-proach to deploying analytics, organi-zations pass through different levels of analytical maturity as shown in Figure 1, typically evolving from companies with limited capability of data-driven

decision-making to sophisticated ana-lytical organizations. The enterprise-wide analytics approach helps to identify processes with low analytical maturity, where the organization can use common data, analytical tools and techniques to enable better, efficient and pro-active decision-making.

The ability of the organization to bring together certain key enablers determines the analytics maturity of the enterprise. The key enablers are: Figure 1Figure 1

Page 31: Analytics July August 2010

• People – Enhancing the depth of analytical skills, consultative skills and building a critical mass of business domain experts.

• Techniques – Deploying simple to use techniques and constructs of logical frameworks, mathematical modeling and visualization to analyze the business data and represent findings effectively.

• Tools – Deploying a standard set of scalable tools to carry out data management, reporting and data mining.

• Technology – Deploying an enterprise-wide data and technology

platform both in terms of hardware and software to deliver the high quality analytics services in a cost-effective and real time manner.However, getting to an enterprise-

wide analytics platform calls for a long-term strategy and requires con-siderable management commitment (as shown in Figure 2). The journey entails:• Developing successful pilots,

establishing the initial talent pool/leadership and building credibility.

• Establishing standard processes and frameworks to deliver the output in a replicable and predictable manner.

• Scaling up the footprint across business units (BUs) and regions to attain sponsorship.

• Building innovative solutions/IP that become an asset to be leveraged across the enterprise.Based on the HP experience of

building up the enterprise-wide ana-lytics shared services model, this is a four- to five-year journey at minimum. In 2005, HP Global Business Services (GBS), which forms the shared servic-es organization for HP, decided to set up a central group of analytics practitio-ners to drive business analytics across the enterprise. The group has since evolved significantly.

Figure 2

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 32: Analytics July August 2010

THE ANALYTICS SHARED services group within HP is called Decision Support and Analytics Services (DSAS). DSAS is a multi-functional analytics center based in India that drives data-driven decision-making across sales, market-ing and supply chain functions by lever-aging structured and unstructured data. This is done with a key focus to influence key business outcomes for HP in terms of revenue growth, risk minimization and cost reduction. The analytics team at HP has invested in top-notch talent compris-ing professionals with advanced degrees in quantitative disciplines and rich indus-try and functional experience.

At DSAS, the analytics practitioners apply sophisticated research techniques, data-mining and predictive modeling to

enable better decision-making. Over the last five years, this has led to the cre-ation of pan-HP Analytics Centers of Ex-cellence (COE) (Figure 3). COE work on enterprise-wide data sources/ structures, apply a common set of data-mining and modeling tools and leverage best prac-tices to deliver actionable intelligence in a cost-effective and efficient manner.

Market Insights is one of the most ma-ture COEs in DSAS, and the evolution of Market Insights in DSAS – from standard market share reporting to advanced com-petitive analysis to predicting market size for HP – illustrates the value of housing analytics in a shared services organiza-tion (Box 1).

AS IN ANY OTHER SETTING, innovation in a shared service model requires a combina-tion of skills, organizational commitment, planning and investment. In DSAS, innova-tion is driven through a multi-fold approach. • Innovation bubbles up organically

through the domain expertise and deep understanding of the data that the team works with.

• Complementary partnerships with internal innovation hubs in HP such as HP Labs & SPaM (Strategic Planning and Modeling).

• Targeted investment in areas where DSAS is best positioned in the organization to generate business value.These efforts have resulted in recog-

nition from both internal HP innovation forums and external competitions. Re-cently, a DSAS project on Supply Chain Stock Outs Prediction won the Wharton Innovation Award. DSAS Projects have also been recognized in HP-Internal events such as Tech Con and the HP Cir-cle Awards.

WITH INCREASING MATURITY of DSAS’ ana-lytics portfolio and enhanced focus on in-novation, HP businesses today see DSAS as an integral partner in achieving their core business objectives – be it boost-ing revenue growth, optimizing costs or mitigating risks. DSAS’ engagement with the HP’s Direct-to-Consumer Business (hpdirect.com) is a good example of how analytics is seen as a critical driver for transformational impact to the business. Figure 3

In 2005, Market Share & Sizing re-porting and analysis in the enterprise business (EB) of HP was delivered local-ly in disparate formats with significant differences in quality and timeliness. With internal and external benchmark-ing, DSAS identified best-in-class re-porting and analysis methods and redesigned the market intelligence pro-cess to a standardized, central model with significant automation, improved accuracy and data transparency. Us-ing tools such as SAS, Access and Ex-cel, DSAS established a standard data cleansing, data validation and data or-ganization process for different market data sources (such as IDC trackers) to make it consistent with HP’s view of the market. A set of key design principles around the taxonomy, presentation, competitive and regional views was

also identified for all reports. A clear set of design principles and an orga-nized database also helped automate the reports generation process.

All this led to productivity gains of ~80 percent. The success in transform-ing market intelligence for EB led to recognition from other business units as well. Today, Market Share & Sizing reports are centrally delivered for all the three business groups in HP. The team has also invested in developing desktop apps and intuitive tools that provide self-service capabilities to the broader market intelligence community. Over time, the team has been to deliver similar benefits to the competitive in-telligence and primary market research processes. More time has been freed up for the marketing teams for investment in higher-impact marketing activities.

Driving efficiency and effectiveness in market intelligence

Page 33: Analytics July August 2010

IN SUMMARY, investing in a central an-alytics group like DSAS to deliver en-terprise-wide analytics has yielded rich dividends for HP. These include: • Economies of scale from

multifunctional analytical services that help drive an end-to-end solution for the enterprise.

• Economies of location – Centralization of the talent pool in a few centers drives best practices and career development.

• Economies of skill – Getting the right analytical talent, equipping them with the right tools and building intra-organization knowledge networks has enabled creation of centers of expertise.

• Economies of process through process standardization and simplification, common enabling tools and technologies, and continuous process improvement.

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

In early 2009, HP’s channel part-ners in the Europe region for top value product lines such as laptops, desk-tops and handhelds were faced with higher stock outs. It was unclear why stock outs had increased from 1 per-cent to about 5 percent to 6 percent. What this meant for HP was not only a significant loss of revenue, but also a loss of customer goodwill and cred-ibility. DSAS was called upon to inves-tigate the causes for these stock outs and to develop a tool to predict future stock outs.

The problem was solved in two phases. In the first phase of the proj-ect, the team investigated root causes for stock outs and identified the

correct variables that would help gen-erate signals around stock outs. In the second phase, an early detection tool – “Signature” – was developed, which would help predict a stock out in the future so that action could be taken to prevent it.

For each product, the tool calcu-lates a “stock-out parameter.” Based on the value of the stock-out param-eter being beyond different threshold levels (Ts = Threshold 1, 2), the tool identifies “high,” “medium” and “low” probability of stock outs. Within three weeks of implementation, the stock out level decreased from approximate-ly 5 percent (pre-tool implementation) to about 3 percent.

Analytics-based tool to predict stock outs in supply chains

Page 34: Analytics July August 2010

• Business impact – From predictive analytics and close collaboration with businesses and other niche innovation hubs, the organization was attributed with millions of dollars in direct and indirect business impact. As data explodes within and out-

side the enterprise, business im-pact and innovation targets will only

increase for organizations. Compa-nies that invest in a central analytics organization to elevate the analyti-cal maturity are well positioned to-day to compete effectively in the marketplace. ❙

Sanjay Singh is vice president, HP Global Business Services. Prithvijit Roy is director, HP Global Business Services. Arnab Chakraborty ([email protected]) is an analytics service delivery leader, HP Global Business Services.

The online consumer segment presents a multibillion-dollar market opportunity for HP. In 2008, HP Direct engaged with GBS-DSAS to realize the objective of gaining share in this high-ly competitive and growing market.

From developing a standard per-formance measurement framework that helps manage daily operations and marketing campaigns that drive traffic/sales, to building sophisticated predictive models that help plan and meet financial commitments, GBS-DSAS is becoming critical to HP Di-rect’s success in the marketplace.

The end-to-end analytics support spans the planning, demand genera-tion, operations and category man-agement business functions. DSAS, with its deep knowledge of customer and business performance data and the application of analytics/computing

tools such as SAS, R and Omniture, provides:

• An enhanced understanding of customer behavior on the Web site through Web analytics – click-stream analysis, cart abandonment analysis, segmentation, etc.

• Recommendations on which marketing levers work (price, promotion, place-ment) and what needs to be tested across the main marketing channels – online Web store and call centers.

• Measurement of impact on perfor-mance metrics through statistical testing (currently done) and exper-imental design techniques (work in progress).

• Predictive models to improve de-mand generation forecast accura-cy that helps the business improve accuracy of financial forecasts and adherence to them.

Analytics for the U.S. direct-to-consumer store

Page 35: Analytics July August 2010

ON MAY 7, the most-watched stock market measurement in the world – the Dow Jones Industrial average – plunged

nearly 1,000 points (almost 10 percent) in a matter of minutes. The size and speed of the drop shocked the financial world, but what was truly shocking was that weeks later, the best reason any Wall Street wonk could offer to explain the breathtak-ing roller-coaster ride (the Dow recovered most of what it lost within a half hour) was

a “trading glitch” that triggered massive, computerized sell orders.

Untold billions of dollars in paper wealth disappear in a matter of minutes because someone hit the wrong computer key? Are you kidding me? As one blogger put it, “If this was really primarily caused by a ‘computer glitch,’ how are investors supposed to have any confidence at all in the market? After all, if a computer er-ror can wipe out half your account in less than an hour, why invest at all?”

Good question.The May 7 market meltdown served

as yet another reminder that financial markets, especially in these difficult eco-nomic times, must be beyond reproach when it comes to security, accuracy and efficiency. Serendipitously and just a few

weeks earlier, the Institute for Operations Research and the Management Scienc-es (INFORMS) bestowed its most pres-tigious prize, the Franz Edelman Award, on Indeval for its innovative use of op-erations research to help strengthen the banking and securities exchange system

For videos of the 2010 Edelman Awards Gala and presentation, see:http://meetings2.informs.org/Practice2010/wrapup.html

The Edelman Award-winning team representing Indeval celebrates on stage at the awards gala.

Page 36: Analytics July August 2010

in Mexico – securely, accurately and efficiently.

Indeval, the Mexican Central Securi-ties Depository for financial securities, manages Dali, the securities settlement system (SSS) that settles securities op-erations and exchanges that average more than $250 billion a day. To put that figure in perspective, every five working days Dali settles in trades the equivalent of Mexico’s annual gross domestic prod-uct. With operations research (O.R.) as its core engine, Dali settles its massive volume of trades securely, accurately and efficiently in near real-time (one or two minutes), a record unmatched by any other exchange in the world.

INDEVAL AND ITS PARTNERS the Mexican Central Bank and Instituto Technológi-co Autónomo de México (ITAM) began implementing Dali in 2005 with two key requirements for the new system: 1. The SSS had to function with minimal financial reserves, which is invaluable during financial crises when liquidity is scarce; and 2. The SSS had to employ

near real-time settlement capability to meet ever-increasing demands for in-traday liquidity and risk management tools. With these premises, commu-nication protocols, business rules and best market practices were revised, and a new settlement solution was created.

The project not only met its primary goals, the near real-time settlement capa-bility saved Indeval’s clients/stakeholders (domestic and international financial insti-tutions, the Mexican Central Bank and vari-ous government ministries) $150 million collectively a year by significantly reduc-ing daily liquidity requirements (and thus intraday loans and accompanying interest payments). The unique application of O.R. combined with its dramatic impact and po-tential on a national and international scale made Indeval the first Mexican organiza-tion and the first financial organization to win the Edelman in its 38-year history.

Former INFORMS President Tom Cook, who headed the famed Sabre group at American Airlines during its rise to analytical legendary status during the 1980s and 1990s, served as master of ceremonies for the Oscar-like Edelman Awards Gala held in conjunction with the INFORMS Practice Meeting in Or-lando, Fla. INFORMS President Susan Albin presented the Edelman Award to Indeval CEO Hector Anaya.

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 37: Analytics July August 2010

“I don’t know what to say. I haven’t practiced this part,” quipped Anaya, who led Indeval’s well-rehearsed presentation during the competition earlier in the day. He was joined on stage by a large contin-gent from industry, government and aca-demia that comprised the prize-winning presentation team.

“To tell you the truth, what I want to say is ‘thank you’ to my team,” Anaya contin-ued. “They … established a commitment to work as a team. We spend a lot of our lives at work and they are my friends. ... I

Indeval CEO Hector Anaya accepts the Edelman Award.

shouldn’t be here – they should be here. Thank you very much.”

AFTER THE CEREMONY, Anaya admitted he had never heard of the term “opera-tions research” before his company en-tered the Edelman competition. So what does he think of O.R. now?

“I think it’s magic,” he said. “It’s magic to settle billions of dollars of trades in a minute. Our settlement system, with O.R. as its core engine, queries more than 22,000 database records and sends mes-sages to all of our clients confirming the settlement in one or two minutes. In my opinion, that’s magic.”

The dramatic award announcement – greeted by a standing ovation from the capacity crowd of conference attendees – capped a daylong competition in which six finalists from around the world (see accompanying story) made a series of presentations before a panel of judges. In his opening remarks introducing Inde-val’s Edelman presentation to the judges, Anaya described the work as a “major de-velopment that will profoundly change the way our business is conducted.”

Anaya added that Indeval was “very proud” that the Dali securities settlement system was “developed from scratch us-ing O.R. techniques,” that it was capable

of “incorporating the entire Mexican finan-cial sector,” that Dali “efficiently settles $250 billion in transactions on average each day,” and that it was “implemented on budget and on schedule.”

Describing the geneses of the initiative, Anaya noted that the desire of Indeval’s Board of Directors to adopt internation-al best practices dovetailed closely with the vision of the Mexican Central Bank, which resulted in a close partnership between the two entities. Together, they recognized the need for “an additional

partner that would bring the technical and O.R. expertise required for the areas of cross-engineering, systems design, op-timization and simulation.” Enter Insti-tuto Technológico Autónomo de México (ITAM) – and operations research – thus creating, as Anaya put it, “a great team of industry, government and academia working together … with a common goal: to deliver a world-class, state-of-the-art securities settlement system.”

The Edelman judges were obviously impressed.

Indeval CEO Hector Anaya accepts the

Page 38: Analytics July August 2010

“They accomplished something in Mexico that apparently hasn’t been done anywhere else,” said Edelman judge Cindy Barnhart, a professor at MIT and a past president of INFORMS. “That impressed us, and they did it through the use of operations research.”

“They provided something new, which is ‘netting’ buyers and sellers every two minutes in near real time instead of at the end of the day,” added Andres Weintraub of the University of Chile, another Edel-man judge. “That takes away a lot of the risk in the operation. It’s an excellent inte-gration of IT and operations research.”

Like the other judges, former Edel-man winner Grace Lin was impressed by the innovation, uniqueness and impact of Indeval’s SSS system: “They showcased the use of O.R. in an area [financial] that hasn’t been widely represented in the Edelman competition,” she said. “The in-tegration of O.R. with IT can really help reduce risk because they have real-time settlement capabilities. If you don’t settle right away and have to wait until late in the afternoon or even after hours, there’s potential to create big problems.”

THE KEY TO Dali’s near real-time settle-ment capabilities is an operations research-based engine that is executed automatically and offers a continuous and secure opera-tion, ensuring that settlement is irreversible. This single clearing and settlement engine incorporates a linear programming model that chooses which pending operations can be settled with the depositors’ avail-able balances, maximizing the value of the transactions settled.

Thanks to the model, many transac-tions that would remain pending if they were processed individually are settled together, thus reducing liquidity require-ments dramatically – by 52 percent in cash and 26 percent in securities. The most important benefit of the implemen-tation of Indeval’s new settlement system is the enhancement and strengthening of the Mexican financial infrastructure.

A major challenge for implementing a new solution was to get the support of the whole industry and persuade more than 100 direct market players (including HSBC, Bank of America and Citibank) to change their operating pro-cedures for handling transactions.

Usually, the handling of financial transactions relies on data management and information technology approaches, which do not take into account critical

Your one-stop shop to view top presentations from key INFORMS meetings

2009 Annual Meeting

Plenary Keynote Wagner Prize Presentations

2010 Practice Conference

Edelman Award Presentations

INDEVAL (winner) Delaware River Basin Comission DHL New Brunswick Dept. of Transportation Procter & Gamble Sasol

Your latest member benefit lets you learn from the best on your schedule.http://livewebcast.net/INFORMS_Video_Learning_Center

video learning center

ation

It’s fast and it’s easy! Visit: http://analytics.informs.org/button.html

Page 39: Analytics July August 2010

factors such as the order in which settle-ment instructions are executed. Consider-ing this, the Indeval team applied operations research techniques that proved to be criti-cal for the success of the project.

The team used business process modeling to redesign all business

processes; designed and tested a new rule-based SSS to improve the qual-ity, reliability and safety of the service; and developed a simulation model to evaluate the performance of the new SSS in liquidity usage and settlement time.

While the enhancement and strengthening of the Mexican financial infrastructure was the most important benefit of the implementation of O.R. in Indeval’s SSS, other benefits include a safe mechanism to transfer money from the Central Bank to meet obligations

in other payment systems and an ef-fective tool for depositors to close their risk positions swiftly for better risk management.

Perhaps no better testimony to Inde-val’s achievement with Dali came from Donald Donahue, chairman and CEO of

fish. Operations research pointed the way to a resolution that had been decades in coming.

Deutsche Postal DHL: “Managing Global Brand Investments at DHL”Going from government monopoly to private economy proved to be a branding challenge for Deutsche Postal DHL. In “Managing Global Brand Investments at DHL,” the company de-veloped brand recognition in the private sector for what was once Germany’s governmental postal service. Using a global brand assess-ment tool, the company segmented potential customers into six customer categories, rang-ing from those having simple brand awareness to those using DHL as a sole provider of pack-age delivery service. With the insights from this research and subsequent marketing roll-out, DHL last year climbed to the BrandZ Top 100 ranking of most valuable global brands.

“New Brunswick Canada Department of Transportation: “Achieving Transporta-tion Asset Management via Operations Research”The New Brunswick Canada Department of Transportation (NBDoT), like the state of New York, also used O.R to reconcile public policy disputes. A couple of years ago, U.S. transportation officials were alarmed when a

Including Indeval, this year’s 2010 Edelman competition drew an international list of com-petitors working on a wide variety of projects. The other five finalists and their Edelman pre-sentations included:

The Delaware River Basin Commission: “Improving Water Release Policies on the Delaware River through Operations Research”Conflict between stakeholders drove the push to begin an O.R. program for the Delaware Riv-er Basic Commission. Although California has developed a reputation as a state where differ-ent interests play a figurative tug-of war about water rights, New York also has groups com-ing to words over access and control of wa-ter. “Breaking the Deadlock: Improving Water Release Policies on the Delaware River with Operations Research” was commissioned to resolve claims between three groups: New York City, whose government seeks high wa-ter levels at its reservoir in the area to prevent rationing in the metropolitan area; the resi-dents of the Delaware River basin, who have experienced flooding; and environmentalists, who are concerned for the Delaware River

bridge in Minneapolis collapsed, killing driv-ers and slowing the daily commute of thou-sands of people. In “Taking the Politics Out of Paving: Achieving Transportation Asset Management Excellence via Operations Re-search,” the NBDoT wrestled with issues of public safety and convenience. With more than 11,000 miles of highway and roads, 2,900 bridges and numerous ferry crossings, the province’s officials had to mount a pow-erful case if they were to secure substantial funding from Ottawa that would keep the province’s infrastructure in good repair. Us-ing O.R. tools, they reconciled their needs with the Canadian national government.

Procter & Gamble: “Inventory Optimi-zation at Procter & Gamble: Achieving Real Benefits Through User Adoption of Inventory Tools”Making it to the top 20 on the Forbes 500 list was no accident for Procter & Gamble, which handles tough business decisions with the help of a long-standing quantitative analy-sis department. In “Inventory Optimization at Procter & Gamble: Achieving Real Benefits Through User Adoption of Inventory Tools,” the team builds upon its insightful distribu-tion requirements planning systems of the 1980s. The goal was making inventory even

leaner than in the past. The team began ap-plying multi-echelon technology from Optiant on its beauty care supply unit. As a result, P&G resolved tactical and strategic prob-lems, thus reducing inventories and, in some cases, delivering cost reductions of as much as 25 percent.

Sasol: “Innovative Decision Support in a Petrochemical Production Environment”Achieving corporate karma in the face of de-clining fuel reserves, stricter clean fuel regu-lations and a global recession proved quite a trick for Sasol, a South African energy and chemical company. In “Innovative Decision Support in a Petrochemical Production Envi-ronment,” the Sasol team moved away from inexact industry measures through improved O.R. planning tools. Recognizing a shortsight-ed industry approach, the Sasol team devel-oped stochastic simulation models to address this variability. The stochastic simulation models now guide gas and liquid production facility changes, responses to changing mar-ket needs, product composition, operational efficiency, operating philosophies, schedules and proposed projects.Technical papers from all six Edelman final-ists describing their work in detail will appear in an upcoming issue of Interfaces.

Edelman finalists boast international flavor

Page 40: Analytics July August 2010

The Depository Trust & Clearing Cor-poration (Indeval’s U.S. counterpart), who said the following via video as part of Indeval’s presentation:

“The severe global financial tur-moil of the past few years has made clear how very critical the capabilities of the financial system’s operational structure can be. Being sure that you can properly clear and settle securi-ties trades and the related real-time payments, along with efficiently ac-counting for and servicing the underly-ing securities assets, is an absolutely essential part of ensuring stability and integrity in the global financial mar-kets. The Dali implementation recently

successfully completed by SD Indeval is an outstanding example of a finan-cial infrastructure organization’s suc-cess in meeting these challenges. Indeval’s achievement is especially impressive since it involved a market-wide implementation encompassing Mexico’s domestic and global equi-ties, government and corporate bonds and money market instruments. … As Indeval’s U.S. equivalent … we ap-preciate very well just how much an achievement this represents.” ❙

Peter Horner ([email protected]) is the editor of OR/MS Today and Analytics magazines. Barry List ([email protected]), the director of communications for INFORMS, contributed to this article.

Named in honor of a pioneer of opera-tions research practice at the RCA Cor-poration, the Franz Edelman Award for Achievement in Operations Research is considered the “Super Bowl of O.R.” be-cause it honors the best applications of operations research in the world. The nearly eight-month competition begins with a call for nominees in the fall. The nominees are asked to provide a two-page summary of a practical application of O.R. that has had a significant, positive impact on the company’s operations and bottom line. A team of verifiers is then sent froth to make on-site visits to learn more about the nominating work and verify claims made in the application process.

The field of nominees is gradually nar-rowed down until six finalists are invited to present their cases before a panel of judges at the INFORMS Practice Conference in the spring.

2010 Edelman Committee Chair Srinivas Bollapragada of General Electric served as one of the judges, along with Layek Abdel-Malek of the New Jersey Institute of Technol-ogy, Cynthia Barnhart of MIT, Tony Brigandi of AT&T Labs, Ananth Iyer of Purdue University, Grace Lin of World Resource Optimization, R. John Milne of IBM, Doug Samuelson of In-foLogix Inc., Donald (Bob) Smith of Monmouth University, Michael Trick of Carnegie Mellon University and Andres Weintraub of the Uni-versity of Chile.

The ‘Super Bowl of O.R.’

Page 41: Analytics July August 2010

UNLIKE THE EDELMAN AWARD, which each year honors a specific example of opera-tions research practice, the

INFORMS Prize salutes organizations for “sustained integration of operations re-search.” The INFORMS Prize Committee looks for a variety of applications of O.R. in a single organization that provides the organization with a competitive advantage through high-impact work. The committee is particularly impressed with organizations that “repeatedly apply O.R. in pioneering, varied, novel and lasting ways.”

In awarding the 2010 INFORMS Prize to Jeppesen (a Boeing company), IN-FORMS took note of the world-class capa-bilities and contributions of the Jeppesen operations research department. Com-mittee Chair Jeff Camm presented the

2010 INFORMS Prize at the awards gala held in conjunction with the INFORMS Practice Conference in Orlando, Fla.

“Jeppesen uses O.R. to address both the strategic and tactical problems we en-counter on a continual basis,” says Mark Van Tine, Jeppesen president and CEO, who was on hand to accept the award along with Stefan Karish and Marilyn Aragon. “INFORMS has bestowed a sig-nificant honor upon us. It independently confirms the impact O.R. has at Jeppes-en, while highlighting our commitment to leveraging O.R. for the benefit of our customers and the success of our com-pany. We simply could not do much of the highly sophisticated work we do without this fundamental capability.”

Jeppesen helps pilots, mariners and others get safely and efficiently to their

destinations by providing them with mission-critical information, including flight manuals that contain navigational and other safety-of-flight information. Worldwide, more than 650 airlines and 1 million pilots, 6,500 commercial ships and 1.5 million boaters and some of the largest railroad operators in Europe rely on Jeppesen.

INFORMS first took notice of Jeppes-en in 2000, when judges for the Insti-tute’s Franz Edelman Award competition gave the prize to Jeppesen after a turn-ing point in the company’s history. In 1997, Jeppesen saw its service deterio-rate when a growing line of over 100,000 aviation charts overwhelmed its produc-tion system. The company responded by establishing a small O.R. group to analyze its production problems. In just

two short years, Jeppesen was able to eliminate late delivery of product or-ders, reducing costs by nearly $3 mil-lion annually.

Since then, the expanding O.R. de-partment has consistently applied a wide variety of O.R. techniques throughout the enterprise to improve operations and to support decision-making related to re-source utilization, inventory optimization, capital investments, market strategy, product development and pricing.

Today, more than 75 professionals with an O.R. background are helping Jeppesen apply analytics to make better decisions, build better product, and offer better services.

Past recipients of the INFORMS Prize include Intel, UPS, HP, IBM, Procter & Gamble and GE Research.

INFORMS Prize Chair Jeff Camm (far left) congratulates Stefan Karish, Marilyn Aragon and Mark Van Tine of Jeppesen.

Page 42: Analytics July August 2010

Daniel H. Wagner Associates is a consult-ing firm that develops mathematical models and software implementations of those models to aid a wide range of clients in solving chal-lenging operational problems. The Department of Defense is at the top of the firm’s client list. Other important client sectors include the finan-cial industry, the health industry, the transpor-tation industry and the oil and gas industry.

Headquartered in Malvern, Pa., the firm has branch offices in Hampton and Vienna, Va. Wagner Associates is an employee-owned company. Approximately half of the technical staff holds Ph.D.s in the mathematical scienc-es. The employees are the single greatest as-set of Wagner Associates, and the company is structured to provide maximum benefits to the staff. Technical and career growth is encour-aged and supported both conceptually and fi-nancially. In addition to tuition assistance, the company provides professional leave of up to seven days a year to encourage professional

activities, such as writing journal articles, par-ticipating in professional societies, refereeing papers and attending conferences.

The firm was founded by Daniel H. Wagner in 1963 with the corporate goal of combining the power of mathematical theory with opera-tional experience to address the increasingly complex problems encountered in naval op-erational analysis. Dan Wagner was a pioneer in naval operational analysis and brought an innovative philosophy to his fledgling compa-ny: hire the best mathematical talent possible and let them learn the applied side on the job. The company continues this tradition to this day, with proven success. In honor of his sig-nificant accomplishments and contributions to the field of operations research, CPMS: The Practice Section of the Institute for Operations Research and the Management Sciences (IN-FORMS), offers the Wagner Prize for Excel-lence in Operations Research Practice.

A major strength of the company is in the area of search theory, the optimal allocation of search effort/resources when attempting to locate or detect an object. Throughout the years, Wagner Associates has advanced this field in both the theoretical and the applied realm. Famous examples of the application of search theory involving Wagner Associates participation include: the 1966 search for an H-bomb lost by the U.S. Air Force near Palo-mares, Spain; the 1968 search for the sunk-en nuclear attack submarine USS Scorpion

(SSN-589); the search and recovery operation after the space shuttle Challenger accident; and the search for the SS Central America, an 1857 treasure ship that sunk off the Carolinas in a hurricane, whose discovery returned more than $400 million in gold.

Of course there are other important appli-cations of search theory. For example, Wagner Associates used search theory to develop the first computer-assisted search planning tool (CASP), which was used by the U.S. Coast Guard in planning and conducting search and rescue (SAR) efforts. One interesting feature of SAR is that in some search-and-rescue op-erations, the goal becomes to minimize the time to locate the object (e.g. man overboard in frigid waters) rather than to maximize the probability of finding the desired object.

In the military arena, search theory comes into play when attempting to determine, or maintain knowledge of, enemy location and status. This applies whether one is trying to optimally schedule the use of radar energy to detect incoming ballistic missiles as early as possible or placing sonobuoys in the water to assure that an enemy submarine cannot get within torpedo range of a carrier battle group undetected. When the object under search is actively seeking to avoid detection, a game-theoretic approach may provide the best solution. Wagner Associates developed a genetic algorithm based tool, the Opera-tional Route Planner (ORP), that is used in the U.S. Navy Undersea Warfare-Decision Support System (USW-DSS) to plan search

Headquarters: Malvern, Pa.

Business: Consult ing and software development f irm that applies mathematics (operations research) and computer science to a wide range of operational problems.

Web site: www.wagner.com

COMPANY NAME: DANIEL H. WAGNER ASSOCIATES, INC.

Page 43: Analytics July August 2010

routes for anti-submarine warfare (ASW) and takes target-reaction-to-search opera-tions into account.

Data fusion is another area of research related to search theory. Humans are easily able to integrate their own organic sensory information in order to obtain an accurate picture of the world around them. Automati-cally fusing the data from operational sens-ing systems (radar, passive and active sonar, cameras, seismic sensors, etc.) to achieve situational or tactical awareness of the sur-roundings poses a difficult challenge even with today’s powerful computers. Wagner As-sociates has been involved in the tracking of military targets since its inception. Even the simple problem of tracking a target with radar falls under the heading of data fusion. One must correlate the radar detections from one scan of data to the next, and then extract from the set of correlated detections as much infor-mation about the target state as possible. The Kalman Filter is the classic method for esti-mating the kinematic state from sensor data. Additional knowledge can be inferred from the observed trajectory over time (e.g., civilian airliners don’t make 3-G turns).

When multiple heterogeneous sensors need to be fused, and when the goal is to infer from

the data a higher level of knowledge (relation-ships, intent), data fusion begins to intersect with a wealth of mathematical fields. Prob-ability and stochastic processes play a funda-mental role in the basic kinematic estimation. For example, Bayesian statistics, often imple-mented as Bayesian Networks, can be used to estimate target classification/identity, and graph theory can be used for associating data across multiple sensor frames. In addition, a variety of new analytical tools can also come into play: neural networks, fuzzy logic, eviden-tial reasoning, support vector machines, etc.

Although much of the work performed at Wagner Associates is basic research, many of our projects lead to operational systems that prove successful in the field. Oftentimes, theoretical advances made in a number of projects over a period of years combine to produce a single, highly advanced software product or module. A perfect example of this process is the acoustic mission planner (AMP), developed for the Navy’s MH-60R Seahawk Multi-Mission Helicopter under a multimillion dollar subcontract to Lockheed Martin Systems Integration-Owego.

One of the primary missions for the MH-60R is to locate submarines, and it uses both passive and active sonobuoys and a dipping airborne low frequency sonar (ALFS) to ac-complish this task. AMP assists the MH-60R crew in planning its anti-submarine missions, recommending ALFS dipping sonar times

and locations and optimal passive and ac-tive sonobuoy patterns. AMP is embedded in the MH-60R avionics software and also in the shipboard mission planning station (MPS).

There are two keys to AMP’s high level of performance. The first is the use of Wagner’s non-Gaussian tracking engine (NGTE) to pro-vide the best possible probabilistic estimate of the submarine’s location. The NGTE is a non-Gaussian tracker that uses Monte Carlo target motion models and Bayesian statistical mod-els to generate a space-time target probabil-ity distribution that is updated in real-time for both “positive” contact reports and “negative” search information from non-detection of the submarine. NGTE also uses estimates of tar-get tactics and the presence of obstacles (such as land in the case of locating a submarine) to accurately project target location into the future based on the fusion of all available data. The second key is a search optimization algorithm that takes the target location information gen-erated by NGTE, combines this data with in-situ sensor performance estimates, and then optimizes the employment of the dipping so-nar and passive and active sonobuoys using a global optimization scheme based on Brown’s algorithm, along with a local heuristic for flight path selection.

In operational use, the AMP optimizer computes a complete route with sensor loca-tions, depths and operating modes for the he-licopter at the beginning of the mission. Each deployment of the dipping sonar, or of an ex-pendable bathythermograph (XBT), returns

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Page 44: Analytics July August 2010

environmental data that is used to improve the accuracy of the sensor performance estimate. Based on this new data, the embedded sys-tem reruns the optimization algorithm, improv-ing overall mission performance in the latter portion of the search.

One of the recurring surprises of mathe-matics is how seemingly unrelated problems have mathematically related solutions. For ex-ample, the same class of stochastic differential equations used to model the physical motion of vehicles for the DoD can be used to model the “motion” of prices of various financial in-struments. Wagner Associates uses these and other mathematical finance methods to devel-op trading systems that exploit statistical arbi-trage opportunities for the benefit of our client investment firms.

Out of our long history of finance con-sulting experiences, Wagner Associates has developed numerous computer mod-els for quantifying financial risk as well as stand-alone mathematical finance software products. The Retirement Spending Plan-ner (RSP) tool uses Monte Carlo techniques (similar to those used in NGTE) to analyze and recommend retirement planning strate-gies for individuals. A classic example of Sam Savage’s “Flaw of Averages” is the retirement plan that on average results in a comfortable lifestyle until one’s presumed demise at age 95, but that has a 25 percent chance of go-ing broke before age 85. RSP, used by both

individuals and certified financial planners, permits a probabilistic analysis that can ac-curately take into account future uncertain-ties. Another product, M-V Optimizer, uses mean-variance optimization methods to con-struct investment portfolios that maximize expected return subject to a user specified constraint on risk.

In summary, the Wagner Associates com-pany handbook states, “Our staff is decidedly our most important asset. Therefore, a pri-mary corporate goal of the firm is to build and

A primary mission for the MH-60R helicopter: locate submarines.

maintain a highly talented and motivated staff and to provide them opportunities to grow pro-fessionally.” We accomplish this goal by choos-ing to work on the most challenging problems, by seeking partnerships with universities and government labs, by encouraging innovative thinking and academically oriented activities, and by maintaining the high standards of re-search excellence set by our founder, Daniel H. Wagner. ❙

C. Allen Butler ([email protected]) is the president of Daniel H. Wagner Associates based in Hampton, Va.

A i i i f th MH 60R h li t l t b i

Page 45: Analytics July August 2010

Gain insights from experts on how math, analytics, and operations research affect organizations like yours in these succinct 20-30 minute podcasts conduct-ed by INFORMS Director of Communica-tions Barry List (photo).

Mary Benner, Wharton SchoolDo Financial Analysts Sabotage New Product Innovation Recorded June 11, 2010

David Alderson, Naval Postgraduate SchoolDanger: Infrastructure Under AttackRecorded May 28, 2010

Peter Kolesar, Columbia UniversityEnding the Water War Surrounding New York’s ReservoirRecorded May 14, 2010

Eva K. Lee, Georgia TechAnalytics Goes to War - Against CancerRecorded April 30, 2010

Warren Lieberman & Michael RaskinForecasting Consumer BehaviorRecorded April 8, 2010Michael Schrage, MIT Experimenting at the WorkplaceRecorded March 19, 2010,

Arnold Barnett, MITHow safe are our airports?Recorded March 4, 2010,

Brian Lewis, Vanguard SoftwareAdvice to Execs on Working with Operations ResearchersRecorded Feb. 18, 2010

Mary Grace Crissey, SASThe Nitti-Gritty of Working with O.R. Providers Recorded Feb. 5, 2010

Dina Mayzlin, Yale University & David Godes, University of MarylandWord of Mouth MarketingRecorded Jan. 22, 2010

Pinar Keskinocak & Julie Swann, Georgia TechHaiti: Humanitarian LogisticsRecorded Jan. 14, 2010

Doug Samuelson, Infologix Winning Elections with O.R. Recorded Jan. 12, 2010

Ron Howard, Stanford University Master Decider Recorded Dec. 18, 2009John Sterman, MITClimate Change: On to CopenhagenRecorded Dec. 4, 2009

Anna Nagurney, University of Massachusetts, AmherstSupernetworks: Building Better Real and Virtual HighwaysRecorded Nov. 6, 2009

Sam L. Savage, Stanford UniversityThe Flaw of AveragesRecorded Nov. 6, 2009

James J. Cochran, Louisiana TechRunning the Numbers in Time for the World SeriesRecorded Oct. 23, 2009

Tom Davenport, Babson College Competing on AnalyticsRecorded Sept. 22, 2009

Sheldon H. Jacobson, University of Illinois, Urbana-ChampaignEmergency! PandemicRecorded Sept. 5, 2009

Rob Pratt, Ivan Oliveira, SAS; Chuck Delaney, Wake County Public School SystemThe perils of success: one school district’s answer in the numbers Recorded Aug. 14, 2009Lawrence Wein, Stanford University Troublemaker or Trusted Advisor?Recorded Aug. 7, 2009

Justin Cohen, The Clinton FoundationUsing Analytics to Battle AIDS: A Lesson from the Clinton FoundationRecorded July 24, 2009

Prof. ManMohan Sodhi, City University, LondonEconomic Calamity as a Supply Chain ProblemRecorded July 10, 2009

Karl Kempf, Intel Intel’s Chief Numbers CruncherRecorded June 19, 2009

Professor Michael W. Carter, University of TorontoHow Can You Squeeze 30% Out of Healthcare Costs?Recorded June 19, 2009

Page 46: Analytics July August 2010

View these on-demand presentations, complete with slides, from INFORMS renowned meetings and conferences. They will inspire and help you navigate changes in the marketplace, develop strategies for excellence in operations research and analytics, and manage your practice or career.

Plenary Presentation:Richard O’Neill, Federal Energy Regulation CommissionBetter Smarter

Electricity Markets: Efficiently Capturing Wind, Rain and Fire

Keynote Presentation:Christopher S. Tang, UCLA Anderson School of ManagementSupply Chain

Risk Management: Developing a Research Agenda

2009 Wagner Prize Presentations:Hugo Simao, Princeton University, and Ted Gifford, Schneider NationalApproximate Dynamic Programming Captures Fleet Operations for Schneider National

Horst Zisgen, IBM Deutschland Research and Development GmbH and Steven M. Brown, IBM Systems and Technology GroupA Queuing Model-Based System for Semiconductor Production Planning at IBM

Michael Gorman, University of DaytonHub Group Implements a Suite of OR Tools to Improve its Operations

Karl Kempf, Intel Corporation, and S. David Wu, Lehigh UniversityExtending Bass for Improved New Product

Forecasting at Intel

Hernan Abeledo, The George Washington UniversityOptimizing Helicopter Transport of Oil Rig Crews at Petrobras

Hector Anaya, INDEVAL, Jaime Villasenor, INDEVAL, Francisco Solis, Banco de Mexico, Miguel de Lascurain,

ITAM, and Arturo Palacios, INDEVALINDEVAL Develops a New Operating and Settlement System Using Operations Research

Robert Tudor, Delaware River Basin Commission, James Serio, The Delaware River Foundation, and Peter Kolesar, Columbia UniversityDelaware River Basin Commission – Breaking the Deadlock: Improving Water Release Policies on the Delaware River through Operations Research

It’s fast, it’s easy and it’s FREE! Just visit: http://analytics.informs.org/

Lori Folts, Deutsche Post DHL, Marc Fischer, University of Passau, and Tjark Freundt, McKinsey & CompanyDeutsche Post DHL – Managing Global Brand Investments at DHL Dale Wilson, New Brunswick DoT, Kim Mathisen, New Brunswick DoT, Ugo Feunekes, Remsoft, and John MacNaughton, New Brunswick DoTNew Brunswick (Canada) Dept. of Transportation – Taking the Politics out of Paving: Achieving Transportation Asset Management through O.R.

Daniel Myers, Proctor & Gamble, Glenn Wegryn, Proctor & Gamble, William Tarlton, Proctor & Gamble, and Sean Willems, Boston UniversityInventory Optimization at Proctor & Gamble: Achieving Real Benefits Through User Adoption of Inventory Tools

Hylton Robinson, Sasol Technology, Marlize Meyer, Sasol Technology, Michele Fisher, Sasol Technology, and Willem Louw, Sasol TechnologySasol – Innovative Decision Support in a Petochemical Production Environment

Richard O'Neill

Christopher Tang

Karl Kempf

Hector Anaya

Page 47: Analytics July August 2010

Getting lost while hiking in the wilderness is a dangerous situation to find yourself in. And mak-ing your way back to civilization is a difficult task that quickly uses up resources. What you decide to take with you while making the journey back to civilization can determine life or death.

Table 1 shows all of the items that are available to you that will aid you in your hike out of the wilder-ness. Containers of food and water will give you en-ergy, shelter will protect you from the elements, and defense will protect you from wild animals. Each item has a weight indicated by the red number and each item has survival points indicated by the green number. You must take one item from each of the four categories (food, water, shelter, defense). Un-fortunately, the backpack you have has a maximum capacity of 25 kg. Your chance for survival is cal-culated by adding all of the survival points together from the items you choose to take with you.

Question: What is the maximum chance for survival you can achieve?

Send your answer to [email protected] by Aug. 15. The winner, chosen randomly from cor-rect answers, will receive an “O.R. The Science of Better” T-shirt. Congratulations to Bill Lesso for correctly solving February’s Planet PuzzlOR. Past questions can be found at puzzlor.com. ❙

John Toczek ([email protected]) is the senior decision support analyst for ARAMARK Corporation in the Global Risk Management group. He earned his B.Sc. in chemical engineering at Drexel University (1996) and his M.Sc. in operations research from Virginia Commonwealth University (2005).

Table 1: Choose items carefully.