195
US software Produced by Produced by Produced by Special report The group of companies that comprise CLSA are affiliates of Credit Agricole Securities (USA) Inc. For important disclosure information please refer to page 192. Ed Maguire [email protected] (1) 212 261 3997 Dominic White, CFA (1) 212 261 7759 13 September 2010 USA Technology 2020 foresight Tech views of the future

2020 foresight - Tech Views of the Future - Ed Maguire

Embed Size (px)

Citation preview

Page 1: 2020 foresight - Tech Views of the Future - Ed Maguire

US software Produced byProduced byProduced by

Special report

The group of companies that comprise CLSA are affiliates of Credit Agricole Securities (USA) Inc. For important disclosure information please refer to page 192.

Ed Maguire [email protected]

(1) 212 261 3997

Dominic White, CFA (1) 212 261 7759

13 September 2010

USA Technology

2020 foresight Tech views of the future

Page 2: 2020 foresight - Tech Views of the Future - Ed Maguire

US software

The group of companies that comprise CLSA are affiliates of Credit Agricole Securities (USA) Inc. For important disclosure information please refer to page 192.

Contents

Executive summary .............................................................................3

Tapping into accelerating change........................................................4

Transparent IT ..................................................................................12

Intelligent systems ...........................................................................16

Convergence .....................................................................................22

2020 interviews

Jan Baan, Cordys............................ 28 Seth Levine, Foundry Group ......... 118

Willem van Biljon, Nimbula............... 34 Glen Mella, Control4.................... 122

David Cohen, EMC .......................... 40 Geoffrey Moore, TCG Advisors ...... 131

Simon Crosby, Citrix ....................... 47 Lew Moorman, Rackspace ............ 139

Jill Dyche, Baseline Consulting .......... 58 Sanjay Poonen, SAP.................... 145

Andrew Feldman, SeaMicro .............. 65 Keith Schaefer, BPL Global ........... 150

Promod Haque, Norwest Venture ........ 70 Stratton Sclavos, Radar Partners..... 157

Timo Hannay, Nature Publishing ......... 75 Michael Skok, North Bridge .......... 164

Parker Harris, Salesforce.com........... 83 Michael Tiemann, Red Hat............ 171

Dave Kellogg, MarkLogic .................. 91 Ray Wang, Altimeter Group.......... 181

Gary Kovacs, Sybase..................... 101 Stephen Wolfram, Wolfram Rsch..... 186

Andy Lawrence, The 451 Group ........ 111

Insightful, differentiated research

Ed Maguire [email protected] (1) 212 261 3997

Dominic White, CFA (1) 212 261 7759

AsiaUSAResearch

TMAsiaUSAResearch

TMAsiaUSAResearch

TM

Page 3: 2020 foresight - Tech Views of the Future - Ed Maguire

Executive summary US software

September 2010 [email protected] 3

2020 foresight We sought the views of over 20 people in and related to the technology industry, including venture capitalists (VCs), technologists, software companies, authors and industry analysts to identify key themes to help investors frame their decisions over the next several years. This report includes transcripts of conversations touching on the world of technology in 2020. We explore key areas including cloud computing, software as a service (SaaS), information management, enterprise applications, open source, mobility, energy information technology (IT), social enterprise and collaboration. We identify three “meta-themes”: the rise of transparent IT, intelligent systems and convergence.

Technologists and investors tend to project the future in stepwise terms, but innovations and paradigm shifts occur at an accelerating, often exponential pace. Over the next decade, hardware, storage and computing capabilities will improve at exponential rates, while progress in software may prove the sole limitation. Increasingly rapid paradigm shifts (Facebook and the iPad, for instance) reinforce that change is accelerating and will continue to surprise.

Computing will be increasingly embedded into daily life, more intuitive and pervasive as a result of increasingly powerful and flexible software, rapid growth of endpoint devices, availability of “instant-on” connectivity and declining costs of hardware, bandwidth and storage. The trend of “consumerization” really reflects complexity that is giving way to simplicity.

Intelligence will increasingly be embedded into “closed-loop” and point-of-control systems. Solutions will benefit from growing predictive powers of software, standards-enabled integration, pervasive connectivity and increasing availability of sensors and remote-controlled devices. Intelligence will extend beyond ecommerce and business to embrace physical systems, including home networking, smart grids and location-based services.

For technology users and information consumers, the experience is paramount. Over the next decade, the distinction between discrete software, hardware, services and content vendors will become blurred as leading vendors both diversify and vertically integrate through mergers and acquisitions. Scale and specialization will define competitive differentiation.

Paradigm shifts are accelerating

?

The Web

Mobile Phone

PC

Television Radio Telephone

Facebook

0.1

1

10

100

1860 1880 1900 1920 1940 1960 1980 2000 2020

(Years)

Source: Ray Kurzweil, KurzweilAI.net

Exploring tech views of the future

Tapping into accelerating change

Transparent IT - Simplicity rules,

complexity fades into the background

Intelligent systems - Turning data into a

smarter world

Convergence - Cross breeding software, hardware, services

and content

Change in technology will come at an increasingly

quicker pace

Page 4: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

4 [email protected] September 2010

Tapping into accelerating change Trying to accurately elicit directions of the future is the charter of professional investors. By gathering data and context from the past and present, we attempt to extrapolate future trends. The immediate considerations of near-term dynamics of markets and the technology industry place demands on investors to focus on issues at hand; however, it is helpful to step back from time to time to take stock of what trends and themes will impact the dynamics of the future.

Tomorrow will give us something to think about.

Marcus Tullius Cicero

We sought the views of people in and related to the technology industry, including VCs, technologists, software companies, authors and industry analysts. Our intent is not to accumulate a codex of future predictions, rather our goal is to identify key themes that may help investors frame their decisions over the next several years. A review of demographic and geopolitical trends, while certainly relevant, lies beyond the scope of this project.

Imagination is linear, progress is exponential The amount of change that can occur in a decade is accelerating as technology evolves at an exponential pace. The world of technology in 2020 will see innovations widely adopted that may now be in the earliest stages of conception, or may not be conceived and realized for several years. Our conversations have provided a few surprises, revealed common threads, confirmed some prevailing views and challenged others, but overall have provided us with a framework to guide our ongoing efforts to anticipate the course of technological change and the investment opportunities that follow.

If we have learned one thing from the history of invention and discovery, it is that, in the long run - and often in the short one - the most daring prophecies seem laughably conservative.

Arthur C Clarke

We believe there is a lot of reason for optimism. The emergence of cloud computing, the mobile internet, non-traditional user interfaces, advances in programming science and artificial intelligence, falling costs of computing, networking and storage place unprecedented power in the hands of everyone, from a child with a cell phone to entrepreneurs to researchers seeking to solve challenges of medicine. The barriers to innovation have never been lower. Resources at hand for free or nominal cost, combined with global availability of information, communications and collaboration tools provide a springboard for imagination, experimentation and risk taking.

For investors, the challenge always remains a combination of timing and careful selection. As we look to the next decade for investment opportunities, the key themes we have indentified - transparent IT, intelligent systems and convergence - provide a framework that we hope will help place in context the ability to identify emerging opportunities, disruptive forces and secular drivers along the full continuum of industry maturity cycles.

Our conversations revealed a number of key themes and predictions about the future of tech and technology. While hardware, bandwidth and storage will continue to increase in performance and decline in price, software remains a key domain where there is not a Moore’s Law type paradigm of exponential improvement.

We sought the views of people in and related to the technology industry

Investors inherently must anticipate the future

The amount of change that can occur in a decade

is accelerating

Barriers to innovation have never been lower

The challenge always remains a combination

of timing and careful selection

Page 5: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

September 2010 [email protected] 5

Three “meta-themes” provide a framework We have identified three “meta-themes” that provide a framework for anticipating the changes in technology and software over the next decade:

Transparent IT - Complexity is giving way to simplicity. Powerful capabilities become more easily accessible and pervasive, while advanced technologies become increasingly embedded in systems and the environment.

Intelligent systems - Software and technology systems are increasingly gaining the ability to drive intelligent automation, decision enhancement, operational optimization and risk management in self-directed, recursive systems. Advancements in analytics and artificial intelligence techniques, such as machine-learning and neural nets, leverage the exponential growth of data from the proliferation of users, devices, sensors, applications and systems.

Convergence - Software, hardware, services, content and business processes increasingly straddle formerly discrete definitional and categorical barriers. We expect increasing integration, both vertical and horizontal, a steady pace of cross-disciplinary development and M&A and a growing emphasis on holistic solutions.

We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next 10.

Bill Gates

What changed from 2000-2010 It is helpful to take brief measure of the changes that have occurred over the past decade. The year 2000 saw the height of the internet bubble with the Dow and Nasdaq hitting all-time highs, companies raising money on little more than an idea (often spurious) and the investment community at large applying “best case” scenarios to the transformative power of the internet business model.

Through the crash of 2000-2001 and the creative destruction that followed, the software industry in particular has followed a course few might have predicted. In hindsight, it is not difficult to imagine that the parabolic tech stock gains of the late 1990s would be followed by a crash. However, the lackluster aggregate return of established tech giants, such as Microsoft, Cisco and Oracle, and steady multiple compression across the sector were far less likely to have been anticipated.

Figure 1

Largest technology companies by market cap in 2000

2000 Market cap (US$bn)

PS (x)

PE NTM (x)

TTM sales(US$bn)

Cisco 448.4 28.8 97.8 15.5

Microsoft 422.6 19.8 72.5 21.4

Intel 202.1 6.0 32.7 33.7

Oracle 201.8 21.0 97 9.6

IBM 149.8 1.6 26.9 95.8

Source: Bloomberg

Simplicity wins

Things get smarter

Solutions are neatly wrapped and delivered

That parabolic tech stock gains of 1990s would be followed by crash is not

difficult to imagine

The largest tech stocks of 2000 had stratospheric

valuations . . .

Page 6: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

6 [email protected] September 2010

Figure 2

The declining valuations of the former top five technology companies

2010 Market cap (US$bn)

PS(x)

PE NTM (x)

TTM sales(US$bn)

Cisco 132.2 3.5 15.8 38.1

Microsoft 199.5 3.3 15.5 60.0

Intel 112.3 3.2 13.5 35.1

Oracle 113.4 4.5 14.7 25.3

IBM 170.9 1.9 12 88.4

Source: Bloomberg

Alternately, the rise of companies, such as Google and Salesforce.com, from startup origins as well as the mainstream acceptance of virtualization and cloud computing are dramatic developments that have reshaped the industry landscape. However, these developments sprung from ideas that had germinated well ahead of the new Millennium.

Similarly, the advent of the mobile internet and many of its manifestations, such as smartphones and mobile applications, were clearly envisioned in the late 1990s. That the ideas were ahead of their time resulted in significant loss of venture investments, but steady progress over the decade has led to a definitive paradigm shift at the beginning of the 2010s.

The explosive rise of social-networking technologies, such as Facebook and Twitter, has occurred in a far more compressed time frame. With the number of Facebook users approaching half a billion and Twitter nearing 200 million, this is remarkably rapid adoption for applications in only a few years. As investors seek to extrapolate future trends from historical perspective, it is helpful to qualify any expectations in the context of exponential change.

Nobody, 20 years ago, forecast the internet.

Bryan Appleyard

In his book The Singularity is Near, Ray Kurzweil makes a compelling case that we should expect continuing exponential change in technology and society at large, so that over the next century we will see not 100 years of progress. It will be more like 20,000 years of progress at today’s rate.

The idea that progress in technology occurs at exponential rates is most illustrated by comparing mass adoption of inventions over the past 150 years. One only has to look at the rapid growth of Facebook and the vision of tablet computing the iPad has catalyzed to see these accelerating paradigms. The ramifications of this are significant. Over the next decade, we can anticipate that successful new innovations will see adoption at increasingly rapid pace.

Ideas behind Google, virtualization and SaaS

sprung before the new Millennium

Adoption of social networking technologies

occurred in a far more compressed time frame

The rate of change is accelerating

Adoption of new technologies is

exponentially faster

. . . but have reverted to market multiples

over time

Page 7: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

September 2010 [email protected] 7

Figure 3

Adoption paradigms are accelerating

?

The Web

Mobile Phone

PC

Television Radio Telephone

Facebook

0.1

1

10

100

1860 1880 1900 1920 1940 1960 1980 2000 2020

(Years)

Source: Ray Kurzweil and KurzweilAI.net

The principle of Moore’s Law, which holds that processor performance can double every 18 months, has held fast since the 1970s, while price performance of DRAM continues to improve along a similar dynamic.

Figure 4 Figure 5

MIPS growth since 1970

DRAM price/performance since 1970

Core i7 Extreme (i980EE)

Core 2 Extreme(Qx6700)

Pentium 4 (600)

Pentium 4 (3066)Pentium III

Pentium 4 (1700)

Pentium II

Pentium

486386

2868086

8080

80084004

0

0

1

10

100

1,000

10,000

100,000

1,000,000

1970 1975 1980 1985 1990 1995 2000 2005 2010

(MIPS)

100

1,000

10,000

100,000

1,000,000

10,000,000

100,000,000

1,000,000,000

1970 1975 1980 1985 1990 1995 2000 2005 2010

(DRAM bits/US$)

MIPS = Millions of instructions per second. Source: Ray Kurzweil and KurzweilAI.net

The power of wireless handheld devices similarly reflects an accelerating rate of performance, while the bandwidth capacity of the internet is growing at exponential step functions.

The rate at which new inventions reach

widespread adoption is accelerating

Moore’s Law has held fast since the 1970s

Wireless handheld devices showing accelerating rate

of performance

Page 8: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

8 [email protected] September 2010

Figure 6 Figure 7

Smartphone/PDA processing power since 1994

Internet backbone BPS since 1965

0

1,000

2,000

3,000

4,000

5,000

6,000

1995 1997 1999 2001 2003 2005 2007 2009 2011

(MIPS)

1.0E+04

1.0E+05

1.0E+06

1.0E+07

1.0E+08

1.0E+09

1.0E+10

1.0E+11

1965 1970 1975 1980 1985 1990 1995 2000 2005 2010

(Bits per second)

MIPS = Millions of instructions per second; BPS = Bits per second. Source: Ray Kurzweil and KurzweilAI.net

In fact, the dynamic of exponential cost and performance improvement is occurring across a broad range of technologies. While improvement occurs at different rates, the consistent historical trend remains a common dynamic across different hardware technologies.

Figure 8

Time to double (or half)

Dynamic RAM memory “half pitch” feature size 5.4 years

Dynamic RAM memory (bits per dollar) 1.5 years

Average transistor price 1.6 years

Microprocessor cost per transistor cycle 1.1 years

Total bits shipped 1.1 years

Processor performance in MIPS 1.8 years

Transistors in Intel microprocessors 2.0 years

Microprocessor clock speed 2.7 years

Source: Ray Kurzweil and KurzweilAI.net

I never think of the future. It comes soon enough.

Albert Einstein

We highlight a few key predictions for the progress of technology, data and connectivity over the next 10 years:

The National Science Foundation predicts the number of internet users will reach almost 5 billion by 2020, an increase from 1.7 billion users in 2010 and 360 million in 2000. Vast numbers of people in developing countries will gain access to the web as a result of declining costs and exponential technology improvement.

The worldwide mobile subscriber base is forecast to increase from 4.6 billion at end of 2009 to 7.5 billion by end of 2020. (Source: Portio Research 2009)

Worldwide mobile penetration will touch 94% by end of 2020 from 64% in 2009. (Source: Portio Research 2009)

The first commercial quantum computer will be available by mid-2020. (Source: Cisco IBSG 2009)

By 2020, the number of internet users will reach

almost 5 billion

Worldwide mobile subscribers will reach 7.5

billion by end-2020

Exponential cost and performance

improvement occurs across technologies

Page 9: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

September 2010 [email protected] 9

In the next 10 years, we will see a 20x increase in home-networking speeds. (Source: Cisco IBSG 2009)

The adoption of Internet Protocol version 6 (IPv6) will dramatically increase the availability of unique IP addresses, hence the number of unique devices that can be connected to the internet. IPv6 addresses are 128 bits long, whereas IPv4 addresses (the prior standard) are 32 bits. While the IPv4 address space contains roughly 4.3×109 (4.3 billion) addresses, IPv6 has enough room for 3.4×1038 (340 trillion trillion trillion unique addresses.)

By 2020, a US$1,000 personal computer will have the raw processing power of a human brain. (Sources: Hans Moravec, Robotics Institute, Carnegie Mellon University 1998; Cisco IBSG 2006-2009)

The world’s data will increase sixfold in each of the next two years, while corporate data will grow fiftyfold. (Source: Technorati)

By 2020 worldwide, the average person will maintain 130 terabytes of personal data, up from ~128 gigabytes today. (Source: Cisco IBSG, 2009)

The extrapolation of exponential progression in microchip processing power shows that we will be able to get to 10-14th / 10-16th calculations per second, possibly by the end of this decade. Justin Rattiner of Intel believes that 3D chips will take off where standard silicon chips leave off. Through the transition, Moore's Law will continue.

Worldwide volume of all digital data will grow from 1.2 million petabytes (or 1.2 zettabytes) in 2010 to 35 zettabytes in 2020. (Source: IDC 2010). Note: 1 zettabyte is equal to 2 to the 70th power (binary), 10 to the 21st power or 1 sextillion bytes. This equates to a billion terabytes (a terabyte is equal to a trillion bytes or a million megabytes).

Looking forward, we identify a few common threads from our conversations:

Simplicity rules - Consumers and business want a straightforward experience and will care less about the pieces that make up a solution. Complex technology systems will continuously be broken down into components and exposed as services to create a simpler experience for users.

Solutions will predominate - The convergence of hardware, software and services will result in pre-integrated solutions productized and delivered as “application appliances.” Recent M&A illustrates this direction, with IBM’s software acquisitions, Oracle’s acquisition of Sun and Intel’s acquisition of McAfee.

Scale wins the platforms, innovation fragments apps - Scale and scope stake out defensible high ground in the cloud. Cloud services will be about cost efficiencies and flexibility, and providers with distribution networks and the capital resources will be best positioned. Google, Amazon, Microsoft, Salesforce.com and Rackspace are early leaders, but we expect growing presence from telecom carriers, such as AT&T and Verizon, as well as integrators, such as IBM and Fujitsu. This will alter the competitive landscape as software vendors compete directly with hardware vendors (SAP vs IBM, Oracle vs HP, Symantec vs Dell, and Cisco vs Microsoft).

In the next 10 years, we will see a 20x increase in home-networking speeds

Worldwide digital data will grow from 1.2 million

petabytes in 2010 to 35 zettabytes in 2020

Consumers and businesses want a

straightforward experience

Expect pre-integrated solutions productized

and delivered as “application appliances”

Page 10: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

10 [email protected] September 2010

Internet time arrives - 10 years later. The accelerating pace of paradigm shifts creates a dynamic where competitive advantages can arise and crumble more rapidly than ever. Though it is not a stretch to assume companies like Microsoft, IBM, SAP, HP, Google and Oracle will remain powerful forces in the tech industry, it will be incumbent on their management teams to ensure they remain as relevant in 2020 as they are today.

Who benefits, who faces disruption? We identify vendors who appear well positioned to benefit from accelerating change over the next decade. This list is far from comprehensive, but provides a general framework for evaluating secular beneficiaries and companies potentially under threat:

Cloud-service providers - Scale is important for infrastructure as a service, and there is a rush underway to build out datacenter capacity to accommodate the expected demand. Key players include Microsoft, Amazon.com, Google, Rackspace and IBM. Other important infrastructure- and platform-as-a-service providers include AT&T, Fujitsu, Savvis, Equinix, Terremark, OpSource, Joyent, GoGrid, NaviSite, NetSuite, Intuit and enabling services, such as Akamai.

Startups - The growing availability and falling cost of cloud-computing services lowers barriers for entrepreneurs. The environment favors new providers of mobile and vertical applications, developers of algorithmic services and ecommerce/personal data brokers.

We identify vendors who stand to benefit from the disruptive technologies that create new markets:

Integration and middleware - Vendors that provide integration platforms for the cloud, connected devices and smart grids are well positioned to enable new environments of networked services. These include Informatica, IBM, SAP/Sybase, Pervasive, Oracle, Syniverse, private vendors Boomi, Iris Wireless, Motricity, mBlox and others. Traditional infrastructure-management vendors, such as BMC, CA, Symantec and others, have important roles in managing complexity, but must balance challenges from legacy businesses.

Cloud-infrastructure vendors - At least for the near term, vendors that specialize in networking infrastructure for cloud computing (both public and private clouds) are well positioned for a capital-spending buildout. These vendors include Cisco, Juniper Networks, F5 Networks, Riverbed Networks, HuaWei and others. Public software vendors that provide and manage cloud-computing infrastructure include VMWare, Citrix, Microsoft and Red Hat. There is a vibrant ecosystem of cloud-focused startups including Univa HD, Platform Computing and Elastra.

Sensors, home networking and smart devices - There is opportunity for providers of a new generation of connected devices that will enable intelligent systems. These include smart meters and industrial control systems (iTron, GE, Honeywell, Siemens, APC/Schneider Electric, Emerson, Eaton) as well as software vendors that will enable smart grid and networked home solutions (Control4, BPL Global, SilverSpring Networks, GridPoint).

Cloud-services and infrastructure vendors are

positioned well

Integration, networking, and smart-device vendors

are positioned well

Connected devices create opportunities for new and

old industries

Page 11: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 1: Tapping into accelerating change US software

September 2010 [email protected] 11

Providers of mobile and location-based services - These include technology, internet, media and communications-services companies. Key vendors include Google, Facebook, Nokia, Microsoft, Apple, Yelp, foursquare, Twitter, Verizon, AT&T, Sprint, DoCoMo, BT and many others, both public and private.

We identify which vendors could face disruption to their businesses:

Legacy packaged-software applications - This includes on-premises applications from the likes of Oracle, SAP, Sage, Lawson, Epicor, Kenexa, JDA Software, Infor, Kronos and others.

Relational database vendors - On-premises database vendors could face some disruption from an increasingly fragmented technology landscape. These vendors include Oracle, IBM, Microsoft, Teradata and others.

Non-integrated hardware vendors - Commodity server, hardware, storage and networking technologies will face ongoing pricing erosion and increasing concentration in the customer base. We expect hardware vendors to acquire for scale, expand into higher-margin services and software for differentiation.

Legacy application, database and hardware

vendors are at risk

Page 12: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 2: Transparent IT US software

12 [email protected] September 2010

Transparent IT We believe innovation over the next decade will be ruled by a consistent trend toward “transparent IT” - technology that is so simple that the underlying complexity becomes invisible to the user. In other words, the difficult stuff will be increasingly distant from end users.

Any sufficiently advanced technology is indistinguishable from magic.

Arthur C Clarke

Over the next decade, computing will become increasingly embedded in the daily life of consumers, businesses and other organizations. Computing will become more intuitive and pervasive with the evolution of more powerful software, rapid growth of endpoint devices, availability of “instant-on” connectivity and declining costs of hardware, bandwidth and storage. We see the continuous elevation of simplicity of experience to the user as logic controls the underlying systems, processes and infrastructure with increasing power.

Technology happens. It's not good, it's not bad. Is steel good or bad?

Andy Grove

Four critical vectors are likely to drive accelerating innovation of software and systems toward transparent IT:

Cloud computing - A paradigm shift in computing The ramifications of the shift to cloud computing cannot be understated. Most of our conversations referenced the transformative impact that cloud computing will have on the availability of flexible and increasingly cheap computing power. The first generation of public clouds from Amazon, Google, Rackspace, Salesforce.com and Microsoft Azure provide field validation of the viability of the model.

Virtualization has been an enabling technology for cloud computing. The ability to run multiple workloads on a shared server to improve server utilization, to run a single workload across multiple servers for scale and to quickly scale up or down server images provides compelling efficiencies in terms of cost, power and flexibility.

Commitment to the cloud model is evidenced by the rapid pace of investment in datacenter infrastructure by corporations, colocation providers, systems integrators, telecom carriers and others. We are seeing aggressive allocation of R&D resources toward cloud-infrastructure initiatives by software companies of all sizes. Notably, Microsoft (which operates the largest R&D organization in the software industry) disclosed that 70% of its developers are focused on the cloud, and this would soon move to over 90%.

Lew Moorman, CTO of Rackspace, comments on the laws of accelerating returns:

‘There are laws of accelerating returns on these technologies at play. Going from mainframe to mini computer was a small advance . . . the PC becomes a transformative advance and I think this next step is again another exponential leap. With cloud computing, we now have ubiquitous computing. Not only does everyone have computing power at their fingertips, they have the power of a datacenter at their fingertips. They have the ability to manipulate and access all types of data; to connect with and do things with that data; to create and store new data.

Over the next decade, computing will become

increasingly embedded in daily life

Complexity gets pushed down, logic moves to

higher levels

The ramifications of the shift to cloud computing

cannot be understated

Commitment to the cloud model is evidenced by

rapid investment in datacenter infrastructure

Page 13: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 2: Transparent IT US software

September 2010 [email protected] 13

We are just starting to understand the potential of cloud computing. It is just starting to transform companies and individuals’ lives.’

Importantly, the availability of cloud-based resources is lowering barriers for startups, which in turn should fuel accelerating innovation.

Stratton Sclavos of Radar Partners comments on lower barriers to startups:

‘Eighty percent of the entrepreneurs that come into our firm looking for seed capital or A-round capital already have an application up and running, if not hosted on Amazon then hosted somewhere like Rackspace . . . The capital efficiency that can be applied to developing, introducing and then iterating on these new applications is phenomenal to us and we don’t think that is going to change. In fact, we think it is going to get more and more like that. More interesting is that the capital efficiency for a venture firm trying to launch these new companies is much better.’

The mobile internet - “Any device, always on, anywhere” The rapid adoption of smartphones and the growing availability of wireless internet is a key vector for realizing the vision of pervasive computing and a wealth of related applications, including micropayments, content streaming, multiplayer gaming, location-based services, enterprise applications, etc.

The introduction of high-speed mobile networks based on technologies including high-speed packet access (HSPA), worldwide interoperability for microwave access (WiMAX) and long-term evolution (LTE) will encourage adoption of data-based applications. Over the next 10 years, we expect the move to 5G wireless to make significant progress toward the “always-on” high-speed internet connection.

More advanced mobile networks will support a new range of applications, including content, shopping, HDTV, collaboration, social networking, video conferencing, robust gaming and additional personalized offerings. This in turn will expand the range of devices beyond phones, PDAs, smartphones and laptops to embrace additional audio, video, sensors, industrial devices and appliances. The availability of greater bandwidth will facilitate adoption of advanced applications, which in turn should drive further growth of data traffic.

Figure 9

Smartphone penetration growth forecast

0

10

20

30

40

50

60

2007 2008 2009 2010CL 2011CL 2012CL 2013CL

Smartphone share Smartphone growth YoY(%)

Source: Credit Agricole Securities (USA)

Availability of cloud-based resources is

lowering barriers for startups

More advanced mobile networks will support a

new range of applications

Adoption of smartphones and wireless internet help

realize the vision of pervasive computing

We expect smartphones to continue to take

share of global handset shipments

Page 14: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 2: Transparent IT US software

14 [email protected] September 2010

The proliferation of mobile applications is representative of the variety of innovations enabled by smartphones and the mobile internet, and this will be reflected both in consumer and enterprise adoption. Examples of the types of mobile applications expected to see healthy growth include money transfer by short message service (SMS), mobile search and browsing, location-based services, mobile music and video services, near-field communications services, mobile health monitoring and many others. The ability to combine location awareness with content streaming will enable applications that can deliver content to users based on their specific location and preferences (for marketing, entertainment or educational purposes).

David Cohen of EMC comments on the impact of mass connectivity:

‘Another jump in terms of the numbers, just in sheer distribution the number of people with cell phones is at least an order of magnitude more than people who have highly connected PCs. It is probably much higher than that. If you look at sensor networks, wireless point-of-sale devices and RFID data collectors, there is also a massive expansion from where we were with tethered devices in previous generations. We are talking about connectivity that is literally on a planetary scale.’

Gary Kovacs of Sybase sees mobility as a paradigm shift, not just technology:

‘I think many people who have been around mobility for a long time don’t talk about mobility as a phone anymore. We talk about it much more as a method of interaction . . . basically accessing cloud services through devices. What I love about that is I can go to my PC, I can go to my iPad, and I can go to my iPhone or my Blackberry and I can get a file from my online service and it just saves. In 10 years, we will have machine to machine and we will have devices that are always on. I think we will have much less in our lives that take a boot-up cycle.’

User interfaces - “See me, feel me, touch me - think me?” The development of new types of interfaces promises to expand the experience of computing beyond the traditional keyboard/mouse, touch and speech-based interaction currently available. The development of new types of touch and haptic interfaces promises to enable new types of applications - for gaming, enabling the disabled, medical procedures, industrial processes, training, simulation and therapy. Haptic interfaces have applications in virtual reality (by enabling real touch to operate in artificial environments) and through teleoperation (using real touch to operate in real environments via computer).

Motion-control interfaces are becoming mainstreamed particularly in the realm of video gaming. Microsoft’s Kinect enhancement for Xbox integrates speech recognition, 3D sensing and motion sensing in an integrated, controller-less user experience, while Nintendo’s Wii and the Sony Move technologies exploit motion-control capabilities. Startups, such as Oblong, are exploring non-physical interfaces for computing.

There is also growing progress on brain computer interface (BCI) technology. Currently, research is focused on physical implants (mostly to benefit the disabled through physical mobility and prosthetics), but there is also growing progress in non-invasive brain interfaces that track brain activity to control computing and physical devices.

Mobility is a paradigm shift, not just technology

New types of interfaces promise to expand the

experience of computing

There is growing progress on brain computer

interface technology

Innovations and connectivity enable a new

class of mobile applications to emerge

Page 15: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 2: Transparent IT US software

September 2010 [email protected] 15

Over the next decade, we expect advancements in non-traditional computing interfaces to be accompanied by new software applications that will match the tactile or sensory interface to a matching function or automation. This is one of the areas where we expect innovation to be quite surprising and to have a paradigm-shifting impact on the experience of transparent IT.

Development tools & standards - Simpler, more powerful The evolution of higher-level languages that bring development closer to the business process puts increasing power in the hands of business users and veils the underlying complexity of code. The evolution of the open-source programming model has created in aggregate over 1 billion lines of freely available open-source code that developers and business users can build upon to create applications and new businesses. Standards, such as HTML 5, promise to enable a new class of rich, interactive and mobile applications, while markup languages (ie, variants of XML and BPML for business process) help enhance data interchange and interoperability and increasingly distributed environments.

Jan Baan, chairman & chief innovation officer of Cordys, believes new languages will change the role of developers from coding to process orchestrators:

‘The new generation of development language is 5GL. This increasingly takes the developer out of the process. Instead, we have the business driving the use of the commodity type of components that are decoupled with services that have been established.’

Michael Tiemann, VP of public affairs for Red Hat, highlights the transformative role that the open-source model plays in software development:

‘I believe the open-source model for software development and intellectual development has been quite disruptive over the last decade . . . cultivating these communities of innovation and enabling a rise of incredibly pervasive powerful technology that is really powering a new transition to cloud-based computing.

Open source has wonderful economics compared to proprietary software in terms of cost model, but the fact that this open-source model can run at such a high level of quality in such a robust manner, with so much interoperability, I think that’s the reason you see companies like Google, other companies who are putting open-source infrastructures together like Amazon.com able to achieve levels of scale and start making forward progress by simply remediating the existing software.’

Figure 10

Open-source project defect densities Year Findings Details 2004 985 defects in 5.7 MLOC of Linux kernel source

code

2005 Linux kernel grew 4.7%, defect density decreased 2.2%

100% of all "serious" defects identified fixed within six months

2006 Survey expanded to entire LAMP Stack and 32 OSS programs, no correlation found between size and defect density

2008 Survey expanded to 250 OSS projects consisting of more than 55 MLOC. Defect density reduced additional 16% since 2006

Funded by Department of Homeland Security

2009 According to Coverity, overall integrity, quality and security of open-source software is improving. Products with near-zero defects have increased from 11% to 36% and project involvement has increased more than 50% since 2008

Rung 2 projects (Rung 1 have zero defects) increased from 11% to 36%

Source: Opensource.org

Standards such as HTML 5 promise to enable a new class of rich, interactive and mobile applications

Open source has wonderful economics, but also a high level of quality

New languages will change the role of

developers from coding to process orchestrators

Open-source project defect densities are

relatively superior

Page 16: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 3: Intelligent systems US software

16 [email protected] September 2010

Intelligent systems Over the next decade, we expect to see intelligence increasingly embedded into systems, benefiting from growing predictive powers of software, standards-enabled integration, “always-on” connectivity and the increasing availability of sensors and remote-controlled devices. The concepts behind intelligent systems have been around for decades, but the gating factors have been the limitations of connectivity, throughput and computational power.

The Star Trek computer doesn't seem that interesting. They ask it random questions, it thinks for a while. I think we can do better than that.

Larry Page

The proliferation of data from corporate applications provides unprecedented visibility into operations and business. New generation sensors and radio frequency identification (RFID) tracking generate huge amounts of data which can be used to improve energy efficiency and optimize broad functional aspects of the supply chain. Technologies related to data warehousing, business intelligence and predictive analytics have matured and are easier and faster than ever to deploy.

Applications and systems will leverage the power of predictive analytics and advanced techniques to optimize business processes, improve collaboration, target information flow and reduce risk. We have seen the mainstream acceptance of business intelligence and data warehousing over the past three decades, while leading independent BI vendors Hyperion, Cognos and Business Objects were absorbed by Oracle, IBM and SAP, respectively, in 2007. This wave of consolidation effectively mainstreamed business-intelligence software within the enterprise IT ecosystem, and these technologies have continued to rank as high priorities since.

Prediction and optimization embedded in the walls The value of predictive analytics has continued to grow with increases in computational power. If business intelligence is largely historical facing, predictive analytics look to the future, to improve marketing accuracy and operational efficiencies and reduce risk of all types. In the past, predictive capabilities have been employed in a number of specific types of scenarios:

online search and advertising, to improve query results and better target ads,

all types of marketing, to improve effectiveness and revenue “lift” from campaigns,

manufacturing, to anticipate potential quality assurance issues,

financial-risk management, to help mitigate portfolio or credit risk (credit scores are a prominent example).

The holy grail of analytics is to “close the loop” from data to insight, to prediction, to action. In many cases, such as marketing, the value of human interpretation of predictive analytics is critical to optimize decisions.

Analytics will be increasingly embedded into real-time operational and transactional systems. We already see this dynamic playing out in ecommerce, with marketing and merchandizing optimization attempting to anticipate what the user may be interested in. Amazon, Netflix Pandora,

We expect to see real-time intelligence

increasingly embedded into systems

There is more data available and analytic

technologies have matured

Value of predictive analytics continues to

grow with increases in computational power

The holy grail of analytics is to “close the loop” from

data to insight, to prediction, to action

Page 17: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 3: Intelligent systems US software

September 2010 [email protected] 17

Google and others target consumer interests in books, movies, music and search. Operational systems increasingly incorporate analytics into domain specific tasks: pricing and inventory optimization, IT performance management, and QA in manufacturing. Integration of predictive analytics throughout the value-chain of products and services, from production to end user, will enable the realization of “mass customization.”

There will be growing integration between physical and logical, with systems that incorporate data from sensors and physical systems, apply predictive analytics recursively to improve experience, operational or other efficiencies. This is a consequential trend in nascent stages. Some of the most visible manifestations of this include the growing prevalence of embedded intelligence in passenger autos, such as self-diagnosing systems and location-aware navigation systems that route around traffic jams. Examples of the types of solutions that will incorporate physical/logical intelligence are: smart grid (including demand/response energy management, municipal traffic management solutions, critical infrastructure management, and networked home-control solutions), self-diagnosing and correcting datacenter management, networked video surveillance and security systems, location-aware services, such as smart billboards.

Figure 11

An example of location-aware services

Source: IBM

Several underlying trends enable the rise of intelligent systems:

Big data. This refers to large sets of data of all types (created by enterprise and internet applications, the proliferation of audio, video and social-networking data) that have historically proven unwieldy and even impossible to manage and analyze. Leading internet companies, such as Yahoo, Facebook, LinkedIn, Twitter and AOL, commonly generate large amounts of data, often over 100 terabytes of data each year. Preparing this data for analysis can double or even triple the size. This is driving new approaches to structure the data to find new ways to analyze and visualize effectively.

There will be growing integration between

physical and logical systems

Big data drives new approaches to structure

data to find new ways to analyze and visualize

IBM is developing intelligent billboards

Page 18: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 3: Intelligent systems US software

18 [email protected] September 2010

Improvements in the hardware and processing architectures will allow greater amounts of data to be manipulated and analyzed with far greater capacity. Sanjay Poonen of SAP discusses this:

‘Behind the scenes, these intelligent systems will have voluminous data handling capacity because much of what ends up getting processed today in disk-based structure can be handled in in-memory structures, which can be in memory structures which may be either physical RAM or flash. Industries that build their reputation on voluminous amounts of data - retail, consumer packaged goods, financial services, utilities, even healthcare and the public sector - will be the first consumers of these devices, appliances and software that run in the form factor, that allows operations that took seconds to take milliseconds, operations that took minutes, to take seconds.’

Importantly, there is an increasing range of technologies to organize and store data that go beyond the traditional relational database promoted by Oracle, IBM, Microsoft and others. Technologies that deal with unstructured data (ie, Hadoop, Cassandra, Mark Logic) and analytics-focused columnar databases (ie, Vertica, Kognitio, Sybase IQ and others) represent an emergence of an increasing array of database types to store and leverage data. There is increasing convergence between structured and unstructured data as vendors (ie, Mark Logic, Attivio and others) bridge existing distinctions between data stored in relational databases (structured data) and everything else, which include flat files, word documents, spreadsheets, multimedia files, etc.

Jill Dyche of Baseline Consulting discusses the emerging concept of data virtualization:

‘A lot of people are talking about software virtualization, but we are going to start to hear a lot of buzz around data virtualization. We simply have to assume that the days of the big behemoth mega data warehouses are over and there will always be new data introduced by both internal and external sources. Data like social-media interactions become important to companies. This “data is everywhere” mindset that everyone is embracing - by the time you are able to load all that data into a big database, it could be irrelevant.’

Proliferation of sensors and networked devices. The growth of devices that will be able to generate data for analysis and receive instructions remotely will give rise to new solutions. Sensors that can measure energy use, temperature and other conditions are being built into smart meters, appliances, building control systems and other devices. The ability to receive instructions (for instance, to run appliances to capitalize on low utility rates or avert demand spikes) is critical to realizing the vision of “closed loop” intelligent systems. The adoption of IPv6 will allow for billions more devices to be connected to the internet, each with a unique IP address. The “Internet of Things” will enable far more pervasive analytics.

Better analytics. Predictive mathematical and statistic algorithms improve the more they can be tuned for effectiveness. The more data and the more iterations, the better the predictive accuracy. The exponential improvement in cost/performance in computation, DRAM and storage enables greater predictive precision. The power of prediction improves as experts in the problem domain deploy the appropriate, finely tuned algorithm. Advances in machine learning, neural nets and other artificial-intelligence techniques continue to empower users with a growing arsenal of analysis tools.

There are an increasing range of technologies to organize and store data

‘We are going to start to hear a lot of buzz around

data virtualization’

The “Internet of Things” will enable far more pervasive analytics

Algorithms improve the more they can be tuned

with more data and iterations

Page 19: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 3: Intelligent systems US software

September 2010 [email protected] 19

Stephen Wolfram’s team is working on software that mines the universe of possible programs to find the optimal software program itself:

‘One of the things that has come out of a bunch of science that I have done is this idea that there is this computational universe of possible programs. Even quite simple programs are useful for things, which mean that it becomes feasible to search this computational universe for programs that are useful for your purpose. Whether your purpose is cleaning up images or encryption or doing routing or linguistics, it becomes possible to not have the engineer build the program step by step, but instead have the program be mined from this computation universe of possibilities.

That sounds very futuristic, but it is, in our own work in Mathematica and Wolfram Alpha, a methodology we increasingly use. So there is an increasing number of algorithms that no human built. We found it. We searched a trillion possible programs and one of them was the one that was the best with respect to certain criteria and that ends up being the one we use for such and such a function. There are a lot of places where this type of mined algorithm will become more and more prevalent.’

The rise of social intelligence. Social networking and collaboration technologies foster business agility, collaboration and innovation by allowing for more fluid communications across the organization. These technologies give rise to new sources of data for analysis as well as new avenues to insight through the processes of “crowdsourcing.” Crowdsourcing refers to the practice of taking problems that are typically solved by employees and extending them to a group of individuals in order to arrive at a collective solution. Crowdsourcing is the paradigm underlying open-source software development, designing organizational algorithms, solving complex collaborative questions as well as optimizing search-engine results.

Technologies, such as Facebook and Twitter, allow problems and data to be distributed across groups of users in real-time. There is increasing interest in using “social intelligence” - harnessing the collective wisdom of large groups of users via technology - to solve global problems, such as energy usage and climate change. We expect to see an increasing range of organizations incorporate the power of social-intelligence techniques to enhance broader analytic capabilities.

Several domains are fostering the evolution to intelligent systems:

Real-time performance management. With the mainstream acceptance of methodologies, such as Total Quality Management (TQM), Six Sigma and Balanced Scorecard, software technologies evolved to help management and line workers adopt metrics-driven strategies through dashboards and visualization tools. There is increasing interest in enabling continuous planning processes with more frequent refreshes or real-time data updates. Leading software vendors IBM, Microsoft, and SAP are leading the evolution toward real-time performance management.

The key goal for performance management lies in improving the decision-making process. We expect to see increased use of real-time analytics to drive optimized decision making. While companies, such as Fair Isaac have focused on enabling enterprise decision management through a combination of analytics, domain knowledge and data management, this has historically required a costly, highly service-oriented approach. The use of analytics to

Performance management applies

methodologies to specific practices

Performance management will

increasingly focus on real-time decisions

‘It becomes possible to have programs be mined from this computational universe of possibilities’

Social technologies allow problems and data to be

distributed across groups of users

Page 20: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 3: Intelligent systems US software

20 [email protected] September 2010

optimize the internet ad networks’ placement and targeting in real time is a key opportunity in this area. Vendors, such as Google, Akamai, Adobe and Splunk, are positioned to chart new ground in this area. Software vendors IBM, Microsoft, IBM and SAP are leading the evolution toward real-time performance management.

Marketing optimization. The use of analytics is a long-standing part of marketing and campaign management, where vendors, such as SAS and IBM/SPSS, have provided the analytic horsepower to power effective campaigns. Vendors, such as SaaS leader Eloqua, Responsys, Aprimo and IBM/Unica, continue to drive incremental value through the use of scoring and predictive analysis to complement functions related to lead generation, campaign design and analysis of effectiveness.

Risk management, churn analysis and fraud prevention. In the wake of the financial industry crisis, financial services have been compelled to deploy enhanced credit risk management solutions. Solving the risk management problem requires pulling together multiple sources of data and applying risk analytics to detect potentially consequential changes. Financial fraud remains a significant challenge for consumers, businesses, banks and credit card companies, and there is consistent need for more effective analytics to detect potential fraud with minimal false positives. Anti-money laundering requirements continue to drive the need for more sophisticated solutions to detect and prevent illegal financial activities. Vendors, such as RiskMetrics, Actimize, IBM/SPSS, SAS and many others, are focusing on the evolving challenges associated with detecting electronic fraud.

Smart grid, smart buildings, smart datacenters. One of the most promising areas for innovation lies in the use of software to manage energy usage, aligning the needs of customers and utilities. The ”smart grid” is envisioned as a system that employ two-way technologies to control appliances and HVAC systems at a home or building, allowing customers to automatically reduce energy usage during periods when spot energy prices are high, and allowing utilities to manage their own resources to avoid “brown outs” from overload or invest in costly spinning capacity. “Demand response” technologies employ meters at the customer’s site that can be controlled by the power utility.

Predictive technologies allow the utility to anticipate when demand may be approaching “peak load” or monitor the spot price of energy to exceed a certain threshold. In response, the utility can issue remote instruction to the customers systems to reduce their demand by raising the air-conditioning temperature or turning off a pool heater, for instance. Companies focused on this problem include Comverge, EnerNoc and private companies SpringSource, GridPoint and BPL Global, among others.

While there is significant promise, Andy Lawrence of 451 Group believes it will take time to realize these visions:

‘I think we are going to get to a point where the power consumption of corporations scales up and down as power consumption drops naturally, without human intervention at the end of the workday, with intelligent policies making decisions about what to turn off. It may be an energy-management system or some kind of network of building-management system and network-management system.

Risk management continues to gain

importance in financial markets

Smart-grid technologies employ predictive

techniques to conserve energy

Analytics enable marketers to improve

effectiveness of campaigns

It will take time to realize the eco-efficient IT vision

Page 21: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 3: Intelligent systems US software

September 2010 [email protected] 21

I think it is going to take a couple of decades. It is not always going to be clear that it is worth doing the retrofit and embracing the complexity. People don’t rip out building-management systems so they can get a better view of their energy footprint. They will do it over time. This is a long-term project.’

Software-enabled home networking also shows significant promise as the wireless ZigBee standard enables compatible appliances and systems to be controlled automatically from a central system. Control4 is perhaps the most advanced vendor in this area and is focusing on enhancing its demand-response capabilities to allow homes and hotels to automatically schedule certain tasks (ie, running a dishwasher) or adjust climate control settings in order to make the most cost-effective energy choices. Over the next decade, we expect home networking and monitoring to extend to areas, such as infant and elder care, which promise significant improvements in quality of life.

Glen Mella of Control4 describes some of the uses of monitoring technology:

‘A partner of ours, CloseBy Networks, sells a software system on top of a Control4 deployment which essentially enables children to assist aging parents. They can monitor things like, “Well, it’s 10am and grandma hasn’t gone into the master bathroom, there may be something wrong because she usually gets up at 8.” Another one is actually helping seniors get the movie set up and play from a remote location or setting lighting scenes to help them settle down for the evening. People are starting to experiment with ZigBee monitoring devices - you can have a pad next to the bed, so when they step out of the bed a signal is given. There are even ZigBee garments that can monitor vital signs.’

Intelligence helps deliver personalized medicine. There is increasing use of analytics and business-intelligence tools to improve both the delivery of care for individual patients as well as fundamental research. Healthcare providers are increasingly able to respond to demands of pandemic prevention by using business-intelligence tools to sort clinical data to identify which patients might be most at risk, enabling the providers to target and prioritize vaccination programs.

Another focus is helping doctors make more accurate diagnoses and recommend the appropriate treatments. Systems that integrate sample analysis into decision-support applications can help pinpoint details that may elude physicians. Public health initiatives, such as the Cancer Biomedical Informatics Grid and UCSF’s Athena Breast Health Network, aggregate shared data and increasingly enable researchers and physicians to improve their research, diagnosis and treatment efforts.

RFID and the supply chain. RFID technology experienced a significant level of hype in 2004-2005 which created a level of skepticism, but steady adoption of the technology is paving the way for new applications and services. RFID employs two components: a reader (or interrogator) and tag (or label). There are many applications for the technology, which is becoming increasingly prevalent as the cost of the tags decreases. RFID tags are used in conjunction with mobile phones and credit cards for payments, for automated toll collection and similar uses. It is in asset management, retail sales and product tracking that RFID shows significant promise, enabling manufacturers, distributors and retailers to gain granular insight into both the supply chain and demand patterns. The tremendous amount of data generated by RFID applications has been long anticipated as a demand driver for data warehouse technologies from the likes of Teradata, IBM, SAP/Sybase, Oracle and other vendors.

Home-networking technologies increasingly

focus on managing energy consumption

RFID applications promote demand for data

warehousing and predictive analytics

Business intelligence increasingly helps to improve patient care

Elder care is a promising vision for home-

networking technologies

Page 22: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 4: Convergence US software

22 [email protected] September 2010

Convergence Users and consumers of information technology are ultimately concerned with the experience and functions of chosen solutions. Increasingly, providers bundle components of hardware, software, services, connectivity and content as a complete solution. This has resulted in a blurring of the lines as offerings and vendors straddle multiple categories. The key characteristics that have driven adoption include accelerated time to deploy, reduced cost of integration and implementation services and increased efficacy of the overall solution.

Information technology and business are becoming inextricably interwoven. I don't think anybody can talk meaningfully about one without the talking about the other.

Bill Gates

The distinction between discrete software, hardware, services and content will become blurred into solutions. Vendors will be at once more diversified and vertically integrated. Over the next decade, we expect convergence to be a theme that will drive both organic development and active M&A in the technology industry. Business models will evolve as value creation accrues in different points in the delivery chain.

Apple and Microsoft, which have long embraced hardware as part of the business model (via Macs, iPods/iPhones and the Xbox), have expanded into content-based businesses (iTunes and Xbox Live) as a way to reinforce the broader value proposition of their integrated offerings. We expect increasing convergence over the next decade across the technology and media sectors as software, hardware and content providers seek to bolster customer stickiness, drive incremental revenues and create higher competitive barriers.

Figure 12

S&P 500 net debt per share

0

100

200

300

400

500

600

700

800

900

1,000

1,100

1,200

Q4 00 Q4 01 Q4 02 Q4 03 Q4 04 Q4 05 Q4 06 Q4 07 Q4 08 Q4 09

(US$)

Source: Bloomberg

We expect convergence to play out along several vectors over the next decade:

Everything as a Service. The mainstream acceptance of Software as a Service over the past decade has paved the way for services to reach lower down the stack. Platform as a Service and Infrastructure as a Service hide the underlying complexity of compute and storage infrastructure and allow users, service providers and application developers to access resources in holistic

Software offerings increasingly combine hardware, software,

connectivity and content

Business models will evolve as value creation

accrues in different points in the delivery chain

Corporate cash balances and low interest rates are

spurring M&A

To those that consume these services, the

experience is what counts

Page 23: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 4: Convergence US software

September 2010 [email protected] 23

fashion. To those that consume these services, the experience (functionality or service levels) is what counts, and the environment will lead to new competitive dynamics. Salesforce.com, Rackspace, Amazon.com, Google and Microsoft all compete directly for emerging opportunities, but have dramatically distinct origins.

Commoditization of IT infrastructure. A corollary to the trend toward “Everything as a Service” and trend toward transparent IT is the continual commoditization of compute, memory, bandwidth and storage. We expect to see continuing convergence of hardware, storage and networking capabilities. Cisco’s move into servers and HP’s initiatives around its ProCurve networking equipment are only the leading edge of what we expect to be an increasing emphasis on integration of commoditized capabilities. As hardware vendors continue to compete on price, performance and supply chain efficiencies, there will continue to be focus on value and differentiation from software.

Content integrates into the IT ecosystem. This is a longer-term trend, as proprietary content and information services become increasingly integrated into solutions. So far this has been indirect as Apple, Amazon and Barnes & Noble have used specialized hardware and ecommerce to deliver digital content to end users. Microsoft’s Xbox Live service builds on the Xbox hardware console in a broader effort to deliver a complete home entertainment solution. We expect the next phase of evolution to increasingly stress the role of proprietary content as publishers and content producers seek ways to monetize their intellectual assets and software vendors look to use content to enhance value and differentiation to their user bases.

Physical and logical control systems converge. There is a slow but steady integration between physical facilities control systems and IT systems. One area that has seen significant interest has been energy management in the datacenter. Part of the challenge is managing energy usage according to the device load. This is being addressed by companies, such as APC, Emerson (which acquired Avocent to extend its control of the datacenter) and privately held Modius. A number of startups have sprung up around energy usage measurement, with several companies focusing specifically on PC power consumption (notably Power Assure and Faronics).

We expect to see increasing innovations around the integration between physical systems and IT systems control. Beyond energy management, we believe one of the key areas of promise is tying together physical and IT security. For banks and high-security areas of government, this would involve the ability to tie together physical and logical access control (ie, allowing someone to log onto a system only when their security badge has been scanned into the building).

How will convergence play out? We identify several likely directions for convergence over the next decade for software, hardware, services and content vendors. We believe the most likely directions are:

Software vendors move into hardware and content

Hardware vendors move into software and services

Services vendors move into software and content

Content vendors move into hardware and software

Physical facilities and logical systems controls

are converging

Expect the continual commoditization of compute, memory,

bandwidth and storage

One of the key areas of promise is tying together

physical and IT security

Page 24: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 4: Convergence US software

24 [email protected] September 2010

Figure 13

Converging IT ecosystem

Software Content

Hardware

Connectivity

Solutions

Source: Credit Agricole Securities (USA)

Software vendors move into hardware and content The rise of appliances that integrate both hardware and software has been a significant trend over the past decade and we expect this to continue over the next decade. We expect continued interest from software vendors on delivering software in appliances as organizations seek greater ROI and deployment efficiencies from the combined solutions. In particular, we expect application appliances to become more common along with hybrid offerings that combine an on-premise hardware presence with cloud-based services. Oracle’s acquisition of Sun and Check Point’s acquisition of the Nokia Enterprise business are examples of the types of transactions we expect to continue.

The hardware appliance approach has become increasingly mainstreamed and adoption is spreading across numerous categories. We expect application appliances to gain increasing ground over the next decade, following Oracle’s lead in targeting high-end opportunities for converged systems that combine software, networking, hardware and storage.

Figure 14

Software/hardware appliances by category

Category Representative vendors

Collaboration Bull

CRM Sage SalesLogix CRM Appliance

Data integration Cast Iron (IBM)

Data warehousing Teradata, Oracle Exadata, Netezza, SAP, IBM

e-Discovery Clearwell, StoredIQ, FTI, Kazeon (EMC)

ERP SAP (ERP in a box with Novell, Intel)

Firewall, VPN Check Point, McAfee,, Palo Alto Networks

IT management ScienceLogic

Mail security Barracuda, Cisco/Iron Port, McAfee, Trend Micro

Messaging Tibco

Search Google, Thunderstone, Index Engines

Secure web content Websense, McAfee

Storage - Deduplication, backup EMC, NetApp, HDS, 3Par, ExaGrid, DataDomain (EMC), NetApp,

Unified threat management Fortinet, Barracuda, Check Point

Video security NICE Systems

WAN acceleration Certeon, Citrix NetScaler, F5 Networks

Source: Credit Agricole Securities (USA)

The appliance approach is increasingly mainstream,

proliferating a cross categories

We expect application appliances to become

more common along with hybrid cloud offerings

Software, hardware, connectivity and content

will converge into holistic solutions

Appliances combining hardware, software and sometimes content and

services are proliferating

Page 25: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 4: Convergence US software

September 2010 [email protected] 25

We expect to see continued diversification of revenue streams both among larger and smaller software vendors. Traditional perpetual license vendors have been moving toward additional subscription-based businesses and hardware over the past several years. Some of this has been through acquisition, some through partnership, but the key objective is to enhance the overall strategic value of the vendor through a broader portfolio and reduce reliance on a single business model.

Ray Wang of Altimeter believes software companies will increasingly focus on content:

‘Software companies are emerging to become information brokers. They are not just delivering a solution. What the cloud is doing is allowing them to aggregate data for prediction, for trends, and for benchmarking purposes. Those trends drive another point where companies must consider if they even need any of these solution providers.

We are seeing companies buy out key software assets so they can deliver services directly to the customers without having to use a packaged application. One of the great examples of this occurred when Roper Industries bought a software company that just did supplier networks.’

Hardware vendors move into software and services For hardware vendors, the appeal of software is undeniable and we expect continuing development and M&A. This trend has been so consistent we would expect continued consolidation in the software sector to continue to be driven by hardware vendors.

IBM: Rational, Cognos, SPSS, Unica

HP: Mercury Interactive, Opsware, Tower (software), EDS (services)

Cisco: WebEx, ScanSafe, Ironport (Software as a Service)

Dell: 3Par attempted (appliances), Perot Systems (services)

EMC: VMWare, Documentum, RSA, Kazeon, Greenplum (software)

Intel: McAfee, Wind River (software)

Promod Haque of Norwest Venture Partners highlighted the hardware commoditization trend:

‘We are seeing hardware get commoditized rapidly, not just in the server space, in the compute space. You are starting to see networking vendors, under tremendous pressure from the HuaWeis and the ZTEs from China. The general commoditization of that space as well, at least the lower end of that space. Over a period of time, that will happen with the high end, but that lower end is getting commoditized.

There is a lot of emphasis on the part of companies like Alcatel Lucent, and whether it is Extreme Networks or Brocade, and equipment providers including Cisco and Juniper. You see Oracle now in the hardware business. HP acquired 3Com. One of the more interesting things that we saw recently was the IBM announcement that the entire hardware group will be reporting to Steve Mills, who previously ran Software only.

Expect vendors to blend software, hardware, content and services

Software companies will increasingly focus

on content

We expect continuing development and M&A

Page 26: 2020 foresight - Tech Views of the Future - Ed Maguire

Section 4: Convergence US software

26 [email protected] September 2010

You are starting to see some very interesting trends in that hardware, not just compute, but compute, storage and networking are coming together.’

Services vendors move into software and content IT services vendors, particularly the likes of Accenture, CSC and others, commonly specialize in customized solutions that incorporate proprietary IP, packaged software and hardware. Given the ongoing pricing pressure from offshoring competition, we would expect North American IT-services vendors to look to acquire or incorporate more internally developed proprietary software and content in order to differentiate their franchises.

There have not been many large acquisitions as of yet. The Canadian IT services provider CGI Group acquired AMS, a vendor of government software offerings, which is a notable example from 2003. Part of the challenge for traditional services vendors is that their valuations (on a per employee basis and on price/sales metrics) tend to be far lower than other areas of technology.

There have been a few examples of services vendors moving into hardware (both IBM and Fujitsu retail legacy hardware businesses). Though not a traditional services vendor, Google’s unsuccessful launch into hardware through the Nexus One smartphone demonstrate that convergence is not without risks.

Content vendors move into hardware and software Although the indications of this trend are early, we think there are growing signs of increasing efforts to converge solutions on the part of media companies. As media companies seek new ways to monetize content (to offset pressures in advertising and traditional book and periodical sales) and technology firms seek new and recurring sources of revenue, we expect increasing cooperation, partnerships and potentially M&A between traditional media and publishing companies and technology firms.

Amazon’s Kindle and the Barnes & Noble Nook represent efforts by channel players to provide a hardware-based vehicle to monetize digital content, and of course Apple’s iPad appears to be revitalizing the business of magazines and other periodicals in digital form. Publishers similarly are seeking ways to monetize their own content and are developing applications to offset declines in their traditional businesses. Reed Elsevier has sought to offset declining sales of its hardbound medical texts and reference books by offering specialized software applications, such as MD Consult, that allow access to the company’s proprietary data on a subscription basis.

Expect increasing cooperation between media and tech firms

‘Not just compute, but compute, storage and

networking are coming together’

Pricing pressure coming from offshoring

competition

Page 27: 2020 foresight - Tech Views of the Future - Ed Maguire

US software

September 2010 [email protected] 27

2020 interviews

Jan Baan, Cordys.............................................................................. 28

Willem van Biljon, Nimbula .............................................................. 34

David Cohen, EMC............................................................................. 40

Simon Crosby, Citrix......................................................................... 47

Jill Dyche, Baseline Consulting ......................................................... 58

Andrew Feldman, SeaMicro .............................................................. 65

Promod Haque, Norwest Venture ..................................................... 70

Timo Hannay, Nature Publishing Group............................................ 75

Parker Harris, Salesforce.com .......................................................... 83

Dave Kellogg, MarkLogic .................................................................. 91

Gary Kovacs, Sybase ...................................................................... 101

Andy Lawrence, The 451 Group...................................................... 111

Seth Levine, Foundry Group ........................................................... 118

Glen Mella, Control4 ....................................................................... 122

Geoffrey Moore, TCG Advisors ........................................................ 131

Lew Moorman, Rackspace .............................................................. 139

Sanjay Poonen, SAP ....................................................................... 145

Keith Schaefer, BPL Global ............................................................. 150

Stratton Sclavos, Radar Partners ................................................... 157

Michael Skok, North Bridge ............................................................ 164

Michael Tiemann, Red Hat .............................................................. 171

Ray Wang, Altimeter Group ............................................................ 181

Stephen Wolfram, Wolfram Research............................................. 186

Page 28: 2020 foresight - Tech Views of the Future - Ed Maguire

Jan Baan, Cordys US software

28 [email protected] September 2010

Jan Baan, Cordys Jan Baan has over 25 years of entrepreneurial and business leadership experience in the software industry. Internationally recognized for his key role in creating the worldwide market for enterprise resource planning (ERP) software, he founded Cordys in 2001, together with Theodoor van Donge, the key architect behind Baan Company's pioneering ERP solution.

Jan founded Baan Company in 1978, soon after attending Business College. With the development of his first software package in 1979, Jan started what was to become a pioneering career in the ERP industry. Under Jan's stewardship, Baan Company grew from a US$35m company in the early 1990s to US$680m in 1998. Baan became the No.2 ERP player in the software industry. In the late 1990s, Jan became a highly successful venture capitalist by focusing on software innovation.

The importance of the business process layer Today, with cloud computing in the enterprise gaining traction, businesses must re-evaluate their existing IT portfolios and transition towards the cloud to remain competitive. Businesses are increasingly demanding a shift from rigid business processes to more flexible and dynamic ones where a business process layer can link all silos under a service-oriented architecture (SOA). SOAs offer a loosely integrated suite of services that can be built upon existing IT infrastructure which offers the flexibility and functionality of the cloud without foregoing the benefits of the investment in on-premise IT infrastructure.

A feature of this shift is that differentiation within software will occur not in manufacturing, but in assembly. Users will have the ability to dynamically change the components of a system by making systems into customizable web services to be delivered over the web. Ultimately, complexity will be hidden underneath a layer of processes, a phenomenon Gartner calls the ‘meta of the meta’ layer. Using the meta, or ‘data about data,’ commodity components will enable projects to be created with no new lines of JAVA code because best practices codified in legacy systems all the way down to the databases will be utilized.

Business operations platform process layer

Source: Cordys

The business process layer promises greater

flexibility for IT to serve businesses

Differentiation within software will occur not in

manufacturing, but in assembly

The business process layer hides complexity,

creating more transparent IT

Page 29: 2020 foresight - Tech Views of the Future - Ed Maguire

Jan Baan, Cordys US software

September 2010 [email protected] 29

Commoditizing complexity, simplifying process Our discussion with Jan Baan focused on his visions of the evolution of the business process, the commoditization of applications into best practice components and the promise that flexibility will bring. Baan envisions traditional enterprise-software applications becoming discrete components that are manipulated by higher level business process orchestration. The applications of the future will weave together these components (comprised of formerly distinct lower level application functions) into a cohesive solution framework. Baan effectively used the analogy of the auto and the airplane in describing his vision for software over the next 10 years. The ability to “componentize” functions and applications allow business users unprecedented flexibility in matching software to the needs of the business, not vice versa.

Key points In the future, a business process layer will link system “silos,” with

integration via SOA.

Business Process as a Service (BPaaS) will provide the logical overlay on top of Infrastructure as a Service (IaaS), Platform as a Service (Paas) and SaaS to enable true multi-silo integration of software systems from the cloud.

Today, data and systems reside in silos (ERP, CRM, HRM, etc), and integration between silos (if any) is difficult, especially as new processes are introduced to the system(s).

Legacy systems (and new code going forward) are redesigned as web services, which will be integrated into the business process layer, delivered over the web with customizable interfaces.

Complexity will be commoditized down to simple elements representing best practices.

These functions can be decoupled and then recombined with other components, enabling mass customization via enterprise “mash-ups.”

The analogy for software is that we will see differentiation in the assembly, not the manufacturing.

The result of this evolution is that corporations will become more agile and efficient.

As processes are commoditized and IT becomes transparent to users, large system integrators and consultants could find they have less demand for their products.

The key differentiator for applications will not be the logic, but increasingly content from the unstructured data world.

Business processes will see the commoditization of applications into best

practice components

Baan envisions Business Process as a Service integrating existing

software “silos”

Page 30: 2020 foresight - Tech Views of the Future - Ed Maguire

Jan Baan, Cordys US software

30 [email protected] September 2010

Jan Baan - Interview summary We spoke to Jan Baan, chairman and chief innovation officer of Cordys, on 17 June 2010 about his vision for the evolution of software over the next 10 years.

Jan Baan became an entrepreneur in 1978. The original vision was to tackle the “mother of all complexity,” enterprise resource problems. This led to the rise of Baan Software, which became one of the top ERP vendors in the late 1990s.

In the ERP era, “data was king.” These enterprise transactional applications were based on UNIX and relational databases, which are designed to handle structured data. This generation of applications was never built for the internet.

These have led to extraordinarily high costs. Baan cited a company that has been in business for 150 years and is happy that its IT budget is under US$1bn. This typical company spent US$1bn to implement an SAP system. After buying the software for US$150m, the company needed to spend US$850m more to customize it. This model is not only costly, but at the end, all the user has is the application.

The computer is now mobile. We can have all the relevant data around us along with the relevant processes. We see examples of this with many of the new iPad applications. We have a new generation of beautiful emerging technologies based on the internet that includes the iPad, smartphones, etc that address only unstructured data. Increasingly, there is value for large legacy systems (more so in healthcare than in manufacturing) to connect with all these new devices.

The dream for Cordys 10 years ago had been to go beyond ERP, and it has taken 10 years to build a business process platform. This is focused on a completely mobile platform that is able to leverage both structured and unstructured data. In this new approach, data is no longer king . . . business process is.

In tomorrow’s world, we will no longer be employees, but knowledge workers in the supply chain. This will take the form of a business process layer that can link all of the system silos, with integration via SOA. A wealth of possibilities exists as a result. Why not have an order to cash collection process then link this with unstructured data rules (like Facebook in a CRM environment), while making all compliant? We need to find ways to link collaborative and social technologies to process, linking something like a Google Wave to a business process, for example, using both structured and unstructured data. This is the promise of using technology to mirror how people naturally interact.

Today there is a lot of discussion about the transition to Saas, PaaS and IaaS. On top, we will have Business Process as a Service. We need to follow the commoditization of services. One supposes you could open legacy systems to view the processes, but you end up having one view with one process. In the future, we will access these systems from the cloud.

We realize enterprises have to combine different technology waves. Suppose we take the analogy of the automobile. In the auto industry, there have been different technology waves, but there are still wheels, an engine, etc. Here the idea of using an old engine and new dashboard concept come together. This is the same for the cloud, where we see combining the cloud and on-premise worlds.

ERP applications were never built for

the internet

The business process layer will link together

underlying applications as services

We review our conversation with

Jan Baan

Page 31: 2020 foresight - Tech Views of the Future - Ed Maguire

Jan Baan, Cordys US software

September 2010 [email protected] 31

Currently, we still have an on-premise world that needs to be combined across silos. Technology systems are still very business centric, and the goal is to align business in the drivers seat, controlling process. At the enterprise, business processes have been standardized for compliance, but this is not flexible for tomorrow’s business needs.

In a typical business entity, there are many different silos. These silos incorporate structured data in multiple areas: ERP, CRM, product lifecycle management (PLM), financial systems, logistics and more. The business process layer can link together all the silos, using SOA to integrate - one rule in a business process.

Tomorrow’s world is no longer about employees doing work directly related to a company. This is about knowledge workers participating in a supply chain. Workers will have their own devices which have access to their corporate resources and will be able to access data from mobile internet connections.

For business processes, it is critical to follow steps in the appropriate order. Process requires linear steps. Workflows, such as email, do not follow the same pattern. A critical language enabling the rise of the cloud is business process language. Business Process Markup Language (BPML) has been standardized for 10 years. It used to be that the human element was missing in the BPM layer. We can now use BPML not only for compliance and system-to-system communications, but also for human-to-human communications. The evolution of software will allow technology to mirror behavior rather than vice versa.

Mass customization is the next wave of software The goal is to preserve existing investments, while providing a layer of flexibility on top of them. Like a mobile phone, the complexity underneath will be hidden as complex processes are combined into “mass custom apps”. For example, the iPhone external case may change with each version, but the underlying functional components are consistent.

We are still driving our software “cars” based on what is under the hood. Though the body and the dashboard may change, a car may have the same wheels and engine as 10 years ago. A similar case will hold in software. We need to look at SAP, Oracle and other applications to respect and preserve the value of their complex capabilities.

In the past we had logic which was stored and concrete, but in the future knowledge workers will be able to harness this logic in a better way - as services. We can build a web service and weave it together with others to collaborate commodity functions at the process layer.

The crucial element in IT is complexity, which creates costs and inefficiencies. What we need to be able to do is commoditize complexity down to simple elements that can be decoupled and recombined with other components. Customer auditing is a decoupling point.

Think of creating software like manufacturing an airplane. Airplanes have been manufactured in a similar way for decades. Commodity components are built from all over and the plane is assembled in one place in the same way, while the individual configuration is different.

Traditional enterprise applications have been

discrete “silos”

Goal is to preserve existing best practices while adding flexibility

Software will see differentiation in the

assembly, not the manufacturing

Page 32: 2020 foresight - Tech Views of the Future - Ed Maguire

Jan Baan, Cordys US software

32 [email protected] September 2010

The analogy for software is that we will see differentiation in the assembly not the manufacturing. Not only will organizations build Software as a Service, they will build the business itself. Tomorrow’s world will be one of mass customization, where users will have the ability to dynamically change components of a system. Essentially you will be able to make “mash-ups” out of complex processes.

The way you achieve mass customization is to take vanilla, traditional systems and make them into a web service, deliver them over web with customizable interfaces. Baan believes Cordys had been early to the market, but we think people are getting it now.

Commoditization preserves best practices Commoditization means there is respect for best practices and best practices for components. What is meant by commoditizing software is to bring together all of these underlying application silos as a process for the information worker. In this way, the complexity is hidden underneath a layer of process. Gartner calls this the ‘meta of the meta’ layer.

This approach is really the only way to simplify everything. In this model, the underlying information is hidden and the process is using information in a way that the commoditization allows it to. Since they are commoditized as components, there is no need to change the underlying applications, the user doesn’t have to worry about how it comes together and there is no need to have engineers customize when systems are built on best practices.

The challenge with the cloud is to use the inherent strengths and underlying processes of legacy systems, make them multi-tenant and cloud enable them. Basically the vision is to make web services out of legacy systems, integrate them into the business process layer. Commodity components will enable projects to be created with no new lines of JAVA code, with the processes engineered in the “meta-layer,” leveraging best practices codified in legacy systems all the way down to the databases.

Business Process as a Service For the knowledge worker, this new model requires that the business user/developer only needs to manipulate the business process layer. This allows the user to build upon the existing investments in IP and orchestrate a higher level of processes that are transparent to the end user.

At Baan Software in the 1990s, there were 2000 engineers working on various versions of the software. Programmers of yesterday made “spaghetti” with all of the customized code. This is analogous to a car where friends come in and change everything. In the cloud, there are new efficiencies because there is only one version of the software.

There is no longer room to build big, monolithic enterprise applications. The differentiator nowadays is content from the unstructured data world. It is not functionality that is the driver, content is now the driver, and this includes both unstructured and structured data. Content itself is stored under the umbrella of the business process layer. In the past, it was logic that was stored and concrete - now logic becomes used as a component.

Commoditization in software preserves best

practice and enables re-use

In the future, business users will be able to

manipulate the process layer, not underlying code

Page 33: 2020 foresight - Tech Views of the Future - Ed Maguire

Jan Baan, Cordys US software

September 2010 [email protected] 33

There are still many complex common best practices. In the past, it was about functionality. Now we like to incorporate unstructured content into the process. For the first time, we can bring unstructured rules and content into the process layer.

The new generation of development language is 5GL, which increasingly takes the developer out of the process. Instead, we have the business driving the use of the commodity type of components that are decoupled with services that have been established. The challenge is how to make legacy on-premise systems multi-tenant to bring them into the cloud.

The carrier for this new wave of enablement is the internet. In the 1960s, data was king. Today it is the “meta-meta model” that enables collaborative process. With the internet as the delivery mechanism, the end device will be whatever the user likes. The crucial element is how to decouple different technology components. Collaborative workspaces will bring together different models under one view. With the XML standard, software itself can become increasingly based on components.

IT is now participating in business decisions. Previously, IT was regarded as a “necessary evil” for the business, but with increased flexibility from the cloud to orchestrate and integrate processes, IT is becoming an increasingly strategic part of the business.

In coming years, the sun will set on mainframe technology. However, client-server technology can continue to be useful for the next 20 years if we can decouple processes and maintain on-premise and in-the-cloud models.

5GL development languages allow

businesses greater input in the process

The sun will set on mainframe technology

Page 34: 2020 foresight - Tech Views of the Future - Ed Maguire

Willem van Biljon, Nimbula US software

34 [email protected] September 2010

Willem van Biljon, Nimbula Willem van Biljon is a senior technology executive and entrepreneur who started his career building a UNIX-based operating system for mini-computers and the first retail debit-card payment system for one of the largest retailers in South Africa. Building on that expertise, he co-founded Mosaic Software to build the first high-end payment transaction switch for commodity hardware and operating systems. Mosaic became one of the world's leading electronic funds transfer companies with operations in more than 30 countries and was successfully sold to S1 Corp in 2004. Willem then joined Amazon to develop the Elastic Compute Cloud (EC2) service business plan and to drive product management and marketing for the service.

Willem is a graduate of the University of Cape Town.

The next layer of cloud automation Willem was part of the team that built the Amazon EC2 cloud, and his new company, Nimbula, is focused on creating a new layer of automation to a broad vision of cloud-based infrastructure. Our discussion focused on the next generation of cloud-infrastructure management as organizations increasingly seek to apply policy and provisioning in a dynamic fashion across public and private clouds.

Key points There will be a change in how people purchase computing resources,

driven by the desire for flexible use of hardware, cost drivers, how applications are deployed, underutilization of servers and the desire to change how to pay for software.

A new breed of cloud-infrastructure software will be able to automate the orchestration of cloud resources, policy management and federation between clouds, resulting in lower cost of implementation and management of datacenters.

Automated and federated systems that orchestrate cloud services will reduce the cost and complexity of IT, making it more transparent.

The continuum of new infrastructure and payment models allows for a stream of innovation that previously would have obviated investment hurdles and management requirements.

Projects or investments that might have been unthinkable in the past because of resources will suddenly be trivial.

‘Previously, IT was task based, and hardware was purchased to solve a particular task or problem. If the task didn’t warrant the cost of buying five to 10 servers or whatever the costs would be, then it never got done. If it now becomes trivially easy to do it, you are going to find pent-up demand for all of these tasks that really make sense to do now.’

Summary of interview on 2 August 2010.

Full transcript follows

Nimbula is focused on creating a new layer of

cloud infrastructure automation

Automated and federated systems that orchestrate

cloud services will reduce IT cost and complexity

Page 35: 2020 foresight - Tech Views of the Future - Ed Maguire

Willem van Biljon, Nimbula US software

September 2010 [email protected] 35

Willem van Biljon transcript Ed: We are very interested to get your views on the evolution of software, networking, technology, and understand a little bit about what has been behind the founding of Nimbula and your vision there.

Willem: When I speak to people I talk to them about what I see as an inflection point in the way computing is happening at the moment. That inflection point is being driven by a whole bunch of influences. The one is just generally the cost of compute that enterprises face and the possibilities of significantly lowering that cost. That is the major thing that EC2 has shown by providing cost on a per-hour basis and at a pretty low point. That has shown that the cost of compute has some way to go down. That is one driver.

The other driver is the tremendous underutilization of datacenters today. That is a problem that virtualization has tackled and successfully so, but there is still some way to go. The reason is that workloads tend to be incredibly static within datacenters, even virtualized datacenters. You still manage the virtual machines as if they were real machines. You connect them and you get them up and running and then they stay there.

The third element is just the way that operating systems as well as datacenter environments have been built over the last decade or so and maybe even longer. Generally, people purchase hardware for a purpose, for an application. When you want to run an exchange server, you buy a machine or machines for it. When you want to run your financials, you buy a machine for Oracle financials. You tend to buy hardware for the application, instead of if you think back to the invention of time-sharing systems, machines were there for any variety of applications. The reasons why that happened are that hardware was cheap enough, applications consumed all of the hardware and sometimes didn’t. You had to make sure there was scalability and there were some security concerns, etc.

As we go forward and as we have virtualization to help us, I think there is going to be a change where people want to buy hardware that they can multipurpose for any number of things that change over time. There is the cost driver, there is the how applications are deployed driver, there is the underutilization driver, and there is the how people want to pay for software driver.

If we think back to the 1990s or the early 2000s, it was still very much “buy license fee and pay 20x% for maintenance per annum” on that license that tends to be a perpetual license fee. That has changed over the last decade, that to an area where there is much more of a recurring license fee or a subscription fee.

In some cases even software, pure software as a service. I think there is this need to pay for software on a per-use basis rather than this huge capital outlay and use it forevermore.

All of these things conspire to change the way people want to compute and are going to compute in the future. That is a driver that we see in building Nimbula, to say ‘hang on, there is a need to build an infrastructure that sits on top of lots of machines that make it easy to spin out workloads as needed and spin them down when you don’t need them anymore.’ You get better utilization of the infrastructure, you get applications that are used when they are needed in a temporal way and they are spun down when you don’t need them, giving rise to better utilization of the datacenter.

Ed: Obviously there are economies of scale for datacenters in a cloud computing environment, but do you see critical mass in terms of size that may act as a barrier to entry for cloud-service providers or enterprises to adopt some of these technologies?

Willem: Yes, economies of scale do matter. They also improve the bigger you are, but that is true for many things. That doesn’t mean to say even if you are small, you can’t benefit from doing things better. With a small datacenter, you may not be able to get the unit costs as low as people who have large datacenters. But you can still do it at a cost point that is much better than you are doing it today. The reason is cloud infrastructure (even for the enterprise in the private-cloud environment) provides far better operations.

We focus a great deal on making sure we can manage the scale of the infrastructure, but also making sure we drive the automation of the infrastructure. Automation drives costs down and makes it possible to get these workloads up when you need them and take them down when you no longer need them. We see that as really reducing the cost for everybody.

Ed: Willem, when you talk about workloads, I would love to get your perspective on whether there is a need to optimize an infrastructure to handle different types of workloads. There is a range of types of applications that run on the cloud, from social-networking applications to traditional enterprise applications. I am curious of your view to how the datacenter of the future might adapt to the characteristics of those workloads.

Willem: That I think is one of those crystal-ball questions, but let me give it my best shot as what I think the trends are. Today there still are applications that belong to the traditional legacy kind of datacenter, the big financial processing applications for example.

Page 36: 2020 foresight - Tech Views of the Future - Ed Maguire

Willem van Biljon, Nimbula US software

36 [email protected] September 2010

Prior to Amazon, I had a string in the payments business, so the kind of thing that processes lots of Visa or MasterCard transactions. These are applications that are decades old and have sat on the same hardware for a long time.

Today, and for some years to come, they probably will live in that kind of hardware. I do see them eventually moving, and moving into this much more agile and flexible infrastructure where the unit of compute is maybe a smaller server than the very big guy used, but you have many more of them. Of course, application re-architecture is required to do that. I don’t think there are significant, specific barriers to saying certain applications can never move there, but there is an enormous amount of inertia for applications to go there.

I don’t think applications that are being built today are being built in the same way. I think they are being built for the new environment. They are being built for this scale-out philosophy, with the idea that today I might have five machines to run on, tomorrow I might have 50, and the next day I might have 10 just as the demand and needs for work to be done change. I think there is a trend in that direction. I don’t think there is anything absolutely specific in the datacenter that needs to be done, but I think there is inertia.

What in the datacenter is required for specific workloads? I think there are some that have some specific needs; some workloads need to be very close to storage whether it be fiber channel, whatever it may be. Others need to be co-located on very fast networks. You might think of the very fast computing world. I see those as challenges, not necessarily for specifically how the datacenter should be built. You can build datacenters to take account of those things, but they are challenges in how the application placement algorithms will work in the future.

How do I specify when I want to fire up an application or a workload to say, this workload needs the following things: it needs to be near the fast storage, it needs to be close to certain other application, and it needs not to be close to others because it is a high resilience situation; and the placement algorithms take care of that. For us, for Nimbula, that is key and the place where we are doing a great deal of work because we think it is an important part.

Ed: The ability to orchestrate a context of environment in terms of where and how that application stack is deployed across a network of cloud resources is really the challenge that you are addressing?

Willem: I think the operating system analogy works well here. If you think about the traditional operating system, Linux, Windows or even a Mainframe OS, it

has a number of jobs. One is to share the resources and to make sure that applications that need resources can get those resources in an efficient and rapid way. When you are one machine, that has some challenges; but when you have many machines with slightly different capabilities on each actual individual box, then that becomes a more challenging task because now where you actually execute that application has a significant impact on whether that application can get hold of the resources it needs.

Ed: The idea of a cloud operating system is not necessarily as tied to the applications or the programming languages of each application, but is more closely tied to the specific resource and performance requirements of each workload.

Willem: Absolutely and we don’t see that necessarily in what the application is written or how it is architected. It is fundamental to this. Of course, applications that run well in this environment are generally architected for this scalable, agile environment, as I pointed out earlier.

If you have a very large, monolithic application that was built for scale up, then it is not ideally suited for multiple servers that grow and shrink as you need more or less CPU power. However, applications that are built that way can be built in any language with any kind of distributed processing paradigm in mind.

You are absolutely right. There is an orchestration level control plane that sits above the hardware and above the hypervisors that manages all of that resource. To take the operating system analogy further, the other thing an operating system does of course is that it makes sure that the applications have the permissions and authority to get at those resources and get at the data they want and get at the networks they want. Some of these things need to be built for this cloud operating system to make sure that it manages the releasing of access to the various pieces of resource in the cloud.

Ed: This is becoming a much more critical issue with this distributed architecture or architecture of network resources. The need to ensure that the data is housed or curated in the appropriate domain with the appropriate controls and access controls when you move beyond a pure physical architecture into a more distributed approach.

Willem: Absolutely, which is why for a long time coming, we will see hybrid approaches. The public cloud clearly is a very important change in the way computing happens. EC2 has been tremendously successfully and provides a fantastic service, but there are some workloads and some data that companies might just feel they can’t put out there yet.

Page 37: 2020 foresight - Tech Views of the Future - Ed Maguire

Willem van Biljon, Nimbula US software

September 2010 [email protected] 37

The reasons are many. It might be regulation, it might be data privacy issues, it might be comfort level, it might be internal policy. But whatever the reasons might be, there will be those things that companies feel uncomfortable with putting in the public cloud and want to keep locally. Even when they run it locally as I have argued in this discussion, they can still benefit from using cloud architecture there.

The question becomes, well . . . where you run the architecture is no longer a question of is it in the public cloud or is it in the private cloud - it is just who owns the hardware. That becomes a moot point. It becomes a policy question. Does my policy allow me to be in place X or does my policy allow me to be in place Y, depending on what the particular constraints are that I have around this application.

Ed: The physical aspects of the cloud are going to be critical to policy and orchestration of workloads. Do you see other factors such as energy resource intensity or the availability of low-cost power or low-cost IT overhead becoming either more or less important than governance in locating?

Willem: To an extent, all of the above because certainly governance will be a key issue and regulatory elements will be a key issue. We already know that various countries and conglomerates of countries like the EU have specific data privacy regulation that make some of those decisions for you, but there are many other elements. Simply from a compute perspective, we know that compute needs to be close to the data.

There are some great papers that argue very convincingly that you really cannot effectively compute if your data is removed with any great distance from where the compute happens. Latency is going to be an issue. Latency to get to your data and there you are up against the speed of light, so we have a fairly hard stop on what the capabilities there are. There will be a number of factors. Where is the data generated? How big is the data? Regulatory things we have mentioned and then of course for the larger datacenters, the other things you mentioned. Access to low-cost power and so on will be influential, but I think they will be influential in the big centers.

Ed: Are there specific types of innovation that this evolution of cloud infrastructure could be particularly conducive to encourage? We visited Amazon’s Web Services Day for start-ups and entrepreneurs here in New York and when you hear more and more stories of companies that are now launching their own initiatives or applications with incredibly low costs, it seems that there is a potential ripple effect throughout technology and the economy from the availability of cloud computing resources.

Willem: I agree. Could I characterize which are the ones that are going to be great for innovation? I don’t think so. I think we will be surprised as one always is at which are the great ideas that get traction and which are the ones you thought were great and don’t. What I do think is a very significant move here is because of the massive reduction in cost and the removal of that capital expenditure barrier, and it is very true and very visible for the startups in the public cloud, but the exact same phenomenon happens in organizations once they get the friction of getting compute done out of the way, when it becomes really easy to get it up and running, and when the cost of getting a new application is running. I think jobs or compute tasks that previously did not make sense to do are going to start seeing the light.

Previously, you generally bought hardware for tasks. If the task you had just didn’t warrant going out to buy five servers and rigging them up and getting the operational people around etc, etc, then it never got done. It might actually be a reasonably important task that just doesn’t obviously warrant that effort. If it now becomes trivially easy to do it you are going to find all of these pent-up demand task that really make sense to do now. I think the companies that start doing them are going to find competitive advantage when they start doing these analysis jobs, search jobs, or whatever they may be that previously didn’t make sense to do.

Ed: A lot of these are going to be either highly compute or data intensive that may not have been as practical in the past because of the constraints on compute availability or the size of the data sets or the combination of both.

Willem: Potentially, I think there is range. I would be loath to predict whether it is either or a combination. From the discussions we have had and seeing the behavior that people tell me, as you pointed out just now, on how startups behave in EC2 I think there is definitely a trend towards saying that things that just didn’t make sense two to four years ago now suddenly do make sense. That is opening up the innovation to directions that previously wasn’t possible.

Ed: It is hard to predict the future. Nobody 10 years ago could have anticipated the rise of a Facebook or a Twitter. One of the common themes we have been hearing in our conversations is that this evolution is really a wave toward cloud-enabled computing and it is having some really fundamental ramifications about what types of applications can be created. The scale and the speed in which applications can be put together or one of the examples Dave Cohen cited was this application designed to find earthquake survivors in Haiti using their cell phones. This application was put together in less than three days to have a global reach and it is pretty amazing.

Page 38: 2020 foresight - Tech Views of the Future - Ed Maguire

Willem van Biljon, Nimbula US software

38 [email protected] September 2010

Willem: The mind boggles at the kinds of things that suddenly find a home and find a way of doing. To spin back to your earlier question, it all comes back to a very fundamental change in how people view compute or computing infrastructure. Today it is viewed as an event-based system. There is an event that I want this application and therefore there is an event that makes me go out and buy a bunch of servers. There are controls around that and make it very expensive to do both in corporate time as well as in capital outlay.

When you make this a continuum, when you are paying by the drip or by the use, whether that payment is internal or external, it becomes much more of a continuum. You get this stream of innovation that previously would have been stopped at this wall of ‘I have to get this event going by buying some servers.’

Ed: The analogy that just came to mind is the industrial revolution and the increase in industrial output that was enabled by having electric utilities. If you think about a plant creating wheat flower, in the past you had to be located next to a river and the wheel would turn and grind the power. As soon as you have machines you can plug into the wall with electricity coming from a utility that has a fairly dramatic impact on the ability to create value.

Willem: Great analogy.

Ed: I think that is what you guys are seeing. I would be curious to get just a little more insight on what Nimbula is doing and where you guys are seeing the opportunity for your technology.

Willem: If you take trends that I mentioned and put them together, the need we see is to build this infrastructure or this cloud-based operating system, if you will, that sits on top of the servers in the datacenter and enables them in a way that is more agile and more easy to use and more rapidly deployable to the enterprise.

A number of functions is key to this. We want this to be able to achieve very rapid scale. It should be able to start small, but should be able to scale to very large number of nodes in the datacenter. Together with that, huge amounts of automation (because automation reduces costs) take human intervention out of the intervention, increases agility again.

The third element is to make sure that we put in place all of the permissions and the policy management systems. Is user A allowed to run this application in this piece of hardware? Is user A allowed to run this application in a remote piece of hardware?

This gets into the fourth element which is federation. We really do believe that it is going to be a hybrid model going forward so we are also building federated models into the public cloud like EC2. If you are a user of the Nimbula software in your own datacenter, not only can you say ‘take this virtual machine and launch it here in my datacenter’. You can also say ‘no, launch it in EC2 but before you do please check all the policy and make sure that the user that is making that request has the authorization to do that.’ The organization retains policy control and we achieve that goal of policy becoming the driving forces to where things can run as opposed to who owns the hardware.

Ed: That is a vision of a dynamic resource regardless of where it is located. Your vision is to span that public and private cloud in order to offer the ability to apply that policy regardless of the underlying cloud layer.

Willem: Exactly and to get there of course, we need to enable the internal infrastructure to be a cloud system, so you have the same kind of capabilities that you have with EC2 today and your internal datacenter. You should be able to launch and spin up workloads and very rapidly spin them down and create networks in a very dynamic ways as opposed to very static ways they are done with VLANs today. Our product does that today and we are kind of in 02 beta with it. As we move into the future, you are going into the federation of the other clouds.

Ed: It sounds exciting. Your view is based on an API that interacts with the virtual datacenter. Your API really works in between the applications themselves that are being deployed and the underlying cloud infrastructure.

Willem: Yes, the user of the datacenter, the developers, or the administrators that deploy applications into the datacenter would use our API. Either in its raw, programmable, restful way to send commands or through the GUI that uses the same API underneath, or a command line interface that we provide that also uses that same API.

We would use one of those three mechanisms to say, ‘I have this workload, please fire up 10 of these on 10 machines in the datacenter. Or take two of these and three of those and four of these and make sure that the third two are run on smaller CPUs with a little bit of memory and the third five should be run on very large machines with lots of memory and deploy those from the end.’ We will find the appropriate resources within the datacenter, connect the correct networks together, connect them to storage, and stock them up.

Page 39: 2020 foresight - Tech Views of the Future - Ed Maguire

Willem van Biljon, Nimbula US software

September 2010 [email protected] 39

Ed: Willem, I would love to hear if you think there are any obstacles or risks as we look ahead a couple of years to realizing this vision and promise of the cloud architecture and what might be the challenges that users or technology providers may face.

Willem: One of the challenges is what to do with all of those large datacenters, very expensively, for a long time. Find a big finance application or transaction processing application or whatever. Many of those are naturally coming to end of life. You can keep that up for awhile, but the cost benefit is required. There is

re-architecting required of many of those old mainframe applications. The rate at which that happens is still uncertain and that is going to be a question as to how quickly those parts of enterprise databases can move.

I think new parts of enterprises’ datacenters are moving and the new applications are being developed makes use of the cloud infrastructure. I am not concerned about the risks there going forward. It is a great growing market. I do wonder how long we are going to live with those legacy applications.

Page 40: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

40 [email protected] September 2010

David Cohen, EMC Dave Cohen is responsible for the overall architectural-design direction for EMC's cloud infrastructure division. With 2008 revenues of US$14.9bn and approximately 40,000 employees, EMC is the world's leading developer and provider of information-infrastructure technology and solutions that enable organizations of all sizes to transform the way they compete and create value from their information.

With a career spanning over 25 years, Dave recently joined EMC's Cloud Infrastructure Group after spending the last past several years working in the financial services industry, most recently at Goldman Sachs where he was a senior storage strategist. Prior to this, he was employed by Merrill Lynch where his focus was in the areas of low-latency and utility computing issues.

Leveraging the growth of cloud infrastructure The term “cloud computing” includes offerings that have been in place for some time. Online advertising, ecommerce and HR services are examples of well-established offerings delivered over the cloud. Gartner estimates the current market is already approximately US$46bn using this definition. However, most tend not to think of these services as cloud computing. IDC estimates the market for cloud services, excluding online advertising and ecommerce, will grow to US$42bn in 2012.

In basic terms, cloud computing can be thought of as a pool of resources, dynamically allocated and accessible through a common application programming interface (API) - essentially a set of software “hooks” for users to connect to. Hardware commoditization and virtualization have converged with grid computing and SOA to form the basis of cloud computing. Multi-tenant architecture enables a scalable infrastructure that may be accessed as a service. SaaS, where the economies of scale are exploited, is possible because cloud services are elastic, available on demand and measurable.

Microsoft’s “Datacenter in a box”

Source: Microsoft

Cloud computing can be thought of as a pool of resources dynamically

allocated and accessible

Microsoft’s Azure-platform appliance

employs an Akamai-like vision for the cloud

Gartner estimates the current cloud-services

market is already ~US$46bn

Page 41: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

September 2010 [email protected] 41

Driving intelligence to the edge of ubiquity David Cohen brings a wealth of insight and long-term perspective to his views. Since we met when he was at Merrill Lynch, David has always had a deft sense for important new technologies and a deep sense of the broader societal ramifications of new innovations. Our conversation centered on critical innovations in cloud computing and agility that are transforming how computing is architected and applications are delivered.

Key points The biggest change in the next 10 years will be the continued evolution,

expansion and coverage of the internet itself.

The growth of connectivity drives equal growth in the number of perimeter devices, which drives growth in content and bandwidth demand.

There will be massive expansion of connectivity to the edge of the network.

More datacenters will be located at the edge of networks to mediate the balance of bandwidth utilization.

As connectivity becomes planetary in scale, new opportunities will arise. It becomes easier to create collaborate applications on the fly.

Large datasets enable greater predictive power through more statistically significant inference capabilities.

There will be fewer companies capable of scaling platforms, such as Amazon, Google and Microsoft, but there will be more opportunity to innovate on top of the platforms and leverage their scale.

At some point, digital content becomes a commodity as people reach a state where they expect to connect anywhere at any time and have reasonable service, just like water or electricity.

At that point, the nuts and bolts of the infrastructure and the providers look a lot more like Schlumberger in terms of a more normal commodity environment.

Summary of interview on 21 July 2010.

Full transcript follows

The biggest change will be continued evolution,

expansion and coverage of the internet itself

There will be fewer companies capable of

scaling platforms

Page 42: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

42 [email protected] September 2010

David Cohen transcript Ed: You are focused long term on the implications of the datacenter, the role for outsourcing and the potential for disruptive technology to change the way corporate computing is provisioned, maintained and accessed. In your view, what are some of the big changes we might see in the next decade, what are the key transitional elements for this transition to cloud architecture and what are the implications for users, organizations and potentially society at large?

David: The biggest change is the continued evolution, expansion and coverage of the internet itself.

Ed: The expansion beyond PCs and mobile phones to devices and sensors?

David: That’s right. As an example, the United States has committed US$7.2bn as a part of the stimulus package to the expansion of broadband around the country. In parallel, for the last several years and probably for the next several years, there is a huge capital investment in 4G networks and WiMax, etc. In the United States and around the globe, there is a massive expansion of connectivity out to the edge.

In parallel to this, certainly in the case of end-user input-output devices, this has transformed to wireless untethered devices like iPhone or Blackberry or net books. Things like this are very very mobile. In the industrial space, these similar technologies are being packaged and tied together with sensor networks, point-of-sale devices, or some combination thereof. You have all of this data being collected, produced or consumed.

Another jump in terms of the numbers, just in sheer distribution of the number of people with cell phones is at least an order of magnitude more than people who have highly connected PCs. It is probably much higher than that. If you look at sensor networks, wireless point-of-sale devices and RFID data collectors, there is also a massive expansion from where we were with tethered devices in previous generations. We are talking about connectivity that is literally on a planetary scale.

In the United States, and I am sure this is true more broadly, there has been a huge decline in the rural communities for doctors and nurses. It is a real struggle. One of the rationales for investment in broadband was simply to put highly connected devices, medical instrumentation, out in these rural areas to enable a medical practitioner to perform the functions traditionally done by doctors and nurses. The idea is that the instruments could be maintained physically by a medical practitioner, but the data can be gathered and made available more broadly. Doctors, in a more regionally centralized position, could help practitioners give the appropriate care.

Ed: I think of it as analogous to the managed networking or managed security services model. You might have a domain expert whose capacity to analyze situations, diagnose particular problems and provide care is amplified by the connectivity at the endpoints.

David: That’s exactly right. You could imagine this being applied in other disciplines as well. Certainly there are examples of this already in the armed forces. People are using social-networking constructs to bring to bear analysts and people back in the US with soldiers on the battlefield. They are interacting in real time and there is quite a bit of data sharing going on within this collaborative context. If you think of all of these narratives, what you see is that there is no way for a single datacenter centralized in the United States to support all of this connectivity. What is happening is more and more datacenters are being pushed towards the edge where these devices are to better mediate the balance of bandwidth, bandwidth utilization and response times for the other end of the connection.

This is driving the massive spending that is going on in just servers for co-location facilities around the world. This is driven by anticipation of this additional connectivity. It is very, very clear that this spending is in the colo facilities arena.

Ed: So you really need to handle different types of architecture, distributed content delivery and distributed data structures. I am thinking about Hadoop, MapReduce, Cassandra and some of these larger databases, but as well as the way an Akamai has approached the problem of internet content delivery by having these presences in so many local networks with cached content.

David: That’s right. If you think of Akamai as an illustration, they were very forward thinking at the time that they came out. They have been very successful, but as far as fundamentally they were a caching layer leveraging HTTP. They co-located their service in colo facilities that were highly connected; tier-one or tier-two connected datacenters around the world. I think they are in 40,000 points of presence today, but have nowhere near that many physical nodes in their infrastructure. It is an example of a service provider who is actually not the datacenter provider. They built their service by actually federating across many different datacenter providers, but they operate their service as if it is a virtual datacenter globe.

There is a trend here to replicate this. Microsoft announced last week a partnership with Fujitsu. They are going to be deploying their Azure appliances in Fujitsu datacenters. This is a different perspective on the same problem of database distribution. Microsoft doesn’t have enough databases to compete with Google

Page 43: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

September 2010 [email protected] 43

and it is very difficult to build databases at a rate that can close the gap, so instead they are partnering with other datacenter providers. This gets them closer to that edge where the content is much faster. You could imagine others going after this same strategy.

Ed: Are there potential barriers to other entrants or service providers to try to scale if they rely on their own proprietary designs or do open-source solutions lend themselves to more of an open-source approach? Microsoft can certainly do that and I did see the Microsoft Fujitsu announcement. It seems that Fujitsu has about 90 datacenters globally. The idea of using a standardized framework like Azure to have some level of heterogeneity that allows you to scale these datacenters with a modicum of replicability and economies of scale is an interesting concept.

David: Again, I think the Microsoft approach is incremental on the business model that Akamai has been so successful. Think of that in the broader economic sense of vertical integration that has happened since Cisco entered the server market. HP is now fully vertically integrated, Oracle is fully vertically integrated. Maybe not so obvious, but Microsoft and Google are both fully integrated because they do all the builds for their pod appliances.

You have five or six vertically integrated platform vendors that can deliver PaaS, SaaS or infrastructure as a service at the same scale as Akamai delivers content distribution. Once that starts to happen it is more and more difficult; it is a big barrier to entry to get that level of scale. You have to go up the stack. If you look at Cloudera as an illustration or some kind of algorithmic specialized provider, data provider, or algorithmic providers that couple themselves to a particular data class. Perhaps the medical or perhaps in the genomic sequencing arenas. And you innovate in that way. That looks a lot more like an oil and gas environment or a more traditional commodity environment at that point.

Ed: You made a really intriguing analogy the other day when you described Oracle over the long term taking a Schlumberger-like approach to the business of corporate IT.

David: I think that’s what happens when you get to global scale and digital content becomes literally a commodity. People just expect to turn on anywhere anytime and get reasonable service, just like water or electricity. At that point, the nuts and bolts of the infrastructure and the providers who do that stuff look a lot more like Schlumberger and that supply chain that would supply into those kinds of markets.

Ed: What does this do ultimately to the vision of the enterprise application as either a monolithic packaged offering or a customized application? Historically we have had these large ERP applications that have been a gold mine for service providers and integrators to customize and to maintain, but with the disintegration you get with a distributed or an on-demand infrastructure.

It seems there is a lot of opportunity to create these composite applications that are just a conglomeration of services. This disintegration could create some opportunities for those that can help orchestrate them, but will also undermine the account control that some of the established vendors have been able to rely on.

David: I think if you look at Oracle and their Exadata appliance, Oracle is arguably one of the more successful large packaged-software vendors and they were able to transform themselves into a vertically integrated provider to address exactly that dynamic. If you look at how Facebook runs under the hood it is very much reliant on MySQL. Yes, Oracle owns MySQL, but how much revenue are they really getting from Facebook? It is probably not a huge amount. If SQL went the open-source way and Facebook way, Oracle wouldn’t be hurt. Neither would IBM DB2.

I think both of those companies and others will be delivering data warehouse and SQL services as physical appliances that you could deploy in service providers or in your own datacenter. Again, that makes it very difficult for those that don’t do that. But at the same time, I am old enough to remember when the mainframe was always going to go away and COBOL was going to go away. There are still people I know making very good livings working in those arenas. Enterprise environments change much much slower. I do think there is a tiering effect that is happening, where you have these large distributed co-location facilities where there are huge amounts of innovation that is matching up with the innovation in the consumer and industrial un-tethered device space.

Conversely, you have this operational tier with brick-and-mortar companies with huge investment. They have a different tolerance for risk and they move at a very different pace. I think there is a huge business that is healthy that has been established with processes and relationships that’s going to go on for a long time. They are going to coexist, but the issue is, the operational, that second business is a mature business now. It is not a high-growth business.

If you look at the cloud, colo facility area, and that article the other morning in the Journal was very clear,

Page 44: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

44 [email protected] September 2010

the growth in that segment of the market is huge, but is the growth in the traditional brick-and-mortar datacenter the same or is there a differentiation in the growth rates between the cloud colo facility infrastructures and those in the traditional datacenter. I think you would find there is a pretty stark difference in the purchase rate.

Ed: After reading the Wall Street Journal article (Spending Soars on Internet's Plumbing, 22 July 2010), I think the opportunity for innovation among these specialists who are providing those distributed datacenters is pretty exciting. Then you talked about some of the work Talera and Quanta are doing with massive parallelism. The ability to bring high-performance computing concepts and resources to more users with lower barriers to entry that could be hugely transformative to so many different areas of the regions, research and the economy.

David: Literally when you look at the input devices and the growth for cell phone connectivity, for car connectivity, whatever, you have to see an equal growth rate in the core count in perimeter databases. It is not that different from when people started to use browsers and moving content around there. You think about the number of processors deployed in databases between 1995 and 2000. There was a huge jump and it was directly related to the browser, the trend of the browsers and the change in how networked people accessed content. That trend is just continuing and it is what is driving the core count. So what enables something like the Talera, Quanta, or the Sea Micro Platform to go in and compete head to head with the x86 traditional servers? It is hard to believe you can call an x86 a traditional server, but it is true. That is the datacenter marquee product today.

Ed: It seems there is always a new round of disruption that creates a better cheaper, faster model in technology.

David: That’s right.

Ed: One of the other comments you made the other day that was intriguing was the emergence of the ability to put together these new applications in remarkably compressed time. The application to locate missing people in Haiti took just three days to put together using social networks and Smartphone interactions.

David: Yes, it is stunning. Traditional enterprise approaches to software development would have taken months to even get something of that scale and criticality off the ground. In the Haiti disaster, a few technology volunteers put together a global scale missing-person facility in a matter of days with wild success. It is definitely a dramatic departure from the norm.

Ed: Well, that just creates more evidence that there is going to be acceleration in innovation. If you think about how transformative the iPad has been in shifting the perspective of what a mobile device can bring to the enterprise in just a short amount of time. I don’t think I have ever seen anything that has had this quick and profound of an impact. It is still evolving and over the next decade we will probably see this play out several times.

David: Definitely, in my lifetime that has happened over and over again, and there is no reason it won’t continue.

Ed: You talked about the math that Google is using and I guess this goes back to the Shannon’s Information Theory and the potential repercussions of the mathematics that they are currently using. It is fascinating that there is such power in the ability to track, understand and replicate human behavior and the idea of moving toward the singularity in 20 years is really compelling. What are the opportunities or the risks as we get closer to a world where all aspects of your life are turned into data that someone can access, model or use for either invading your privacy or marketing to you?

David: The intersect with Shannon is really around the notion that communication is a function of encoding and that that encoding is dependant on computation. That seems very abstract, but when you think about what the internet is and the basis in Ethernet, TCP/IP and other transports layered on top of that and the applications and the functionality and the value that has grown out of it, there are really two things at work. One is the value derived by the end user of that access information between the user and a browser. From a smartphone, think of untethered devices or other types of devices that are now able to communicate or exchange data and then operate on that data at much faster rates and much lower costs. That trend is continuing for many years with huge massive value creation.

If you look at what is happening now that model is even morphing, taking it to another level where notions of taking HTTP logs and deriving value, Google’s whole business model is mining the HTTP logs and indexing the web pages. In that context, Facebook and others follow that same business model. If you think of that approach, it really is two different areas of value. The second area where you are mining derivative data from the usage of content and then inferring behavior based on that is really based in the context of Shannon and others and the math is very much around statistical inferencing. That ability to make inferences is enormously enhanced by large volumes of data. The larger the sample, the better the inferences become.

Page 45: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

September 2010 [email protected] 45

Literally the more connected we are increases the information that’s available statistically and improves the inferences about our behaviors that models can make. It’s not just our behavior, it’s relationships between documents we have left behind, our car, or whatever has attributes that have been captured digitally, can be used to make inferences about us or about others around us. That definitely is a trend that will continue to unfold.

Ed: Do you see any physical limitations or barriers that are going to need to be overcome over the next decade, whether it is energy usage, storage capacity or IO? Are there physical constraints that we may run up against as we start to see this massive build out?

David: I’d be remiss if I didn’t point out that there are always physical constraints. Obviously power and energy is a huge constraint and a huge issue and is going to become more and more a scarce resource that is in huge demand and will play out in a big way. In terms of physical or technical infrastructure that needs to be built, the move in Ethernet from 1 gig to 10 gigs to 40-100 gigs is really a function of tracing and chasing this opportunity for this massive amount of content creation. Storing all of this information that is being created for a variety of reasons also has its limitations and there are already very clear examples of the rate that we are creating content is far exceeding the ability to provision capacity. These are all things that will definitely be focused on and I think there will be a lot of innovation in these areas.

Ed: It already seems that the interest in the distributed data management around Hadoop is certainly driving a healthy ecosystem of startups around the data warehouse world which had been quiet after a lot of the consolidation.

David: I think in a lot of that data warehousing, especially analytical facilities around those big data pools, is an area that is going to continue to have a lot of innovation.

Ed: You made a comment a while back that the model of the Wall Street developer was linear. You had to hire more programmers in a linear fashion to scale the business. Outsourcing only got you part of the way, but the need to provide tools that would create a much more transparent and less technically demanding environment for development of new tools would really need to evolve. I would love to get your thoughts on the state of application development and how any of the innovations we may see might create the opportunity to democratize application development.

David: It’s a great area of a lot of interest for me. If you look at all of the different industries, the most valuable people in the organization in terms of their

contribution to the top line, more and more frequently what you find not just on Wall Street but literally across all industries, is the person who has the ability to analyze data quantitatively and take advantage of this massive uptake in information. Making those people more productive and being able to leverage them more and more broadly is really a key. When we worked on Wall Street, it was the quant more and more frequently in the derivatives and the algo-trading world that was the trader. To make that quant effective, you had to buy more and more development resources to support them and get them to scale. That is true in all industries where these talented people work. That is not a sustainable business model. Transforming the environment so they can do things more effectively themselves or with tools around them is an area where there is going to be huge innovation.

Ed: Another area I think is also transformative is education. Dave Gelernter, in Time to Start Taking the Internet Seriously, made an illusion to education as one of the next areas where we will see significant disruption. The traditional university will cease to exist in its current form but instead may be a truly online university or learning experience that may become the predominant way.

David: I definitely think that is true. There was an article that I read about education in the context of access to manuals and other training information. This was for soldiers, but you could see how this would apply broadly. Literally using smartphones and streaming video to allow a soldier access to training and self help in the field is a massive benefit for somebody who is remote who may be in duress. The ability to innovate in education and the importance of that is huge. There are going to be huge amounts of change there.

Ed: This ties into another theme that I have discussed a few times with Dave Kellogg at Marklogic. Many of the media companies that have specialized in technical journals that exist in hard copies with very limited availability - for the companies that curate, cultivate and produce this content are increasingly creating information services or information subscriptions that provide access to this content. What that also does is a devaluation of the scarcity value because it is delivered over the internet and it democratizes that information. Maybe we will all end up getting smarter.

The one thing I worry about is Nick Carr’s argument that the internet and the interruptions of email and text messaging and all of the modern appurtenances to our connected lifestyle have fragmented our attention spans and impacted our cognitive ability to remain focused and engaged in deep thinking. It is certainly an area of debate. I know we all fight the numerous interruptions to our day, but maybe working in finance

Page 46: 2020 foresight - Tech Views of the Future - Ed Maguire

David Cohen, EMC US software

46 [email protected] September 2010

you have to have a little bit of attention deficit disorder to begin with.

David: Watching my kids and how they use technology, I definitely wonder about it. At the same time when we sit down for dinner and have debates or talks, they actually amaze me in the depth of their thinking and in the way they approach topics. I think it is a fairly debatable topic.

Ed: It is native to the era and the environment they have grown up with, just like I think a lot of the innovation is going to be native to the people who have grown up thinking about mobile applications being always on and having that ubiquitous connectivity.

Dave, I really appreciate your time. Your insights are always much appreciated and I very much value this. After we spoke last time, I read some of Gelernter’s writings and his argument in Time to Start Taking the Internet Seriously about these cyber streams, which is

really intriguing. I don’t know if that vision will ever truly play out, but it is pretty compelling.

David: He is an amazing guy.

Ed: What struck me about Gelernter’s work with his company Mirror Worlds was his fundamental premise that the way information has been organized on PCs and the internet mirrors some arbitrary decisions made by UNIX programmers in the 80s and 90s and that we need to organize a more narrative approach to organize information because that’s how our minds work. I think we may get closer to that with social media and the idea of being so connected to people on Facebook, for example. There was always that dynamic of losing touch with people after a few years of moving on from school, jobs, etc. There was that natural process of dispersing and you had to make an effort to stay in touch with people. Now we are connected to people in a way that is far more persistent than anything that we have experienced before.

Page 47: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

September 2010 [email protected] 47

Simon Crosby, Citrix Simon Crosby was founder and CTO of XenSource prior to Citrix’s acquisition of XenSource. Previously, Simon was a principal engineer at Intel where he led strategic research in distributed autonomic computing, platform security and trust. He was the founder of CPlane, a network optimization software vendor, where he held a variety of executive roles. Prior to CPlane, Simon was a tenured faculty member at the University of Cambridge, UK, where he led research on network performance and control and multimedia operating systems. He is the author of over 35 research papers and patents on a number of datacenter and networking topics, including security, network and server virtualization, resource optimization and performance. In 2007, Simon was awarded a coveted spot as one of InfoWorld's Top 25 CTOs.

Virtualization - The foundation of cloud computing The precursor to true cloud computing, virtualization allows for the pooling of server resources, independent of the underlying hardware. This vastly improves the resource utilization and flexibility, but also increases the complexity of managing an increasingly heterogeneous infrastructure.

Virtualization refers simply to the abstraction of a resource and the delivery of the resource, but in practice separates the logical layer of software from the underlying hardware component. There can be many types, including server, desktop, application, presentation and storage virtualization. Server and desktop virtualization are enabled by the hypervisor, a piece of software that allows multiple operating systems to run concurrently on a host computer. The overall market opportunity for virtualization is expected to be one of the most robust in IT over the next several years. The overall management opportunity is expected to be the largest segment of the market.

x86 Virtualization market forecast

(US$m) 2008 2009 2010 2011 2012 2013 2008 - 2013 Cagr (%)

Hosted virtual desktops 38 79 152 301 504 796 83.9

Server virtualization infrastructure

844 718 559 392 247 143 (29.9)

Server virtualization management

902 1,048 1,392 1,893 2,539 3,303 29.6

Total x86 1,783 1,845 2,104 2,586 3,290 4,241 18.9

ICA/RDP 1,305 1,401 1,499 1,604 1,715 1,844 7.1

Total virtualization 3,088 3,246 3,602 4,189 5,005 6,085 14.5

Source: Gartner

Virtualization allows for pooling of server

resources independent of the underlying hardware

Virtualization enables cloud computing, which in turn is fuelling innovation

The virtualization software market should

see healthy growth

Page 48: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

48 [email protected] September 2010

Abstracting the concept of a desktop Our conversation with Simon Crosby focused on the implications of virtualization technology and the adoption of cloud computing. Simon is leading thinker as evidenced by his experience as the founder and CTO of XenSource. He believes the notion of “the desktop” is about to change as advancements in virtualization decouple the components of Windows, freeing applications and users from their physical bounds. Simon discussed many thought-provoking topics and provided insights into the evolving landscape of hardware and IT administration.

Key points Today, the concept of a desktop is associated with a physical, locked-down

device that is limiting in terms of the user’s mobility and invested personality.

The rapid growth of connected devices and user personalization will continue to strain this concept of the desktop as the management burden intensifies.

In the future, a desktop will no longer be associated with a device or an operating system. It will be an abstract, establishing an identity and security context, and delivery of applications irrespective of operating system or device.

Virtualization breaks Windows into its components and removes the need to bind applications to a single OS, but there remains ample opportunity to service the millions of Windows-based applications that will be in use for a long time to come.

Over time, the notion of user identity will evolve to be federated among clouds by personal credentials, removing the need for IT to manage policies among multiple systems and devices.

Storage and server vendors face significant challenges in the new computing paradigm as Amazon Web Services and Microsoft Azure become defacto vendors, delivering commoditized compute and storage as services.

The IT organization’s invested interest will act as a resistance to cloud adoption, but companies are finding innovative ways to remove the burdens of IT at the margin.

Open source is a moving target. It is constantly at the edge of innovation, moving at the behest of developers from multiple organizations. Its continual evolution will result in open source becoming more relevant over time.

Summary of interview on 24 August 2010.

Full transcript follows

Storage and server vendors face significant

challenges in the new computing paradigm

Open source is a moving target, constantly at the

edge of innovation

Growth of connected devices and user

personalization will strain the concept of PC desktop

Page 49: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

September 2010 [email protected] 49

Simon Crosby transcriptSimon: Let’s divide the world into two categories. For simplicity, there are the back-end applications that run the enterprise and then there are end users who consume applications to do productive work for the enterprise. Let’s start with the latter.

Today, most people sit with a Windows-based desktop and they work on applications. Most of those applications that they work on today are already old. There is a profound change that is going to occur both in the applications and in the desktop. Today, the concept of a desktop is associated with a physical device. That is what people see. They walk into their office and they have a PC on their desk. That is going to change radically.

The concept of a desktop is extremely limiting because it is in conflict with the desired work behaviours. It is in conflict with the work patterns that companies expect. People are much more mobile. They work from home some of the time, and they might have to work at home when sick. They travel around the world and so on. What we need to be able to do is to be able to transform the way people work by making them more effective when they are in a place where they can be effective and we need to be able to allow them to be free. The constant in everybody’s life is access to computing.

Another major change is this idea of having a home PC and a work PC, which is kind of blurry too. A major trend that is happening (and it is happening courtesy of the profusion of smartphones and other smarter devices) is that the user has an invested personality and invested state in a particular device. It might be their own and yet it will also be used for corporate activities. At the edge, a vast profusion of different devices, it will no longer be just a Windows world with very substantial different numbers of smartphones and lots of newcomers entering the desktop market, the physical device market in general. Apple and so on is illustrative of this. There will be new device types, iPads and various other tablet-type devices. Really what is happening at the edge is this profusion of ways to get hold of computing.

The constants are those devices, whether owned by the corporation or the employee; we need to give the employee access to corporate materials securely. It is important so the corporation can be sure that its data is secure, from anywhere at any time. The employees also need to be able to deal with their lives. I am inseparable from my computer because I use it to do everything. I plan where my kids are going next and buy movie tickets and whatever else on my computer. We need to be able to liberate users to also use computing for their own personal means. Arbitrarily locking down their work

personality or work device and forcing that user to carry two doesn’t make sense either. That means we have to get into the business of being able to deliver enterprise applications to any device independent of what the operating system is.

We need to give a consistent user experience, consistent security and policy for managing the security of documents and various other things the corporation works on. We need to be able to give the employee access to any application on any device and at the same time support the user in their desire to use that device for their own purposes. That is a very profound change. One of the significant concepts that comes along with that is that the employee will increasingly own the device. I work on my own iPhone and my own iPad and when I go on short trips I take all with me. Once you raise this possibility it raises enormous opportunities and challenges for existing IT practice and existing IT functions because those devices need no longer to be on the corporation’s books but more importantly they need no longer be managed by the corporation. They can therefore dramatically reduce their management costs.

Computers become consumer objects. Consumer objects are every kind of widget which happens to have enough pixels to support the function the user wants to be engaged in. All of those need to be able to access the corporation. That is a very central challenge and traditional ways of guaranteeing or at least supposedly guaranteeing accessibility. In other words, being connected to a particular physical wire in a particular branch office or something is certainly no longer valid. They were never secure to begin with and so the challenge then becomes delivering to all of these devices on the basis of the assumption you cannot trust the environment to which you are delivering the application. Security is a prime concern there.

The concept of a desktop changes too because the application that the user is using will change substantially. Today, the majority would be Windows based, but over time the concept, once I separate my Windows environment from the physical device and move to a hosted virtual desktop, is fine. Do I really need those pixels on the desktop called Windows, or is Windows merely a runtime platform for the applications that need Windows? Do I also get applications from elsewhere? I do. I need access to Word-based applications running my corporation and I need to get access to SaaS-based applications that my corporation purchases from other providers. Here comes also a very profound change even as we deal with end users.

IT at the backend now has to support the concept of a desktop which is not a physical operating system or

Page 50: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

50 [email protected] September 2010

device, but rather it is basically the conjoining in a session when the user is logged on of all of the applications to which the user is entitled or subscribed with a Windows or otherwise, with the appropriate permissions and identities managed so the user, when logged in, can gain access seamlessly and that the user does not need to go outside the corporation to get to particular applications. It includes aspects of identity management, authorization security management and the contextual binding of all of those pieces together into a thing which you could logically consider as a desktop which is really merely a session based combination of access to different applications. With that comes all of the requirements that if I happen to revoke the employees employment that all of those permissions have to disappear simultaneously.

There are substantial challenges, obviously, when data is in flight or when data is resident on any client device, it must be encrypted and protected. Moreover, given that there are substantial trends towards more and more laptops and then because they therefore will be local state on those systems, continually backing them up as a service and then encrypting everything at rest is absolutely a mandatory requirement so the corporation is not vulnerable to the loss or compromise of a device. There are profound changes then on the employees interface to the corporation.

Then, let’s look at the backend. The backend would be the transformation of what today is a highly manual IT process into something which is service based and self-service based. Much is being said about this process called “ITIL” [Ed: The information technology infrastructure library (ITIL) is a set of concepts and practices for information technology services management (ITSM), information technology (IT) development and IT operations] which is an investment in a more streamlined or methodical IT process. I happen to be not a huge proponent of ITIL because I see the whole thing becoming hopefully less necessary as we move to much more of a self-service based IT environment.

Bear in mind that the next generation employee is quite comfortable with computers. They are quite comfortable with the notion of self-service. They know how to go get what they need and so having an IT process and support system which assumes that folks who try and do things with computers are illiterate at it as our generation is will be the wrong thing to do. What next generation employees will expect are self-service interfaces and the ability to educate them to find feedback about their job performance and everything else all along the line. This will be true of IT application owners and everything else.

Where those applications will run is also going to change. There will be increasing acceptance of cloud. Cloud computing is kind of blurry right now and will remain blurry for a substantial portion of time, but certain components can be, certain components of the enterprise workload can be run outside the boundaries of the enterprise itself. More and more you will see the growth of service providers of one form or another, whether those be outsourcing providers or hosting providers or cloud providers, taking on a greater portion of the infrastructure operations for the enterprise and running it as a service.

The enterprise then will become more of a consumer of, let’s not call it consumerized IT function but certainly formularized IT functions which are well known and well understood which give them a way to easily consumer IT resources and be charged for them on the basis of consumption. Bundle all of that up in cloud and let’s just say that the trend is inexorable toward greater automation and a more dynamic infrastructure.

Ed: It seems that the role of IT organizations is really set to undergo a transformational change - from that of an integration, provisioning and deployment and technical to focusing on governance and becoming more of a service provider for business segments.

Simon: Yes, but this is one of the worries I have about the adoption of cloud. I think there is embedded in every human a sort of a time constant or maybe it is a subset of constants. We all grow old with our music, we grow old with our GUIs and we grow old with our practices. I do worry that this is something of a generational effect. There is no way to take your average IT techie today and turn him into a cloud guy. In fact, he hates it. He is suspicious of it. It is trying to get rid of him because we are trying to make manual things more automated and automation spells the end of life for a particular manual class of labourer. These people are technologists but, they are manual technologists. They are doing a job that can be highly automated by computers.

Cloud actually stands for a big battle against an established class of workers called the IT admin.

I think that the transformation will therefore not occur as rapidly as people project, because at the very least there will be an IT skill set issue. IT folks will be resistant to the change that cloud ushers in. You will see greater adoption of cloud as next generations of IT workers join the workforce and join the corporation, but it is not going to be the case that your average enterprise storage administrator suddenly becomes a cloud administrator and he is happy with the task.

Page 51: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

September 2010 [email protected] 51

Ed: It is a new skill set to really think of IT as a set of resources to be delivered on a dynamic basis.

Simon: Some CIOs that I talk to tell me that they are aware of this challenge, which is that the existing IT practice is their enemy and so they are moving towards having . . . even as their need for compute grows, they are not acquiring any new resources internally and aren’t growing their labour force but moving people on to contract terms as they renew terms with IT employees. Incremental resources, hardware and so on are all being purchased from service providers. In other words, the existing deployments stay the way they are, and as people move on or whatever you change the cost base by moving people into contract-based roles and slowly migrate towards a more cloud-based infrastructure.

Ed: Do you think that the IT skill set of the future will really just become a core component of business skills rather than a specialty?

Simon: I think that there will be specialty folks engaged in the delivery of service to large corporations, but large corporations will not need very large numbers of fixed IT people. So, you will find organizations that specialize.

Ed: The idea that a CIO’s role will evolve from being less an architect who is involved with procuring, delivering and integrating technologies and more into an agent of aligning resources and providing tools of resources dynamically to serve the business on a much more closely aligned basis. It would seem that there would be either new skills that need to be developed from, I guess, teaching the old dogs is certainly one factor of disruption. On the other hand, as you have eluded, the new generation of workers that are familiar with this new technology and with the se ideas may have a new blend of skill set.

Simon: I am sure they do. If you look at just by way of example, those obviously that play around with cloud computing are very happy that somewhere out there on the internet is this thing with an API and if I write a program that talks that API I can get some way cool things done. That is what cloud kind of stands for. That’s what Amazon EC2 is and all of these great things. If you say that to your average IT guy, that is an absolutely meaningless concept. If you tell him to go and bring up a Microsoft Exchange server he wouldn’t have the faintest clue about how to do it. By the way that is nothing more than a manual labour practice being challenged by a change in technology. The interesting thing is that the rate of change in technology in our case is so much different than the rate of change in the automobile industry. Cars are still bolted together with bolts as far as I am aware

and that hasn’t changed for a long while, but Moore’s Law is still super healthy. The rate of change here is phenomenal.

Ed: The potential disruption for hardware is the implications are wide reaching as well. One of the visions or predictions that we have heard is that we are going to see more convergence between hardware, compute and storage and then when you look at the endpoints that have been so tied to an operating system that stops as a perfect example that differentiation and that identity itself becomes fragmented.

Simon: The concept of a desktop is no longer associated with a device or an operating system. It is the stuff I need to be productive. By the way, that includes my identity and my ability to take information from one source and put it into another and manipulate it and everything else. It is fundamentally about establishing an identity, establishing a security context and the right to an access certain applications that I need to be productive. It has nothing to do with whether it is Mac or Windows or anything else. That is a very profound change. The funny thing is that Citrix feels like we have been telling the story for years because that is kind of what we always been up to.

From a hardware perspective there are two major trends going on. One of them is greater vertical integration on the part of certain providers. You will see just as you saw in Microsoft with its big monster Azure hardware platform. Drop off a container in your parking lot. You will see other major vendors doing exactly the same - Oracle and so on. The challenge that that introduces is of course . . . so once it makes this huge big appliance with enormous capacity a new thing, the thing itself is largely self managing and so it introduces a larger scale unit that has to be managed through its life cycle and therefore the costs should drop.

The vertical integration has its own challenges, namely that every vertically integrated system will ultimately become a trap. The funny thing is that we have gone from a vertically integrated world, which was the IBM mainframe and its applications and the operating system and everything from the terminals all came from one place. It got blown apart by the x86 and the PC into its entire constituent part and then aggregated horizontally into these layers of chips and hardware and so on, and now we are going back into a vertically integration world again. This cycle will probably repeat itself yet again. The challenge of course is that the greater vertical integration, whilst it drives towards a more computely packaged service for the function, ultimately can become a trap for the acquirer.

Ed: Right, and of course the concentration of compute resources into cloud-service providers who compete on

Page 52: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

52 [email protected] September 2010

scale in many respects results in a contraction in the number of customers for the hardware vendors, which will create greater pressures on them.

Simon: The hardware vendors, they are a funny bunch anyway. They have always been squashed between Intel and Microsoft and they have survived on very thin margins for doing not an awful lot. They have very substantial changes ahead. The next generation of server vendors are people called Google and Amazon. They buy the chips from Intel, they make the server, they just happen to sell it to you buy the hour with all the software on it that you need.

The notion that I am going to go off to someone like Dell and buy myself a server and have to plug it in the wall and plug in all the cables is rather arcane. Why would I? Moreover, anything that I could do to maintain that thing throughout its lifecycle is bound to be far more expensive than getting it into the hands of somebody that does it on the scale of hundreds of thousands or millions. Even from a power and cooling perspective.

The next generation of server vendors are called clouds and they will also be the OEMs of certain kinds of technology. You can go and rent Windows and everything else from up there. They are also vendors of various abstractions, including infrastructure as a service and platform as a service and even software as a service in the case of Google and so on. As a server vendor, they can go off and innovate very substantially around that and depending on their size they will deliver something between a complete application or something more like an infrastructural abstraction which can be used by the end user to deliver their own applications on top of it. But you know the HPs and Dells and so ons of this world are going to find life harder and harder.

Ed: The impact on application vendors as well is fairly significant when you look at this shift in how applications and functionality are delivered and consumed.

Simon: I think that the application vendors will have their heads on and they will make the changes. It would probably be reasonable to say that within eight years or so, you won’t find application vendors or people who want to make money by making software, delivering software that is highly OS specific or tied to Windows or anything else anymore. The challenge for the enterprise is that the skill set that it is delivering, the application that is powering the enterprise that are not purchased from elsewhere, that skill set is just the same as the average IT skill set. It is going to sit there and write programs in Visual Basic and .NET and whatever else until the cows come home. You will see an extremely long tail of what you can think of as legacy technology building tomorrow’s applications.

Ed: The idea of turning some of these prior investments into components that can be service enabled and then orchestrated into a broader process.

Simon: Right, that is kind of the opportunity that Citrix uses. In that context, we see a very long life ahead for Windows and Windows-based apps. There are millions and millions of apps in the enterprise that are never going to be re-written because enterprises can’t afford to do that. All the people have left or something. And the one thing we got when we were naming things in computing, one of the things you got wrong was hardware and software because software is hard and hardware is easy. We swap out the hardware relatively frequently, but the software never gets changed out.

If you want an example of that, a very large airline in the US, one of the largest carriers in the world, all of their software from the time you walk into the airport to the time you take off was written in Windows 3.1 which was about when I was in college. The good thing is that all the viruses have moved on and the bad thing is that it is extraordinarily brittle and fragile and they can’t re-write it. They have tried eight times.

Ed: There was a saying that what Intel giveth, Microsoft taketh away or similarly Wirth’s law is the converse of Moore’s Law. It suggests that for every increase in the power of computing, there is a consequence decrease in the speed of software.

Simon: I am not too sure about that because I think we have placed enormous demands on software nowadays in terms of the sophistication of user interfaces, security and a bunch of other things. I don’t think we necessarily serve ourselves well because the larger these programs get the larger security footprint that they have, that is the more vulnerable they are. I think we will see things returning to a balance.

Ed: One point that you highlighted was this idea of an identity independent of a machine when you were talking about the desktop. In the future, if your desktop or personal identity is not bound to a specific machine what is it going to require for the appropriate security and permission to allow a relatively transparent . . .

Simon: The notion of identity expands even the enterprise. Once I authenticate myself and I have an identity, that identity has to be able to leave the boundary of my enterprise and allow me to use SaaS-based apps to which my enterprise is a subscriber. For example, if my enterprise is using Salesforce.com, once I have signed on to my enterprise desktop, I should automatically be signed on to Salesforce.com for a bunch of reason.

Page 53: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

September 2010 [email protected] 53

First, if I leave the company, revoking my one corporate ID should revoke access to all applications that I have access to. I shouldn’t be able to quit, run out the door and then log onto Salesforce.com and grab all of my stuff. This notion of identity becomes a big deal and there will be a major trend that emerges in the next few years for there to be a much more rigorous and much more robust notion of identity across the internet. Partially related to issues of security but partially because the increasing use of web-based technologies is just going to demand it. My identity has to be a set of tokens that I present to any application on the planet and they need to be processed for that application to query my identity provider or the broker of my identity to check that I am who I claim I am.

The technology for that, security assertion markup language (SAML) and so on, is fairly well advanced in standards bodies and starting to roll out. It is interesting too that you can start to think about the identity being independent of the enterprise. I don’t have to have an identity assigned to me by the enterprise on the first day of work. I could walk in the door and say here is my Gmail address and that is my identity because all that is required is a way to uniquely identify me to something and then therefore for there to be a trust relationship between the identity managers. Just as we see a trend toward bringing your own computer there may well be a trend toward bringing your own id. That is I walk in with a credential that uniquely identifies me, could be a Gmail account or an account at eBay, but it is me and as long as we can verify that then my identity is known by the enterprise and my certificates automatically work.

Ed: Microsoft has tried a lot of that with Passport and Windows Live.

Simon: And they have done actually, in terms of an architecture for managing identity, the Microsoft boys have got their heads on perfectly. The way they are building this into Azure is superb. The rest of Azure, Azure as a service or the Pass platform is a little bit like galactic glue as far as I am concerned, but they have thought through the identity-management piece very well.

Ed: Over the next decade what do you see as the biggest potential sources of conflict as we make this transition toward a transparent desktop experience with portable identity and a services-oriented environment? You already alluded to some of the challenges on the organizational side, but are their any other hurdles or seeds of conflict that we may see over the next decade?

Simon: The challenge comes from the fact that the vast majority of applications today, written for

Windows, were not designed to be delivered to a user in any context other than the user looking at it from Windows and being the only user logged on to a Windows running right in front of them. In general, as you pick apart Windows into its constituent parts, there is the operating system, the applications the user uses, there are the users registry hive and personalization’s and all of their documents and settings and so on. All of these pieces basically are components of the desktop but that doesn’t have to be Windows.

If you take the notion of a registry in general, it is a set of personalizations that I want to be present no matter where I choose to manifest this thing called my desktop in the future. Therefore, the extent that Windows registry contains a bunch of tokens and values, I could ostensibly have a bunch of tokens and values for my iPad somewhere, which then express policies about how applications show up and are manipulated on my iPad.

Let’s start with the Windows world. The Windows world is basically being ripped apart into its constituent sections and it was never designed to do this. What is happening is that once the desktop is no longer tied to a device and once we insert virtualization between the device and the operating system, between the operating system and the applications, between the resulting system and the end user, we end up with some enormous opportunities and some challenges.

The opportunities are that we can manage each layer independently of all the others. Instead therefore of having say 10,000 users, each of whom has their own vertically integrated OS and app and personality stack, independently out there trying to attract malware, I can say everybody gets the golden image because I can assemble their run time on the fly as they log in and I build their desktop context by combining that golden image with the apps and all of that stuff. What I get is a reduction in the number of instances of each layer that I have to manage, arguably down to one. I also get the opportunity to dynamically combine OS apps, user context, user security, credentials and everything else into this thing called a desktop, decide where to run it based on the most effective way to deliver to the end user given where they are, who they are and how they are logging in. Each one of the layers is separately managed, it is just one copy of the app and everybody is using the same copy of the app and I can backup the user’s state and so on.

All that is great, but the challenge we have is the applications today were not built to do this. Windows was not built to do this, so as we rip off different layers of Windows and separate them into layers, there is

Page 54: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

54 [email protected] September 2010

actually a very substantial challenge to make this stuff all work and in delivering it all to the right place and at the right time. It is non trivial to do, but it is going to get done.

The result will be something like Win32 being a compatibility API for apps, a user profile or user personality being present and to think all OSs will have vanishingly little influences over the users selection of what they are trying to do, over the users preferred way of getting hold of the applications and in terms of creating any net end user value. I think that is a substantial challenge for the traditional OS vendors. It is a challenge for those of us in ecosystem because essentially as we construct the worlds on the fly out of constituent parts of Windows, we are stretching the technology Microsoft built to its very limits and nobody ever thought you would do this to Windows.

It becomes harder and harder to reason about correctness and indeed to guarantee that they are building their work just right. You get potential issues with application compatibility or various other issues that can pop up. The broader the set of use cases we address and the more personal we allow the desktop to become, the harder it is to guarantee that everything works with high fidelity.

The trend towards greater use of virtualization on the endpoint is inexorable. Client side hypervisors for example, they will be delivered this year, will make a profound difference to the security of client systems, laptops and so on, and we will dramatically assist there. But also, it is nothing that Microsoft ever conceived of when it first built Windows operating system.

Ed: It is a really new way of looking at the whole process of delivering compute and software intelligence to users.

Simon: It is entirely different.

Ed: There is certainly a lot of opportunity to deliver value.

Simon: Oh sure. From the technology side when you have something like Moore’s Law ripping through the industry . . . by the way, it is a doubling curve and it is incredibly powerful. You know that any idea that is out there is about to get turned on its head. A good example would be the security industry. We recently went through the 1 billion to 2 billion transistors per die, doubling from Moore’s Law. In that time, it is now the case that it is no longer possible to detect malware. That is any rapidly self-changing malware can change at a rate such that you can now prove that it is impossible to decide whether any piece of code is

malware or not. That is it is no now longer possible to detect malware on your system if it is carefully written. That tells you that the whole business case of McAfee, Symantec and anybody else in the find-the-bad-guy security business is done. They don’t have any value anymore because they can no longer find the bad guys. The security world has to dramatically change to be able to say what it can about the system.

You will find a change towards positive notions of security, towards attestation based on known circumstances and an increasing importance placed on whitelisting. I will want more and more granular inventory of the system that is being used to run the application and of the application code and of everything else. I will be able to say that, well I won’t be able to say that it is invulnerable but I’ll be able to say what state it is in. Then I can at least reason about a security posture. For example, how big is the security business of McAfee, Symantec and so on? Those guys are done, and by the way I think Intel buying them was kind of a smart idea. It is a profound change that is sitting there for them.

Ed: It is going to be quite an evolution for them. There seems like there is enough of a legacy business to sustain cashflow, but the key is how to fund that innovation because it is a human-created problem that sits on top of this massive exponential change on compute, storage and the price of DRAM.

Simon: Exactly and I think the storage industry is also going to get hammered. Here is why. First, you would think that it is a great business to be in because there is more and more stuff being created, but that is a commodity business. Sure, every consumer is going to have massive storage problems in a little while, so there is huge opportunity for backup and cloud services.

In the enterprise segment, virtualization completely hammered security the first time around. It basically changed every . . . instead of going to the storage guy and pleading for weeks to get a line you just turned every machine into a file. Now it is about to get killed again, because basically of Moore’s Law on a server. With Moore’s Law on a single Intel server, the number of VMs I can pack onto a single server will kill any storage array on the planet from an IO perspective.

Let’s be very clear, even with desktop virtualization, we can fit maybe 150 VMs on 2-socket servers, 250 VMs on 4-socket servers easily. That number of VMs just doing standard IO will kill even the highest-end array that you can find. By the way, I want to have 100,000 virtual desktops and so the I0 rates that are happening and the changes there are so dramatic that there is no way that any storage array can keep up with us.

Page 55: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

September 2010 [email protected] 55

What does that mean? That means that we have to have a profound change in the way we organize storage. Moreover, the average disk can deal with eight VMs. We have had great increases in the capacity per spindle but not in the data rates of the heads, so we can’t get enough data through a head. We can only put about eight VMs per spindle and we can’t do that. I cannot afford to have 40 spindles for a single server. It doesn’t make sense.

Flash has dropped down to the point to where it is actually affordable as an alternative. Now look at the challenge. If storage becomes flash and part of the server, how does backup work and what’s the notion of all of that? Suddenly the concept of storage management, which is all about making sure your precious bits don’t get lost somewhere or don’t get left somewhere or fail to get backed up, must extend from whatever form of persistent storage you have across the network and into the server and start and pertain to the bits in the server and resident there. Storage becomes a wildly different thing. It moves right up to being . . . But within a year, we will have DIM addressable flash storage. In other words, you will be talking to flash storage as though it was RAM in a server directly from the processor. It won’t be looking just like a disk, it will just be part of memory. Right up after the cache, we will be going straight to flash. How do we deal with that then? What is the role of that in storage? Computers get radically re-shaped as a result.

Ed: Of course the availability of flash drives is completely changing the capacity for predictive analytics and the architectures of the traditional datawarehouse. They are quite disruptive.

Simon: Yes, it is going to be profoundly disrupting. I can’t believe that anybody is still buying EMC Symmetrix, but that is just me. That, by the way, is a labour practice thing. If you have ever seen an EMC Symmetrix, it is one of these beasts where you actually need a specific device to tell it to do anything. You have to plug a PC into it and a whole bunch of things. There you have a guy who has invested his whole life in being the EMC Symmetrix management guy and the guy who is selling it to him is telling him why the new EMC Symmetrix is so cool and why it will give him job security. That is the enterprises problem; you have a bunch of vendors selling to an established work practice and not necessarily selling the best thing for the enterprise.

The funny thing about, I hear all of these crazy things about you can’t trust clouds and clouds are insecure and so on. In the average cloud, their entire business depends on being secure and being highly automated. In general, there is no way anybody could put their hands on the bits that belong to you. That is, it is encrypted; you can

encrypt all of your stuff in the cloud so as a result in general they are far more secure than any enterprise IT shop. If you really want them super-duper secure you can get them that way. This fear that is circulated by some of the analysts and the press and so on is actually a fear that is expressed on the part of the IT practice which doesn’t want to let control go and wants to make the case that it is insecure and that it knows better because it is afraid of what happens if that stuff did move to the cloud.

Ed: That is the organizational resistance to the paradigm change.

Simon: Correct. I had an interesting conversation with Gartner who, as far as I am concerned, in general speak to that category of worker. They put out a thing about security in cloud, and I took them on about it - ‘The only reason that you are echoing this stuff out is because you sell to these same people, so you are part of the problem too.’ Well, of course that didn’t go down particularly well.

Ed: Those businesses may change as well.

Simon: I think they will change because after a while the credibility will be vested in huge brands. I think there will be a practice in assessing the capabilities of clouds and matching them to what enterprises need. I think that is a useful thing to be investing in and some of the smart guys are doing that already.

Ed: There role will change to reflect the transformation in the way that IT is delivered as well.

Simon: Right, exactly right.

Ed: This idea of accelerating change has been a theme that we keep coming back to and I have a chart that Ray Kurtzweil had put together that talks about the accelerating pace at which innovations are adopted on a mass scale. Really, when you go back to the telephone to the automobile to radio to TV to the internet and now when you start to see the mobile internet and even just how much change we have seen in the few months since the iPad has been available and the impact it has had on the way people think about computing, we have to brace ourselves for continual acceleration of change.

Simon: Well, just look at the rate at which Apple has produced new products. Just look back since the first iPod which I am sure you can remember it was five or six years ago. Apple releases two major profound new innovations in terms of product designs per year.

The funny thing is, I think that we are accustomed to linear processes. I gave a talk at Microsoft recently and I showed them how their release schedule mapped

Page 56: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

56 [email protected] September 2010

onto a nice straight-line implementation of Moore’s Law. They are linear thinkers and think from one release to the next. The problem is that very few people can even begin to imagine what a doubling is going to do. Between any two Microsoft releases you have two cycles of Moore’s Law. So there is a quadrupling of the amount of compute capacity from the last release. If you think about Windows XP, XP SP3 which is what the average enterprise is on, that is long enough that I have been through many generations of Moore’s Law. The challenge is that the processes that we have for creating products, whether they are technology products or other, are very linear, methodical and process oriented. The curve that is ripping up the industry continually is Moore’s Law which really does nothing more than create huge opportunities for innovation which disrupts established bases. The interesting thing about it is, if you managed to get out fast get out ahead, Moore’s Law is so substantially in your favor right now that you can do tremendous things which is why things like Amazon are so profound.

Ed: Simon, I think we are very aligned in that view. Some of the work that we have been doing and particularly looking at the innovations in technology that go beyond pure enterprise IT but into the areas of biotech medicine, nanotechnology, materials science, I think the innovations in the next decade are really going to surprise a lot of people.

Simon: I think there are consequences also on development processes. I will try a theory out that I have that open source becomes more and more relevant because of continual evolution. A well-organized open-source project is one which is continually moving. It is moving at the behest of developers from multiple organizations, private sometimes but often public, often lead by research organizations that are leading the curve in terms of innovation. At any point in time, if I grabbed the relevant open-source code base, I am going to be there, I am going to be with the rate of technology evolution, I am going to be right up to the edge.

Conversely, if I sit with any proprietary code base, what I have to do is employ all the developers to move it forward. That means I have to assume the entire cost of the development to move a very substantial code base at the rate that Moore’s Law is moving. In my view, that becomes insufferably expensive. I am the guy who has been going on about free hypervisors for a long time but, if you look at the size of our engineering team in Citrix for XenServer, which is our competitive product to VMware, the core team is 80 guys. They have more than 500 on the job to just try and keep up with what we do. Why? Because the best people at Intel, Oracle, AMD and countless other organizations

around the globe are moving this thing forward every day and we don’t have to take the cost.

Ed: I want to agree actually and we have some interesting comments from Michael Tiemann at Red Hat about applying the philosophy or paradigm Edwards Deming applied to Japanese auto manufacturing. He described the idea that the code quality is so much higher and the revisions become componentized so each step is a firmer foundation for the next step and you don’t have to work around flaws in the software. Now that there are over one billion lines of open-source code that are vetted, field tested and available to anybody that wants to start a project right now, it just creates a massive acceleration for any idea.

Simon: Exactly, the rate at which you can put a thing together is unbelievable and more than that you get transformative capabilities. The whole concept of the infrastructure as a service cloud would not exist without open-source software. Amazon just grabbed Xen and did it and Google just grabbed Linux and did it. Those companies would not exist had they not had the power of innovation there just to grab and go. What it allows the companies to do is just leap ahead, take advantage of a new twist in Moore’s Law and build a whole new business.

If you look at companies like Microsoft, which are really under attack on all fronts which is reasonable because they were the largest anyway. If you look at recent repeated failures, something like the Kin, for all of these innovations the project starts but runs in linear time and by the time it gets to market it’s too late. It starts late anyway because Microsoft is late in innovation; by the way that is just a consequence of their scale. There are plenty of smart guys there and they do great work, but they operate at such scale that they just can’t go very fast.

You start behind the curve and you can’t go fast enough and then things have changed by the time you get to market. I think it is instructive in terms of observing the kinds of companies who can make big changes and can really profoundly affect the industry. It is not places like Microsoft. They will probably do fine for all time including the big Azure lump of iron, but really they aren’t leading innovation. The innovation is being led from small companies that manage to scale fast.

The other good thing you get with open source, though it’s not restricted to open source, is the ability to conquer the cost to sale. One of the things we have been extremely successful with is free downloads. You change the whole go to market, the cost of sale and the whole complexity of delivering a product to an end user. You get to manage the life cycle better and you get to have a conversation with the user. These are all

Page 57: 2020 foresight - Tech Views of the Future - Ed Maguire

Simon Crosby, Citrix US software

September 2010 [email protected] 57

examples of things you could do with a proprietary world, but when people are focused on charging a large amount of money for traditional enterprise software, they tend to forget about these things that are really much more Apple-like in the sense from the moment you open the box you get a great experience.

Ed: You go to a more transparent experience, transparent IT. The value is in the solution.

Simon: Right, then there are vendors that I love because they completely turn the whole model on its head. Spiceworks has about a million IT folks using their software and all of the software is free. The value proposition is that all the software is free, it’s well integrated, it’s good management software. What they do is manage the careers of the IT guys. They

feed them highly personalized and highly crafted value-added information relevant to their jobs and the software is absolutely free. But then they get to deliver ads to those IT people. Additionally, if you don’t want ads delivered to your IT people, US$20 or something. How do they make money then, they make money by creating the social network for IT admins and delivering highly-qualified rooting of information to those guys and building a community they want to be a part of instead of charging for the enterprise software.

Ed: It is all for a company that is really selling content more than anything else.

Simon: And the CIO doesn’t even know this stuff is being used.

Page 58: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

58 [email protected] September 2010

Jill Dyche, Baseline Consulting Jill Dyché is a partner and co-founder of Baseline Consulting. Her role at Baseline is a combination of best-practice expert, industry gadfly, key client advisor and all-around thought leader. She is responsible for key client strategies and market analysis in the areas of data governance, business intelligence, master-data management and customer-relationship management.

Jill is the author of three books on the business value of IT and her work has been featured in major publications, such as Computerworld, Information Week, CIO Magazine, the Wall Street Journal, the Chicago Tribune and Newsweek.com. Jill’s latest book, Customer Data Integration (John Wiley and Sons, 2006), was co-authored with Baseline partner Evan Levy and shows the business breakthroughs achieved with integrated customer data

Additionally, Jill is a featured speaker at industry conferences, university programs and vendor events. She serves as a judge for several IT best-practice awards. She is a member of the Society of Information Management and Women in Technology, a faculty member of The Datawarehousing Institute (TDWI) and serves as a co-chair for the Master Data Management (MDM) Insight conference. Jill is a columnist for DM Review and a blogger for BeyeNETWORK and Baseline Consulting

The data virtualization paradigm Jill Dyche has been active in the business-intelligence, data-warehousing and CRM areas for many years. Our conversation touched on several areas, notably the increasingly distributed nature of data management, the potential impact as social-networking paradigms are adopted at the enterprise and the growing potential for conflict over the privacy of personal data. Jill’s extensive client relationships provide her with a practical context to gain insights on the practicality of emerging technologies and a strong understanding of the evolving role of corporate IT.

Key points Collaborative business intelligence and decision making will become

pervasive within the enterprise.

A new generation of tools will leverage growing adoption of collaboration software by enterprises. In the business-intelligence world, reports will be shared and modified by groups and there will be less reliance on IT or a single business unit to create and maintain the reports.

Collaboration breaks down the walls of IT and improves efficiency of the enterprise. Business entities within the enterprise will be empowered by SaaS and new means of collaboration, while the roles of the CIO and CTO will evolve, or perhaps diminish.

“Data virtualization” becomes the accepted paradigm for managing data over networks. Enterprise adoption of the cloud, new database technologies, such as Hadoop, and in-memory analytics are capable of scaling larger data sets, both structured and unstructured.

‘This isn’t your father’s SQL database, it may just be some soft of services-oriented architecture on steroids that retrieves customer data in real time and utilizes information in the meta-layer to understand where the correct, up-to-date information resides.’

Summary of interview on 12 August 2010.

Full transcript follows

Collaborative business intelligence and

decision making will become pervasive

“Data virtualization” becomes the accepted

paradigm for managing data over networks

Page 59: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

September 2010 [email protected] 59

Consumers will take ownership of their personal information as established means of monetizing consumers’ personal information may favor consumers. New personal data broking services will emerge that will enable consumers to monitor advertising media’s tracking and sharing of their personal information.

Personal-data services will be new means of monetizing consumer-information brokerage, and consumers will ultimately be able to monetize their personal information.

Consumers will ultimately be able to monetize their

own personal information

Page 60: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

60 [email protected] September 2010

Jill Dyche transcript Jill: I see the future of BI and what BI will look like in 2020 broken down into two areas. I think we should consider inside the four walls of the corporation and then outside the four walls of the corporation as the two areas.

If we start inside those four walls, that basically has to do with the infrastructure that companies are setting up to continue to deploy BI nimbly. From an internal perspective, we are seeing more deliberate decisions about what companies want to host internally and what they want to outsource. This is going to require a very realistic assessment of IT and business competencies. Organizations are just now trying to figure out what is core and what is context.

At a lot of our clients right now, we are starting to hear buzz around things like collaborative decision making and the ability for people in different organizations to come together for the purposes of developing policies or consensus on different information needs. If you look, there are little startups cropping up right now to that end. The enterprise app players are going to start really adopting this capability pretty widely. There are a couple of small companies letting their customers use what are essentially social-media functions to collaborate together on building these new BI reports. In defining metadata for those reports and ensuring the re-use of complex analytics.

These are collaboration platforms that have to do with decision management. Somebody in marketing and somebody in finance can both be looking at an electronic report and decide together whether they want it to reflect billed or booked revenue rather than these crazy heterogeneous reports that land on executives desktops and everybody says, ‘These don’t match and we have multiple GLs and nothing is rolling up and oh my gosh.’ I think those days are thankfully coming to a close and this internal collaboration can not only streamline collaboration, but it will have some downstream implications on things like time to market and customer profitability and those kinds of things. That is one of the first things we are going to see a lot more of, as well as a lot more adoption in the enterprise.

The other thing inside the walls of the corporation in a few years is the concept of data virtualization, which is really going to be the standard. A lot of people are talking about software virtualization, but we are going to start to hear a lot of buzz around data virtualization. We simply have to assume that the days of the big behemoth mega data warehouses are over and there will always be new data introduced by both internal and external sources. As we have all heard, the rate of that data is just going to increase. Data replaces itself every 18 months, we have all heard that statistic.

We are already seeing this happen at companies like retailers and healthcare providers are good examples as they transform to an end-consumer business model. Data like social-media interactions become important to companies. This “data is everywhere mindset” that people are embracing, by the time you are able to load all that data into a big database, it could be irrelevant. To the point of data virtualization, if you look at the growth rate of companies that offer this type of functionality, like composite software, they are growing at five times the rest of the data integration market. Coming up with a way to access and use that data from wherever it is will be a key future trend. Again, this is something that our clients are already starting to look at and ask questions about.

Hand in hand with that, still within the four walls of the company, that really is all types of data. I mentioned social-media data, but we are also talking about everything from complex imaging data, to video and voice data, to bunches of unstructured information. Companies are starting to realize that they will either have to get serious about gathering and storing this information or they are going to have to go to specialized-hosted solutions in order to get it right.

The last thought from inside the company perspective is really an organization perspective. I noticed you asked some of your other experts what this means for IT? In a few years the IT organization is going to look very different. It will be used to deploy and maintain IT infrastructure and perhaps even manage relationships with Software-as-a-Service providers, but fundamentally the big technology deployment center is going to go away in favor of smaller, more line of business or even business-process-specific IT groups. Those groups will be more networked as opposed to silos. Just because there is an IT group within the marketing line of business doesn’t mean they won’t share work and or platforms with the finance IT group. To that end, I think the role of the CIO will become more business focused than it is now and the CTO role might actually disappear all together.

Ed: Jill, you have touched on a theme that has come up a couple of times where you have IT in a sense becoming far more a part of the business rather than specialty or a subset.

Jill: Yes, and that is already occurring in a lot of our clients. The organization, and particularly C-level executives, have less and less of an appetite for the ownership battles going on between these smaller, more focused IT groups and the large infrastructure providers. As these smaller groups can write their business cases and prove ROI for IT, that marginalizes these big central IT groups.

Page 61: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

September 2010 [email protected] 61

Ed: As you go to more of a cloud-services approach you have a lot of the previous technical barriers to having control of resources falling away. That changes the dynamics there.

Jill: Not only that, but one of the things we underestimate about Software-as-a-Service providers is their ability to innovate. There has been a lot of cynicism about the old-guard IT organization’s ability to think outside of the box. If you look at these smaller hosted solutions and how creative their solutions, platforms and infrastructures are and they are not very readily replicable by very risk-averse organizations today, I think we will see a change in the fundamental IT model.

Ed: I want to go back to one of your earlier points about data virtualization and this idea of social decision making and the idea of having to build this infrastructure that federates all of this data that resides and is sourced and created in so many different places. How are we going to manage that when you think about organizations and individuals? The idea of the centralized data warehouse has become so unwieldy, but when you move into the distributed world, how do you avoid the sprawl and the silos and the fragmentation that could happen?

Jill: I think it is going to be a lot less about where the data is and more about bringing the data together as a service. This isn’t your father’s SQL database, it may just be some sort of service-oriented architecture on steroids that goes and says, ‘Give me the customers data in real time’ and through a metadata layer just knows where the right data and right version of that data is. It is less about infrastructure and more about metadata and the definitions about right time and place for key business data and bringing that together in real time. The core technologies are already out there to do that kind of stuff and it is just a matter or adoption and tuning.

Ed: That brings up the fascinating conversation or topic about the accuracy or the ability to tag and organize data effectively and in context.

Jill: That is a great point and what we are finding is that, fundamentally, technology in the solutions may be evolving, but the need to understand fundamental data requirements and what data can do for the business still needs to be there. While we may see changes in the IT model, and we may see changes in some of these underlying infrastructure solutions, what we won’t see going away is the role of the business analysts and the role of the data steward. They will be more important than ever.

Ed: That is the other thing that seems to be, as you think outside the four walls of the enterprise, this

theme of big data and the advent of the importance of analysis and even the quants to apply predictive analytics and observations to solve business problems just opens up a completely new horizon for innovation.

Jill: Absolutely, and as we leave the four walls, I think what we are seeing is a migration from that pull model where business people need to ask for the data to more of a push model where the information is provisioned to the consumer in a timely and relevant way. The example of the movie Minority Report where the guy walks by the billboard and the advertisements change to target him directly. I don’t think that is too far afield. I think we will start to see a lot of the larger enterprise app vendors looking into technologies like complex event processing so they can get the triggers to enable that kind of work.

Ed: One conflict that is likely to arise is the boundaries between what data is personal, what data is public, and how that data is being used.

Jill: It is huge. I don’t know if you caught CEO Eric Schmidt of Google, but he was on CNBC saying things like internet anonymity is dangerous and privacy has its limits. I think that the markets are going to respond to that and I think we need to get ready for the forthcoming backlash on some of that stuff. One of my outside-the-walls prediction is that consumers are going to reclaim their personal information. The end consumer will end up owning his or her own personal information and where they give that away. It is sort of a privacy conversation obviously, but it is more than that. If you are trying to sell me something and you want my personal information, there needs to be sort of a quid pro quo before I share it with you.

We will not only start to see a lot more web-based privacy protection products coming on to the market, I am thinking about things like reputation defender and safety web for children but also new tools like Bynamite. They were profiled in the New York Times. They will actually tell you as a consumer which websites are collecting your information and will report that back to you so you can tweak your own version of your information and put it back out there. Consumers will start claiming rights to their own information and even deciding where they will participate and where they won’t and what that information should be worth for the companies that use it. We might even see money changing hands by 2020.

Ed: That has profound implications for the internet search and advertising business. Obviously, it is a multibillion-dollar business and if that value chain is disrupted somehow, that is going to create opportunities. But it could be also a real challenge for

Page 62: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

62 [email protected] September 2010

some of the business and organizations that have built their revenue streams on this.

Jill: You are right and when you say that I am thinking of the Dun & Bradstreets and the Acxioms of the world who have that master version of consumer information or business information. Now if I reclaim ownership of that information and I have that own version of who I am and what I want to share, how relevant is the data coming from those third-party providers. That may be an at-risk business model. Having said that, some of those guys are doing some interesting things in the cloud with data integration and matching functions so I wouldn’t count them out just yet.

Ed: It’s really a profound cultural difference between the US and some of the European countries and Japan where there are very strict limits on how people’s data can be used. In the US, we have had this “Wild West” where people do these experiments to see if you ask somebody for their data on the street, people willingly give up information that can be so valuable. Part of it may be cultural.

Jill: I think some of it is cultural but I think some of it is that countries wisely pre-empted the debate through their privacy legislation. There are a lot more checks and balances on what companies can actually use where we are still having the debate. When Eric Schmidt goes public and says that if you don’t want anybody to know what you are doing on the internet, don’t do it on the internet. A lot of consumers are past that. I should be able to do what I want on the internet and the Googles and Yahoos of the world should not know and should not be able to track or share that. We are still having that dialogue on both sides whereas in Europe and some of the Asian countries they have pre-empted it through proactive privacy legislation. Rather than legislation here in the US, what we are going to see are tools to put that data back in the hands of the consumer whether corporate America likes it or not.

Ed: That is a great point. I think it could create some significant battles between companies and individuals.

Jill: Absolutely, as well as some strategic decisions that need to be made by some software companies. Are we going to play in this privacy game where we want everybody to be transparent about their information or are we going to get on the bandwagon that we will be part of the solution for protecting people’s personal information?

Ed: Users have gotten so used to getting so many types of services free that are supported by an advertising model. What happens if that business is undermined? For example, AVG, the free anti-virus company, gives away free software as long as users allow add networks to target them with ads. AVG is

careful that they don’t disclose the identity of their customers, they provide access to them through their downloaded customers and it seems that a lot of software companies that are struggling now with either subscription models, at least on the consumer, trying to find business models that work that had been previously advertising supported, could really have to rethink their whole philosophy.

Jill: That is an excellent point. There was actually something written about that recently in the McKinsey Quarterly where they talk about the “freemium” business models where you get the free services as long as the companies gets advertisers. There are other companies where they are actually banking on what they call the exhaust data. They use Visa or MasterCard as an example because of what their core business is, they are able to summarize information and sell it back to their corporate customers as new behavior and segmentation data. It is very interesting how these business models are going to affect the data that these companies actually have, use and share.

Ed: As you look outside the firewall, what do you see as the potential evolution of intelligence and analytics in the public sphere when you go beyond traditional business analytics?

Jill: We are back to that push model where just based on who I am and what I am doing, the information will come to me based on my behavior. Whether I am a customer of REI and I just happen to be at a trailhead and all of a sudden there is a coupon dangling from the bulletin, which is a real life story. Depending on how I spend my time, I will have very relevant offers in front of me as opposed to going and searching for them or just having the barrage of irrelevant offers sent to me all the time. That will be a direct extension of business intelligence. It is just a matter of: a) the real-time nature of some of those new offers, and b) the channel, what is going to be the right channel. As companies get smarter they will be able to drive me to the channel they want me to use. The smarter I get about whom I want to do business, with the smarter they will get as well and hopefully it will be a complementary closed loop.

Ed: Do you think there will be opportunities for users or consumers to create their own profiles. Perhaps in a way that is transparent, we become the smart concierge that will collect the behavior that you want us to collect and then we end up being the guardians of what you need to keep private but allow you to enhance your broader experience. That ends up being a personalization broker as it were.

Jill: Exactly, and not only that I can specify my interests and can retain the interest that I don’t want

Page 63: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

September 2010 [email protected] 63

everybody to know about, just for very specific vendors. It is almost like a multitiered customer profile. I don’t have a problem that everybody knows some of my Facebook information but not all of my Facebook information. The same things go for companies. If I am looking for a car I might want to add that to my public profile and get relevant offers from car companies for a period of time but not forever. I think there is going to be a lot of flexibility in how we define and how we maintain our own profiles moving forward.

Ed: Clearly that is going to be part of this broader or ubiquitous and pervasive BI. Do you see that that vision is really valid or do we have other hurdles or obstacles that will have to be overcome to allow for more effective intelligence for corporations and individuals?

Jill: I think there is a readiness factor when it comes to consumers. We are actually seeing pervasive BI more on the corporate side. An example is that we are working with a retailer right now, a top-five retailer in the US. One of the things they are trying to do right now is to move their clientele-ing software on to mobile devices. If I have a shopper in one of my high-end department stores and she is browsing, as a sales rep I can go up to her with my mobile device and say, ‘I see you were looking at teal green jackets and the last time you were here you bought a scarf with teal in it so that would be a good store. By the way, this is the outfit out-bought last time in the store, are there any accessories you would like to buy for that outfit?’ We have not only last activity information, we have it out on the floor with the customer as she is shopping as opposed to waiting for her behind the cash register.

We have unstructured data, some imaging data on that hand-held device as well as that standard customer profile, next best purchase sort of thing. What that has done is they have measured the sales uplift of stores that are beta testing those mobile devices versus stores that are not and they have an upsell rate of something like 23% in the stores that are using the mobile devices. Not only does that allow for more flexible sales capabilities among the sales associates, but they are also much more relevant conversations with the shopper.

Ed: That is pretty compelling. It has been the vision of true multichannel CRM for years and years, but that is just like any predictive algorithm, you need more data.

Jill: In a different sort of mechanism through which to engage the customer because sometimes you are only as good as where you are standing in retail. It is a real breakthrough for this retailer.

Ed: What are your thoughts on some of the fragmentation of different types of databases and the rise of these distributed data warehouses like Hadoop? Does that fit into the concept of data virtualization where you have a lot of federated data sources that end up being controlled or accessed by a meta layer?

Jill: Tools like Hadoop are interesting because on one end they do provide more flexibility, use and deploy the data where it is or fragment the data based on where our skills are. On the other end of that spectrum, it still relies on a fairly sophisticated development team to pull it off. In our clients, the development organizations are pretty flat smart in order to deploy that stuff because it is untraditional and it does command some environment specific knowledge. Definitely we are seeing a lot more flexibility in those kinds of environments, particularly as executives come to terms with the fact that big data is here to stay. That doesn’t necessarily always equate to super high costs.

Ed: As we look forward, are there any potential risks, dangers, or challenges both to users and society at large and also thinking about the disruption we might see with some of these key topics, the socialization of decision-making, virtualization of data, and taking back the control of our data which we talked about? What could be some risks out there and who could be disrupted?

Jill: I think that inside the four walls, the big risk is that we throw the baby out with the bathwater because there are so many hosted options and because we can use data where it lays; there may be a tendency to shed some very key skill sets. I alluded to this earlier, but we are seeing this now with the adoption of agile BI methods where, well we can do it fast and we can do it in bite-size chunks but it is sort of sloppy and fast and really doesn’t solve the complete problem.

There is this tendency to say, ‘Because we don’t need this big infrastructure, we don’t need expensive skills anymore.’ I think executives, especially in IT, make those decisions at their peril. If anything, the line of business subject matter expertise is going to be more important than ever. Rules of engagement between IT and the business are going to be more important than ever. We need to reemphasize and in some cases over-emphasize specialized skill sets when it comes not only to BI but advanced analytics and data integration. We are finding that the data-specific skills that people have are becoming invaluable. Even as companies try to outsource a lot of IT work, that data knowledge is really key to leaving in-house and cultivating. That is the big risk inside the company.

Page 64: 2020 foresight - Tech Views of the Future - Ed Maguire

Jill Dyche, Baseline Consulting US software

64 [email protected] September 2010

In terms of outside the company, I come back to the coming backlash of the use of personal information. Again, as these big media companies, executives go on record that the train has left the station when it comes to the use of your personal data, I think that will really motivate people. It is good days right now for the Googles, Yahoos, and

Facebooks of the world, but there will be a backlash, if you will, when it comes to the data that they have, the data they share, and the data they analyze internally. We will see consumers mobilizing and in a few years perhaps even before 2020, there will be new tools on the market to help us manage some of this.

Page 65: 2020 foresight - Tech Views of the Future - Ed Maguire

Andrew Feldman, SeaMicro US software

September 2010 [email protected] 65

Andrew Feldman, SeaMicro Prior to co-founding SeaMicro, Andrew Feldman was an entrepreneur in residence at Crosslink Capital and at US Venture Partners. Before that, he was vice president of marketing and product management at Force10 Networks, where he was responsible for marketing, product management and business development. Prior to joining Force10, Andrew was vice president of corporate marketing and corporate development for Riverstone Networks (acquired by Alcatel) from inception through IPO. Prior to Riverstone, Andrew served as senior director for worldwide product marketing and product management at Cabletron Systems. Andrew joined Cabletron through its acquisition of YAGO Systems. At YAGO, Andrew wrote the business plan and then as senior director of marketing and business development, led marketing and strategic alliances. Andrew holds an MBA and BA from Stanford University.

Physicalization - Scaling infrastructure for the social internet Our conversation with Andrew focused on the disruptive aspect of an emerging trend called “physicalization.” SeaMicro is a private company, focused on building servers that address the workloads characteristic of internet applications, such as Facebook, Twitter, etc. These workloads typically handle large numbers of concurrent users, but are not computationally intensive. Andrew believes that this type of workload will increasingly dominate in datacenters, and his company has created servers that employ large numbers of low-power processors in order to provide linear scalability and energy efficiency.

The company’s initial successes are promising. We believe that its approach could be the harbinger of a broader re-thinking of how datacenters are constructed over the next decade as social networking and other internet applications become more prevalent.

Key points SeaMicro is focused on building servers that draw one-fourth the power

and occupy one-fourth the space of traditional datacenter servers.

Two-thirds of power use in typical servers is related to overhead (disks, fans, motherboards, etc), so it is nearly impossible to decrease power use by 75% if only reducing power consumption from the CPU.

Virtualization and the perceived need for virtualization is a symptom of purchasing servers with CPUs that are too big.

“Physicalization” represents a re-thinking of the prevailing approach to datacenter workloads.

Compact, low-power servers are designed to allocate a single low-power CPU per workload, which could change the dynamics for the traditional server supply chain as well as virtualization vendors.

Andrew believes the use of low-power CPUs (such as Atom and ARM processors) could ultimately disrupt incumbent approaches that rely on virtualizing workloads to drive high utilization of high-end servers.

Summary of interview on 26 July 2010.

Full transcript follows

Two-thirds of power use in typical servers is related to overhead

“Physicalization” employs a large number of low-

power chips to match internet workloads

Page 66: 2020 foresight - Tech Views of the Future - Ed Maguire

Andrew Feldman, SeaMicro US software

66 [email protected] September 2010

Andrew Feldman transcript Ed: Andrew, could you please provide some color on your vision around physicalization and how this could impact server infrastructure for cloud services?

Andrew: Our vision was pretty straight forward. We set out to build a server that draws one-fourth the power and takes one-fourth the space of the best-in-class server on the market.

We saw that the business drivers for the internet (eg, search and view advertising) were creating a specific new type of work for servers. These business models, the ones that require that you give compute away for free in return for the low-probability event that a user clicks on an advertisement, are poorly addressed by today’s computer architecture. This mismatch between the way servers are built today and what the internet’s workload looks like causes the power issue in the datacenter for companies, such as Facebook or Twitter, that are growing at hundreds of thousands or millions of users a year. Demands for this type of workload are the cause of a set of issues in the datacenter that are namely power and space.

We asked ourselves, ‘If you were to design a computer optimized for handling this specific internet workload, how would you build it?’ The result is a system that draws a quarter of the power and takes a quarter of the space of the best-in-class server on the market today. We will begin shipping product later this month and we have 50 of the world’s top 70 datacenters in queue for trial.

Ed: Does this require any rethinking of the current emphasis in datacenters around virtualizing servers? Is there an implication for how the software is architected?

Andrew: There are really two questions there: The first is around software and the second around virtualization.

There are no changes to your existing software for it to run on our system. It does not need special drivers, or to be recompiled. Software runs unchanged on our systems.

Second question: Virtualization and the perceived need for virtualization is a symptom of purchasing servers with the wrong-sized CPUs. The virtualization model is: You buy a big, expensive, power hungry CPU, in an expensive server. You then buy expensive software which allows you to carve up this CPU so that it behaves as though it were lots of small CPUs.

Then you ask yourself, ‘Why have you gone through these steps when you could buy a machine made up of lots of small CPU’s?’ The largest datacenter

owners do not use virtualization. The largest and most sophisticated players don’t use it. The biggest players in the world just don’t use it. They stack users. They have the size and scope not to need it. The pathology of buying expensive PC server equipment that you then buy more expensive software to make it behave as though it was small independent machines, when you can go out and for a fraction of the cost buy those small independent machines, serves to help the chip guys, but it is still costly.

Ed: This benefits a few other people in the chain as well.

Andrew: Yes, the makers of virtualization software do well. You must ask the sophisticated users why are they buying servers they don’t need to begin with? There are environments where virtualization can provide some advantages, particularly around management, but for the most part you are buying the wrong size shoe if you have to put in inserts. Our vision is to provide the right size shoe from the start.

Ed: So what you have done is to employ clusters of the Atom chips which are more energy efficient. Because these are more efficient processors, is there less overhead from leaving nodes idle?

Andrew: I would begin on the other side of the equation. In today’s server, only a third of the power is drawn by the CPU. Two thirds is drawn by a collection of stuff on the motherboard that you don’t really need, don’t want and barely know is there. Some of that stuff you might have heard of (ie, disk and Ethernet), but a lot of the stuff you have never heard of (ie, BIOS, battery backups and all sorts of stuff that ends up sucking down two-thirds of the power in a server). What that means is that if you have two-thirds of the power going to overhead and one-third going to the CPU, no matter what CPU you use you can’t reduce power by 75%.

Ed: Because there is an inherent redundancy if you are using multiple servers, right?

Andrew: Exactly. What if you apply technology that reduces the overhead, or what we call ‘the non-CPU power draw?’ When you reduce the non-CPU power draw, it opens up a wide spectrum for you to choose among a variety of CPUs. This includes Atom CPUs or other flavors or an ARM CPU or that could be a power PC CPU - all of those CPUs that were historically foreclosed from the server market.

We chose the Atom processor because for the internet workload it is the most power-efficient processor on the market bar none. We were able to use it and no one else is able to use it because of the technology we invented.

Page 67: 2020 foresight - Tech Views of the Future - Ed Maguire

Andrew Feldman, SeaMicro US software

September 2010 [email protected] 67

If you just put an Atom in a regular server architecture, it is less efficient than a Xenon. If we put an Atom in our architecture, the result is a 75% reduction in power draw. That is what our technology does. It amplifies the benefits of low-power CPUs in a way that others simply cannot do.

Ed: This would seem to have pretty meaningful implications if you extrapolate this across the entire network of datacenters, especially in terms of the types of workloads you would be able to support from existing infrastructure.

Andrew: Correct. Today, we are focused on a particular type of workload that dominates the internet datacenter. However, the same SeaMicro architecture with a different CPU could transform the supercomputer world. The same architecture with yet a different CPU could transform the application tier of the enterprise. The architecture we have laid out is transformative across multiple distinct markets. We have chosen to go after the largest and most sophisticated buyer first.

Ed: The evolution of computing paradigms like MapReduce gives you this potential to tap into different types of workloads like analytic work loads as well.

Andrew: That is exactly right. We focused on highly partitioned workloads because they underpin the internet business model. And since then, it seems like the whole world started coming towards us. MapReduce and Hadoop are software methods of taking non parallel, large, chunky workloads, and breaking them into highly parallelized, small workloads, that are optimal for systems like ours.

On the business front, companies that are growing at an exponential rate, whether it is Twitter, Facebook or any number of Silicon Valley companies, are creating workloads that are optimal for us. Admob and these companies are by definition able to grow exponentially because they have scaled out, not scaled up. In other words, when they add a thousand new users they can buy a handful of new servers and keep scaling at the same ratio. That is the very essence of what our system is built for.

Ed: What are the potential ramifications assuming that this becomes accepted as a more dominant approach toward datacenter computing? What would the implications be for the parties with the vested interest, the Dells, HPs, and the Intels? Ultimately, would they either have to try and adapt or do they become far less relevant and maybe focused on different types of workloads and markets.

Andrew: It has tremendous ramifications. Every single technology in the computer world has been eaten from below. That is the definition of our industry. It is through architecture like ours that we could see a transformation in the server world. The server market has been dominated by basically Intel architecture with small share going to AMD. Intel has walked away with 90% market share and 65% gross margins. ARM could potentially disrupt this through technology like ours. Every single technology in the computer world has been eaten from below. That is the definition of our industry. That is a huge change. Today, Intel’s Atom processor is best in class for this workload and has tremendous advantages over the other players, but they need to deliver parts next year and the year after that will allow them to maintain their position. Whether or not they can do it is up to them.

The real issue on the table is that no one knows how to make CPUs both faster and cheaper at the same rate as we have seen in the last 20 years. They don’t know how to take the next step to dramatically increase performance. What they have done instead is a form of slight of hand - instead of getting faster - to add more of the same performing “cores” on a die. That is where multicore came from.

The result of putting more cores on the same die has brought with it a whole set of unfortunate consequences. Without getting too technical, big chips are disproportionately more expensive, they consume more power, they require higher-speed memory interfaces and associated components, and they can fail more frequently.

There are some curious ramifications. Some of those are, as these chips get bigger, as they have all these cores on the same die fighting to get off to get to memory, you must run the chips faster and you have to run the memory faster. When you run things faster, you lose more power. Not in proportion to how much faster, but in proportion to the square of how much faster. Moreover, when you run things faster, you fail more often. The mean time between failure drops. What happens is you are becoming less efficient and less reliable as you do all of these things to use lots of cores on the same die.

For some particular workloads, particularly complex dependant workloads, the benefits of multicore outweigh the costs and it is the best approach. Unfortunately that type of work best suited for multicore processors is not growing very quickly and the work not suited for multicore workloads - the internet workload - is the fastest-growing workload in world. And multicore processors are not designed for it.

Page 68: 2020 foresight - Tech Views of the Future - Ed Maguire

Andrew Feldman, SeaMicro US software

68 [email protected] September 2010

Ed: The traditional workloads that would depend on a high-performance chip would be something like graphics or design applications.

Andrew: Big database applications, CAD/CAM design, thermal modelling, fluid dynamic modelling, some supercomputer work, etc. Those are not industries that have been experiencing the kind of growth you see in internet applications. In fact, the workloads in these industries are the opposite of the internet workloads which are growing at exponential rates.

Ed: Your point is well taken. These are highly distributed, a lot of individuals doing relatively small tasks.

Andrew: It has to be that way on the internet. It is not an accident. If I am going to buy a server and I am going to give you chunks of compute and memory for free with the 1/1000 chance that you click on an ad from which I make a dollar, the chunks I give you cannot be very large - I cannot give you a lot of compute. I have to give you a tiny little bit, like a thousandth of the machine’s capability in a minute. That way I can have 1000 different users on my machine at the same time with users coming on and off every few minutes. That way, your revenue potential is the number of servers x the number of users on a server x the probability that the users clicks on an ad so you divide the day into three minute blocks. Your cost is servers. So clearly there is tremendous pressure to drive up the number of users on a server, or said differently make your “chunk of compute” that you give to each user as small as possible. And that small chunk is what our technology is optimized to handle.

Each three minute block you have a thousand users. Each user has a 1 in 1000 chance of making you a dollar. That is the internet. That is Google, that is Yahoo, that is every one of the click-through models. You can’t go online and find a fluid dynamic model for free because it takes 30 machines 10 hours and no one can make money if you click through for a dollar. The very underpinnings of this world of cool free services online, whether it’s Mzinga or Facebook online or anybody else that’s growing faster online than we can imagine businesses growing. They have to generate small workloads so they can get many people on a single machine, so that the revenue stream can become larger than the cost of the infrastructure.

Ed: That paradigm will start to pervade into enterprise computing. We are already starting to see that with Salesforce.com, which has essentially built a massive shared architecture running a single instance of an application. If you think about parcelling out workloads using something like a MapReduce or building other applications on this type of an architecture with the cloud-service providers providing the infrastructure, we

are talking about a very different conception or approach to applications.

Andrew: I think we are. There is no question about it.

Ed: Do you see or have you been in conversations with the cloud-service providers, the people providing the compute and storage on demand? This approach should offer similar economic benefits in their models.

Andrew: I think for them, there are some real pros of virtualization. For example, you can sell a quarter of a CPU five times over and as long as you are lucky nobody notices. That is the same model airlines use. If they have reservation systems, how can you overbook a flight? Because they don’t set the reservation systems to stop at 100%, they set them to stop at 108%. In the same way, a cloud provider can sell five one-quarter CPU “slices” for every CPU. You over- subscribe that CPU, all five of those people are paying but the cloud providers gamble that they won’t all be using their maximum at the same time. For example, cloud providers won’t guarantee performance security, because today they can’t. They can’t guarantee performance, because they don’t know what the other customer that is sharing that CPU with you will be doing. They can’t guarantee you access to your memory because they don’t know at the exact minute you need it whether other customers on the same CPU will be blocking your access. What is more, they cannot even guarantee you security. That is to say they cannot guarantee you that the other person who you share the CPU with isn’t malicious.

Enter physicalization. Don’t share a CPU. Have your own CPU - which says alright here is an opportunity for those who want better than best effort, to have their own CPU. What is more, the SeaMicro architecture is designed to guarantee performance and security. You always have access to your CPU and the same secure, performance from it. For some customers the “best effort” approach will be sufficient. But we believe that for broad scale adoption of the cloud, you are going to see adoption of a model in which you can actually give someone their own CPU. That is physicalization.

Ed: This seems to be a more economic and energy efficient approach than essentially your dedicated servers. The traditional outsourcing model.

Andrew: The problem with dedicated servers is they don’t share any overhead.

Ed: You are back to that same problem, that fundamental problem.

Andrew: Here is an analogy our CTO likes to use. A big CPU is like a 100-room home with six bathrooms. It

Page 69: 2020 foresight - Tech Views of the Future - Ed Maguire

Andrew Feldman, SeaMicro US software

September 2010 [email protected] 69

works just fine until more than six people need to go to the bathroom at once. That is what happens when you and your four friends are renting the same CPU and all want to go to memory at the same time.

What we build is apartment buildings filled with one bedroom and studio apartments. What you want to share is the roof. What you want to share are hallways and what you want to share are the maintenance costs. What you don’t want to share are bedrooms and you certainly don’t want to share bathrooms.

Continuing the metaphor, you want to always be sure you have a place to sleep, a place to be private, and a way to get out. That is physicalization. At the system level, what you want to share is power supplies, fans and other “infrastructure.” Not have 30 systems, each with their own power supply and fans ensuring that each one is particularly inefficient. You want to have a set of power supplies and a set of fans to cool the system. As a result, in large apartment buildings you have far more efficient air conditioning than you have in single-family homes. You have far more efficient heaters. That is the right thing to share, share the infrastructure. Don’t share the bedrooms and bathrooms. Don’t share the processing and memory. That is at the very essence of physicalization.

Ed: Why hasn’t anyone thought of this before or tried to this before?

Andrew: They have, but it was targeted to the supercomputer world. Our approach is similar to that found in the fastest supercomputers in the world use. This is IBM’s Blue Gene architecture. Instead of optimizing the architecture to build the fastest computers, we optimized for the most important workload - the internet workload. Why haven’t other major server providers done this? Because it requires

custom chips and they don’t do custom chips in their server businesses.

What we did, coming out of the networking world, is we were unafraid to do custom silicon. We had done far more complicated ASICs in previous companies in the networking industry. And we were unafraid to spend tens of millions of dollars doing engineering. That allowed us to think differently about how to solve this problem. We brought a networking style approach to a market that had failed to innovate for many years.

Ed: That in turn should be another enabling factor in driving further innovation.

Andrew: It sure will.

Ed: Like all disruptive innovations, it is incredibly simple but has some pretty profound ramifications.

Andrew: It does.

Ed: Right now it sounds like you have the IP and the operations in building mode and right now it sounds like it is more just about execution and sales, building it out and proving it on a large scale.

Andrew: Yes, I think as one of my board members likes to say, we have a shot at becoming one of the fastest-growing systems company in the history of Silicon Valley.

Ed: Well, that’s a good place to start out. It sounds like we are at the cusp of something and I will be very intrigued to see what comes out of this and the potential disruption for some of the incumbents who will obviously have to respond. The potential for the innovation that occurs around what you guys are doing and on top of it in the stack is intriguing.

Page 70: 2020 foresight - Tech Views of the Future - Ed Maguire

Promod Haque, Norwest Venture Partners US software

70 [email protected] September 2010

Promod Haque, Norwest Venture Partners Promod Haque has 20 years of experience in the venture-capital industry and currently serves as managing partner at Norwest Venture Partners. He joined the firm in 1990. He has been ranked as a top investor on the annual Forbes Midas List for the past nine years, and in 2004, Forbes named him as the No.1 venture capitalist based on performance over the last decade. In 2006, Promod was presented with a Global Leadership award from NASSCOM.

Promod focuses on investments in semiconductor and components, systems, software and services. He was an early investor and a board member of Cerent (acquired by Cisco), Cast Iron Systems (acquired by IBM), Siara Systems (acquired by Redback Networks), OnDisplay (acquired by Vignette), Winphoria Networks (acquired by Motorola) and Extreme Networks (Nasdaq: EXTR).

Convergence, commoditization and IT in India and China Promod Haque brings a wealth of experience to his role as a VC at Norwest Venture Partners. His successes with companies at every stage, from startups to turnarounds of public companies, mark an unusual versatility and talent for identifying value-creation opportunities. In our conversation, Promod addresses several secular themes, such as the adoption of infrastructure as a service, commoditization of hardware, convergence of networking, compute and storage and the distinct role that the US plays for innovation in the software industry.

Key points Commoditization of hardware (networking, compute, storage) is forcing

value to accrete to the top of the stack, software.

Standardization, Moore’s Law and competition continue to drive down the cost of compute, storage and networking.

Differentiation in this space has become more difficult, so the lower levels (networking layers 1-3) have converged.

The commoditization of hardware results in more rapid innovation at the software layer, particularly closer to the “pain points” of business IT.

We will see the emergence of new types of automation: Specialized infrastructure applications will emerge to monitor the performance of applications and performance of the underlying infrastructure. These applications will, on a real-time basis, allocate resources and manage performance to finely tuned enterprise policies.

This has and will continue to result in convergence into higher value solutions. This will change IT customer expectations - ‘You manage my IT and I’ll pay by the month.’ Software will increasingly compete with hardware, and integrators will compete with service providers.

There is no single software-product company in India or China that sells internationally, largely because they are too far away from where the early adopters are. Ten years from now, there will be slightly different dynamics with wage escalation.

Innovation will remain closer to the pain points of IT, typically in the markets where IT spending is concentrated, still the US and Western Europe.

However, as labor costs in developing economies are arbitraged away, companies in those regions will increasingly need to adopt technology to improve employee efficiency, otherwise they will become disadvantaged competitively.

Summary of interview on 3 August 2010.

Full transcript follows

Commoditization of hardware accelerates

innovation at the software layer

Proximity to early adoptors is critical for product companies to

succeed globally

Page 71: 2020 foresight - Tech Views of the Future - Ed Maguire

Promod Haque, Norwest Venture Partners US software

September 2010 [email protected] 71

Promod Haque transcript Promod: Let me preface my remarks by discussing the areas and the geographies we focus on. We are making investments here in the US. We are making some investments in Israel, which is primarily a source of deals because Israel to a very large extent is not a very large market, as you know. Then, we make investments in India and some investments, though not as much, also in China. If you look at the areas we invest in, they end up being somewhat diverse because of the somewhat diverse geographies. These countries are all in different stages of development and have different economies. I will just get that out of the way first.

Investments in India and so on are non-traditional investments for us in that we don’t do a lot of those here in the US. What we do continue in the US, have been doing in the US, is focus on the information technology space in a broad sense. We are adding some healthcare investing as well, which we have always done in the past, but not in the recent past.

In the high-tech IT space, we work across the food chain and across the value stack all the way from components and semiconductors to companies that actually build systems. These are systems for enterprises. These are systems for service providers or wireless carriers, datacenters and enterprises. We do a lot of stuff in the software space which is stuff that sort of sits on top of these systems. These are platforms, application platforms and basic infrastructure platforms for infrastructure management, all the way to business applications, whether they are delivered in-house or through subscription and SaaS models or in the cloud.

Then, on top of that, we also do analytics on the inner stuff. That was part of what we were doing at SPSS. Our belief is that lots of enterprises, especially large enterprises and some of the high end of the mid-tier market, all have ERP and transaction systems, but the trend is ‘Can they do predictive analytics? Can they learn from what they are doing to be better in the future?’ There is a lot of interest in analytics and you are seeing that expanded. Over a period of time even that will become something that a lot of people deploy.

We also do a fair amount of stuff in the consumer internet space, consumer space, consumer internet technologies, infrastructure and applications. That is sort of the general and then some technology-enabled businesses. These would be solutions that are verticalized for various markets. Not a lot of deep technology, but utilizing existing technology. These companies are offering solutions and that are applicable to various vertical market constituents. That is sort of our focus. I think one of the things that we

are noticing in the last couple of years and more so now is, especially in the mid-tier market and not so much in the high-end market, a definite move on the application side to SaaS-oriented applications.

We are starting to see in that same market space a lot of interest in not only SaaS, but also the basic infrastructure being offered as a managed service. Not so much of the cloud yet. You are starting to see a combination of that and people are saying, ‘You know what? I won’t even have all of this stuff in-house in the future, I’ll just go to a cloud model.’ Then, most companies have a legacy in place, so they have to manage that. It is sunk cost. It is money already spent on hardware.

What they are doing is starting to use managed-service providers to manage all that stuff for them and do it as a hybrid. So I have some stuff in-house that I will have someone manage and then some of the new stuff I am going to put in the cloud so that it is also managed so that it alleviates some of the things I have to do to manage the ever-increasing complexity on the infrastructure side. Those are the trends we are seeing.

We are also seeing hardware get commoditized rapidly. We are seeing that not just in the server space, in the compute space, which obviously has been going on for a period of time, but you are starting to see networking vendors, with the bankruptcy of Nortel. And you look at Alcatel Lucent, which is under tremendous pressure from the HuaWeis and the ZTEs from China. The general commoditization of that space as well, at least the lower end of that space.

Over a period of time, that will happen with the high end, but the lower end is getting commoditized. There is a lot of emphasis on the part of companies like Alcatel Lucent, Extreme Networks or Brocade and equipment providers, including Cisco and Juniper. You are starting to see a fair amount of emphasis on their part to offer total solutions, not just to provide an element of dissolution, which is creates some pretty interesting trends. You see a lot of consolidation in that space. You see Oracle now in the hardware business.

One of the more interesting things that we saw recently was the IBM announcement that the entire hardware group is going to be reporting to Steve Mills, who previously ran software only. Steve Mills has taken over the hardware group as well as the software group that he has always run. You are starting to see that. HP acquired 3Com. You are starting to see some very interesting trends in that hardware, not just compute, but compute, storage and networking are coming together.

You are starting to see more and more buyers going to a SaaS model, a cloud model or a managed-services

Page 72: 2020 foresight - Tech Views of the Future - Ed Maguire

Promod Haque, Norwest Venture Partners US software

72 [email protected] September 2010

model. You are starting to see the compute, storage and networking come together. You will actually see a lot of this as infrastructure as a service or managed services of the legacy stuff. There will also be continued focus on software innovation.

I recently met with an equipment manufacturer in the networking space, broadly speaking in the networking space. One of these officials said to me that with all the commoditization that is happening with what HuaWei and ZTE are doing, our belief is that going forward we are going to have to differentiate on the basis of software and solutions. We will have to innovate rapidly because that is the only way we are going to stay ahead of the Huaweis and ZTEs. These are pretty interesting trends in the marketplace.

Your question about what the world will look like in 10 years, my view is that you will see a lot of this compute, storage and networking come together. You will see a lot of it getting commoditized. You will see software as a major tool that differentiates. You see these hardware and software groups come together because the delivery model is changing.

When it becomes a cloud model or an infrastructure as a service or an infrastructure, platform and application as a service, you are not going to have all of these various groups running around, each trying to sell their stuff. The customer is going to want to talk to one sales guy and he wants to buy a total solution, including all the hardware, software, networking, and he wants to pay a monthly fee and doesn’t have time to talk to four different sales people. There is a major change coming in the way that IT is going to be constituted and delivered in the marketplace.

Ed: It is interesting the way you have characterized it because potentially this will set up some new competitive dynamics where historically you have had some specialists that would compete for their place in the stack. There were always a handful of pieces that you would be able to pick best of breed from the networking, storage, infrastructure, software and the applications.

By moving into more of a vertically integrated solution, I guess it raises two questions. One is whether scale is going to become a true competitive advantage, but as we all know, the larger an organization gets, that creates barriers to the types of speed in innovation that you get from the smaller more nimble companies that end up being specialists. I would be curious as to your thoughts on the dynamics of where value creation will migrate as you have pointed out, as hardware gets commoditized you will have more of these resources delivered as a service. It seems like the value-creation opportunity is

really moving up to the software layer. The question is how high up the layer will that need to innovate go? Do you have to get into very vertically integrated solutions to really compete over the long term?

Promod: I think the action will be in solutions. That doesn’t mean the underlying infrastructure doesn’t matter. I think what you will see is the action being on total solutions. You will see some pretty interesting technology getting developed which when you really think about it, you mentioned a term earlier which struck a chord here: best of breed.

You have the best of breed and large companies can afford to do that . . . large enterprises rather, where they can go buy the best-of-breed technologies and stitch them together. You go to a system integrator or whatever the internal staff is and stitch the best of breeds together and hope that the interaction between these best of breeds is such that you can get truly what you are looking for in terms of competitive performance. It is very difficult to do that manually. It is very expensive. Only the very, very large companies can afford that kind of stuff.

My belief is that going forward a lot of that stuff will get automated. You will see specialized infrastructure applications emerge which monitor the performance of applications. They also monitor the performance of the underlying infrastructure. They actually, on a real-time basis, allocate resources and do all kinds of interesting stuff such that the performance management of that application on a policy basis is fine tuned to what you require as an enterprise.

I think you are going to see a lot of this stuff get automated . . . and the underlying infrastructure will just be commoditized because the value will accrete to the top layers, which is not even the application. But how do I now, given that I got the underlying infrastructure and whatever apps I am running, know the value will really accrete to the top where I have a very specialized application that will fine tune my application, without human intervention on a real-time basis? That’s where the action will go, because then you can take commodity hardware and commodity software, and yet you can fine tune everything based on, think almost of, predictive analytics. Think of it as predictive analytics on a real-time basis, on the underlying hardware and software to maximize what you want, which is performance.

Ed: It is almost self-diagnosing and self-correcting performance management.

Promod: Yes, which to some extent happens inside a given system. But now we are talking about that art moving up, so not only are you doing that within a

Page 73: 2020 foresight - Tech Views of the Future - Ed Maguire

Promod Haque, Norwest Venture Partners US software

September 2010 [email protected] 73

subsystem, but now what you really are able to do is take that whole concept and you are able to optimize that across five different systems that are totally independent systems.

Ed: It really seems that this - the elevation of logic continually up a layer to be able to manage policy, performance and to match resources very dynamically to applications - ultimately is going to free up a lot more of the customers’ and application developers’ time toward innovation.

One of the thoughts I have been working on or developing is with this rise of application, platform as a service, and infrastructure as a service, where the barriers to entry are very low for a startups with the availability of close to a billion lines of open-source code that anybody can build on to start a service. It seems that we are on the cusp of an opportunity for some tremendous innovation on the application front.

Particularly when you extrapolate this to some of these markets you have been working in, India, Israel and China. What I am interested to see is that higher level intellectual property output and innovation coming from these economies that have essentially been very competitive on costs for commoditizing, for commodity businesses. The question is, as we are starting to see these brands like HTC emerge from being pure manufacturing and outsourcing companies to actually having branding and differentiation and what some of the Taiwanese manufacturers have done with the concept of a netbook. How long will it take before we see some truly world-branded, world-know application and software companies coming out of some of the newer markets?

Promod: I think the challenges of the newer markets are what you are talking about.

Ed: China and India where the businesses have been heavily focused on manufacturing and services in development but not as much in applications.

Promod: Yes. I think the challenge is that that will happen, but I don’t think it happens in the next 10 years. The reason for that is very simple. Their total aggregate spending on IT is far too low compared to the US and Europe. Labor is still cheap in countries like India and China and so the incentive to do a lot of this development we just talked about is not there.

In contrast, in the US and Europe, the problem is acute and that is you have tons and tons of people or operations using technology. They need better value propositions. They need to lower their cost of

maintaining and managing all of this stuff. The pain points are very real, and therefore I think that innovation is going to be addressed by local companies.

It is very difficult for a company in Bangalore or Shanghai because it is too far away from where the early adopters are. The early adopters are still in US and Europe, especially the US. It is very difficult for some of these companies that are not based here to be able to have the insights to be able to develop the next generation of products. That is why you haven’t really seen any product companies coming out of China and India, with the exception of HuaWei and ZTE.

Not a single product or software company that you can think of in China sells internationally. There is not a single product company in India on the software side that sells its product internationally. They have all been services and manufacturing companies in China. They are too far away from where the early adopters are.

I think 10 years from now there will be slightly different dynamics. We have spent a lot of time thinking and talking about this. As wage escalation continues in these countries, I don’t know about in China, but I can tell you in India since I sit on a couple of boards of IT companies, not pure IT, but companies that use IT.

Annual raises are 10-15%. The challenge is that you can’t continue like that forever. You do that three years in a row and pretty soon your costs, operating costs just continue to increase very rapidly. The only way you as a company can survive in that environment is either pass that cost on to your customers, which is very hard, or you increase the productivity of your existing workforce. As their salaries are going up, you want productivity to go up at a faster rate.

The only thing we know to increase the productivity of human being is technology. I think you will see a very rapid adoption of technology by enterprises in these countries. This is good news for the IBMs, the HPs, and the Oracles of this world. They are seeing this anyway. A lot of their revenue comes from out of the US and I think it is going to increase in the next 10 years. More and more companies will start to deploy technology to maintain productivity gains.

Then as that happens, some of these countries start to become early adopters of this technology. Then the entrepreneurs don’t have to fly 10,000 miles to meet a customer who is going to use their technology. It is right there in the cities. Right now I can tell you it is very difficult for a product company in Bangalore or Shanghai to understand what the market needs when they are developing technology. Services are a different story. Products for the local market are a

Page 74: 2020 foresight - Tech Views of the Future - Ed Maguire

Promod Haque, Norwest Venture Partners US software

74 [email protected] September 2010

different story. For international products, they are too far away from the market.

Ed: Another interesting point, following up on your comments on the convergence of software, hardware and services, is the role that content may play in software business models. When we have looked at how Apple created value around iTunes, the availability of content and having a content delivery system have been really integral to the solution as a whole. When we look at what Microsoft is doing with Xbox Live and Salesforce.com buying Jigsaw to have some proprietary content, one of the questions I have had is whether this is meaningful enough to constitute a longer-term trend where software companies get into the business of offering proprietary data and analysis services. Katie had mentioned that you guys are investors in 1010 Data. What Microsoft is doing with Project Dallas is pretty intriguing and would love to hear your thoughts on that aspect of longer-term convergence.

Promod: I think obviously you have to worry about privacy issues and a whole host of privacy-related issues. But leaving that aside for a second, yes I think there is use of industry data, not necessarily individual data. There is a way to use that data to do predictive analytics. I think you will see more of that, but again there are a whole bunch of issues around privacy and all that we have to be very careful about because we can get into a lot of trouble.

Ed: There was an article in the Wall Street Journal about the privacy issues on the internet and it was pretty intriguing. When we get into the area of

location-based services and context-based services, this seems to raise a new level of demands and concerns around where data is stored.

Promod: And how it is being used.

Ed: It is clearly one of the seeds of some fairly significant conflicts over the next several years.

Promod: Yes, it certainly is an issue. Very definitely.

Ed: Are there any macro or longer-term concerns that you have, beyond the disruptive impact of commoditization and competition and maturing in industries, which ultimately do benefit the end users of technology by offering them greater functionality and greater value for the money that they spend. Are there any other factors that concern you longer term that could create some risks or conflict?

Promod: None really, no.

Ed: I’m struggling to find something that would knock this trend. It is pretty inexorable.

Promod: Yes.

Ed: I guess one of the challenges is just for realizing some of the vision here is having that ubiquitous bandwidth and having enough network capacity to have all of these always-on connections to fulfill the vision of some of these services. Those all seem like relatively practical problems that can be overcome.

Promod: I think so, things that can be solved.

Page 75: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

September 2010 [email protected] 75

Timo Hannay, Nature Publishing Group Timo Hannay is head of web publishing at Nature Publishing Group. Before that, he ran Nature's local-language Asian websites from Tokyo and established its New Technology team in London. He has previously worked for McKinsey & Company and The Economist.

Timo holds a doctorate in neurophysiology from the University of Oxford and a degree in biochemistry from Imperial College London.

Accelerating innovation in the sciences Our conversation with Timo Hannay of Nature Publishing focused on the technological innovations within the sciences as well as the evolution of the scientific-publishing business into a technology-enabled research-enhancement platform. Timo’s ethos is a scientist at the core, and his experience as a consultant and in the publishing industry informs a holistic view of science and the process of discovery. We discussed how content and technology are becoming interwined and necessary for successful companies of the future. Timo provided his views on cultural and technological barriers to scientific collaboration and how they will change in the next 10 years. The rise of cloud computing empowers researchers with more IT resources than ever before, but there is still a need for collaboration and information-analysis tools to provide an end-to-end platform to accelerate innovation.

Key points Content companies, such as Nature, are morphing into technology and

information companies.

‘Information without technology is not of much use, but equally, we live in an age that even information on its own without associated functionality is of limited use as well.’

Nature Publishing has experimented with this new concept by providing merging content with cite-able, validated databases to subscribers, which facilitates dynamic content and breaks down the historical barriers between journals and databases.

Scientific research is proprietary by design, but the process is slowly evolving to become more open and collaborative.

The open-source software-development model was itself inspired by the classic academic approach to problem solving, which is open and collaborative, which itself came out of the birth of modern science in the 17th century.

What is missing at the moment is the software layer between the emerging cloud-services infrastructure and the tools and services scientists themselves need.

New information and workflow management systems are needed for scientific discovery, and there are opportunities for open source or crowdsourcing scientific-discovery software for labs and lab automation.

Content and technology are morphing into services

The self-proprietary nature of scientific discovery presents

technological barriers

Software innovation in the sciences improves as the community becomes

more collaborative

Summary of interview on 31 August 2010.

Full transcript follows

Page 76: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

76 [email protected] September 2010

Timo Hannay transcriptTimo: To give some context as to where I am coming from, I am a scientist by training. I was a neuro-physiologist originally but I also worked in business; I was at McKinsey for a couple of years, so I worked in business and in consulting but my heart is really in science and scientific research. I started out as a bench researcher and have ended up working in the scientific-information industry. I feel much more like a scientist who happens to work on science communication and scientific information rather than a publisher that works on scientific content. That covers my perspective.

At Nature, which is part of Macmillan Publishers, we are very interested in scientific information broadly. Nature clearly is a content company. It has made its reputation, and most of its business comes from its activities in publishing, but we have also tried to pioneer a bunch of other activities that take forward that dream of disseminating scientific information in different sorts of ways now that we are in the online sphere. Nature’s founding principles, which were actually published in the second issue of Nature back in 1869, are a twofold mission, basically.

To summarize briefly: It is to help scientists communicate with one another and to help scientists communicate with society at large. It is a twofold mission and I think there are ways we can use the online environment to do that much more effectively than we have been able to do with print publications. The kinds of things we have tried to pursue, for example, we embraced early on compared to most scientific publishers - the use of online audio and video material.

That is a good way to engage people outside of the professional scientific sphere, so that helps meet part of our mission to communicate science to society at large. Video and audio can be much more accessible than the written material we publish. It also helps with professional communication as well. For example, if you are trying to communicate a scientific protocol, often it is much better to have video than it is to have the written protocol. We have tried to pursue that kind of angle as well.

A couple of other ideas we have pursued: One of them has been database publishing, which is trying to break down the barriers between the two big domains of scientific information online. One is journals and the other is databases. You can see how they are rather different from one another. Journals have rather slow publication processes. It can take months or even years to get stuff out in a journal and once the information is there, it is static. Largely, this is by design because you want to make that a kind of unambiguous, citable entity but also that was a necessary constraint in the print world.

Databases are rather different. They tend not to place such a heavy emphasis on peer review or curation. Some of them do these days but certainly originally they didn’t. They were much more frequently updated but they didn’t pay a lot of attention to things like archive-ability or cite-ability and yet they are much more flexible. You can query them in multiple different ways and get results back in multiple different ways. What we have tried to see is: Can you actually have the best of both worlds? Can you create a publication that is highly structured and query-able and updated on a continuous or frequent basis and yet is peer reviewed and cite-able and archive-able with all the essential features of a peer-reviewed scientific publication?

We have conducted a number of projects and experiments using collaboration with various research groups. The first in that area was something called The Molecule Pages, which is in essence a review journal on the proteins involved in intracellular signaling. If you look at it it looks and feels and behaves a lot like a database except that the database is populated by expert authors selected by our editors and the contents are subject to peer review and are cite-able. It has all the attributes of scientific publications. We are trying to break down the barriers between databases and journals.

The third area we have been active in is how to provide tools for scientists. Can we provide the means for them to communicate or the means for them to manage their own information? Not just provide a publication or a forum or a means to disseminate content.

For example, how can you encourage social interaction among scientists? We have a number of projects in that area but the most obvious would be Nature Network. It is often called the “Facebook for scientists.” I don’t particularly like that description but that gives you some idea about what it is about. It is enabling scientists to communicate directly with one another, individually and in groups and we are providing a forum for enabling that, rather than a publication through which it happens.

We have also had experiments with information tools for individual personal needs. One that we did many years ago is in essence Delicious for academics. It was inspired by Delicious and took the lessons from what Delicious was doing back in the very early days of that website and added features that we thought would be particularly interesting for academics. For example, if you bookmark a research paper it will automatically extract the associated metadata like the author’s name and the publication date and the journal name so you can retrieve the information on that basis going forward not just using tags.

Page 77: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

September 2010 [email protected] 77

As you can tell, what we have tried to do is say, ‘Well look, we are not just journal publishers but are in the scientific-information business and now that we have the web environment as a different kind of information-communication medium from the ones we were used to when Nature was originally established, what more value can we add and what other tools and services can we provide in order to facilitate the flow of scientific information?’

The tools to enable researchers to do a good job in managing their own information and share it effectively with one another don’t really exist. We have a long way to go in terms of having the right tools. There are some good tools out there but we aren’t close to servicing all the different information needs of researchers in an adequate way. I think consumers, for example, are much better served by the software that is out there at the moment, and researchers by comparison have a relatively poor set of tools at their disposal, generally speaking.

Ed: Timo, what you are doing ties into a couple of the themes that we have discussed in some of our other conversations. One is that there has been an evolution among publishers or content providers to adapt to an online or an internet world and really find new ways to leverage that proprietary content through new types of software tools and in some sense converging with traditional software offerings. A converse theme that was cited by Ray Wang of Altimeter is that he believes that software companies are increasingly going to become information brokers where the value of proprietary information for solving a particular problem is going to become an inherent competency of what we now think of as traditional platform software vendors.

Timo: That is a really interesting observation. I haven’t come across it from the other side that Ray articulated, and I think it speaks to the fact that the real value comes when you combine technology with content or information. Information technology without the information is not that much use, but equally we live in an age that even information on its own without associated functionality is of limited use as well. From the publishing perspective, as soon as you start providing content online you immediately have an imperative to provide certain types of functionality around that. It doesn’t take too many steps ’til you get to the point where the functionality is as important, or more important, than the content.

Some of the projects that we have been working on, admittedly somewhat experimental in the scheme of things, functionality is really all there is. The content is almost wholly provided by the users themselves and we are not in that case a purveyor of content. I think what this means from the point of view of a publishing company is that we move very close to

becoming a technology company. Certainly if you are in the information business you have to master information technology. That seems obvious to me but it is something that has taken many publishing companies time not just to appreciate but to fully embrace the mindset, which is the need to become a technology company.

That doesn’t mean that you need to become a Microsoft or a Google, but it does mean that you need to have certain competencies in common with those types of organizations and you need to be very comfortable with information technology. I think you need to be comfortable enough with information technology that you can sell technology products as well as content products as it were. Whether you choose to do that is up to the individual businesses and their tactical priorities.

Ray’s observation, added to the perspective that I have developed working in publishing, I think speaks to the value coming from the combination of information and information technology because one without the other is of limited value. It is very interesting to hear that where software or technology companies are moving is in the direction of wanting valuable information as well as publishers wanting to be able to master information technology.

Ed: That is part of our broader theme here of what I will call convergence that we are seeing in the industry between hardware, software, services and content where users need a solution and don’t necessarily care about the components. You had alluded to this idea of almost creating a social network or an intelligent network for collaboration. This ties into another theme that we have been focused on, which is the rise of the importance of intelligent systems. This is much more tied into the vision of social-networking technologies or online, real-time collaboration capabilities being embedded in the business process.

Our conversations with Salesforce.com’s Parker Harris talked about their new product Chatter, which incorporates the Facebook or Twitter paradigm for an organization. They also ended up buying a database called Jigsaw, which was a proprietary information source essentially for salespeople. But extending that paradigm back to Nature and what your team is working on and others is really harnessing all of these technologies together to enable your constituents to either solve problems or spark innovation. Ultimately, that is the most exciting thing over the next several years is that this should hopefully accelerate innovation in the sciences.

Timo: That is what those of us at Nature are really interested in doing. Many of us, including myself, are former scientists and consider ourselves to be

Page 78: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

78 [email protected] September 2010

scientists still and that is our ultimate goal; to help accelerate the pace of discovery and the pace at which those discoveries are exploited. There is a lot of interesting work that is going on, but I think we need, there are a lot of threads that need pulling together. There is major macro change going on in the information-technology industries as a whole. We see big trends like cloud computing, the rise of mobile and tablet devices and smaller trends like HTML 5 that have to do more with the implementation of things.

At the same time you have other forces at work. Within science itself you have, actually, a large number of innovative projects, and products are whipped up by individuals in their labs who are scratching their own itches, as the saying goes. Those kinds of products generally aren’t seeing the light of day for most scientists. There is not really a good path for the small-scale innovation that goes on in a particular lab or a particular researcher to move into a polished or commercial product. There have been some exceptions to that but, by in large, those things tend to sit within that one lab or get used by that one researcher or a small group of their colleagues.

The other thing that is going on in science, this is actually a source of immense frustration for people like me who would like the way that scientists work to change more rapidly, is that there has always been a very established culture about how things are done and how things are monitored and how you get credit. The communication is traditionally done through peer-review journals. Now, I work for a company that has a substantial business in publishing peer-review journals, so, far be it from me to talk down the importance of peer-review journals. I think they are, and will continue to be, for the foreseeable future, a cornerstone of scientific communication. That being said, there are other ways of communicating and other ways of contributing to the scientific process.

For example, you might have a nice piece of software that other scientists could make use of. How do you offer that to the scientific community in a way that you might offer a set of results and ideas that come out of an experiment that you could publish in a scientific paper? There isn’t a good way to do that. Technically we could provide a way to do that. Technically we could provide it. Software developers have good ways of sharing code with one another and they have good ways of providing updates to that code and they have good ways of being able to share the fruits of their labor. That kind of process and that kind of behavior hasn’t really embedded itself into science.

The currency of science, the unit of contribution to science, is the peer-reviewed scientific publication. It is not the data set, not the piece of software and not any number of other things that you might be able to

provide to the discovery process. There is actually a bit of a mismatch there because what is happening in science, particularly in biology, is that things are moving from, what I would categorize as a cottage industry, where you have a small group in one lab who are doing everything from soup to nuts. They are doing the design of the experiment, they are conducting the experiment, collecting and analyzing, drawing out the insights, writing the paper, and they are publishing it and repeating the whole process.

Where we are moving towards in many areas, particularly in biology, is much more industrial scale where you have large multi-lab groups and you have a high degree of specialization. There is a group or person who is collecting the data, another group running the analysis, another group drawing out the insights and these people are not formally part of the same group except at some macro kind of level. How do you enable that kind of specialization and the kind of sharing and trading of information that that implies?

In many ways the challenges in science are not the technical problems but the social ones. How do you organize yourselves to carry out research in that way and how do you recognize the different contributions? That is really underdeveloped in science, although there are some people and notably some of the funding bodies who are thinking about that kind of stuff and are trying to connect it.

Ed: That raises some interesting parallels with the IT industry. Certainly with the rise of different types of collaboration tools and certainly with a number of open-source approaches to research in AI or in medical research, the issue of credit becomes a far more diluted concept. The question is that while we have seen this acceleration in innovation in software employing the open-source model, there is one estimate that right now there are over 1 billion lines available to anybody who wants to start a project, whether we can see that similar parallel in specific projects, I would guess it would really depend on the specific problem that scientists are looking to research.

Timo: For me the irony is, and maybe I am just taking a science-centric view, but the open-source software-development model was itself inspired by the classic academic approach to problem solving, which is open and collaborative, which itself came out of the birth of modern science in the 17th century.

Instead of being alchemists who hide our results from one another and try to get one up on one another by being the one to discover how to turn base metal into gold, we will start opening up and start sharing our results. That’s the point when alchemists became chemists and modern science was born, the point at which the practitioners starting sharing their results

Page 79: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

September 2010 [email protected] 79

with one another and we made much better progress that way. That was also where the modern scientific journal was born as one of the forums in which you share your results. Not only that but in return getting the recognition and admiration of your peers and that is how you developed your career. That ethos is what drove, either explicitly or implicitly, drove the early practitioners of open-source software development who wanted that open, sharing and collaborative ethos of academia to be brought to play in software development.

The irony is that these days, science, certainly many areas in science, has much more to learn from the open-source software people in terms of how to be open and collaborate than they do from one another. Many areas of science are very proprietary and I don’t just mean the directly commercial areas of science. There are exceptions. Physicists tend to be very open. Biologists, particularly cellular and molecular, tend to be much more proprietary. There are some economic reasons for that because biology data and information tends to have more commercial potential than physics, generally speaking, but there are also cultural biases that as far as I can see are just historical accidents basically.

Many scientists, if you say to them, ‘Hey, why don’t you put the data online in case anyone finds that information useful or why don’t you put early versions of your manuscripts online so people can give you feedback before you submit them to a journal?’ They would think that you are crazy - people have said that to me. They think, ‘Why would I ever do that? I am just giving information to my competitors and I won’t get any credit for it. I want to be the first to publish on this topic or I want to publish in a top journal and let everyone else publish subsequently in lower-ranking journals.’

It is a competitive and proprietary mindset that I hope we can break out of through a combination of different approaches. Part of it is giving much better tools to enable scientists to share more effectively. Part of it is baking into those tools and those approaches the ability to track and therefore by extension reward that kind of good behavior and part of it is just buying into that, what I would as a Brit term the US kind of West Coast open-source software, kind of ethos, which is we can all do well by collaborating and we move forward much more quickly and effectively through a judicious mixture of collaboration and competition.

The competition must not be all about hoarding your results and hiding ideas but it must be first and foremost about sharing with one another. This is particularly true if you are a scientist funded by the taxpayers. This is not a view held universally by scientists because it is such a competitive realm and partly because they don’t get rewarded for everything that could contribute to the scientific process. They get

rewarded first and foremost for their publications so that is what they concentrate on.

Ed: You get a chance to review a lot of research and I am sure in your career you have seen the evolution of the pace of research, but as you look out into the future with the access to cloud-computing resources, mobility and some of these other enabling technologies, what might be some of the areas where you see the most compelling leverage of this increasing access to computing power and resources?

Timo: The two trends that you mentioned, which were cloud computing and mobile, have a big part to play in the way that science is done going forward. There is a third one that I will add in a minute and I think it is a bit less obvious to those outside of science.

If you look at cloud computing, that has an enormous potential for science because scientists in many areas need to do huge degrees of number crunching. If you are running a climate simulation or a simulation of subatomic particles, you typically need huge computing power. That has typically been dealt with by having super computers and super-computing centers, which are necessarily exclusive and perhaps are not always the best way of being able to provide and manage computing resources.

Cloud computing both democratizes the availability of computing more and potentially allows it to be used in a much more efficient way. For example, the idea that you can buy compute cycles from Amazon overnight at times when other people don’t need them and they are cheaper. You could run your simulations then, for example, and those types of approaches have an enormous amount to offer science.

What is missing at the moment, for science specifically, is the software layer between this amazing emerging infrastructure from the likes of Microsoft, Amazon and Google and the tools and services that scientists themselves need. Most scientists aren’t going to go onto EC2 or any other part of the Amazon offering and fire up their own instance and start running things. They will need a software layer for being able to do what they need to do. They are not computer scientists. I am confident those things will emerge over the coming years. I think it is going to take longer than I hope, but I think it will be with us in the next 10 years. We will have a nice layer for researchers to use. I would also include in that, rightly or wrongly, I would include in the cloud-computing category the idea of being able to use or aggregate human effort.

There is the crowdsourcing mentality and idea that is wonderfully implemented by Amazon in their Mechanical Turk, but also implemented in a number of science projects you can see. Most famously, this was

Page 80: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

80 [email protected] September 2010

used in Galaxy Zoo, where members of the general public can come in and classify pictures of galaxies taken by the Hubble Space Telescope, and they have huge capacity compared to what they had been able to do within their research group.

There are things like Foldit, which is this quasi-computer game but it actually solves real problems in protein folds for molecular biologists. Another example is what Clay Shirky writes about now in terms of the cognitive surplus of people out there - harnessing not just the computing capacity of machines but of human beings as well. That is an incredibly important part of it.

On the mobile side, and I would include in that smartphones and tablet devices like the iPad and others, I think the interesting thing from a scientific perspective is will that result in an end-to-end digital workflow for the scientific lab. At the moment we have these analog breaks: People keep their lab notebooks on paper, for example. I would like to think and believe that over the next 10 years we are going to end up with the norm being an end-to-end digital workflow within any scientific lab basically. There is nothing just on paper anymore and I think that makes a qualitative difference because that changes the way you do things just because everything is digital.

You are going to have a wider scale adoption of lab information-management systems whereby people’s information is automatically backed up and stored in a way that makes sharing or retrieving information easier. It will be a way that is tidier than trying to find someone’s notebook or a printout of a graph in the physical world. It makes things much more feasible and enables things to be much more efficiently handled and managed within a research lab if you do have an end-to-end digital workflow. The reality is that today we don’t, and I think that will come to fruition now that we have tablet devices.

Ed: I would also make the observation that in the next 10 years a new generation of scientists that have a greater familiarity and openness to working in different ways will certainly be much more open to that digital workflow.

Timo: That is a common hope and one that I share, although there is some evidence that suggests that it is also the case that young Facebook-using or Wikipedia-reading graduate students come into the lab and within a very short period of time they are acting like their professors in being more circumspect about using new technology and being open. This is because as soon as they finish their PhDs they are thrown into a very competitive world of post doctorates who are pretty much hand-to-mouth in looking for positions and grants. It is often not in their interest to be very open and by contrast you get some professors who are

very open because they have tenure and don’t need to win the Nobel prize and don’t need to prove themselves anymore. You often find that the most proprietary, and at some level, the most resistant to new approaches and technologies are actually in that middle ranking because they are in the most competitive environment. There is also a generational effect. Don’t get me wrong, but I think it is layered on top of other effects, which mean that the incentives don’t always work to enhance that. I am generally an optimist, so I believe we will overcome those challenges and the incentives will be better aligned with the interests of science as a whole but I realize these things take many, many years.

The other thing I was going to raise, which these days is less spoken about but I think is just as important, particularly in the scientific context, is what you might call the semantic web. I don’t really mean the semantic web with a capital S and a capital W in terms of the specific technologies that have been pushed by relatively hardcore people who say that all information needs to be structured on the web and when we were talking about RDF and those kinds of technologies and very rigid ontologies and those sorts of things. I wrote about this in the fourth paradigm article, which I think you mentioned you had had a look at.

What is important here is that there is now widespread understanding, at least by people who work on the web or work on building products or services for it, that semantic links are just as important as navigational links, being able to link together two pieces of information or being able to employ a common format or a common vocabulary for things in order that multiple sources of similar types of information can be aggregated much more cleanly than otherwise possible.

Those types of things have seeped into everyday understanding of what good practice is on the web. We find ourselves in the world where the hardcore vision of the semantic web, where every webpage is going to have RDF and everything is tied to an ontology, something I don’t think we are seeing and we probably will never see. I think we are seeing a light version of that, which is light in its implementation and its approaches but not in its ultimate results.

With things like tags and HTML micro-formats, those kinds of lightweight approaches that are easy to adopt, I think we are seeing more and more information being linked together semantically, not just navigationally. As I wrote in that piece, over a period of time, certainly over the next 10 years, this will have a profound impact. We are also seeing some people in organizations who have been traditionally not been so interested in the semantic web side of things getting more and more interested.

Page 81: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

September 2010 [email protected] 81

For example, the purchase by Google of Metaweb is a really interesting development, possibly partly stimulated by Wolfram Research’s development of Wolfram Alpha. I think you spoke to Stephen Wolfram previously as well. There is interest in being able to do intelligent querying and the reality is at some level for some queries you need information organized intelligently in order to be able to do that.

Before, at least in my mind, Google has been somewhat dismissive of the hardcore semantic web approach in years gone by, not without some justification. I think you see in moves like the acquisition of Metaweb, a move in the direction that, well, actually, we realize that having structured information is really helpful and you can’t just derive everything from the massive links on the web, you need to sometimes put more effort in upfront. I don’t want to particularly pick on Google but I think that is one type of example of the broader awareness that semantic linking of various kinds is useful.

We are moving towards a world where having information that isn’t semantically linked to other information, I hope, will be as unthinkable as having a webpage that isn’t linked to or from other web pages. It will just be a natural way of thinking of things. It has taken a while for us to evolve in that direction because it is obviously less tangible. An application hyperlink is a very tangible thing, a semantic link is something people have a harder time appreciating.

Ed: It is much more challenging to implement and to get people to agree on the arguments about taxonomies and dealing with applying computational techniques to natural language is always a bit of a challenge.

Timo: The reality is that you are never going to get everyone to agree. This is why the whole process is incredibly messy. You are not going to get things to line up beautifully or get everyone to use the same taxonomy, ontology or the same data standard. It is going to be messy just like the web is messy. You can’t get people on the web to agree on exactly how to link. There is a certain baseline or foundation about this is how HTML works, but even HTML, different browsers behave in different ways. Hopefully, we will have a more consistent set of tags and rules when HTML 5 matures but we are not there yet.

The way that people set up websites and link websites and structure websites and so forth is massively different. The web is an incredibly messy place but it is still an incredibly valuable place. I think we will see similar trends with semantic linking and semantic information as well. There is not going to be one true way of doing it, much as that would be nice in a perfect world, it is not going to happen. We have to put up with the messiness and with information loss and

noisiness in the signal, but that doesn’t mean we can’t pull out useful information.

I think it is going to be a combination of best-practices approaches or increasing awareness of these kinds of issues and building up the network effects in order to create incentives for participating in that semantic linking and tagging. At the same time, and you were alluding to this earlier, developing better algorithms to extract useful information from natural language. Natural language isn’t going to go anywhere soon.

Human beings are going to continue to use language to communicate with each other that computers find very very difficult to parse. I think there is going to be a combination of structuring data better, but also having better algorithms for extracting information from that structured or semi-structured information or just from free text or other forms of language human beings use to communicate with each other. Again, we are interested in that area of: How do you extract useful scientific information using a computer based on information that wasn’t designed to be used by a computer? It was designed to be consumed by a human being.

These days we have much more scientific information out there, even in the peer-reviewed scientific journals. If you go beyond that to the repositories and the gray literature and all the other kinds of information out there, there is much more than human beings can scan unaided. How do you get computers to help with that? We are going to have to have better algorithms as well as better marking up and structuring of the information as well. Those things are both happening and will continue to happen and I think will result in qualitatively different ways of engaging with scientific information in 10 years’ time than we have at the moment. In other words, it will be much more to use software to help you identify and summarize and aggregate information, not just search engines but more specialized.

Ed: It is the vision of some of the AI researchers that we will have software programs essentially that will be capable of conducting their own research and providing recursive improvement and optimization to their own methods over time. Coming back, tying that back to this idea of cognitive surplus or intelligence optimization, it ends up opening up really compelling opportunities in many areas but particularly in science where if we start to see more acceleration in biotech, energy research, physics and basic materials research that can create some pretty compelling innovations.

Timo: I think that is right and people have been doing work and continue to do work on taking as far as literally having robot scientists that design and conduct

Page 82: 2020 foresight - Tech Views of the Future - Ed Maguire

Timo Hannay, Nature Publishing Group US software

82 [email protected] September 2010

their own experiments. That whole area is still very much in its infancy, for my own part I hope they never go as far as actually taking away too much of the work of drawing out the insights and making the discoveries because that is the fun part of science.

I think at some level it will always require human beings, not least because that is the whole point of science, that human beings end up understanding more than they did at the beginning, and if you left all of that up to the computer, you would kind of miss the point. I think there is a lot of opportunity for taking the drudgery out of science. Let’s get machines to do things that graduate students are usually forced to do, some of the mundane aspects of doing science. Secondly, some of the things that are just not possible now, which means things like scanning very large volumes of information whether it is text or other types of data for drawing out insights or potentially interesting leads.

Again, there is some very good work in a number of different areas going on in that and I am optimistic

that even though it is a very difficult problem to crack, we are making progress day by day. Over a time period of a decade or so, those kinds of incremental progressions on a number of fronts will really make a material difference to the way that scientists work in the future. It will be natural, just as it is natural for us to use word processors and spreadsheets and web search engines to use more specialized tools for drawing out insights from scientific information going forward.

Ed: I think coming back to your point about human intervention. I go back to this quote I found from Albert Einstein, which says, ‘The only real valuable thing is intuition.’ And I think really sums it up. The more powerful the tools get, that is not a capacity that can be automated or even isolated at this point.

Timo: I think it is what drives scientists to drive the work that they do. The aim should be to try and remove the drudgery and leave the scientists to draw out the insights. That is the important bit, the human bit. It is also the fun bit.

Page 83: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

September 2010 [email protected] 83

Parker Harris, Salesforce.com Parker Harris co-founded Salesforce.com with Marc Benioff, Dave Moellenhoff and Frank Dominguez in the spring of 1999. As executive vice president, Parker oversees the development of all software for Salesforce.com.

Prior to founding Salesforce.com, Harris developed web-application and salesforce-automation expertise at Left Coast Software, a private consulting company he co-founded, as well as at Metropolis software, an early pioneer in field-salesforce automation, subsequently acquired by Clarify.

Parker graduated from Middlebury College with a bachelor's degree in English literature.

Social networking, an accelerating paradigm shift Social networking is a phenomenon that has taken the last decade by storm. Facebook, Twitter, MySpace, LinkedIn and a host of other sites, now household names, were barely conceptual inklings 10 years ago. These platforms offer connectivity and interactions that were previously unimaginable. With users in the hundreds of millions, and millions more added every month, social networking is closely aligned with the Millennial generation and is now an inseparable part of contemporary life.

For the enterprise, this has deep implications. As the social-networking generation begins to matriculate into the workforce, workers are frustrated by the rigidity and restrictions legacy systems impose. ‘Why can’t business applications be as intuitive as Facebook?’ is a common question that is already being addressed. Future workers are engulfed at home with a socially networked and connected world. In an age where user-created content, the ability to connect with contacts, follow postings and engage in interactive commentary are commonplace, the enterprise must evolve to offer similar collaborative solutions in order to maximize worker output.

Salesforce.com Chatter

Source: Salesforce.com

‘Why can’t business applications be as

intuitive as Facebook?’

Chatter embeds the social networking paradigm

to business

The explosive rise of social networking

illustrates accelerating adoption paradigms

Page 84: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

84 [email protected] September 2010

The enterprise goes social As a co-founder of Salesforce.com, Parker Harris has been a driving force behind the realization of the vision of Software as a Service, the extension to Platform as a Service and now the introduction of the social-networking paradigm to enterprise technology. As the defining vendor in the SaaS sector, Salesforce.com continues to lead the way in expanding into areas, such as proprietary-content-enhancing SaaS (through the acquisition of Jigsaw), an application exchange that predated the iTunes App Store and now the launch of the Chatter technology, which represents the next meaningful shift in the ideas around collaboration, communications and content for businesses. Our discussion centered on consumerization of IT, the impact of cloud computing in the organization and the growing need for innovations around security to address the increasingly open environment.

Key points The combination of “Facebook-ization” and the “consumerization” of IT

will result in task-oriented collaboration and applications that communicate live, actionable information to users via corporate syndication networks.

The concept of applications will evolve from monolithic systems toward smaller components focused on the task at hand.

With the consumerization of the enterprise, there is going to be a huge need for new innovation around security. As the walls around the enterprise IT systems fall away, the consumer internet and the business internet converge and the resulting effect will be more open systems that are more exposed.

The scale of the cloud and the data that resides in the cloud will provide economies of scale that will enable unprecedented sharing and insights from data.

As data becomes more communicable and collaborative within an organization, it becomes more interesting.

The integration of social-networking technologies (such as Chatter) will enable the rise of more task-based computing, as teams come together for a single project or users assemble services to address a specific need. The scale of platforms combined with collaboration-based applications will enable more event-driven communications between data and team members.

The adoption of cloud computing will result in power shifts from the CTO and IT to the business users of IT. There will be a need for IT to be nimble, to be self service, and ultimately IT organization are going to get a lot smaller.

Summary of interview on 2 August 2010.

Full transcript follows

Applications will evolve from monolithic systems

to smaller components focused on tasks

Cloud computing will result in power shifts

from the CTO and IT to the business users

Page 85: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

September 2010 [email protected] 85

Parker Harris transcript Parker: The influence on Salesforce has been heavily from the consumer web and consumer technology. That is what influenced us when we started the company. We were influenced by companies like Amazon and Google. Now a decade later, we are being influenced by companies like Facebook and Twitter. We are also being influenced by analysts like Mary Meeker’s report on mobile internet and big trends in that arena.

One of the trends I see over the next decade is even more consumerization of the enterprise, in terms of the technology. That is going to drive more power into the employees’ hands, and they will be looking for a lot of choices. You are seeing it now when people are demanding various devices and the capabilities they might have in a Google email system.

They want that in the enterprise or they want Facebook in the enterprise. They want it on a really cool mobile device, maybe it is their favorite one, and not just the one IT has agreed to. There will be a need for IT to be nimble, to be self service, and IT is going to get smaller. There is going to be a lot of change in back-end systems. Where are manufacturing, ERP, accounting - where is that going? That is going to get relegated more and more to being the engines of the corporation. It will probably be one of the last bastions of on-premise software. It will go to the cloud eventually. In terms of employees interfacing with these systems, it will happen less and less. There will be back-end systems that no one ever sees that run accounting and manufacturing.

The systems that users interact with will be closer to customer-facing systems. There will be messaging that reaches them from these back-end systems that tell them what they need to do or what they need to take action on. Messages will then go back to the back-end machinery to actually take care of it. A lot of technology will be very task oriented and not application oriented.

You aren’t going to think about applications, but you are going to think about what task you need to do at that moment. I would love to see, and my hope is that, task oriented also means smaller technology, that you build technology for the task. It does the task and then it has served its purpose and it is really great technology. It is not monolithic. It’s not heavy. It doesn’t take forever to deploy and you can also iterate it. That task-based solution isn’t working? You come up with another one built by whatever.

One huge thing that is going to happen is around networking and connectivity in security. Everything is going towards the consumerization of the enterprise. Someone in our company said that his vision is that the firewall disappears, that the

corporate phone disappears and that the corporate email kind of disappears. You use your cell phone, maybe it is too much to say that the email system goes away, but you are no longer tightly bound to all of these systems.

Whether they are housed in your corporation or not, you feel like you are in this walled garden of the corporation. Those walls are going to go down so the consumer internet and the business internet converge. The business internet of my businesses to your business, to my customers’ internet, to my partners, those will all go down and it will all be open. When it is open like that, there is going to be a huge need for new innovation around security.

We look at what happened with some of those firms like Google, Adobe and others that were in this latest Aurora attack from, we believe, China. It certainly originated there. And when you look at what happened there with malware and very sophisticated attacks, how do you move into this future world of the consumerization of the enterprise and have the right controls to basically protect IP and conversations, and basically businesses, from those attacks? They are kind of at odds with each other a little bit.

There will be more firms, whether it is at the networking level where you can create better secure tunnels or virus protection or behavioral analysis. There is a cool startup firm in Palo Alto that is doing log analysis at cloud scale to tell people, to detect fraud or malicious behavior. It is getting to the point that that will happen more and more in real time so that it happens not after the fact. You don’t say I think someone just broke into your bank.

It will happen more at the front end where the systems say, ‘You are doing something really weird and I am going to not let you do that.’ I think over the next 10 years, it is probably going to be a huge flourishing of technology in that area. Enterprises are going to be reticent to open up more and more. Their employees, partners, and everybody else are going to push them for that openness and there will be vendors that come in to solve that conflict.

Ed: There has always been that conflict between availability and security. It has been a hindrance, but when you move into these more distributed architectures and service-based delivery models, the nature of where the data is stored and the relation of person to machine and machine to machine and person to person are dynamic and a lot more complex systems.

Parker: That is absolutely right and that is why for Salesforce one of our top values in our brand is around trust and protecting the information we have in our

Page 86: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

86 [email protected] September 2010

service. At rest within our service, it is very protected. Over time, we are going to add more things and look for partnerships to help when that data is being accessed. We have a lot of tools in our service that our customers can take advantage of to protect themselves. We are going to provide some of that, but I think there will also be some vendors that come up with some very creative things maybe even in the plumbing in the internet to help companies feel that they have a more secure connection.

There is this wide open wild internet which will be like, ‘Ok, I am out in the public world and then I am in my office building, but it won’t be as much of this tight network and behind my firewall I have to authenticate five times.’ It will be more open.

Ed: That has been a critical issue when you try to extend communication and collaboration over a distributed network or network of networks. The ability to tie context into policy without creating these really obtrusive security models and authentication models that is an issue for on-premise networks.

Parker: Yes, and there is a false sense of security with the on-premise solutions that they are secure. They aren’t really secure if they are accessing the internet. Every company needs to be on the internet, so I believe where we are going is that firms like Salesforce and others are going to get more and more secure because that needs to be part of our brand and part of our expertise.

We are an amalgamation of every customer and partner we work with and their best practices. That is why it is going to push cloud computing even further. Companies are going to want to be more and more out there on the internet. What is going on with my company on Twitter or on Facebook? How do I talk to my customers, sell to them, and support them? How do I work with my manufacturers on the back end? All of that is going to be about communication and collaborating. You are not able to do that if you are plugging it all together yourself and do it with the same capabilities that larger vendors that are dedicated to it are going to be able to solve.

Integration has been this problem where people have been creating middleware over the years to solve integration. Every company has problems with it because it is a lot of work and it is all custom every time you do it. For every corporation, how do I plug all my systems together? In the future, I think there is going to be a lot more services.

That vision of a service-based architecture, or SOA, which is the promise behind the firewall. I think some companies will get that, but the return on the expense

to get there is probably not going to be there. Whereas vendors that specialize in it can work with other vendors who are out there. We can integrate with Oracle or we can integrate with . . . if you are using some Microsoft services on the internet, you can also use us and they work together.

We do it once because we know what the interfaces are between the two services on the internet and we work together as partners. More and more, it is going to be . . . there used to be this debate of ‘Is it going to be best of breed or suite?’ It used to be this discussion. In the future, it is going to be best of breed because we can finally solve the plugging together of disparate systems, but do it as services on the internet with defined interfaces that do not change.

Ed: The point you made about trust is pretty important and I think tying back to that ability to ensure that every one of the links in the chain is a trusted component. It would seem that there is an argument that scale for a cloud-service provider, whether that be a Salesforce or a Google or a Microsoft, that acts as an intermediary or someone that can provide that centralized framework for trust will become increasingly important. In turn that would create potential barriers to competition.

Parker: Yes, it definitely will. The larger these firms get, the more defensible their positions are and especially when they leverage what we call cloud scale. Certainly, you have the trust of providing them things and that is a huge, not only in terms of expenses, but also a lot of expertise in how you built the technology and how you operate it. When you talk about scale, you don’t install Google for example and say, ‘Oh, I am going to index the web too. I am just going to buy that box and put it in my business because I want to index the web for some reason.’ You don’t even think about that. It’s not even a question. Google has one of the biggest distributed computers in the world. That is cloud scale.

We have cloud scale for our databases and our analytics and the data that people put in it. In the next decade, we will get to the point that you can’t do that behind your firewall. You can’t get that level of performance because we have the economies of scale of all of our customers that we can invest, but you can tap that. You saw with utilities, electric utilities that used to be small little things and maybe you could have the generator or they could have the generator. Then it got to the point where you aren’t going to install the nuclear power plant. That is where I see utilities going. There is going to be really interesting things that are possible because of that scale. A lot of it is going to be around analytics. Around insight into your business, into your data, into what is going on that you can’t get elsewhere.

Page 87: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

September 2010 [email protected] 87

Also, assuming companies over the years want to open up some of that information and share it with each other in aggregate, you could start to get analytics on best processes. What companies are really good at selling? What are the reasons they are doing that? What kind of analytics can you provide to tell people that? We don’t have access to our customers’ data to do that, but we acquired a company called Jigsaw which is about providing data in the cloud, in a public cloud, and about a community-based way to manage the database. It is starting out now as a contact database for companies to get your email address and your phone number so that they have the right data. That data is already out there, but how can we provide more data and more analytics around that data that you can only do really on the internet.

Like eBay got to scale, it is because of the community that eBay is so defensible. You don’t just say, ‘I am going to build an auction site and that would be cool. You can trade stuff.’ It is not just because of the technology, but because of the people that are on it. As these companies grow, really interesting things start to happen that you can start to give back to the businesses out there.

Ed: It is interesting that you bring up Jigsaw because one of the hypotheses we have had for awhile is that the role of content as a differentiating asset for technology providers is going to increase over time. I was actually surprised, pleasantly surprised, to see that Salesforce was heading in that direction with the acquisition of Jigsaw. There are a couple of points here: First, it appears the ability to automate a business process is becoming increasingly verticalized and narrow, the ability to add value in a custom application on top of a Force.com. This opens up a whole opportunity that is essentially democratizing innovation. Second, this ability to create business models around content breathes new life into opportunities for data providers, data-service providers, and enterprises of all sizes to create new opportunities. I don’t know if you have seen what Microsoft is doing around Project Dallas, but I think that is right at the cusp of that.

Parker: Yes, it is the same thing. What is really interesting is when we started Salesforce in 1999. We and the world thought that the differentiation of being on the internet was going to be content. I don’t know if you remember a little startup called iSyndicate, but it was a tiny little company out here in Silicon Valley that was around syndication of content, providing content from all of these different datacenters.

Ed: There were several of those.

Parker: They went out of business as did others. We were bullish on that and we were looking at ways to

bring content in. It turned out that 10 years ago people were just looking at a better solution to these enterprise systems that leveraged consumer-type technologies that felt like a website. It had a full text search engine. It had a very easy to use interface. We just built something that was easier, more usable and friendlier. That caused the adoption and I think that is where we have been for the last 10 years.

What is happening now is that we have that and instead of it being a system of usable data browsing, it is becoming a live system that data is being pushed to you, kind of like Facebook. Also, that data is becoming interesting again. With what we are doing with Chatter, which is our Facebook-like social networking - that is a simplification of it - for the enterprise. When you combine that with data and you want a syndication network to tell people about that data, Chatter is the syndication network.

Ed: That has some fairly profound implications for communication and collaboration within an organization. Commentary I have gotten from some of the early users of Chatter is how this can flatten organizational structures and provide an answer to some of the collaboration and search problems that have been bedevilling enterprises for the last two decades.

Parker: Yes, it is definitely a flattening of the enterprise and even if you look at what we are doing internally in running the development organization, we use a process called agile or scrum. It is a very bottom-up, team-based process. I think you are going to see a flattening of the organization where organizational hierarchy has a whole new meaning. Certainly you need the command and control at the end of the day for decisions to be clear who is going to make a final decision, but if you look at the collaboration, it happened over email ineffectively. At least here we use a ton of email. Conversations happen and decisions get made on email and who gets pulled into that email is part of the decision.

It is not a hierarchical decision of who is on that email, but it is ineffective because some people might be left off that should have been part of that. Instead of directing it to specific people and maybe making a mistake of who is supposed to be included, the Chatter system includes people through the metaphor of following. You get included because you’ve shown interest and you discovered these things because you are told about it. It creates that collaboration model that really doesn’t look at the hierarchy of the enterprise, but looks at team. Teams change week to week depending on what you are working on.

That is back to what we were talking about earlier in the conversation when I was talking about task-based

Page 88: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

88 [email protected] September 2010

technology. If you think about Chatter and virtual teams that are forming, they are forming around specific tasks in the enterprise to do something. It might be to close a big deal. It might be to solve a support case. It might be to build a product. It might be to acquire a company. All of those are examples of projects or initiatives or tasks and people are coming together to solve them. A lot of that, by the way, will be done on mobile devices. You look at what is happening with computing.

Apple has done a phenomenal job in changing mobile technology and bringing in touch interfaces and unlocking that whole market through their technology and their partnership with AT&T, which we could talk about at length. They have really changed that game and everyone is following them. It is happening really, really quickly. I tried to use my iPad instead of my laptop and it is just not there. I tried to use it going on a trip and still there are things that were lacking there, but there are going to be different ways to use computers. I saw in Wired magazine there was some cool new computer that had two touch screens that opened up like a clamshell and you could have two different things happening on it. You could be looking at a spreadsheet on one and something else on another. Maybe you are watching a movie on one and you are doing your work on another.

Ed: It is like a slide-able dual-screen laptop. I think Sanyo might have it available in Japan.

Parker: That was ‘Wow, that is a different way to think about computing and interfaces!’ I was surprised to see Microsoft not do more with its surface technology that I thought was pretty cool. It didn’t really go anywhere. There are going to be changes in how people interact, and it has been a long time that the graphical user interface and mouse and menu have been around.

We saw a shift to internet UIs based on consumer websites, but now we are going to see another shift that moves that to paradigms that involve touch and different ways to interact. I don’t know if you saw a company called Flipboard. It is just a new startup, but it is a pretty cool UI on the iPad for looking at content. If you look at the newspaper industry, they created the UI for viewing a lot of content in a newspaper. Headlines and how do they put the articles on the page. It is not by accident that they are designed that way. Maybe there is something there, if you combined it with touch, the computer and browsing, what does that look like? There are people that are experimenting with that. How people interact with computers will change again during the next decade.

Ed: We have been talking with folks about touch and even non-touch interfaces with some of the work

around gaming technology and motion sensing. There is even TED Talk from a company called Emotiv showing how you can wear a headset and train your brain’s theta waves to issue commands to a computer screen. It is pretty amazing if you think about the potential for people who are physically disabled or who are immobilized in one fashion or another. Of course you have speech technologies that continue to evolve. That idea of having these services that essentially have ubiquitous bandwidth along with having access to the mobile device and being able to access these cloud-based services - it is a theme that has come up in several of our conversations. I definitely look at the cloud-service providers like Salesforce and some of the application vendors and people that are developing more services to play a big role in that.

Parker: Yes, I totally agree. Look at mobile specifically. Years ago when we started Salesforce, Blackberry was getting popular, but it was about how I put all the computing on the device. Connectivity wasn’t really there so you had to basically have the entire app on the device. Now, connectivity for mobile is for the most part essentially there with 3G. You can argue that in third-world countries connectivity isn’t there, but over time it will get better. You don’t have to have the entire app on the device and you can have a very rich experience leveraging computing power in the cloud to give an incredible application without having to have a massive CPU and a ton of memory and storage on the device, which essentially ruined the form factor.

I remember Jim Balsillie was at one of our events years ago. He got on stage and talked about how difficult it was to figure out the battery life of a Blackberry and people talked about packing it full of power. He said it is basically like building a bomb. How can you pack a ton of energy in a very tight small space? That is really not going to go away. There will be more and more computing power in these devices, but I think we should assume there will be computing power in the cloud that will always exceed that and will make that experience even better.

Ed: The opportunity for innovation in dedicated applications running off a platform as a service or an infrastructure as a service is one of the themes that we keep coming back to. Your vision around Force.com and the potential transition in business models that it is creating and the potential opportunities for service providers or systems integrators, formerly VARS, is amazing.

Companies that are very close to the ground that understand business processes in very specific applications could lead to a real explosion of these, call it micro vertical applications. We would love to get your

Page 89: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

September 2010 [email protected] 89

thoughts on innovation on the cloud and on platforms such as Force.com.

Parker: Ironically, we started out before Apple and we had something that we called the AppStore. We gave that up when Apple launched its store. If you look at Apple as an example, the apps that are on there are all pretty small, tight and purpose built. There are a lot of games, but there are other specific things and it is a huge success. You look at the pricing of them and it is a no brainer to buy an app for US$1 or US$2, but these companies are making massive amounts of money on small transactions.

If you look at the AppExchange, it is slightly different, but I see it evolving towards that model and I think mobility and Chatter will also affect that. On the AppExchange, we have the Force.com platform that people are building applications for specific tasks. Some of these are coming from the world of client-server computing and what solutions do we need to solve for sales people? A lot of our people are in the sales area right now, but it will start evolving to where these applications are even tighter and more task built.

They will also be mobility enabled because our platform will allow these apps to be mobile and socially enabled so the data, documents and the people are connected in these apps because they are built on our platform, which has Chatter built into it. There will be a lot of enterprise systems that are going to be built on our service on Force.com and the app exchange, but part of it for me is that they want to see that it is a very open system. They want to see that these applications are open and interact really nicely with other services on the internet and that we work with other vendors as they are working on technologies like OpenID and OATH, which is for identification and authorization-based APIs for making it really easy to code up integration between two services.

You are seeing it with Facebook . . . Facebook Connect is very open. People are putting buttons on their websites that make it very easy to throw something on Facebook. You see applications on Facebook that are embedded in there. You are going to see that more and more, and I do think you are going to start to see user applications. Facebook is all about applications for the user. AppExchange is about applications for the corporation and the corporation decides to buy them or not.

I think you are going to see applications that users are going to buy for themselves, for their jobs. Certainly multiple users might start using them, but it will be initiated from the user. They will grab an application and it will start working for them. It will work within Chatter and then maybe other people are virally added to it just as you see on Facebook.

A lot of it is around gaming. You start using Farmville, you buy your tractors and stuff, and you invite other people to come onto your farm and many of these other applications. I see people grabbing collaboration and productivity applications for doing their jobs for the enterprise that other people join and that drives that marketplace.

I think it will drive it even faster than the typical IT model which is ‘Let’s figure out what the problems are we need to solve. Let’s go do the selection. Let’s implement the technology.’ You go through this long process and by then the users are beyond you. What is going to happen is that the users are going to try it and either it works or it doesn’t. If it starts working, great, and more people will use it and the best technology, most adopted technology, will win. In contrast to that model, where we are very much, is the direct sales model to the top brass of the corporation.

Ed: There will be a lot of innovation over the next decade. I would have to concur. I think the question that I have for everybody is what are the big risks, challenges, or even black swans, out there that could create challenges or havoc over the next decade? Could it be privacy or security? Would love to get your thoughts on some of the challenges to realizing the promise over the next decade.

Parker: As we have talked about: trust, privacy, security and compliance. All of that is something we look at every day and invest heavily in. It will be something that is an obstacle that must be overcome. There will be legacy vendors that don’t want to see this change happen. If you look at Oracle, they are investing heavily in Fusion which is basically putting a new skin on all their apps, albeit the UI is not bad. It is a more modern UI, you know, good for them. They are stitching together all of these acquisitions so your company can run all this software behind the firewall. They don’t really, in their hearts, want to see everything going to the cloud because they sell a bunch of software, a bunch of databases, and now they are going to sell hardware with the Sun acquisition. They will essentially be competitors with IBM now. They don’t want to see this happen.

IBM is kind of interesting. They are very service oriented, selling not cloud services but selling services. They are moving a lot into the cloud. They are moving Lotus into the cloud. A lot of that is around a unified messaging play. Cisco is doing that as well. I think there are going to be some vendors that are going to want to stop this from happening to make the market where they want it to go. Just like when you saw mainframe going to client server when Oracle was born. There were companies in the mainframe world that didn’t want to see that happen. We are still in the middle of this change.

Page 90: 2020 foresight - Tech Views of the Future - Ed Maguire

Parker Harris, Salesforce.com US software

90 [email protected] September 2010

The “Facebook-ization” of the enterprise and that influence and mobile are just part of the evolution and established probably over the next decade. After trust, privacy and security. I think it is going to be a lot of vendors that don’t want to see it happen and have a lot vested there. CIOs and IT organizations that feel like they are losing something and certainly the Oracle corporations of the world are going to influence them and make them feel like they are losing something as they move to these services. They perceive that they are giving up power to their employees, which is kind of crazy. You want the employees to be more successful, but you are going to see power struggles in the enterprise.

Employees want more technology and want to be served in a better way. IT sometimes may not be able to see what is in it for them, when in fact it could be huge for them. We are seeing a lot of companies move to that, but sometimes it is a struggle. As you get to the large corporations, they have a lot invested there and they don’t see the opportunities. You will see that conflict happen within the enterprise and I think that is a challenge that has to be overcome, both for us and other cloud vendors as well as for the enterprises themselves. I think it is the employees. At the end of the day, that will drive this. They are going to be pushing for this to happen.

Ed: We have really seen a shift in power over the last decade and I think it is only going to change in the next 10 years.

Parker: Definitely, although when you saw the economy drop there for a little bit, power shifted back to the CIO and IT. When things became purely economic decisions,

sometimes they could make decisions that might not be the right one, but they would do it for economic reasons.

Ed: It will be interesting to predict who the next Amdahl or Wang Computer will be for the 2010s, but certainly we have some creative destruction going on and some surging new ways of solving problems that are emerging. There is a lot of opportunity for value creation.

Parker: Yes.

Ed: That is really helpful, Parker. One of the interesting analogies that has come up is that when you look at the cloud IT resources, you often hear this analogy of a utility. Innovation is a personal passion of mine, to understand how it evolves. If you look back to the preindustrial revolution, if you wanted to make bread, you had to have a mill that was next to the water with a wheel that would generate the power to grind the flower. As soon as you got electricity that could be generated by a utility and accessed by plugging into a wall, you have had this incredible increase in productivity and output and value creation across all sectors of the economy.

I think the analogy is that in many ways we have had this build-it-yourself approach over the last three to four decades. Now, with the ability to plug into the cloud and get those resources without having to worry about all of the individual components, I think the potential for value creation over time is similarly promising. That is a bit of the vision that we share and we are starting to become more convinced of. We are excited that you guys are going to be apart of it and appreciate your thoughts on all of these topics.

Page 91: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

September 2010 [email protected] 91

Dave Kellogg, MarkLogic Dave Kellogg is CEO of MarkLogic, responsible for the company’s worldwide business and setting the overall direction of the company. He is passionate about the database and analytics-software industry where he has spent the past 20 years in marketing, operations, strategy and product lifecycle management roles.

Prior to joining MarkLogic, Dave was senior VP of worldwide marketing at Business Objects, a leading business-intelligence software company. Before that, he was VP of marketing at Versant Object Technology, a provider of object database management systems. He also held a number of technical and product marketing positions at Ingres, a provider of relational database systems. Dave sits on the board of the big data analytics company Aster Data.

Dave has a dual bachelor’s degree in geophysics and applied mathematics from the University of California at Berkeley, and a master’s degree in business administration from Saint Mary’s College. Dave is an active blogger and author of the award-winning Kellblog.

Capturing structure in the “Wild West” of unstructured data Data growth has been exponential, and managing the storage processes related to the capture and movement of data has become more complex. Various industry analysts concur that the growth of data over the next few years is likely to be enormous. Gartner estimate that 40 exabytes of unstructured new information was generated worldwide in 2009, forecasting 650% growth in enterprise data over the next five years. An August 2008 study by Forrester Research quantified the following: 50% growth reported in online transactional data and repositories annually, 27% of enterprises surveyed with production databases of 50 or more terabytes, and 20% of datacenter-infrastructure spending is directed to support data growth.

As mobile devices penetrate societies around the globe and demand for entertainment and information increases, data is being produced at an unimaginable pace, a phenomenon resulting in “big data.” In the future, every device, every machine and every checkpoint in the physical and online world will be connected and always on. The resulting “Internet of Things” will create logistical and technological challenges which must be addressed. The coming age of big data requires significant advances in storage and accessibility solutions so analytics can be quickly and accurately preformed in real time in order for the benefits of a truly connected world to be realized.

Data growth continues unabated

(Total bits shipped)

100,000,000

10,000,000,000

1,000,000,000,000

100,000,000,000,000

10,000,000,000,000,000

1,000,000,000,000,000,000

100,000,000,000,000,000,000

1970 1975 1980 1985 1990 1995 2000 2005 2010

Source: Ray Kurzweil and KurzweilAI.net

With explosive data growth expected,

managing data becomes more challenging

“Big data” requires significant advances in management, storage

and accessibility

Data continues to grow at an exponential rate

Page 92: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

92 [email protected] September 2010

Showdown at the unstructured-data corral Our conversation with Dave Kellogg covered the “Wild West of data,” the world of unstructured data which includes spreadsheets, word documents, images and video or essentially any type of file, document or data that doesn’t fit in rows or tables of a traditional relational database. Dave has been involved in business intelligence and unstructured data throughout his career and his observations are opinionated and always insightful. His firm MarkLogic is essentially a specialized database for XML data. Our topics ranged from how information is organized to the future of the traditional software vendors.

Key points Unstructured information will be “database-ized” through platforms that

help to organize data.

Tagging information will become more automated and intelligent than today.

Text mining (analytics on unstructured data) will work with increasing accuracy. This will generate advancements in automatically generated metadata (“data about data”) for unstructured information. This will make it easier to identify and extract information, effectively “liberating” information stored in disparate systems.

“Big data” will disrupt the traditional database market as cloud adoption catalyzes a new paradigm in computing. Traditional IT, database and BI vendors are at risk of losing out in the new paradigm of cloud service and big data.

Another “big bang” is coming in data and analytics, and it will be driven by the cloud, resetting the whole IT stack. While there aren’t a great number of successful SaaS companies today, this will change over the next 10 years.

There will be new methods for analyzing data in-memory and in-database, both structured and unstructured, at scale and in more intelligent, predictive ways.

‘As data continues to grow, you can’t pull data out of the database and plug it into an enterprise-data warehouse or SAS statistical software, you will have to plug SAS into the database.’

A new breed of information applications will leverage the cloud for scale and apply analytics to existing datasets, creating opportunities for content providers to gain vertical share.

‘Salesforce.com and Netsuite are empty things. The same can be said of QuickBooks. New models are forming that sell data with the applications.’

IT has transitioned from a role of developer to integrator, and IT will continue to transition to a role as a services integrator focused on cloud services.

Summary of interview on 14 July 2010.

Full transcript follows

“Big data” will disrupt the traditional

database market

IT has transitioned from a role of developer

to integrator

Page 93: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

September 2010 [email protected] 93

Dave Kellogg transcript Ed: You have talked about unstructured data being the Wild West, and we continue to see creation of data and information increasing at an exponential rate. Ten years from now, is it going to become so unmanageable that we need to find new ways to filter, access and focus data? How do these management challenges and potentially use cases change over time?

Dave: I’ll start close to home in unstructured data and databases. This ties right to why I joined this company, What I mean is I came to Marklogic because I wanted to do unstructured data, having spent almost 25 to 30 years “platformizing” databases - basically the separation of the database from the application. The big thing that happened with the whole relational database movement, which I will call “platform-izing,” is getting it into a database, separating out unstructured information or a document-oriented application from databases, enabling analytics on top of it and enabling control and dynamic reuse.

I’ll give you a simple example. Most marketing departments today may have a SharePoint portal, and post the latest version of the company’s sales deck and the latest data sheet, but the portal that has all the sales data isn’t hooked to anything that has anything to do with how RFPs get made in the field. It’s not hooked to how presentations get made in the field. And it’s not hooked to the web content, so you have all this redundant, inconsistent information. We used to have this in databases, right?

Some little companies run on a bunch of disconnected excel spread sheets, and eventually somebody comes along and says, ‘That’s crazy. You can’t run your finances that way. You need to put that in a database and have a master copy and have data integrity and reporting and analytics.’ Well, just picking on documents for now and legal or any other department, all the content is strewn all over. It’s redundant and inconsistent. I think in the next 10 years, that’s going to change.

The unstructured information is going to get “database-ized,” and we are going to get platform-ized in terms of a platform for unstructured information, such as by an ECM mechanism or SharePoint. SharePoint doesn’t solve web-content delivery, or custom publishing, or create RFPs dynamically, etc, so I think you are going to see a lot of change in how people view documents. It’s really the Wild West to me; this stuff is everywhere. That’s the first thing that I think is going to happen.

The second thing that I am reasonably excited about is text mining, for lack of a better phrase. It almost works today, and I think in 10 years it is actually going

to work. Good text-mining algorithms today, if they are properly trained on the right set, can get 80% accuracy. That’s not 100%, but they can get up in the 80s and a lot better than the 50s and 40s of others. I believe that is only going to continue to improve, and we are going to have more automatically generated metadata about unstructured information since that’s what text-finding tools do.

You will actually be able to run queries that, all of a sudden, like reputation management as an app, will be a lot easier - even silly, simple things like looking for slides in the corporate corpus about the company Apple. You know Apple is a company because a text-mining tool found it and you’re not finding all the slides that talk about apple the fruit. Twenty years ago, if you typed a word wrong in a word-processing program, you would be shocked if it suggested a spelling improvement.

Today nobody uses the smart tags in Office because they are really irritating and don’t seem to add any value. But I believe 10 years from now, we will expect not only spell checking. When you type in orange, it’s going to, looking at the context, understand you meant the company, and true or false, just with a little click or just maybe continuing to type if it got it right, you are going to be marking up all this content automatically. You can see the future today, Office has it today, and it’s called smart tags. It doesn’t work and everybody hates it, but that’s going work in the future because we will view mid-level language processing - identifying people, places things, travel facts, credit card numbers - the way we view spell checking today; we will take it for granted. We will think, ‘Of course it spell checks, of course it finds entities, of course it identifies fact, of course it’s putting metadata in.’

Word has had properties for decades, but it’s awful. If you go to SharePoint and look at the properties associated with a document, they are wrong all the time. It’s the wrong author, half of them are blank, it just doesn’t work. What will work is automatic identification and extraction. I am positive that the technology just isn’t quite there yet. This breeds the whole semantic web thing: where does it stop? We are capable of handling, what I call, one level up from spell check. Can we really get up two levels, where we you know Bandit is a dog, dogs barks, therefore Bandit barks? Being able to take the first two search entries and derive new knowledge (ie, induce the third per se), I don’t think that’s going to happen, but it’s something people talk about happening in 10 years. I think they’re dreaming. I think in 10 years, we can actually get the tagging right and then you will be able to run better searches and queries to find information, but I’m skeptical about the inference capabilities.

Page 94: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

94 [email protected] September 2010

While there is a lot of semantic web stuff that I agree with, I think representing information triples and searching triples is nice. Basically to me, it’s an 80% to the fourth problem. What I mean by that is 0.8 x 0.8 x 0.8 x 0.8 = ~0.3, so when you take four operations that work at the 80% level, stack them on top of each other, you are working at the 30% level.

That’s what I think the problem is, because the entity extraction only works 80% and then the fact identification only works 80% and the inferencing only works 80%, and you pile that all up and you just have garbage. I am not a huge believer going forward in the derivation of new knowledge from existing content, but up until that boundary I am actually on board. I think we are going to be able to harness and liberate the information we have.

Ed: In many respects, it’s your classic data quality problem, just on a semantic level.

Dave: Yes, on a vast scale, on a semantic level, and with automatically generated data. I think we have kind of proven companies are not going to hire librarians to build tags, except for special certain applications. Your average company will not hire a librarian to go around to find all the content, put it in a taxonomy, write abstracts for it, and apply keywords. That’s just never going to happen. Users, unless they have great automatic tools, aren’t going to provide any good metadata for content (ie, nobody fills in the properties tab).

It leaves you right where we are today. If you are an optimist, I think we’ll end up with automatically generated metadata, which is an automatic way of saying yes when the application guesses the metadata properties. That’s the issue. It’s kind of like data quality, 0.8 squared because you have the same problem with data quality, but the data is being generated, not necessarily just collected.

Ed: To paraphrase, what you are seeing will be interactive where the data will be automatically generated, but you have an opt-out or a potential way to correct so that it is a relatively non-intrusive work flow for users. It is not asking people to create their own metadata, which is just an additional step which no one wants to do anyway?

Dave: Yes, that’s exactly what I am saying. The other reason I jumped from numbers to words is that words are way much harder than numbers. You know 10 is always less than 11 and always greater than 9, but sometimes you want fruit to match apple, but sometimes you don’t. If you are searching for all fruit tart recipes, you want to get apple tart even though

the word fruit doesn’t appear in apple tart. It’s that kind of subtlety that makes words hard and fun, and makes interpretation of semantics hard.

I think we will make some progress with this semantic web vision, but I think that some “go off the reservation” who take the full vision where you are inferencing new facts, and then storing those facts in there, so we have determined that bandit is a dog, because all dogs, or only dogs barks, and bandit barks, therefore bandit is a dog - then to shove this derived knowledge into a database so it becomes a fact, and we are going to do more inferencing on top of that. To me that’s just like whoa!

Ed: Each one of those false assumptions might hold up until you put some weight on it.

Dave: It’s just garbage, totally. I’m a big believer in part of the semantic web vision. That’s the nice thing about the 10-year timeline. Maybe somebody figures it out, but not in 10 years.

Ed: It’s not about the limitations of processing, and it’s not the physical limitations, necessarily, of how much storage we have. By that time Moore’s Law kicks in, we aren’t bounded as much by pure technology, it comes down to software and being able to write the logic on top of that so we can parse it.

Dave: They are algorithms, and they are hard. More computing power will help. The problems today are not a result of computer power limitations. It’s the limitations of algorithms.

Some of the other things that I think of when I think 10 years out . . . you know I love the whole IT as a utility, cloud computing and API-based stuff. I think that’s all very very real. I think organizations in general don’t like their IT departments, and with SaaS apps, Salesforce.com and cloud-based applications. And cloud-based APIs are going to help drive what you could call utility IT or whatever you want to call it. I think that’s going to be very real, and I think lots of computing power is going to be moving out of companies into cloud providers.

I think people love the dynamicity of cloud. There will probably be some privacy disasters, but they love the dynamicity, they love the efficiency, and they basically love the fact that it lets them focus on their business. I can tell you already for us, just the simplest example in our current day-to-day life, when our customers want to do a proof of concept, we show up and say we can do this with stuff. We get some guy who owns a business, because that’s who we sell to particularly in media, to want to try our software.

Page 95: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

September 2010 [email protected] 95

We tell them, ‘Look, we can build you a POC,’ and he’s thinks, ‘Oh my gosh, I will go to IT, get rack space, get approval, have a committee, and blah blah.’ Or, we can just throw up the proof of concept on Amazon’s EC2, and he’s thinks ‘yes’ because that lets him bypass his IT department, which is overburdened, buried in work and probably, depending on the company, bureaucratic, just not agile. Throwing it on EC2 is basically a business that treats him like a customer, and this is the original reason that we moved to Salesforce.com. I told them, and this is years ago, ‘IT, you are not treating us like a customer, and I am going to find somebody who is.’ That’s going to happen over and over again at the infrastructure level whether it’s disk, box or CPU cycles.

That’s one of the beautiful things about the cloud. You can buy it at so many different levels. If you want disk, if you want database, you can get it. If you want a full blown application, you can get it. If you want an API on top of that, and so on. To me, it’s actually quite exciting because the summary has been around for quite awhile, Ed. This goes all the way back to the CORBA vision, of kind of service orientation and that you’d have a bus out there and that bus is now the internet where you can access services, call them, pass them information, and they will do it all properly for you. That vision is going to happen, and I think IT will transition to a role as an integrator. I think in the old days when IT used to be the developer, and then IT became the integrator.

I think what happens in 10 years is IT become the services integrator - weaving together cloud services, apps provided and APIs to those apps provided as cloud services. I think it all makes sense because who wants to be setting up computers and buying disk space? Nobody. Who wants to be configuring operating systems and maybe even databases? Nobody. Who wants to go install their own salesforce-automation system? Nobody. I think you are going to see a lot of change in those areas.

By the way, I do believe that while it hasn’t really happened yet, it will eventually disrupt some of the oligopolies. IT today is starting to look a little too much like it did when I joined IT in the mid 80s. IBM ruled the world, you had to buy a mainframe, it was incredibly expensive. You bought the whole stack, you bought everything, you bought the disks, the computer, the OS, the application, the database; you bought it all from IBM. It cost a fortune, and it’s like déjà vu all over again.

Now you go to Oracle and buy your database machine with the hardware and the storage and the app. I feel like I joined right after the client server big bang and everybody said, ‘Screw you, I’m going to buy my own

computer.’ Now it’s like, ‘Screw you, I’m going to go put up an EC2.’ It’s déjà vu. ‘Forget you, IBM. Your margins are way too high. I’m going to go buy a thing from this little company called Oracle,’

Now it’s going to be ‘Screw you, Oracle. I’m going to go run it on Cloudera, and I’m going to go use Hadoop in the cloud. Forget about you. I don’t need your tables. Your tables don’t work very well representing the information I have today because I’m not just trying to store numbers, I’m trying to store everything. You’ve got a database that can only hold 10% of my information, so I’m going to go put it, hopefully, with Marklogic, or if not in any of the alternative databases out there, such as Cassandra, Hadoop - all that stuff.’

I’m a big believer in what I call database alternatives and alternative databases. I view things like Hadoop and Cassandra as database alternatives because they don’t quite meet my definition of database system, but nevertheless you can use them. They are great infrastructure for certain apps. I view things like Marklogic or Aster Data as alternative databases, which is that they really are databases, but they are either not relational, or in someway they are different than Oracle. In Aster’s case, it’s super high-end analytics.

I think we are going to see one of these big bangs again, and it’s going to be driven by the cloud. And, as much as the move from mainframe to mini reset the whole stack with it. I honestly believe the cloud is going to reset the whole stack. It just hasn’t happened yet. There aren’t that many successful SaaS companies, but it just hasn’t happened yet. It didn’t happen overnight last time either.

Ed: I found it intriguing how Reed Elsevier and other proprietary media companies are looking at the cloud as a way to offset the decline in sales of their traditional businesses of hardbound books, periodicals and content. Now with the kind of rise of the iPad and tablet computing - what we will call kind of edge devices - there might be second life or at least an offset to these declining secular business models through cloud services.

Dave: Absolutely. The fun thing about looking at the media space right now is that it’s breaking down so many walls. It was probably 25 years ago when it dawned on me that, maybe this was obvious to everybody, but I woke up thought gosh, I’m in the movie business, I go and raise a lot of venture capital, I hired a bunch of developers, I go and build a product and it costs US$20 or US$30 million. Then I launch, and once I launch it costs me a nickel to make the copies. It’s the same economics as software (ie, these kind of zero marginal cost businesses). When I realized that, it was like in some ways, remember they used to

Page 96: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

96 [email protected] September 2010

call us software publishers in the old days. It wasn’t actually bad, it seems quaint now. But it wasn’t a bad metaphor. Instead of publishing words, we were publishing programs.

Ed: Well, it’s still digital bits and bytes and the difference was 10 years ago that for a CD it didn’t matter if you spend US$2 million or US$2,000 to create a CD, it still retailed for US$15.99. But, if you wrote a CD for enterprise accounting, you could charge 6 figures for it.

Dave: And media, the companies used to sell hard-bound books. I mean these guys are not dumb - I mean some of them are, but lots of them aren’t - and they have realized that they need to basically sell. I just call them cloud-information services. You can’t see where the software stops and the content starts. When you buy a product like Pathology Consultant from Reed Elsevier, you aren’t buying books anymore. You are buying an online service. It is a cloud service by definition.

You log into it the same way you log into SalesForce.com or anything else, but instead of selling your pipeline, it’s got slides and stains of different pathology and tumors, and people sit there doing differential diagnosis using this thing. That’s what we sometimes also call a content application, or information application. It’s part software, part content and part cloud.

Once LexisNexis sells you a bunch of legal cases, they will sell you tools to automate your law office. These guys aren’t dumb. They say, ‘We’ve got a time tracking system for partners as long as you are buying our content,’ so the media players are actually interesting attackers on certain professional segments.

If you are selling to people who prepare taxes all day, a big part of their life is access in a friendly way to the tax code, and if you have some software that goes along with that, and if you are selling tax software and not tax content, somebody who is selling both, both well integrated, has a really good chance of beating you. And I think you will find in more sectors, whether it’s the tax preparation software with the tax code, or if it’s software to help a pathologist diagnose a slide along with all the content from 50 pathology books. I think we are going to see more and more of those information apps because the initial cloud apps are empty.

You license Salesforce.com or Netsuite and there is nothing there. It’s just an empty thing. The same can be said for installing QuickBooks because there’s no data and you build up your data over time. For many of these types of applications, you want the data to come with the app, so I think the information providers

and media companies are already in this type of business to some degree, and I think we are going to see more of it in addition to the already well-known cloud startups. I think we are going to see more models like LexisNexis and Reed Elsevier saying, ‘You know what, we are selling software too and we aren’t bad at it.’ You know Bloomberg fits into this definition perfectly. Bloomberg is one of the best technology companies I have met, period. It’s not just Google that can hire smart engineers, Bloomberg can too, and Bloomberg has content and a plenty of it.

As you know I’m also on the board of Aster Data, and the reason I joined that board was because of the big data problem. This is something you talked about earlier, which is what breaks, at some point when the data gets so big, what breaks? To me, the answer is your ability to do complex analytics on the data. Let’s just pick SAS as an example. The typical way people use SAS is they pull a lot of data out of the database, they load it into SAS, they tell SAS to go grind on it, and SAS comes back with really interesting observations, analysis or predictions. The bigger and bigger the data gets, the more you just can’t do that anymore.

We have already watched big data disrupt the database market. That’s what the whole Hadoop, Cassandra and HBASE thing is about; it’s basically saying that sharding MYSQL is labor intensive and doesn’t work. And by the way, you don’t need all the benefits of a database management system for a lot of these apps. When scaling matters a lot, just use the MapReduce paradigm and automatically distribute this thing across a bunch of nodes. MapReduce is very much a cloud technology. The whole notion of dynamically spinning 1000 servers, bang. You need that from a hardware perspective, but you need software that can say, ‘Go run on 1000 servers.’

I mean if your software can only run on one server, it doesn’t matter if you have access to a thousand, on a blink. So I would just say the map-produced paradigm of computing, which is how many computers do I have, ok let’s partition the work across that many, and by the way that might be 10 or it might be a 100 and the beautiful thing about map-produced scalability is it might be 100,000 if you have that big a problem. I think big data drives MapReduce, which basically drives database disruption, and I think we will see more of that.

The other thing you will see is this concept of Aster’s pioneering, and I think this is going to have to take off, which is in-database analytics. The argument is as the data gets bigger and bigger and bigger, you can’t pull the data out and move it to SAS. You need to take SAS and plug it into the database. People have talked about this for a long time. We have anticipated this day

Page 97: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

September 2010 [email protected] 97

coming in database land for 20 or 30 years and it’s never really come, but it’s starting to now.

Databases are getting big enough, and that’s why SAS is partnering with people like Astor and the other high-end analytics vendors, because they know it. You can’t bring the data to the analytic tool. You need to bring the analytic tool to the data because there is just too much of it. That will change a lot of things. I think it’s obvious it’s going change a lot because it’s a whole different paradigm of computing. That says a lot of interesting things about the analytics vendors, what’s going to happen to SAS, does SAS become a database vendor?

Ed: How closely, or how is this related, to in-memory analytics which SAP is talking about?

Dave: I wanted to talk about that too. It’s exactly the topic I was about to get to. The argument on in-memory is really interesting. I will tell you the one thing I am sure of and we will figure out what it means. The thing I am sure of, as a guy that’s been in data and databases for 25-plus years, is that a vast majority of the optimization and work in database systems goes towards data integrity. Basically, it’s about disks. You’ve got to make sure the data is written to disks - that’s the only place it’s safe. Or, you have to make sure if the disk blows up, the system keeps running. Or if you are going to process a query and do it quickly, you have to avoid disk IO as much as humanly possible.

I used to joke that query optimizers were basically IO avoiders, because IO is the most expensive thing, it’s two orders of magnitude slower than memory access. Think about all the money Oracle has invested in their query optimizer to avoid IO, but now all of a sudden IO costs 1/10th with SSDs, which I think are really going to change the world because for the first time, there are two different things going on. One is memory and memory is a 100 times faster than disks, but SSDs are 10 times faster than disk and at a price performance that is very well suited in the middle. Some folks are going to go all the way to memory and try to load these databases in memory, but don’t forget about SSDs.

I’ll just tell you what the in-memory database argument is. There are two different arguments. There is an analytics argument that says a huge percentage of the size of a data warehouse is pre-computed analytics. That was kind of the trick of the last 20 years to make analytics go fast. Don’t count, or actually put more precisely, count at night. Go count at night how many of this unit we sold vs that unit in New York, in Chicago, so when you ask, the database system could answer with one lookup. Pre-computation (ie, what made Arbor effectively famous with OLAP) was the trick of the last 20 years. But pre-compute

takes space. These things get massive because you are pre-computing this entire cube of possibilities, and then people come along and say, ‘Hey, wait a minute, if you could just put the source data itself in memory, you could do whatever calculation you want to do so flipping fast that you don’t even need to worry about pre-computing,’ and I think it’s an excellent insight.

Ed: Well, it’s a little bit like the ROLAP vs OLAP, but now we are able to get ROLAP at OLAP speed.

Dave: Even ROLAP pre-computed, but it just stored the answer in a relational database. ROLAP kind of flattens that hyper cube into tables, but it was still doing the pre-computation tray. This one is doing one better, which is you don’t have pre-compute anymore, it’s in-memory. If you want to count how many sales guys sold product X in Cleveland on Tuesdays, when the entire database fits in-memory, you can do that really fast. The problem was always when the entire database sat on disks and you had to go to detailed records of every order done by every sales guy in the month of February. The disks would have to grind forever. I think that the in-memory crowd, from an analytics perspective, has one easy good point, which is to the extent your database fits in-memory, you don’t need to do pre-computations to do fast analytics.

The other argument is, I don’t know if you are following Dr Stonebraker’s latest thing with the whole VoltDB argument, which is if your database can fit into memory and can be replicated a couple of times, you never need to write to disks, ever, because I am going to have replicated partitioned tables. For example, you are IBM and you take the employee table, and you have several hundred thousand employees. We are going to put different rows on different computers and then we are going to mirror on different computers different locations each set of 20 rows. All of a sudden, through replication and mirroring, I never write to disks again. I can do phenomenally fast transaction volume. Dr Stonebraker makes a whole bunch of other suggestions, but if you haven’t looked at VoltDB, you should look at it. It’s all part of his general mean of specialization of infrastructure software.

That the general purpose infrastructure, basically, is a jack of all trades. Oracle is pretty good at this and pretty good at that. It’s pretty good at data warehousing, it’s pretty good at OLTP and it’s pretty good at analytics, but it’s great at none of them. The argument that Dr Stonebraker makes, which I generally agree with, if you try to do any one of those things really fast, you can build a custom system that goes 100 times faster. The way you do it is by such things as keeping everything in-memory. I think memory resident databases have two trends that are fighting each other.

Page 98: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

98 [email protected] September 2010

One is the data explosion that says you can’t get everything in-memory, and particularly in the unstructured land you are never going to get everything in-memory. In the structured land, for most companies, I think it’s a good point. In-memory analytic databases are going to go fast. The thing that people tend to forget is SSDs. SSDs are going to change everything in database land because now there is a third option. For the last 30 years, it’s been either some memory or it’s on disk, and you have a 100x difference between the two so choose your math. How much do you want to pay? I think you are going to end up seeing a three-tier model: Do you want it in-memory, SSDs or disk?

Ed: Somehow, the incumbents are going to need to embrace that.

Dave: They better. If you are an incumbent, you have two things going on. One, a lot of your R&D for the last 20 years that you thought was a giant barrier to entry is now moot because you have all this propriety query optimization technology that avoids disk IO. I’ll just go by a ton of SSDs, I don’t care about IO anymore. All of a sudden a barrier to entry has gone away because a hardware assumption has changed. That’s the threat side. The opportunity side, obviously there are going to be companies out there optimizing for SSDs. If they don’t embrace it, they face a big risk.

Ed: That could be quite disruptive over the next several years. Do you believe we will actually see a convergence of predictive analytics, structured and unstructured, and then the ability to intelligently or appropriately create software agents that can return intelligent queries or could even anticipate the type of information that a user is going to need?

Dave: Yes, I think so. I definitely believe there is going to be a convergence of structured and unstructured data. The simple fact of the matter is that the reason your text needs to be indexed in a search engine and your numbers need to be indexed in Oracle is a historical artifact. When I first started at Ingres, you could have 2000 bytes in a row which was the maximum width of a table. I guess you aren’t going to put a PDF in there. We have this dichotomy between data and documents or data and content that when you come at it kind of computer up, or bottom up from technology, you think, ‘Well, of course.’ But people don’t come at it that way anymore. If you look at it the kind of coming generation of programmers, it’s very interesting.

If you asked me why I learned to program, I did it because I liked computers and wanted to learn how they worked. I just interviewed and hired a Berkley MBA who had an undergraduate CS degree. I asked him why he learned how to program, and he answered he wanted to build websites. That made

him learn HTML, and then he wanted to make them dynamic so he had to learn Java script. Then he wanted to make them database driven, so he had to learn databases. It was completely backwards from how I came to learn to program.

That kid doesn’t see any distinction between data and content. He gets it imposed on him because in terms of if . . . it fits on Oracle, then put it there and if it doesn’t drop it in the file system and index it, but that’s an implementation detail to him. He’s starting at the other end of the telescope saying I want to build a dynamic data-driven website. There are going to be more and more people, obviously, as time passes that see the world that way. They are going to start out saying, ‘I want to build a website that does x,y and z,’ and cobble together what they need to do that. They are not going to start saying, ‘Because we bought an Oracle license, I’m going to use Oracle.’

A lot of object programming languages can handle both structured and unstructured data. Java represents document perfectly fine. So they are going to look at this and wonder what’s wrong with their infrastructure, how come the top layers of the stack don’t force them to split data in documents, but the lower level they get, they do, and they don’t like that. Basically, kids don’t see the data-content dichotomy because it’s artificial and a historical artifact. The more the kids get in charge, the less they are going to accept that and the more they are going to want to use technologies that hide it.

One of the most popular technologies in databases in the last five years is Ruby on Rails. What does Ruby on Rails do? It hides the database because the programmers don’t want to see it. It’s an abstraction layer that hides the database. You will see people who can build layers that can hide some of this stuff or unify it, and they will do very well. There is no reason to accept the historical artefact. The only reason we accept it is because we grew up with it.

Ed: That ties into a theme we discussed with Jan Baan about building an additional layer of process orchestration on top of services, and I think ultimately, as you were saying, IT is going to become programmers of servicers or orchestrators of services rather than people that put these components together.

Dave: Yes, absolutely. They are not going to be people who buy computers, they are not going to be people who set up computers, they are not going to be people who develop applications from scratch. They are going to be people who tie together services. I totally agree.

Ed: What happens to the industry? You related to this oligopoly that the industry has become with this wave

Page 99: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

September 2010 [email protected] 99

of consolidation recently, and now we are starting to see more vertical integration taking us back to the mainframe paradigm. On the other hand, there is this incredible flowering of disruptive open-source technologies and alternative ways to get around these big stacks. How do you see the industry evolving in the next 10 years?

Dave: One of the issues of enterprise software is there is a strong increasing returns effect. The more people in the 80s who used Oracle instead of Sybase or Informix, the more you probably should use Oracle too. A customer who bought Informix in 1990 is probably a lot less happy than the customer that bought Oracle because they probably had to move off Informix. Because of the increasing returns effect in enterprise software, you have this “winner takes all” thing that happens, and you end up very quickly with hyper-dominant players in markets, whether it be Oracle in database or SAP in apps or Microsoft in the Office suite - arguably natural monopolies.

When the natural monopoly becomes real, and Oracle has a monopoly on databases and SAP has a monopoly on ERP, you cannot disrupt it frontally. That’s the thing I have liked, the orthogonal disruption. It’s the way you disrupt on the flanking attack. Nobody can beat Microsoft in operating systems, so what do you do? You back Linux and you give it away. It’s basically business model disruption instead of technology disruption. Once the winners take all, effect has happened in the category, the die is cast. I don’t know anybody to dislodge that winner.

Ed: Look what Android is doing to Apple right now. It hasn’t happened yet, but it is clearly a way to disrupt things for Apple.

Dave: Absolutely. I would argue that in mobile OSs, the game isn’t over yet. It’s the equivalent of relational database in the mid 80s. The market is struggling to figure out who the winner is going to be, and if you are building a bunch of apps, you obviously want to build on the winner. Because if you had built all of your apps on the OS 2 in the 80s, you were hating life because you die with your platform, or you bear the cost of running on N platforms.

I think there will always be this increasing returns effect, which is somebody starts to establish clear leadership that just makes too much sense. If right now I’m going to quit MarkLogic and go build a mobile app, will I build it on Android, BlackBerry, or the iPhone? Personally, I’d build it on iPhone right now. I’m not going to build it on BlackBerry and I’m going to scratch my head on Android. As we know, the more people do it, the more it drives towards leadership.

I think Google is a good disrupter, particularly because they have deep pockets. Deep pockets and high commitment are a good combination. Once the monopoly is built, it’s very hard to disrupt it. I believe, as a believer in Adam Smith, that they can disrupt it somehow.

I believe that Oracle can't continue making that much money, 52% operating margins, without consequence. Of course, MYSQL was a good consequence in database land, and of course, now they bought that. I think Linux was a beautiful challenge to Microsoft in server OSs.

But what does it all mean? I go back to what I talked about earlier, a big bang effect. I think two things happen: One I’ll just call an orthogonal disruption, where you are disrupting on a different dimension, such as giving away a technology and selling services or rent by the hour in a SaaS form. For the last 10 years, Silicon Valley has been all about business model disruption. Most of the companies are either doing new things like Facebook, creating new internet services, or, in terms of software, has been the appliance sold as SaaS, open source. But not a frontal attack with the same business model and different technology or a flanking attack with different technology and a different business model. My belief is that it works, it takes awhile, but it works.

My other belief is that the technical innovation in Silicon Valley has suffered in the last 10 years. In my opinion, Marklogic is one of the very few software companies that is innovating on technology, not business model. Most companies I know are either analytics plays, SaaS analytics or SaaS BI, or open-source BI or open-source database. Does anybody try to sell better software?

Ed: A lot of the innovation has happened on the consumer side where it’s being driven by a very traditional model which is advertising.

Dave: That’s absolutely true. That’s something we haven’t talked about. The whole consumerization of IT trend will continue, absolutely. The consumer side is going to set the bar for what’s expected on the corporate side, whether that be the whole social thing, which we haven’t talked about either - I’m a big believer in social computing. I don’t think it’s magical, I just think it makes a lot of sense and it’s a great opportunity. Why not see what the salesforce thinks what the best tool is inside a company, so they can vote on it rather than have marketing tell them. I think social computing is going to be a big part of the document sharing we talked about earlier.

I think industry wide, there is a big bang coming. Basically, it’s happened already. Google hasn’t won yet

Page 100: 2020 foresight - Tech Views of the Future - Ed Maguire

Dave Kellogg, MarkLogic US software

100 [email protected] September 2010

but Google Docs is a great threat to Microsoft. It’s the first credible one I have seen. It’s a lot more credible than Star Office. There will be consequences because you can’t be a 50% operating margin monopoly without a whole lot of people trying for a really long time to figure out how to screw it up, and that’s one of the things I love about open source. In some ways, open source is to software as blogs are to media companies.

Consider Business Insider probably has five total employees and is trying to disrupt Business Week, which probably has 200 employees, huge fixed costs and New York real estate. If you just strip away everything that was superfluous, and you can have a pretty good, if gossipy, business blog site, on Business Insider and the same thing is happening with open source. I think MySQL was doing US$65 million a year in revenue when they got bought, but I think they were probably disrupting well over US$1 billion, maybe US$2 billion. How much revenue did they disrupt, as far as from a market perspective is actually as

interesting a question as how much revenue they actually had. I think you will see more of that and it’s not because information wants to be free or any such other garbage. I believe in economics 101, someone is going to figure out how to attack that.

The thing that I always find interesting when I talk to most people about this is they say Oracle is unassailable, that Oracle is all powerful. Yes, but it doesn’t mean that people aren’t going to try. Eventually somebody is going to get it right. I don’t know who, I don’t know when, but lots of people are going to attack that.

Ed: I agree, and the outcome of this is a point that has been made by Red Hat and Eric Von Hippel about innovation. When you reduce the cost for users to deploy technology that allows them to spend a lot more time and essentially transfer that value to innovation.

Dave: Well, that’s a really good point.

Page 101: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

September 2010 [email protected] 101

Gary Kovacs, Sybase Gary Kovacs is the senior vice president of markets solutions and products for Sybase. Gary's unit is responsible for defining and developing Sybase’s core markets, including the necessary solutions and products, as well as overall revenue and market-share plans companywide.

Prior to joining Sybase, Gary was at Adobe Systems, most recently as general manager and vice president of mobile and device solutions, where he led the evolution of Adobe into the growing mobile and consumer-electronics market. During his tenure, Adobe developed innovative tools, run-time platforms and global partnerships that resulted in deployments of Adobe technology on over 1 billion mobile and consumer-electronics devices worldwide. Prior to Adobe, Gary was president and COO of Zi Corporation, a global mobile-communications company. He built Zi from inception to a Nasdaq-listed public company providing embedded software on over 400 million devices worldwide. Gary also spent 10 years at IBM, building businesses in a variety of senior leadership positions, spanning product development, sales, marketing and operations.

Rise of the mobile internet The introduction of high-speed 3G and 4G data networks will accelerate growth in mobile data. Cisco has projected that mobile IP traffic will be responsible for 2.5 exabytes (1000 petabytes, 1bn gigabytes) by 2014 in the Cisco Visual Networking Index (VNI) Market Forecast for 2009-2014. The vast majority of this traffic will be video traffic. Significantly, much of the traffic will come from mobile broadband-enabled laptop computers, in addition to high-end handsets.

The mobile internet will evolve beyond handset and laptop connectivity as new form factors and smarter devices come to market. End points will become more connected and intelligent as communication networks are extended. Smarter applications, leveraging connectivity through the cloud, will usher in a new era of always-on, always-connected communication.

Mobile IP traffic

0

500,000

1,000,000

1,500,000

2,000,000

2,500,000

3,000,000

3,500,000

4,000,000

2009 2010 2011 2012 2013 2014

(TB/month) Non-Smartphones

Broadband Gateways

Smartphones

Portables, Netbooks, Tablets

TB = terabytes. Source: Cisco VNI Global Mobile Data Forecast for 2009-2014

High-speed 3G and 4G data networks will

accelerate growth in mobile data

The mobile internet will support new form factors

and smarter devices

Video and applications will drive robust growth

of mobile-internet traffic

Page 102: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

102 [email protected] September 2010

Mobility - Entering phase two of the fourth platform shift Gary Kovacs has a long and distinguished career in technology, focused on developing the mobile opportunity. Sybase has been a leader in enterprise mobile enablement, and our conversation centered on the evolution and potential of mobile technologies. Gary sees the current adoption of the mobile internet as a paradigm shift that will lead to new types of applications created by those who do not have to adapt prior reference points. The promise of mobility is to unlock developers’ creativity when addressing uses that will be inherent to the generation that has grown up around mobile.

Key points Big technology shifts (of which mobility is the fourth after mainframe,

client/server and internet) come in two phases. In the first phase, prior ways of doing things are adapted to the new approach, and in the second phase, entirely new and transformative innovations come into play.

We are now in the second phase of the mobile-platform shift, which was ushered in by the iPhone, App Store and a new generation of designers who have always known wireless in their lives and do not want to be constrained by the old ways of doing things.

Network and bandwidth constraints are gating factors for adoption of new types of services, but this should improve incrementally over the next decade.

Machine-to-machine computing married with the scale of the cloud and stored information will give rise to smarter applications and devices.

Information will be stored in the cloud and accessed through devices that are always on and always connected. This omnipresence of information will unlock the imagination of tomorrow’s developers, leading to new paradigms of mobile connectivity and applications.

Service providers will focus on scale of bandwidth distribution and information-service providers will leverage the channels to deliver content that is more relevant to endpoint devices.

Summary of interview on 9 July 2010.

Full transcript follows

We are now in the second phase of the mobile-

platform shift

Information will be stored in the cloud and accessed through devices that are

always on and connected

Page 103: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

September 2010 [email protected] 103

Gary Kovacs transcript Gary: I have been in mobile since 1998, which makes me pretty much a lifer. All the mistakes you can possibly make, I have made. I guarantee it. And that is good projections and bad.

I am both a technologist as well as a business person and started with IBM and then left to start a company in 1998 to do predictive text as well as the algorithms for the first SMS applications. At the time which was prescient, somebody approached me with a business opportunity. I was at IBM in Armonk running a global product group - not to do specifically with mobile. We worked specifically with the fall plan. The fall plan for IBM is how we make our investments for the coming year. One of the line items in the fall plan was are their any alternate platforms outside of clients that we know today, which were the mini computers, the mainframe terminals and PCs. And there was a little bit of a discussion on mobile devices.

I will never forget this moment because I was debating whether to leave IBM and start this company. Somebody approached me 10 months earlier with this patent idea and said that we should start this company. The patent idea was to do disintermediation of input for Chinese characters. I looked at him and said, ‘Well, that applies to Word Perfect and all of the other applications that you do on a PC today that are hard to enter in the Chinese character set. So it’s really taking a QWERTY keyboard and mapping it.’

When I had that in the back of my mind and I looked at this fall plan item, and it said mobile devices. I literally pulled out my mobile phone (at the time just a candy-bar style) and I asked myself how will people ever input onto this mobile phone? I thought, ‘That’s exactly what this stuff is for.’ SMS was just barely starting; people were just starting to send text messages. There were dozens sent a month and that was it. At that point it was December, I just said I’ve got to go start this company, we are going to aim it at mobile and we are going to solve the problem of usability. We created a pitch that was just usability, basically how do you input text into mobile devices. It went public, was subsequently acquired and is now owned by Nuance.

I immersed myself in mobile by travelling the world, talking to all of these handset manufactures. I explained that you are going to do more on this device than voice and you have to solve this problem of input. This was long before touch screens or any other input. Even joystick five-way type of function wasn’t available. They all bought it. We had 600 million devices out there and 250 OEMs around the world licensed it, so it was quite a nice 4.5-year run. And they are still going. I think they are up to a billion installed and there is good money in that business. It

then turned into SMS, which then turned into becoming a transport for data and applications.

Then I went to Macromedia where they were saying Flash and the presentation on the desktop is sort of figured out, but there is this whole new class of devices called mobile and we haven’t figured that out. That is what I went there to run. We subsequently got acquired by Adobe, and I was for nearly five years the general manager of all mobile at Adobe. From tooling all the way through runtime, and I don’t know when Flash will be on the iPhone so we can just save that question here. Although, I have been yelled at by Steve more than I care to think about.

We turned the problem from an input to a visual problem. How do you even design and develop meaningful applications that aren’t just text based? That’s when we created Flash lite and the whole tooling that was part of CS2, CS3, now CS5, and all of the emulator tools. All of that happened over five years and US$300 million of revenue. I think when I left, we were on 1.3 billion devices. We would love to have 1.3 and 1.

We created the Open Screen project to pull the top 25 influential mobile companies, as we called them, towards creating this standard for displaying and presenting content and video. There is a lot of evolution and lot of mistakes happened along the way.

Then just over a year ago, I came to Sybase because Sybase has such core mobile assets and our CEO’s charter was ‘How do we join those and then take them to market in a meaningful suite that moves mobile at an enterprise level to the next stage?’

Ed: That was actually very helpful, gives us some perspective.

Gary: Along the way I have had a good tie in to operators, right through to content creators, so I want to tease some of the questions that we will get into. I have to smile because it has changed so dramatically. Looking forward to a market point of view, starting at IBM and moving right through today, the way I see it is there have been four tectonic-platform shifts in computing.

By computing, I mean going from manual processes to where we are today. It started, of course, from manual process to the mainframe. That was really about taking our manual processes and automating them largely for speed and eventually for cost. At each one of these platform shifts there have been two steps to it.

The first shift: it’s logical that we in computing or in usability start with the things we know. The first phase

Page 104: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

104 [email protected] September 2010

of the mainframe shift was automating what we already knew - the business processes, work flows and apps. The second phase was rather than just automating what we already know, we now have an opportunity to redesign the way we do business. Whether it’s the way we go to market, the way we connect to our customers or the way we process orders, we have the opportunity to redesign it starting with this new computing platform at the core. It’s at that second phase of each one of these shifts that I’m going to describe that greater productivity gains, by orders of magnitude have happened both from an individual and an enterprise point of view.

The second shift, which was in the early 1980s, was client/server. It pushed computing and decision making at an enterprise level down to departments rather than just this centralized place. The first stage of client server, the history just repeats itself, we are just extending what we already do. Most people just used the terminals connected to the mainframe. Second phase was to build entire new businesses and the way they do business. That’s what Microsoft and all of the other companies spun out to prominence. They led that wave of transformative client-server applications and ways to do business.

The third phase was the internet, the dramatic shift. The internet was interesting because it took us global overnight, but the actual usability was less than the client-server phase. I mean it wasn’t as rich, it wasn’t as interactive, and it certainly didn’t allow me to process a lot on the terminal itself. It was a window into largely a mainframe. Phase one of the internet was again presenting information that lived somewhere else. It wasn’t about doing transformative things. That lasted for six or seven years and then the transformative internet happened in the early 90s and it went crazy. Lots of companies led that wave including Google and others. Wealth was created and more importantly productivity on both a personal and enterprise level happened.

Those are the three shifts preceding today. In the early 2000s or the late 1990s, mobile started to happen. Mobile is the fourth platform shift. It is a fundamental platform shift, in my view. What happened is the same history lesson is occurring. Phase one of mobile was extending things that we already do. Email is the classic example. Everybody had email, we just pushed it a little closer to us personally. It was always on, always in our pockets. Then came simple work flows, simple approvals and things of this nature that weren’t monumental. We tried to get the internet to work for years in the early 2000s. It kind of did with WAP, but people don’t want to create two sites. The presentation wasn’t that dynamic either, so it sort of went, but it sort of didn’t go. Some basic apps and a little bit of gaming but nothing really monumental.

Without question, we are in the second phase of the mobile evolution - the fourth platform shift - which is: rather than doing things that we already do, both at an enterprise and a personal level, the world is focused on building these transformative ways of doing business or doing things we do in our personal lives. That was really ushered in by the iPhone and the ability to crack the presentation nut and usability nut.

There are really many other factors, such as bandwidth speed and processing power, and how we dissipate heat and all the rest of the things that go into a mobile device to enable those things to happen. We are right at this knee of the curve and in the next year or two, it’s going to be astounding at what’s going to happen. This is how we create entire new ways of doing business and entire new ways of living our lives, starting with the mobile first. By the way, each one of these platform shifts doesn’t replace the others. It builds upon the others. So that’s kind of the frame of how I’m looking at the world and where I think we are.

If I think out over the next 10 years, it is going to be these transformative experiences that happen and the investments made to enable those that are going to change. I believe that it is going to change our lives. I personally believe it will be a path of innovation beyond anything we have seen before. Even the Wild West of the internet days in the 90s won’t touch what is about to happen. For clarity, when I talk about mobile, I talk about iPad’s and iPhones and . . .

Ed: The mobile internet.

Gary: Yes, there’s a new edge. There are several things I think are going to be really important.

First of all, without question, the most important thing that’s happening now is that there isn’t going to be one or two companies that are able to define or develop things that are interesting to the mass appeal of humanity.

You think about the iPad. I said, ‘I don’t think anything about the iPad at all.’ That was just my controversial statement and they said, ‘What do you mean? How can you not, it’s a great device.’ I said, ‘Because the iPad is just a housing for the incredible innovation that is going to come from the developer community that now understands how to build.’

If we can build things that get out of the way of the brilliant minds that create stuff, it is unbelievable what’s going to happen. What the iPhone and Apple has done better than anybody else is create that from my brain as a developer to somebody’s device and somebody’s wallet. That path has been simplified to the point where it is meaningful. Now we see the creativity come out.

Page 105: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

September 2010 [email protected] 105

When I hear things like this device or that device, or Symbian or Kin, I mean I don’t actually care what the Kin or any of those Kin is now . . . it’s defunct, but I don’t actually care what it does and it’s social focus. What I actually care about is how it enables a community to build. That might just be my Adobe roots where I have seen the power, if you give them a great tool, magic happens. It is magic that a hundred of the smartest people in the world sitting in a room could never ever imagine or predict would happened.

It’s just because you have unpacked the creativity that is humanity. We are right on the edge of that happening. It’s unfortunate that there are a couple of things getting in the way. It’s going to be a bit of a platform war for a while. It will be between Android and the toolset there, and I think RIMM is going to continue to try and push their agenda as they should, and Symbian and Microsoft aren’t going to go away anytime soon even though I don’t know how much impact they will have. Finally, of course there is Apple which is not a mass-market product, it’s a pretty narrow focus product. We do have to unlock the community in the simplest way, but it is getting closer. It’s not 10 platforms now, it’s three.

Ed: Right, getting at this idea of unlocking innovation by providing transparent technology. It becomes an extension of imagination or consciousness enhancement. Clay Shirkey has a good term for it: cognitive surplus. That’s veering more towards the Singularity in a way.

I think of being a musician. You are never going to get around the fact that you need to learn to play scales. You need to be able to play arpeggios. You need to be able to make a sound out of a bow or by plucking a string, and there are the physical realities of the human body against an instrument that essentially is moving air. Now when you have technology, some of the stuff that Tod Machover at MIT has done with MS patients where they are able to compose using just an interface or a paintbrush that’s held in their teeth, just unlocking this innovation in the human spirit using technology and it’s unbelievably liberating.

To your point, it allows that innovation to flow freely and a lot of implications that come with that. You no longer have this control or domination of the channels of distribution. Old media is broken down because it’s not three TV networks, five major labels, and a handful of book publishers that end up being the arbiters or the curators of everything, of the information or the art flow. Anyway, that was expounding on your point and that’s a lot of the things we’ve been thinking of.

Gary: Everything I believe in life and business boils down to really elemental substance. We bought this massive easel for my seven-year-old daughter. She’s

sort of like her dad in this regard, which is a really cool thing. If a small piece of paper is good, a massive piece of paper must be better. She just happens to love art and we bought her these really spectacular crayons during a trip to Germany. They are almost like paint, but they are crayons so they don’t make the mess. You don’t have to press hard to make an image. She wanted this massive easel, so I ordered one online. I mean it is massive, like six feet by four feet. We put it in her room, and my wife said ‘Ok.’ The first thing she did was carved out a corner, put a black line and said ‘why don’t you try something here?’ I ripped off that first page and I said, ‘No, just use the whole canvas.’

We closed the door, and when we came back, she had taken it off the easel and put it on the ground. Literally she had a crayon in each one of her toes and one in each of her hands and she was just scribbling. We sat back, I looked at it and said, ‘Wow.’ I never would have thought to do that. We aren’t going to sell it on some art site for a million dollars or even one cent, but I just looked at it. Out of the brain of a seven-year old, we lifted the constraint of the canvas sizes. We lifted the constraint of how she could use the different tools, we lifted the constraint of where she should even start on the canvas, and sort of magic happened. Now you add that if she had the ability to understand and perfect the mechanics, the strokes, that’s how magic happens.

When I think about mobile, the constraints we had were the device, the usability, the form factor, the size and the pipe, meaning the speed we could deliver the information to it. We have tried for years to solve these problems with things we already knew. Like QWERTY keyboards, things like touchscreen never came into prominence because we just didn’t change our paradigm of thinking.

We didn’t get out of the room and let the child immerse herself in the art versus immersing herself in how we do things today and applying them to another platform. Even tools back at Adobe and here at Sybase are largely tools used to develop internet apps and are used to develop client services apps. That paradigm just has to change and it is changing and that’s what’s unlocking this whole new generation of people that are going to start with mobile first.

It is really quite a profound shift that’s happening here in the marketplace and applications like Ocarina and all these other things. Who would have thought that? Why would you think that, but somebody just dispensed with what was real, and now those types of applications are making their way into business and saying how can we better cooperate, how can we better share? If it’s doing one thing, it’s providing an opportunity to think entirely differently about how we

Page 106: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

106 [email protected] September 2010

do work and how we do play. That is the very very cool part of this thing.

There is still some innovation that has to happen at some very pedantic levels of this market. Moving towards the 10-year outlook, bandwidth is a problem that is being solved. Latency is a problem that lags the bandwidth problem a little bit. When I press a button, there is a return time or a transmit time that is often unacceptable.

We still have some of those basic network problems. Those problems don’t get solved in a year or two, they get solved in five-year blocks. Over the next 10 years, one of the most fundamental things that’s going to change is that information can be stored anywhere and accessed instantaneously. It doesn’t really matter if it’s 4g or 5g, but it’s the way that those networks are implemented and where it will be immediate, it will be just in time, and it will be direct access to information. Back to my analogy, it opens the canvas up to the artist. The artist is the developer, so that’s one constraint that will change dramatically.

The other will be that I don’t believe there is one form factor or size that will win. I have an iPad, and in the first two weeks I had the iPad, I used it 70% of the time and the iPhone 30% of the time. I have now switched back and the reason is because the iPhone is always with me. Unless I get a bigger pocket, it’s not going to go with me. It has a place in my life, but it is not as immediate. I don’t walk downtown in San Francisco and carry my iPad with me. I have my phone with me.

With that said, there will be people who advocate for one vs the other and I don’t believe that. It’s not going to necessarily expand the physical size. We will make the physical size more usable for sure. It will be the other sort of pedantic things that have to change as processing power continues to grow. Battery life and heat dissipation and all of those sorts of physics problems . . . I was asked if they will get solved. I said, ‘No, they will never get solved.’

It is an iterative problem because every time you solve one iteration, you create another need to solve a different iteration. We will get much more efficient and effective at that. We will get much more efficient and effective at graphics. The GPUs that are in the chipsets today will get much more effective. Two- and three-day battery life for a high-powered visually expressive device will become the norm. All of these things in and of themselves . . . ok, so if I get two days of battery life, it’s better than half a day, but it in and of itself isn’t going to create any more meaning on the device for me.

It’s the ability of that change to enable developers to build cooler things that are much more processor intensive that will change my life. I have no idea what those are but I will speculate in a second. A lot of these things are being worked on. They are often not talked about, but they are really critical. Today we understand pinch and zoom and touch screen and swipes. Things like that will be outdated in three years and there will be a whole new way of human motion that guide how we interact with this device, much like the Wii.

Ed: One of the interesting developments we have discussed with some other folks is the evolution of motion-based interfaces and some of the stuff Microsoft is doing with Kinect. It is essentially a motion sensor, but I expect that is going to work its way into the enterprise but certainly in terms of the mobility. I don’t know if you have seen the Sixth Sense project that they are working on over at MIT Media Lab.

Gary: Yes, I’ve seen that one, and I’ve actually seen Project Natal (Kinect) prototype and some of the motion-based work that’s being done at some of the other labs. There isn’t one that I’m more impressed with than the other. My mind is numb at all of them. If you start looking at the signposts, you can buy a Nikon camera that is three inches by one inch and you can project your images and slide shows onto a wall because there is a special type of projection mechanism. You start thinking about the device as an edge, but not the only part of the edge that it enables. It really changes.

I think voice is a little bit challenging because it is an intrusive input method. I have to speak in places I might not want to speak, public places and so forth. It will have a place and I think we will get more effective there. All of those enablers will change the canvas, meaning it will lift some of the constraints of the canvas. It will be similar to going from a QWERTY keyboard to a five-way nav on a device and from a five-way nav to a touch screen. These shifts changed that canvas entirely. All of these things we see today, I always say don’t get too attached to them because they are going to be dramatically different in a year to two to three years from now.

The unfortunate part, and quite frankly this is really very surprising to me, the only real innovation that I see comes out of one or two companies, everybody else plays catch-up. I’m just wondering where the innovation has gone, I really do. We are chasing a business model and the business model is ‘if I can out iPhone the iPhone, I can make money.’ While that’s true, there has got to be more innovation somewhere where people are really thinking about the next phase of all this stuff. I just don’t see enough of that and it’s disappointing.

Page 107: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

September 2010 [email protected] 107

Ed: I wonder if it is because historically mobile devices or the mobile ecosystem has been so vertically integrated and controlled by carriers. With HTML, TCP/IP, Java and Java Scripts, you are able to have coding and transport on standards-based language, while mobile carriers have controlled much more around the delivery.

A few years ago when ringtones and personalization were big, it was all about having that proprietary walled garden and there has been enough critical mass so that we don’t have monoculture, we have fragmentation. I just wonder how much of an obstacle has been until we have the iPhone. They have a completely closed ecosystem, but it now has a critical mass so it’s attracting people who are writing creative applications and services. Still there’s that barrier: how do you introduce a new service if it conflicts with something that Apple wants on the phone?

Gary: I’m going to say something that will be very controversial, I’m sure. I think there is a real high value placed on a closed ecosystem at the start of a tectonic shift. The reason is that open source is a wonderful thing, but open source in and of itself doesn’t drive innovation. So much time is spent on the mechanics of pulling all of the different pieces together and that gets in the way both from a time as well as an ability point of view of the creativity. As a consumer and certainly as a business user, I don’t care at all, not at all, about the underlying technology. Whether it’s open source, or proprietary, or Flash or non Flash, HTML file, I just don’t care.

What I do care about is that it is a delicious and amazing experience. The people who care are the people who are in the ecosystem, who are either playing in it or are getting boxed out. That doesn’t matter to me at all. In fact, I don’t care if all those companies go out of business. I really don’t. What I do care is that the innovation happens to enable businesses to do things dramatically different. If you think about just what could happen if I could structure my workers to cooperate real time over these mobile devices in front of a customer in a way that has just never existed before, and I don’t care what technology that’s based on.

As I look at these things and I hear all the rhetoric, I’m like, I don’t care. I like the fact that Apple is closed. I pick up this thing for US$299 on my upgrade plan and man, it turns me on every day. So I’m feeling like ‘man, I don’t care.’ That’s your problem, developers. Developers seem to like it. That’s the ecosystem fighting it out. That’s the operators fighting it out and the Adobe’s fighting it out. As a consumer and an enterprise worker, I just don’t care.

Ed: There is an interesting post from the Google blog about open versus closed systems and innovation and how open-source systems take a longer time to develop, but end up often creating a more sustainable ecosystem. To your point, I think a closed ecosystem is absolutely an advantage of the early stages of the market because you solve so many of those initial problems of trying to make things work together.

Gary: Exactly right. I think the best living example that we have in technology is Windows. As much as we all hated Windows and I fought it for years at IBM and then at every other company, it worked. It was on 96% or 98% of computers and people could develop to it and enterprises could develop to it. Exchange Server came along and it worked, it worked fantastically.

Who suffered in that? Lots of people, I’m sure, in the ecosystem suffered and therefore we can argue that innovation suffered as a result of it. However, there were a whole lot of wins too. I’m not an advocate of Microsoft, don’t get me wrong. It’s just these types of things we tend to overblow because, well partially, we sit in Silicon Valley, so we fight the fight.

Back to the 10-year view, I think the canvas will continue to get less constrained. It will continue to get much simpler to paint on and it will continue to get much more exciting to consume on for all the pedantic examples I used before. Therefore, there was one question I saw in here before, basically it was asking what the ecosystem will look like.

I think the areas of specialty, the parts of the ecosystem that are the best will get strengthened and this desire to play outside of our area of specialty will get weakened for the simple fact that it is very hard for some to play in other people’s spaces. An example is an operator. They have tried to be content providers. They have tried to wall the garden. They provide a pipe, a billing service, support, and they provide a great care and feeding of the end consumer. That’s what they do.

Are any of these operators going to be the next one that enables a great application platform? Absolutely not. I’m not even saying it’s possible. They have tried it and it’s just not possible. Innovation will happen at the discrete level just like it does in every other platform shift. It will take some time, but ideally the most important pieces, whether it’s enterprise developers or consumer app developers at that level, and then innovation I think is probably the least developed today.

Carriers are really going to focus on how to get the pipe big and meaningful with low latency and reliable. We have a lot of work to do there, a lot of work to do there. It will happen and the seamless shift between

Page 108: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

108 [email protected] September 2010

3G or 5G or 8G or WiFi or whatever WiFi will become and these city hotspots and all these other things that will materialize in the next five years, they will do one thing. They will enable that always-on connection. Again as a consumer, I don’t care how it happens, I just care that it happens. It will enable it at a meaningful price point.

I look at the cable model. We pay US$130 a month for cable or something ridiculous like that and I have to watch television commercials. It’s just absurd. As a percent of GDP, that model has scaled higher than gasoline. I don’t think we can in mobile afford US$30 a month for the iPad, US$120 a month for my mobile device, and that’s if I don’t go over the 1500 text messages. Then I have another device where my car is connected, so I pay whatever it is, US$9.95 for real-time traffic for each car and I pay for Sirius or XM radio for my boat. There is a finite amount of money to be spent on that and I think operators and those service providers have a material role to play in that.

In 10 years, our lives will be connected everywhere and we will have some sort of rate tariff that is meaningful but not constrictive. The other thing is, right now, it’s largely server to device communications and therefore any applications or services that are provided are within that constraint. It will be device to device, any device to any device. It will not be just person to person on the devices but machine to machine. We will recognize state, we will recognize presence, and we will recognize location.

I will be able to tell where my car is and all these things will start to communicate in my world. We see applications like FourSquare and FindaFriend and so forth. Today they are trivial and they will be even more trivial in the near future, but all the devices in my world, and there will be hundreds in our personal lives connected to the cloud somehow. They will start to be the endpoints of our existence, whether that’s for work or for consumer parts of our lives. My home computing system will know when my lights are left on in Tahoe, or you know when my daughter is going too fast on the boat. Those are constrictive, but we will be able to control all those different aspects. That will be a huge shift for us and it will be important.

There will be innovation around that and I think there will certainly be innovation around the software-development layer. What I said earlier is that that is probably the least developed and what I mean is probably the least mature. We are just now at schools that matter. Design and computer-science schools are now getting into mobile development. Previously, to one of my earlier comments, we have tried to solve the mobile problems through the lens of what we already know.

The first days of the internet were great examples where we said, ‘Let’s just take a web page and shrink it, or reformat it into some WAP base and we will shrink it.’ That didn’t work because we had to window pane back and forth, or so we couldn’t see it, so our answer to that was we will just window pane it and leave it full size so you can see a 2x2 square of it at any one time. It just doesn’t work. That’s a necessary step, but that’s a step that is a result of looking at something through the lens of what we already know.

The kids coming out of schools today, design schools and otherwise, they don’t know all that stuff, and they don’t want to know all that stuff. What they want to know is, I am starting with the mobile device first, and they will be unable to unlock it in ways. I think the investment in that education, the investment in the tools to enable that creation, the investment in the service that enable the back ends to happen, that is by far where the biggest innovation will come. That will in some cases lead the technology and hardware innovation, whereas today it is being led by tech and hardware innovation. So that inverse will happen. I don’t know when, but I’m hopeful sometime in the next three to five years and it will explode.

Ed: Do you think we will need to find new types of development languages or development methods? I had an interesting conversation recently about how the dev ops cycle is a huge roadblock in deploying datacenters for the cloud. If there are new ways to accelerate or create new cloud-based development services, do you think those could potentially be enabling factors or catalysts?

Gary: Absolutely, no question about it and HTML 5 is an example. They have been working on this for a couple of years and it’s another example based on what we already know. What I was just saying about people coming out of colleges in the next generation aren’t going to be constrained by ‘hey, I know Java,’ or ‘I know HTML’ or ‘I know this web-based file-sheet format.’ They are going to be constrained by the target platform and the ability of the target platform to interact with the target platform.

The languages that every platform shift has been optimized quite dramatically for that platform shift. We are starting to see some experimentation in mobile today. When I say mobile, these devices are just going to be windows into our cloud, that’s all they will be. There will always be local processing that’s always needed. There will be some data that needs to be stored on these devices, but they will be windows into the cloud. There aren’t really consistent approaches today to develop in an efficient manner for that paradigm.

Page 109: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

September 2010 [email protected] 109

There’s lots of experimentation, there’s lots of innovative thinking going on here, but it’s nothing that’s really prescribed yet. That’s exactly what I was referring to when I said that will start to lead the technology and hardware. The reason is interesting and it gets into a philosophical discussion.

The hardware and physical layer is limited by physical constraints. We have a natural barrier to how we can think. We have a form factor, a size, a heat dissipation profile, and some physics characteristics whereas the software isn’t. We don’t have that same constraint. That’s why that will and has historically led the innovation cycle after the first two or three generations. That’s probably the most exciting piece. When I look at the horizon today and I don’t yet see it in a material way, I see experimentation. I know Google is certainly investing a lot of effort there, and others like Apple, for sure. That will be the fun part.

I guess the other piece is the whole collision of cloud, SaaS, whatever buzz words we want to use in mobility. I think many people who have been around mobility for a long time don’t talk about mobility as a phone anymore. We talk about it much more as a method of interaction. Basically accessing cloud services through devices and what I love about that is I can go to my PC, I can go to my iPad, and I can go to my iPhone or my Blackberry and I can get a file from my online service and it just saves. I can change it on my iPad and it saves. I come into the office and go to my file folder and it’s there.

If you think about that as such a trivial example of a change that three or four years ago was not even contemplated to be possible. We had shared services, but they were really “here’s my corporate share” personal, we were just getting to it. If you had the wherewithal to know how to set up one of your home devices as a server, now there are thousands of online file services that I can save my work to in a secure manner. That ability is the preface to how I can now put data everywhere, or I don’t even need to share data, or I can just leave it where it is originally created and I can accumulate it at an application level. Paradigms are developing before our very eyes.

In 10 years, we will have machine to machine and we will have devices that are always on. We will have much less in our lives that take a boot-up cycle. The biggest difference between an iPad and a PC is that it’s on. My email is instant, it really is instant. I have my desktop on my iPad in a secure VPN fashion and it’s instant. Time is an amazing driver of innovation.

The car navigations system - there is no reason that I shouldn’t be able to get that in my life today. So we will see a lot of that. And I think that wearable computing is

an absolute, it will just be. There is no doubt about that. Video will be an immense driver of collaboration. We have just scratched the surface of video today - that is a fundamental game changing technology. Some of the innovation Cisco is doing around video and the way it is being included in the iPhone 4, they are all interesting preludes to what will be a much bigger shift of real-time, always-on visual communication. When you start taking all these factors, give those to the artists, what can they create? Again, your mind explodes with possibilities.

Ed: It’s pretty exciting.

Gary: It’s amazing. The hard part is individually when you look at it, you ask where should I spend my time?

Ed: Exactly, nobody knows which horse to ride yet and there’s more fragmentation probably than there was at the last shift. I don’t know that we will see empires crumble, but we will definitely see a lot of disruptions.

Gary: Look at the empires that have crumbled. I mean software companies have crumbled entirely. Hardware companies have crumbled. I don’t want to portend the future, but Symbian, a tremendous driver of smartphones and even though they still have significant share in parts of the world, what becomes of that?

The Palm devices that ushered in a lot of the pen and productivity gains, what becomes of that? What becomes of the PC in the enterprise world in 10 years? There are just these game-changing things that happen. The coolest part of mobile and sort of the final prediction that I have is not what it can be, it is what it is unlocking. The IT world, for 10 years, has been unarguably pretty stagnant and has been all about cost reduction, cost optimization and stack integration.

Mobile now is giving people, CIOs, IT developers, gaming people, creative artists, people that build consumer apps, an opportunity and an invitation and indeed an excuse to innovate again. That is what we as a society in the IT world have been looking for. We are moving from this how do I optimize the stack to run my business to how do I transform the way I do business? That is what any new platform should create and that’s upon us now.

Ed: It is truly a disruptive shift. I’ll share a couple of my views of what we will be seeing at least a little more near term. I really appreciate your insights. It’s just tremendously well thought out and valuable.

We have been focusing a lot on enterprise computing and the idea that we have this commoditization of base functionality through open source. The potential of

Page 110: 2020 foresight - Tech Views of the Future - Ed Maguire

Gary Kovacs, Sybase US software

110 [email protected] September 2010

cloud services and SaaS is essentially cutting out the less value-add components of cost in realizing applications. The way I look at technology at the top of the stack or just the complete reason for being is everything about an application and how you consume it and everything below it, as a business user, you don’t care about the technology.

It could be chewing gum and duct tape or the most advanced technology, but if it works you really don’t care. The idea that we are moving away from this do-it-yourself environment where you have a lot of pieces and you pay someone a lot of money to put things together to what I call tech-enabled services, but being really more broadly encompassing software and hardware that become part of either an appliance or in the instance of an iPhone, it’s a conduit for services which can be either the applications to the connected applications or the content itself.

So many of these business, your traditional software businesses, your traditional media businesses are looking at these new enabling technologies and this new type of connectivity is going to be disruptive. I see a lot of convergence as well in how companies make the money to fund the innovation. We will see a lot of fragmentation.

I think we will see the rise of the mom-and-pop application shop where it becomes so easy to, the barriers to creating an application that’s very specific. We are already seeing it in the mobile world. I think we will see that trickle down into the enterprise world, when you have more standardization of services where they can interoperate. You will have the big platform, the stack guys will be battling it out among themselves to maintain their value, kind of their integrated value networks.

Gary: Just to punctuate your point and I think it’s a great point. The cost model to a consumer or an enterprise user of consuming this type of computing has changed so dramatically. That has been a big driver. I can get my head around US$0.99, US$1.99, US$2.99 vs you really have to have office as an integral part of your world to spend US$299 for all those features. As a student, you would buy it, as a corporate worker you would buy it, but a lot of those things I had on my PC 10 years ago I have no interest in. They are US$49 for this particular thing and the

problem is that particular thing I would go one step further. It’s not so much about an application, but it’s more about a workflow or a “fun flow” that I want to have in my life. The application enables that.

I don’t need 27,000 features in a spreadsheet. I personally don’t, so why should I have to consume that at US$199 for Excel? Why can’t I just get this for US$19.99 - read a spread sheet, add a column, bold a field and then, if I want extra features, buy them on an ad hoc or an ala carte feature. That is the fundamental shift that’s happening and a little bit of Sybase and SAP.

You may think about SAP as this broad multimillion- dollar set of applications when we now are mobilizing these as a partner. This is pre-integration of anything. What people are saying is, ‘I want to mobilize this particular workflow in our mobile sales application and it better not cost US$10 million’ and we are talking hundreds of dollars a user, not millions of dollars. They are looking at mobilizing these trivial pieces that are very important to their overall workflow, but on a very different consumption model. Now our hope is enterprises over five years are going to spend about the same amount of money, but how they bite into that is very, very different.

Now it changes our investment profile at Sybase. Developing these big applications and big processes and big toolkits that you can sell to people who create things for thousands or millions of dollars is taking a second seat to what are the quick applications, quick processes, and quick things that we can get out there and have a shelf life of six months. The innovation cycle has to speed up and the way we develop has to speed up. How we do business has changed dramatically. It’s a good example of the point you just made and it’s an important one too.

Ed: Well, I think it will be an adventure for Sybase to be associated with SAP because it connects you with an enormous ecosystem of companies and users that have automated their processes over the last two big shifts. I definitely give credit to SAP for seeing in Sybase the value of that trend.

Gary: Yes, they see in us that it is ‘how do you enable this new edge?’ You have to have mobile DNA to do that. Back to the whole, the new canvas. Ah, we will see, we will check in five years.

Page 111: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

September 2010 [email protected] 111

Andy Lawrence, The 451 Group Andy Lawrence has a particular interest in how the energy use and carbon footprint of IT can be reduced: how IT can become more sustainable; the role of IT in improving energy distribution and management; and the role of markets and governments in enforcing or encouraging sustainable practices.

Andy has advised many businesses on their green IT strategies, both suppliers and end-user organizations. He also covers datacenter resource efficiency projects and is the program director for the Uptime Symposium on Datacenter Efficiency and Green Enterprise IT. He represents The 451 Group at the Green Grid, an international body devoted to improving datacenter efficiency. Andy also leads eco-efficiency research for the European Union Optimis project. This is a multiyear European Union project designed to enable businesses to select and deploy cloud services using criteria such as cost, trust and eco-efficiency.

Before joining The 451 Group, Andy was an influential editor and analyst, raising venture capital and co-founding London-based Infoconomy. He was previously a director of ComputerWire, a global IT information service. Andy was the founding editorial director of London-based Computer Business Review magazine, and later the US magazine Global Technology Business, where he pioneered critical, in-depth business coverage of the IT industry.

The datacenter energy footprint Datacenters account for 2-4% of regional carbon emissions and cost more than US$3.5 billion annually in the United States. The secular growth of connected devices, computational power and data will fuel demand for datacenter expansion. IT organizations have awakened to the inefficiencies inherent in datacenter design and are driving change through new technologies and infrastructure design. There are many ways in which the industry is addressing the eco-efficiency problem, including smart grids, datacenter power and cooling design, systems-utilization management and microprocessors.

The breakdown of datacenter electricity usage

IO4%

Storage4%

PDU1%

Server & power supply

14%Other Server15%

Processor15%

UPS5%

Cooling38%

Lighting1%

Building Switchgear. Transfrmer

3%

Source: Emerson white paper

Datacenters account for 2-4% of regional carbon emissions

Cooling accounts for the greatest portion of

datacenter energy usage

Page 112: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

112 [email protected] September 2010

The era of energy-aware computing Our conversation with Andy Lawrence of The 451 Group focused on the nascent eco-efficient IT industry and the opportunities for vendors in the coming decade. Andy believes eco-efficient investment opportunities will persist for a long time due to the nature of fixed investment decisions, but the current opportunities are centered in device metering, reporting and analytics. Andy’s experience leading the eco-efficiency research for the European Union Optimis project provides for a good perspective of the challenges and opportunities.

Key points Eco-efficient IT is not a clearly defined sector, and while the concept has

been around for five years, it is still in its infancy.

IT accounts for up to 4% of energy use and carbon emission in the US and its relative share is expected to continue to grow, which has gotten the attention of governments and IT managers.

Most electronic devices, including datacenter devices, do not measure their energy use and are incapable of reporting energy use for analysis.

We are now entering the instrumentation phases, and the metering of energy will enable the next phase of investment, which will be in analytics and management of devices within datacenters.

These investment phases are creating opportunities for companies, such as Honeywell, Johnson Controls and Schneider Electric, to leverage their industrial and automation expertise.

The endgame is to enable the movement of workloads across systems and clouds as a result of the data provided by the real-time metering and analytics.

The evolution will be one of stop-starts, and it will take decades to unfold.

Summary of interview on 20 August 2010.

Full transcript follows

Eco-efficient IT is not a clearly defined sector and

is still in its infancy

Most electronic devices, including datacenter

devices, do not measure their energy use

Page 113: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

September 2010 [email protected] 113

Andy Lawrence transcript Andy: One of the interesting things about eco-efficient IT, as we like to call it, or green IT, which is the more populous term for it, is that it has never been and probably never will be a clearly defined sector. It is more an attitude of mind as much as anything else, trying to reduce energy consumption and the carbon footprint of business generally, not just IT by marshalling resources in the best and most sustainable way possible.

From a technology point of view, we have seen this play right across the waterfront, whether it’s on the desktop with thin clients or power management of devices. We have seen it in the datacenter, we see it in systems management, we see it in processes, we see it in server design, we see it in storage hierarchies, we are now seeing it more in smart grid, we see it in carbon management and accounting software, so it really is a very broad area.

I think that has been part of the challenge for the IT industry. Companies that have tried to approach it by setting up some small division or appointing somebody to be in charge have actually found it to be quite a difficult process because it is all over the place. Once that is understood, it makes it easier to approach the sub just because we are actually going to see things happen in all of the areas we mentioned. I think the effects will be quite dramatic, so that has been an interesting observation about how things have been for the last two to three years.

Another observation I would make is that in the early stages of eco IT, it was a kind of phony market in a sense that there was a lot more noise in the press and amongst certain vendors who actually, in practice, weren’t really doing much as opposed to actual real company creating technology and products coming onto the market and so on. The fact is the real drivers for implementing eco-efficient IT, which are primarily financial and economic, a little bit of corporate social responsibility, some operational efficiency, also compliance carbon trading, etc. It is fair to say that none of those drivers has yet really fully bitten.

One of the first observations I would make is that those key drivers that will drive investment at a corporate level and also by suppliers and ultimately by the financial markets is only probably just starting to kick in. I think all of those drivers for development for an increasing market, for new products, for some exciting things to happen; they are all ahead of us for the next 10 years, not behind us. So, even though we are probably four to five years into the eco-efficient IT era, the real action is only starting about now.

I think, for the purpose of this discussion, it is at least sensible to maybe divide it into how IT is dealing with its own energy consumption and carbon footprint and then how IT is maybe helping business generally deal with its energy consumption and carbon footprint. Obviously the two overlap, but they aren’t necessarily the same thing.

If we look at IT itself, and we have all seen many figures that say IT accounts for 2% of carbon emission and some people say 4% and certainly if it comes to energy I am sure the 4% of electrical power is probably not an unrealistic picture. There is clearly a very strong trend of increasing power consumption by electronic devices. It is also very clear that a huge amount of that consumption is wasted.

It is wasted at almost every stage, whether it is power distribution to the door of the corporation or the datacenter, it is wasted in the datacenter, it is wasted in transformation in the inefficient distribution, it is wasted in the cooling overhead, it is wasted in low utilization devices, it is wasted in the devices being left on and off all the time.

It is also fair to say that if you look now across most electronic devices they either do not measure their own power consumption or if they do they are unable to report it. Effectively, even though we are dealing with very intelligent networks of equipment, in terms of power consumption, it is completely dumb. There is very little reporting back what is actually going on. One of the phases we are just entering into is instrumentation. Effectively it means that a server, and this is just starting to happen, servers are coming onto the market that can tell how much energy they are using at any given moment, certainly in relation to the jobs they are doing.

We are starting to see the same with networked devices. Importantly, if they are able to collect that information, which the majority of devices cannot, you must collect to something and do something with it. There is a very clear trend now, we are starting to see it in datacenters and I expect it to spread across the whole of IT, for what I call energy aware computing which is where IT actually becomes aware of the energy it is using. Unless it becomes aware of the energy it is using, it can’t actually do anything about it.

In terms of a vendor opportunity, one of the challenges is to collect that information, which is by no means a trivial task. To aggregate, order and normalize it so that it is in one format so it can be properly analyzed, put into analytics systems and then you can start to set policies . . . certain things on and off, trends and

Page 114: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

114 [email protected] September 2010

reporting, similar to the things we see in financial and other real-time systems today.

In datacenters, there are probably 10 to 20 companies that have and are in the process of developing software to enable datacenters to record the way they use power and to start to make decisions around that power. I would expect that that software will perform very well over the next two decades. Eventually it will increasingly be built into containerized or pre-modularized systems where, it may not be a distinct market. But in the meantime over the next 10 years, it will be, and we are seeing everybody from the big companies like IBM, Computer Associates, Schneider Electric right down to the small companies like Power Assure, OSIsoft, all engaging in that area. That is a very important trend and it is happening now. It is not in any sense futuristic.

Ed: Andy, how difficult is it to apply measurement to the systems that are already deployed and in play? Is there a lack of granularity? Do you need new equipment to measure the power draw for servers that have been designed without any thought toward being able to manage the power consumption?

Andy: Yes, you do. It is quite an interesting area. The most common basic setup in a datacenter is the power is measured as it comes into the datacenter and it is managed at the UPS level, the uninterruptable power supply, which is the centralized level. It is sometimes the case that people are able to measure at the PDU level, which is maybe a group or racks or rows together, but it is not that common that people can get down actually to the rack or to the row or certainly to the individual server. Therefore, it becomes quite difficult for people to manage, you know if you have 1000s or even 100s of servers, to identify which ones were on or which ones were off.

You may have a logical view of what work is done on a certain server, but you won’t necessarily know what power it is actually drawing. It is quite a process to try and read that information, and as you were asking ‘how do you do it if the server itself doesn’t already have some metering capability, which is very rare.’ The answer is that you have to put a meter or multiple meters in somewhere. If you want them to be accurate and resilient, they can be quite expensive - hundreds of dollars a piece. To instrument an entire datacenter with 1000s of servers can be pretty expensive. At a later stage, you want to go on to use that to actually bill customers in a colocation situation, those servers need to be accurate to within 2%. That adds a further north on the price at least.

Ed: If you think about the logical progression here, and I don’t want to knock you too far off track, but the idea would be that we would ultimately be able to track the power consumption of a particular application or

workload which would provide a truly granular view of the true cost of running your CRM system or your email system or being able to host a workload for a cloud provider.

Andy: Absolutely. The endgame here, and you are interested in what will happen within 10 years and this may not happen within 10 years. I don’t think it will happen commonly in 10 years, but the endgame is that if you have CRM systems or email, you can actually know exactly how much is not exactly energy. You are interested in knowing what carbon generation is being associated with any business activity you do.

If you track that across the cloud from a cloud supplier, if you want to track it down through the virtualization layer and have some knowledge of what energy is being used, what carbon, and also have the power to use that information to make decisions, such as we are going to move that workload somewhere else . . . that is a big project, but one that I think the leading IT vendors understand is the next stage in this whole game.

It may take years, but I think this has to happen. There are projects going on in Europe. We at The 451 Group are involved in a project with the European Commission that is getting into aspects of this, but I know all of the leading vendors, IBM, Microsoft, Cisco, VMware, Schneider, have all thought quite a lot about this, as have a number of startups. They are not ready to productize anything, but they are certainly aware that this is where they would like to go.

Basically, the challenge is to make the underlying physical infrastructure of IT fully aware of the energy it uses. Once you have that information, you can start doing things with it. That is probably the next big trend. Let me give you an example. If you have 10 servers that are each running an application and running at very low utilization, which is not at all uncommon, maybe less that 5% of the total processor capacity, and yet all of those servers are running at 70% or 80% of their maximum power consumption. It clearly makes sense, using virtualization, to move as many workloads to one or two servers and close the other eight down. If that could be done across a datacenter, a corporation, a cloud or industry generally, you would see huge increases in efficiency of IT. This again is somewhat futuristic.

There are products out there that do this in a fairly rudimentary way, but they are still in their early phases. A couple of commercial products are available. One is from a company called Power Assure in California, but it has been very difficult for them to prove in commercial situations that this works. I would also say that the datacenter industry, datacenter operators and managers are heavily committed to and incentivized to ensure that their systems never fail. If they start to take risks with availability in order to

Page 115: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

September 2010 [email protected] 115

make power savings and it goes wrong, they won’t be thanked for it.

There is a very conservative mindset across the datacenter industry and the overwhelming attitude is doing never turn the server off. The idea that we are going to move to a situation where automated systems move workloads around, shut servers off, work out where the demand is right, fire servers back up, all sound technically very feasible. While not simple, it is understood technically, there is almost zero commercial demand right now, but I expect that to change in the next five to 10 years.

Ed: We have already seen some approaches on the storage side. I believe COPAN was a company that was working on systems that would essentially spin down the storage systems and then fire them back up, or try to bring them back online quickly when there was a need to access that data.

Andy: COPAN, which the fact that they are no longer with us might say something about that technology, but they had the notion of sleepy disks. Effectively, it was rather like putting a server into a semi-comatose state to save the power. In storage, we are seeing a similar development where you spin things down and put them in low power states and bring them up as fast as you can. The trouble about storage is that there is almost a completely opposite development to bring everything in near-line memory, very fast solid state memory. Probably you are seeing two conflicting trends there. I don’t think, given that storage at the moment is no more than 15% of datacenter power use, it isn’t really a huge focus for the green movement or the green R&D side of things, although there is work going on and as storage builds it will become more important. I don’t see it getting to be a real focal point. More of the work is going on in servers where the real power consumption is.

Another interesting area is datacenters generally. It is becoming increasingly understood that datacenters have been a little bit over engineered and there has been a lot of very precious handling of all of the operating envelopes and so on. We are now seeing datacenters use a lot more, for example, free outside air cooling. We are seeing datacenters run a lot warmer. We are seeing people no longer trying to monitor humidity or static quite as much. There is a general feeling that if we can run all the datacenters in a bit more of a rudimentary fashion and maybe the IT hardware can be a little more rugged, the result is that this 50% cooling overhead that is using up 1/2 the power in datacenters will no longer be necessary. That is a very clear trend that is happening across the datacenter industry.

We are starting to see datacenter efficiency figures that would have been thought impossible two to three years ago. Allied to that, there is a notion in datacenters that you have to protect availability almost at whatever cost. We have come away with a very good tier system, with tier four, tier three, tier two, and tier one datacenters which can enable people to design according to the needs of the business, to design in availability and redundancy and so on. Of course, the more redundancy you build in the more energy you use.

I think one of the interesting developments, big software opportunity is, if by virtualization, by cloud, by sophisticated systems management, you could actually move workflows around, you don’t need that physical highly available infrastructure underneath. We are already starting to see fairly major corporations say ‘I am not sure I want to build a tier-4, tier-3 highly available datacenter, which costs tens of millions of dollars, when we can build simpler ones and just move the workloads around if there is a problem.’ No, I am not saying this is commonplace. We are already seeing the likes of Microsoft and Google doing it and I think it is going to spread across the industry. We may start to see almost simpler datacenters of the future that don’t have all this engineering high availability complexity in them. A lot of the complexity will actually move up into the software layer which will in effect move the value add into the software layer. You will hear a lot from VMware, IBM, CA, Google, Microsoft, and those companies all very interested in how you resolve those problems.

Eventually, we may even see a situation where you can buy relatively simple and configured mini datacenters such as are available now. I think they might become a little more commonplace. People may use them for some of their local computing needs, possibly without all of the sophisticated power supplies and external chillers and all those types of things. They will couple that with services they get from the cloud. We will see a lot of absolutely huge datacenters, highly energy efficient, often broking their cloud services with other cloud providers to effectively create a virtual and highly energy-efficient mass datacenter infrastructure. That will be coupled with smaller, almost local containers that do some of the local work. It is not clear how it will develop, but it could be quite interesting. One thing is quite clear and that is that the amount of work that is done for the amount of energy consumed will continue to dramatically improve by factors, on a Moore’s Law rate of progress, I would imagine.

Ed: Microsoft has been demo’ing their new ITPAC, their Azure platform in a box and that seems to be the harbinger of a lot of these trends that may ultimately become mainstream.

Page 116: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

116 [email protected] September 2010

Andy: I think it fits into a theme that I know you are interested in, transparent computing where a lot of the complexity is hidden away. One of the problems with datacenters is that the complexity and cost have started to frighten organizations. If it can be delivered in some kind of in-built stack, whether in the cloud or locally, people find that very reassuring. Those are some of the developments.

Another thing I think just to focus on some of the basic systems, one of the huge problems IT has faced up to now is that the moment you turn on a server, the power consumption goes straight up to 60% or 70%, sometimes more. The truth is that most devices never go much beyond 60-70% of their utilization, so the moment you turn a box on, it is consuming 70% of the power and it will likely stay there for three to five years when it is replaced, regardless of whether it is doing no work or a lot of work, but almost never the maximum amount of work.

I think another very interesting area of R&D from the chipmakers and also some software companies is how do you get this dynamic scalability where the amount of power consumption reflects the amount of work you are doing. Like where you put your foot on the accelerator of a car, you use more gas, we really need to see something the same with power consumption in a server. Intel in its most recent processors, especially multicore processors, has gradually built in more and more capability and AMD too, to enable for example, areas of the chip to be shut down if it is not being used. If it is a multicore processor for the processor to understand that certain processes aren’t necessary and shut them down. It is providing hooks to control the voltage and frequency.

What then happens is the software suppliers have the opportunity to hook into that and actually turn devices up and down according to how they are actually using the server. I think we are going to get to a phase maybe in five to 10 years where all of the power management issues begin to go away where, when a device isn’t being used, it falls to a very low level of consumption. That alone will save vast amounts of power. That is another interesting area of development.

Ed: It would be helpful if you extrapolate some of these technologies more broadly from the datacenter into more general energy management into physical systems, building systems. Where do you see the types of applications ultimately being able to impact energy usage and being able to retrofit these technologies to existing systems? How difficult will it be to implement some of these more advanced technologies on existing investment?

Andy: The power consumption by IT in business, in some business it’s only 5%. If you are a datacenter operator, it gets up to 80% or 90%, so it varies widely. Clearly, it makes some sense to bring all of the power consumption of your equipment, whether it is IT or your buildings, into one place. We are starting to hear people talk about end-to-end systems management. Cisco is one that thinks that every device in a corporation will be addressable, will be intelligent, will be using IP protocol, will sit on an internal network or a mini-internet and it will be possible to control and measure the power consumption and to be able to turn things on and off based on demand.

That makes a lot of sense and I think we are going to get to a point where the power consumption of corporations scales up and down rather in the same way it happens with the server. The power consumption drops naturally, without human intervention at the end of the workday, with all of the intelligent policies making decisions about what to turn off where and when is happening in the background being taken effectively by an intelligent system which may be an energy-management system or some kind of network of building-management system and network-management system.

The first challenge is to get everything instrumented. As we have already eluded to, a lot of it isn’t instrumented so there is a huge opportunity just to go in and start making devices talk to each other, share the information, and also pick up on power information and to enable these devices to respond to external commands in a simple way. Part of the challenge right now is that the people who, for example, supply datacenters are not the same people who supply air conditioning or elevators or other systems.

We are starting to see lots of organizations getting very interested in this area. We are starting to see Schneider Electric, Johnson Controls, Honeywell, people who are traditionally building-management systems, they are starting to wonder if they might be competing against Google or GE or IBM at some point. It is not at all clear how this is going to play out, if there is a clash of standards or architectures.

To be honest, the actual return on investment for the end user, the business, is somewhat fuzzy. Doing it on an energy savings alone, it may get you there. It may be that you need to do it on energy savings for compliance. It may be that only the newest builds will install the most sophisticated systems. It is not clear how quickly this is going to happen, who the leaders are going to be, or which architectures are going to dominate. I strongly think it is going to be an IT-based architecture that will use IP protocols.

Page 117: 2020 foresight - Tech Views of the Future - Ed Maguire

Andy Lawrence, The 451 Group US software

September 2010 [email protected] 117

Companies like Cisco are going to play a very big role. Schneider Electric, a partner of Cisco, will play a very big role. IBM almost certainly will play a big role. Again, the end goal is to enable ultimately, not just businesses, but anything in society, whether it’s retailers, traffic systems or anything, to control the energy use and report it back and use carbon data as well. That is the trend and clearly there are energy security issues, there is a carbon issue, and there is a financial issue about the amount of sustainable energy that will all drive us in that direction.

Where I might differ from some of the more ambitious, spectacular forecasts from some analysts about smart grid in particular is that I think it is going to be a real stop-start process. I think it is going to take a couple of decades. It is not always going to be clear that it is worth doing the retrofit and embracing the complexity. People don’t rip out building-management systems, for example, so they can get a better view of their energy footprint. They will do it over time. The compliance has been very stop-start and I think it will continue to be. This is a long-term project.

Ultimately, when businesses can start to control their energy use and carbon footprint almost at a real-time level, I don’t know but certainly at a dashboard level, then they might be able to start participating more in the smart grid. They might get some to say that it is worth us shutting down this plant, participating in demand response, firing up some generators here, feeding in renewable energy there. They will start dealing with energy in a more sophisticated way analogous to how large companies deal with money now. They have treasury systems and financial-management systems and they participate in markets

and they do hedging and all of that. I think we are going to start seeing businesses do that with energy as well. Maybe not everybody, but the systems will evolve for that to happen.

Ed: It seems like it will. It has been my take that it will end up having to be a very gradual process in terms of looking at the project in Boulder Colorado ended up costing close to three times the initial projections. Of course, the idea of costly retrofits only makes sense when the savings themselves are really justified through the combination of cost savings and corporate social responsibility, which is hard to quantify. It is certainly a vision for the future. I share your view, it is a real jump ball that there is a lot that is undefined and a lot of people that will be competing that may not have ever imagined that they would be competing against each other. That is what is so fascinating. The convergence of the physical and logical infrastructure potentially around an IP backbone for energy may provide this connectivity that could also result in a number of far more value added or intelligent systems over time. That is what seems to be really interesting.

Andy: Right, there is a school of thought that we will start to see completely new types of companies. That the smart grid is like the third internet and we are going to start to see new Goggles and new eBays and new Amazons. I am sure we will start to see certainly service companies and those who are building analytics layers and trading layers on but I guess at this phase we aren’t quite sure what will happen. I think we will definitely see completely new companies emerging that we had no idea of even a few years ago or even today. It is all quite interesting and it won’t happen over night.

Page 118: 2020 foresight - Tech Views of the Future - Ed Maguire

Seth Levine, Foundry Group US software

118 [email protected] September 2010

Seth Levine, Foundry Group Seth Levine’s career spans venture-capital investing as well as operational, transactional and advisory roles at both public and private companies. Prior to co-founding Foundry Group, Seth began his venture-capital career at Mobius Venture Capital. He currently serves on the boards of AdMeld, Lijit, Mandelbrot Project, Medialets, StockTwits and Trada for Foundry Group. Prior to Mobius Venture Capital, Seth joined a restart of the data-communications company, FirstWorld Communications. Seth was the corporate finance half of the CFO position where he started and led the finance, mergers and acquisitions and investor-relations groups. Seth then led the IPO process for FirstWorld’s IPO in 2000. When the company decided to focus on its datacenter and hosting business, Seth assumed operational control of the remainder of the business and led the internet service provider, telephony and network integration portions of the company. He managed the sale of these lines of business in 2001.

Seth is a summa cum laude graduate of Macalester College in St Paul, Minnesota.

Moving toward intuitive computing The dramatic shift from the traditional graphical user interface (GUI), driven largely by the exploding smartphone market led by Apple’s touch-driven user interface first introduced on the iPhone, now begs the question, what’s next? In a general category, everything outside of the realm of GUI falls under the umbrella of non-traditional user interfaces. We are beginning to see manifestations of these “other” interfaces in many sectors, from gaming to home automation to healthcare. Microsoft is leading the transition in gaming with Kinect, which interfaces by incorporating speech and motion detection into the user experience.

Emotiv is another example of a company pushing the bounds of user interfaces. The firm specializes in producing headgear capable of reading the brain’s theta waves and translate them into computer inputs. With the motto of ‘you think, therefore you can,’ the brain computer interface technology being explored offers life-changing applications for disabled patients, like controlling a wheelchair, typing on a mind-controlled keyboard, or playing a hands-free game. It is also being applied for gaming, but future possible applications are limited only by the imagination.

Microsoft’s Kinect Project Epoc

Source: Microsoft Source: Emotiv

New types of interfaces find use in areas from

gaming to home automation to healthcare

Brain computer interfaces can read brain waves to

control computing

Page 119: 2020 foresight - Tech Views of the Future - Ed Maguire

Seth Levine, Foundry Group US software

September 2010 [email protected] 119

The glue that binds us We spoke to Seth Levine, managing director of Foundry Group, on 30 June 2010 about his firm’s investing activity, which is largely driven by an approach focused on themes that drive cycles of innovation over a period of five to 15 years. Foundry Group’s investments are currently characterized in six main themes, including human computer interaction, implicit web, email, “glue” and “digital life.” “Distributed” is a term that is a common denominator in all of these themes - the notion of a single point of information has been eroded by the web.

Key points The web has eroded the notion of a single point of information. As

connected devices proliferate, each with its own service and storage, technologies that stitch them all together into a cohesive view will become core to the internet.

Today, people have multiple points of communication, multiple IDs for media sites, blogs, news sources, media consumption and storage. The end result is an increasingly fragmented identity. This is an area ripe for innovation.

New web-services hubs will empower web-application designers to build and deliver lightweight applications across the web to different endpoint locations.

The propagation of lightweight web applications and the glue that binds them together may result in faster innovation cycles and benefit hardware and appliance vendors as they become more accessible through the cloud. There is the potential that the importance of a ubiquitous OS is marginalized if services are readily available.

It isn’t always optimal to interface with computable devices using 20-year-old keyboard and mouse technology. New means of interaction will evolve as form factors continue to change.

Advancements in software and hardware design will lower the cost of developing new computing interfaces, freeing developers from constraints of today’s form factors. Personal clouds invert the notion of the external clouds.

The scale of the cloud and large communities of users have opened the door to new business models which can scale rapidly and at relatively low cost.

Networks will be borne from existing 3rd-party networks, leveraging their installed base before reaching critical mass and branching out on their own.

New web services hubs will enable new types of lightweight applications

Advancements in software and hardware design accelerate new computing interfaces

Foundry Group’s interests span a broad range

of themes

Page 120: 2020 foresight - Tech Views of the Future - Ed Maguire

Seth Levine, Foundry Group US software

120 [email protected] September 2010

Seth Levine interview summaryFoundry Group’s team has been investing in email for as long as 15 years and the innovation continues to date as evidenced by the most recent investment in SendGrid. Email is pervasive and its role in the enterprise is well engrained. The problem is not much innovation has happened since the integration of email with contacts and calendaring. Seth believes this is ripe for change, especially in terms of mining existing email infrastructure for “the enterprise social graph” and tools for knowledge management.

Seth believes the world is ripe for a series of major shifts in user interface paradigms. He believes the keyboard and mouse, while disruptive technologies at one time, are not optimal means for humans to interact with machinery. The underlying premise to the investment theme is there are billions of computable devices, and it isn’t always optimal to interact with them using the traditional keyboard and mouse.

Recent advancements in computing interfaces include Apple’s iPhone pinch and motion gesturing, Nintendo’s Wii motion controller, and Microsoft’s Kinect motion and voice recognition. Seth’s firm has invested in the company Oblong, which recognizes hand motions and is akin to scenes in the movie Minority Report. While the technology currently requires gloves to be worn, the firm expects it won’t be necessary in the next 12 to 18 months. Further, the price point is expected to eventually scale down to appeal to the desktop-PC market and be very affordable. Seth believes as chip and software design evolve, there will be more innovation in this area.

Foundry Group’s investment in Sling Media was a successful learning experience for the firm in the digital-life, digital-home theme. The group realized that investments in this construct needed to be compatible with existing entertainment infrastructure, simple to use and work wherever a user may be.

Existing home and personal media infrastructure is antiquated in terms of connectivity in and beyond the living room. Seth described two phases of “connecting” the home and believes we are in the middle of phase one, which is enabling devices to communicate. He described phase two as truly empowering connected home-entertainment networks when media can flow freely between disparate within home networks and beyond.

Seth highlighted Pogoplug (Cloud Engines Inc) as another interesting investment in the digital life, digital home theme. The product inverts the notion of cloud: instead of the cloud residing outside the home, Pogoplug is an appliance within the home that makes all storage available by folder or media as a personal cloud. The media can be pushed from one location to another seamlessly rather than worrying about what device the media on which the device is stored. The key to the technology is how it reduces the complexity of moving and storing media between personal devices.

Seth believes we will witness an increasing distribution of where data resides. There are increasing amounts of connected devices, each with its own amount of storage. This presents an interesting problem of how to access and manage all the data on the devices (personal media in the Pogoplug example). Today, people have multiple points of communication, multiple IDs for media sites, blogs, news sources, media consumption and storage. The end result is an increasingly fragmented identity.

The world is ripe for a series of major shifts in

user interface paradigms

We will witness an increasing distribution of

where data resides

Existing home and personal-media infrastructure is

antiquated in connectivity

Page 121: 2020 foresight - Tech Views of the Future - Ed Maguire

Seth Levine, Foundry Group US software

September 2010 [email protected] 121

The distributed workforce is growing in size and the complexity in managing the workforce is amplified by its expanding geo presence. The underlying physical infrastructure needed to run an outsourced business is completely abstracted. This is possible, in Seth’s opinion, because the technology has evolved from colocation (renting boxes) to virtual colocation, to Amazon Web Services (hosted on multiple machines, completely abstracted and distributed across regions).

Seth discussed successful business models that leverage third-party networks to foster their own communities within them before branching out. The concentration of users on Facebook (currently more than 500 million) for example, provides for an interesting opportunity to develop a platform capable of harnessing the user base and its underlying social-media principles. Zynga is a Foundry Group investment that has been successful in this regard. Another example is StockTwits, which leverages the Twitter user base.

The rise of virtual currency is interesting to Seth, and he believes it has been a relatively successful category. Microsoft, for example, has been able to sell more than US$100m in virtual goods on the Xbox Live platform. Gaming, in Seth’s opinion is only one example of successful applications of the model. Seth’s firm has invested in BigDoor Media as a means of participating in the space.

Glue is a term Seth’s firm used to describe the web-infrastructure layer that facilitates the connections between web services in content companies. In other words, glue is the technology that stitches the distributed web together. Over time, as content on the web becomes more distributed, glue will be the foundation for enabling the delivery of lightweight applications across the web to different endpoint locations.

Foundry Group believes web-enabling technologies are going through a phase similar to what the enterprise experienced in the 1990s with enterprise application integration (EAI). The firm had several successful investments then including Dante Group, DataPower and Cyanea, which were all acquired. The firm has most of its investment portfolio based in the glue theme.

Gnip is an example of a current glue investment. Gnip helps web-services designers connect to all other web services - the TIBCO of web services. Seth claims it is far easier to ping a central repository of web services than it is to track multiple services available from multiple locations. Informatica is an example of a company that solved the enterprise equivalent of this problem in the past.

Social networks foster development of new types

of applications

Glue describes the web-infrastructure layer

between web services for content companies

Web-enabling technologies undergoing

evolution similar to EAI in the 1990s

Page 122: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

122 [email protected] September 2010

Glen Mella, Control4 Glen Mella currently serves as president and chief operating officer of Control4, overseeing the company's internal operations. Glen brings more than 20 years of industry management experience to the company. He has held senior-level management positions within large, publicly traded software and consumer-product companies, as well as early stage technology ventures.

Prior to joining Control4, Glen served as senior vice president of marketing and sales for Triton PCS, an award-winning wireless carrier providing service in the Southeast. Glen also served as division president for CRS Retail Systems, a leading provider of point-of-sale (POS) and multichannel software to the retail industry. Prior to this position, Glen was president and CEO of Found, Inc, a software-solution provider for the integrated retail enterprise that was eventually acquired by CRS. Glen has also held executive and management positions at TenFold, Novell, WordPerfect, Dial Corporation and Frito-Lay/PepsiCo.

Glen holds an MBA from Northwestern University and a bachelor’s degree from Brigham Young University.

Toward the intelligent home Homes today are mostly comprised of disparate systems with minimum integration. In the US alone, nearly US$860bn is spent annually on TVs and audio & video equipment, which often cannot stream media easily between systems, let alone rooms. While universal remote-control devices have been a staple in many homes for decades, they rarely interface with home heating & cooling, lighting and security systems.

New platforms based on open standards are being delivered to the market. Once applicable to newer, custom homes, home improvement and entertainment companies are partnering with software-platform providers to deliver systems that can be managed centrally and intelligently. Consumers are now able to “plug and play” with new home devices, which open up the addressable market to the home-controller vendors.

Control4 user interface

Source: Control4

In the US alone, nearly US$860bn is spent

annually on TVs and audio & video equipment

New platforms leveraging open standards are being

delivered to the market

Page 123: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

September 2010 [email protected] 123

The smarter “home suite home” Glen Mella is the COO of Control4, a private vendor that aspires to become the “operating system for the home” through a network of partnerships with appliance and home-equipment manufacturers. In the past, the domain of network home automation has been the province of very high-end custom homes, but Control4’s vision is to extend these capabilities to the mass markets. Our conversation explored the dynamics of this market as well as some of the growing potential for new applications, such as senior care and integration with smart-grid and energy-efficiency applications. The killer apps for home automation are energy management, elderly care and mobile monitoring.

Key points Adoption of open standards based on the ZigBee protocol (802.15.4)

enables integration of disparate home products and technologies. This will give rise to home systems that are networked and controllable from anywhere.

The advantage of a ZigBee-based mesh network, where messages “hop” from one device to another, is a level of resilience and redundancy for home systems. Home-control systems can also communicate wirelessly on an IP backbone.

Remote monitoring of the home is the “killer app” for home automation. The ability to monitor an elderly relative could increase quality of life and extend the age that seniors can live unassisted.

Remote control of security, climate and lighting will create greater flexibility and cost savings for users.

There will be new opportunities for developers of system control, measurement, and security applications around the networked home.

‘Metcalfe’s Law holds that the value of the network increases proportionately with the Nth new device. In his world, that was true about PCs and corporate networks, but in the home we believe the same is true.’

Summary of interview on 15 July 2010.

Full transcript follows

The ZigBee protocol (802.15.4) enables

integration of home products and technologies

Remote monitoring of the home is the “killer app”

Page 124: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

124 [email protected] September 2010

Glen Mella transcript Glen: By way of background, Control4 is in our 7th year as a technology provider, although we really began shipping product in May 2005, so just over five years ago. Our first two years were the R&D phase for the company. Whereas some technology startups invent a single product, we introduced 24 SKUs in May 2005, because home automation and integrated systems like ours are not one product, they are a system of products. And today, if you count every size, color and form factor, the number of SKUs is in the hundreds.

Control4 is now selling our products in 58 countries. We have delivered about 140,000 home controllers, which is the core device in our ecosystem, and about 1.4 million ZigBee devices in those markets around the world. Our delivery channel today is comprised of 1600 resellers in the US and Canada where we sell direct through custom integrators. We refer to them as CI dealers for short, and they typically work with low-voltage, home automation/home theater, custom audio systems and security systems.

Our CI dealers are not typically electrical contractors, although they hire electricians when it comes to doing lighting or working with high-voltage wires. And then we also sell through specialty electronics retailers, such as Magnolia, and a number of regional electronics retailers around the country. In the international markets, in South America, EMEA and Asia Pacific, we go through master distributors who have country-wide, multiyear distribution agreements with us. They are exclusive and they represent us; they aren’t wholly-owned offices but they are our distribution partners. They perform the three core tasks that we do in North America: recruit, train and support dealers, and of course they warehouse and deliver our products.

So I just wanted to give you that as background. It still feels very much like a company that is new in this world of the IP-based home or the connected home.

Ed: Could you talk about the state of the home network and where it has been, just touch on that very briefly, and if we fast forward, what are the missing pieces till we get to the vision?

Glen: Ok, so where have we come, where are we and where are we going?

For the last decade and a half or so, there have been home-automation solutions on the market. They have been almost targeted exclusively at the high end, new construction market. As a benchmark, if there were roughly 120 million households in the US and at its height in 2005, the new housing starts were about 1.8 million. That number is substantially lower now. I think the annualized number today is about 600,000. At 1.8 million, it was just under 2%, so in any given year in

this country, somewhere between 1% and 2% of all the homes are represented by new homes under construction. Of those 1-2%, a small percentage of those carry a price tag of north of a million dollars. So if you are only selling to multimillion-dollar new homes, you can imagine what a small niche market that is.

The companies like Crestron, AMX, Lutron lighting, the higher-end providers, were really targeting this high-end niche, so therefore it was not a well-known or well-understood category. We are all three degrees separated from somebody who is related to somebody who has a rich uncle in Park City or Tahoe that’s built a nice cabin with touch screens, etc. That has been the market and the Custom Electronics Design and Installation Association (CEDIA) channel has been the delivery environment for companies who sell expensive products, with high margins, that typically do a lot of unique custom work (custom programming, custom design). That has been the state of the industry.

We came along in 2005 and said, ‘To make this a breakout category and to deliver the benefits of integrated technology systems to a much broader market, you have to do three things: you have to make it affordable, you have to make it easy to set up and use, and you have to make it retrofitable to existing homes.’ If you could really do those things, not just give lip service to them, but if you could really make integration technology affordable, easy and retrofitable, then you have the potential for a breakout market.

Ed: So the role of IP protocol and ZigBee for devices is really providing that standards framework that’s allowing for this?

Glen: That’s correct. By home automation, what we really mean is integrated systems on an IP backbone. We all have from one degree to another, the following technologies in our home: some kind of entertainment system, (99% of homes or more have a TV), and that has evolved to some type of home theater, a flat-screen TV with a surround-sound speaker system, whether it’s a TV room or a dedicated theater room or whatever.

We all have lighting control, temperature control and a growing percentage of us have some type of security or home monitoring system. So if you just took those four subsystems and said what would be the advantages of them behaving in a unified fashion rather than individually? So you’d have comfort, convenience, energy management, safety, security and peace of mind.

I could throw out 10 attributes immediately. In fact, we have a little phrase ‘Life’s better when everything works together’ as opposed to the alternative which is,

Page 125: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

September 2010 [email protected] 125

‘I have all these disparate technologies.’ I didn’t even mention communication, so add your land line and your mobile network to the picture as well and you start to see confusion, frustration, inconvenience, cost and high-energy consumption. You start to see a lot of issues that can be solved and addressed over time, once you add standards and the adoption that can make for integration.

So what home automation really means is integrated technologies within the home, and we believe we are one provider of a platform for doing that now, even though the entry point today for many people is purchasing some hardware to integrate those things (ie, a home controller or some type of touch screen or some type of smart remote). To your question about where the value really resides, we are quick to position that we are a software platform provider.

Today the revenue model is very much driven by hardware that is enabled by software. But that’s just a point in time, a sneak peak under the hood. We are beginning in very large numbers to license our software to other manufacturers. I’ll take my brand out of it for a second, you will see control-enabled consumer electronics on the market as soon as this year but in pretty decent volumes in 2011 and 2012 where you don’t need another remote or controller. The TV or the AV receiver or the BluRay player or the set-top box, those types of devices will be able to serve as the ZigBee server and the content delivery system. They will serve as the digital encoder and transmitter to deliver the benefits of integrated systems without the any new hardware.

Ed: Right.

Glen: And a business model there starts to feel more like an app store model, where you buy a major manufacturer’s TV and besides serving as a television, by downloading a software application right on the TV through their interface, it can do things like stream audio around the home, deliver other applications for productivity, social networking, command and control, comfort and convenience and even safety and security monitoring all through the television.

Because it is ideally going over a standards-based network (ie, ZigBee [802.15.4]), it can talk to other devices to enable other value propositions such as energy management, so a refrigerator or other white goods (ie, appliance, dryers, ranges, hot water heaters) can talk and share a lot of valuable information.

Devices around the home will be able to share status information, current consumption information, and information that marries up to the new smart meters. These will manage how to handle peak and off-peak

pricing and how to handle demand response events from the utilities. All of this will allow consumers basically to put their homes in energy-conservation mode or what we like to call energy cruise control - which has a direct economic benefit.

So basically the reason we became early believers and early adopters of ZigBee is because we have a vision of these disparate systems coming together. We were founding members of the alliance and very involved in fact in our early years, 2005-06. Control4 was one of the few companies with actual production ZigBee out there, selling in real homes around the world. Other than the smart meters, which on an actual unit volume will eclipse us pretty soon given that 30-40 million meters are ordered, we believe with 1.5 million ZigBee nodes that we are the leader in home mesh networks around the world.

We very much have a vision of these disparate systems over time coming together. Some of it will be driven by the manufacturers themselves, and some of it takes a neutral third party‘s platform - companies like us, that will be the glue, so to speak, to bring different systems together.

Ed: So the device manufactures mean ultimately there is that tradeoff between value of having a proprietary integrated system, but then the ability to cross networks and integrate with devices from different manufacturers means that the value there is really a network effect.

Glen: Yea, it goes back to Metcalfe’s Law, which holds that the value of the network increases proportionately with the “Nth” new device. In his world, that was about PCs and corporate networks, but in the home we believe the same is true. In the past few decades if you go to a product manager at one of the major lighting control manufactures, Vantage, Lutron, Lighttouch, Leviton, any of those, and you say, ‘Hey, on your product requirements document for next year’s light switch or dimmer switch, do you have a requirement that says the light switch needs to talk to the TV or refrigerator?’ I’m 99.9% confident that the answer today is no. They just aren’t oriented to think that way and therefore it takes a third-party integration platform to enable that.

I think in the future that will change, and we are already seeing it where many of those companies are now adopting ZigBee. As I mentioned, the white goods . . . you don’t have to go far to talk to the GEs, Sears and the Whirlpools of the world and find out where does ZigBee, energy management and smart grid play in your road map. They all have a strategy.

We believe that in the mainstream market, not just the luxury high-end market, we will see the benefits and

Page 126: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

126 [email protected] September 2010

the continued trends towards integrated systems for entertainment, lighting, temperature control, safety, security and energy-management solutions. The train has and is leaving the station, and big retailers like Best Buy and big manufacturers will all be introducing strategies. If you listen to Brian Dunn (CEO of Best Buy) all he talks about is the connected home. That is their big vision and they want to be one of the big enablers so that just speaks right to what we’ve believed since the inception of our company.

Ed: Now Glen, as we move down this path, you have companies that are vying to take control of this convergence of the PC, the TV and the home entertainment system. Microsoft, Google TV and Sony all want to be the center of the living room. Additionally you have your carriers and cable providers who want to control those set-top boxes. How do you see the relationship between these content providers, the pipes that are coming to the home, and also providing connectivity working with the traditional software platform? I should include Cisco in there as well because they are making a lot of wireless networking equipment. As a Control4 who is providing that platform and now some of your other competitors, how do you see those dynamics playing out? Where are the integration points likely to occur and will there be conflicts down the road?

Glen: Well, you are right. Most of those companies have had initiatives for quite some time. Microsoft launched Microsoft At Home sometime around 1994. One of the challenges has been how you define the digital home - for many of the companies you mentioned the digital home still today begins and ends with photos, music and movies. Those are great and relevant to everybody, but many companies are ignoring the traditional controls and automation markets for lighting, HVAC, multiroom audio and security systems. All of these have traditionally been more niche markets but are becoming increasingly relevant.

We believe it is all about an ecosystem, that no one company can possibly develop or dominate the devices or all of the features and functions that the digital home of the future and the consumer will demand. We believe that standards are really important and that they matter a lot. Now, it also has to be the right tool for the right job. So ZigBee is great for low-cost, energy-efficient delivery of small packets of data, such as command and control signals. When you tell a light switch to go from 60% to 100%, that’s a small packet of data. Streaming a song or a movie requires broadband, so you need WiFi or Ethernet or something else, but for the benefits for mesh networks and content delivery, we certainly believe that all companies are best served by participating in and influencing the standards body.

The utilities are certainly adopting ZigBee as the defacto standard for smart-grid meters. Early on, we used to jokingly refer to the Z wars, Zwave vs ZigBee. Today, some of those decisions seem to be becoming more clear and a lot of mainstream large companies are lining up in the ZigBee alliance because it is a multicompany IEEE standard as opposed to a single company-dominated proprietary technology. Let’s use a recent example to make a point, which is in the hospitality industry. Back in December, a major hotel complex in Las Vegas opened, called the MGM city center, if either of you have stayed at Aria . . .

Ed: I’ve stayed at Aria.

Glen: I was going to suggest you do it sometime. We automated 4005 rooms there, but really it’s an ecosystem. You have a Control 4 controller talking ZigBee to an Access doorbell kiosk, a SafeLock deadbolt, a Bartech minibar, a MechoShade curtain and sheer drapes, and you have Control4 doing the AV, the lighting and thermostats. That’s five manufacturers on a ZigBee pro, ZigBee 1.1 platform in 4,005 hotel rooms. As you can attest, that’s not just the luxury suites, that’s the standard US$99 or US$149 a night rooms. That’s a nice microcosm of what we see in the home of the future. Collaborative technologies on an industry-standard platform as the backbone, whether it’s IT or whether its mesh networks like ZigBee.

There will be others, I don’t mean to imply that Control 4 is a ZigBee company. We are a lifestyle and automation company. It’s just that today ZigBee seems to be a robust, affordable, available technology. There are all sorts of questions when you go to markets where it’s all concrete and block construction. In Latin America, Europe and the Middle East, you don’t have the luxury to be able to snake Romex and CAT-5 on existing homes. If there isn’t any conduit, then you have to rely on wireless. Everybody wondered if ZigBee was going to work or if it was going to coexist with other wireless signals. We have been just delighted in essentially every region of the world. We just don’t see problems.

There needs to be some thought in how the mesh network is designed. ZigBee likes to have lots of distributed devices around the home because it is not a point-to-point wireless concept. It is a hopping concept. How do I hop to the next light switch or the next thermostat that’s 6 or 8 feet away? It is not ‘how do I get a signal in a WiFi paradigm’ where all you are used to thinking about is ‘my router is downstairs in the basement and my PC is on the third floor in the study, can I get a signal?’ That’s our paradigm in a WiFi world, but in a mesh network, it’s all about signal hopping.

Page 127: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

September 2010 [email protected] 127

Ed: So there are built in redundancies in the mesh?

Glen: Yes, self healing and always on. It’s robust for the consumer, even if you lose power to some nodes, the mesh in milliseconds hops around and finds another path. The consumer wants a robust, always-on, self-healing type of environment so when they want to change something, change a setting or an alert in their home, they know it is going to work even if they are halfway around the world.

The other thing we see happening, a lot of this was driven in the last 10 years through the adoption of entertainment. We believe that equally motivating in the coming 10 years will be energy management. Today, it’s relatively young, the socially conscious. We used to just have the Prius, and now every manufacturer has a hybrid. You are seeing that phenomenon beginning to emerge and for the first time ever retailers like Best Buy are having customers walk up to sales associates and ask how much current does that 60-inch TV you are trying to sell me draw? Then the room goes quiet because the salesperson is ill-equipped to answer that question. In the world of Energy Star rated appliances, they have been talking that lexicon for some time because everyone knows refrigerators draw a lot of current, but in the world of entertainment devices it’s a fairly new phenomenon and yet it is happening.

Manufacturers are learning to make their hardware devices not only energy efficient with functions like standby mode and power-down modes, but on their roadmaps they are beginning to design ways to have these devices report what their energy usage is and to make their devices more controllable. In the entertainment world, you have all these control and connection protocols. Just think if you have wired your own home theater: composite, component, Svideo, you have HDMI, You have USB, Ethernet, IR, etc. We jokingly call them the “goes-intas” and “goes-outas” of the electronics industry. That’s why every receiver has this plethora of connections and ports on the back.

Wouldn’t it be nice if all you needed was USB or Ethernet? That’s not real world, at least not today and so as we move forward, these new manufacturers coming into that ecosystem, like the white goods manufacturers, IP camera and security systems manufacturers, and even wireless door locks from Black and Decker are coming to companies like us and saying, ‘What should we use?’ We can say, ‘Well, if you use ZigBee for this type of purpose, it works great.’ There are well-defined protocols and multiple chip manufacturers, so that there is at least a way to see companies like that bring their technologies into a connected ecosystem. Black and Decker is a really good example. Kwikset and now Baldwin have come up with ZigBee wireless controllable door locks.

It was a natural adjacency. If you could reach over to the nightstand and push one button and turn off all the lights in the house, why not push another button and lock all the doors? Or better yet, have them lock at a set time. That’s a real-life example of people bringing another piece of technology and tying it into an IP/ZigBee-based infrastructure to enable new kinds of lifestyle.

Ed: When you think about security traditionally you have had home-monitoring systems and networked IP-based cameras that are getting less and less expensive to outfit in a home. When you look at the rise of motion-detection capabilities, Microsoft’s Kinect which has motion sensing, over the next 10 years do you see an opportunity for these technologies to converge to the point where you can have somebody walk into a room and have facial recognition that allows them to use gestures to turn on lights or use voice commands to control the environment around them?

Glen: We outfitted a demo suite with a large networking vendor recently at a fairly high-profile initiative around smart-connected cities in Asia. When you walk into this demo suite, there are no devices, switches or interfaces visible on the wall. They are actually capacitive and they are behind the surface of the wall with about a 6-inch throw out in front. When you walk up to the area where the light switch would be and put your hand in front of it, the lights come on. That’s all today’s technology, no smoke and mirrors or demo-ware, so we are beginning to deliver solutions like you just described already.

Certainly I believe 10 years from now there will be a broader-based adoption of technology. One way to look at it is that the word “control” connotes that I do something and I get a response; I push a button or I set an alert. The word “automation,” we like to use it in context of a home already set to know what my needs and preferences are so I don’t need to push buttons and controllers. The home will already know.

An example would be, in my foyer, when you come in from the garage, if it’s after dark, the main light in the foyer comes on to 50% for three minutes and then goes off by itself. That’s simply so my wife or I, if we have our hands full, won’t trip on a soccer ball or a longboard, which is a likely to occur in the Mella household. That scene that I just described is enabled by a motion sensor and a programmable dimmer. The time it took me to describe it is about as long it took me to program it.

In a graphical environment, our software tool called Composer, you click on the little icon that’s the light switch, you ramp it to 50%, click it to three minutes, done. The reason I said after dark instead of a set time is that we have a built in astronomical clock

Page 128: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

128 [email protected] September 2010

that’s based on my zip code which understands what time darkness occurs in different seasons of the year. That’s not expensive and it’s both a safety and a security feature to some degree and very simple to enable.

Imagine lots of examples that have to do with security for instance. These programmable or wireless-controlled door locks also have a keypad and the keypads have unique codes for members of the family so you could get an email or a text message when your son or daughter comes home from school and disables the alarm to walk into the house. Either by using the pin on the alarm system or the keypad on the door locks. There are lots of examples of the lifestyle benefits enabled when different technologies can work together.

Ed: What becomes even more intriguing is the rise of the mobile internet and the wireless devices that get embedded in cars.

Glen: When it comes to the home, we believe remote monitoring of your home is a killer app for home automation. It is one thing to do all these things when you are in the home, and that’s great although you get all the joking that we are all going to become couch potatoes and never have to get up from the sofa. But there are many other opportunities. I live in a three-story house - as the baby boomers reach retirement age and start to settle in to senior living type environments, getting up three flights of stairs to turn off lights at night becomes more than trivial. So lighting control, which has been a niche market, is making it mainstream because it’s affordable, easy to do and has huge implications for our aging society.

Ed: Absolutely, when you have demographics of aging populations not just in the US, but particularly in Western Europe and Japan.

Glen: One other thought about content. More and more the paradigm will be media that is ubiquitous, content that’s available and accessible at the time. It will be streamed to me as opposed to content I own. The question that used to be in vogue a few years ago was ‘What’s on your iPod?’ Now we want to know what your affiliations are, what groups matter to you? What is the content that is relevant to them that can be at your mobile device or if you have a distributed audio system in your home, we can deliver Rhapsody inside of Control4 so you can access any one of 5 million songs at any time from any room if you have Rhapsody inside a Control4 system. Brands aside, just being able to say I want to consume this content, whether it’s sports news, stock quotes, entertainment content, audio or video, right now in this locale for this purpose, we definitely think that there is a lot of value in what

you called the delivery channel. I call it the infrastructure, so when the infrastructure is in place in your home, you can easily access and consume the content. That really makes all the difference. Finally and again, it’s got to be affordable and it’s got to be easy to do.

Ed: This is a broader move toward content becoming more of a service. We have looked at digitization of content and software as well, we are seeing this convergence.

Glen: For software, I think the application paradigm is here to stay. An example I would use really hit home to me about a year ago. So a year ago, I downloaded an app called MotionX GPS Drive. It was US$1.99. So I asked what do I need to do to get voice command. Well, it’s US$3/month or US$24/year. So an annual subscription is the US$2 for the app and US$24 for the voice command. So call it US$26, and I promise you if you put it side by side on your dashboard to your US$200 Garmin Nuvi, it is 100% the same functionality as you drive down the road. Same voice - ‘turn here, go here and here.’ Now if we aren’t seeing that as disruptive, US$26 vs US$200, we aren’t paying attention.

When was the last time you paid more than US$10 in the app store for any app? The availability and creativity of combining four devices in a smartphone - if you think about it, what’s an iPhone? It’s a camera, it’s a GPS, it’s a phone, it’s a PDA, and now you have 200,000 apps and 3 billion downloads in the first two years.

Now in the home, we have just launched our 4Store. Think of your TV as the interface and in the home you have more than four technologies. You have entertainment, HVAC, lighting and all the things we have been talking about. Think about the endless permutations when you can enable apps (ie, software that takes advantage of all those different technologies, once you have the right platform and infrastructure in place). You will see almost every platform manufacturer, platform provider or software manufacturer have some kind of app metaphor, whether it is delivered to devices like iPads or iPhones, or in our case those devices in addition to the TV itself, because for many people that is the primary interface.

Think of the LCD screen on your refrigerator right there in the kitchen, which is often ground zero for the female head of the household and where a lot of the buzz and activity goes on in the home. Imagine the apps that could be downloaded to that screen: Google calendar, the 10-day weather forecast, all those types of things that we are doing now and will certainly be broad based in 10 years.

Page 129: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

September 2010 [email protected] 129

Ed: When you extend that into mobility, with always-on connectivity, you have the opportunity for different information services, if you are in the kitchen if you want a recipe streamed to you with a video of instructions of how to make something.

Glen: It doesn’t matter where you are.

Let’s say you’re provisioning something from Verizon or Comcast. You’re finally going to do VOIP in your home. Today, the best Verizon can do for you is ‘I need you at your home from 8-5.’ You say, ‘That’s ridiculous.’ And they say, ‘Well, that’s the best I can do.’ Imagine that versus getting a call on your iPhone and it’s the Verizon technician. He’s at the front door and all he has to do is drop the box off. The problem is, I am all the way in Sydney and I won’t be back for a week. No problem. I am looking at him through the cameras on my front doorstep. Let me disable the alarm system and turn the deadbolt on my Kwikset door lock, go ahead and set the box inside door. Thank you, I’m watching you, now back away, close the door and I’ll lock the deadbolt and turn the alarm back on.

That is today, none of that is the future. You can just think of the social implications of what we just described. It facilitates portable, mobile, always-on access, monitoring and control of my home.

Ed: That’s a lot of opportunity for custom applications.

Glen: Yes, can you imagine? For everything that has to do with our children, families, profession and service providers that we work with. It is very disruptive and not pie in the sky. Because what I am really saying is broad adoption 10 years from now. So should we talk a little more about elder care?

Ed: Please, that’s a great topic.

Glen: The demographics drive it. It’s not a “nice to have”, it’s a “have to have”. It’s just like how many more assisted-living facilities will have to be constructed in the coming 20 years?

Today, since we are talking about the home, the question is how do I extend liveability for the aging population in the home and what is the intrinsic value of keeping grandma and grandpa in their home for another x number of months or years? You can literally put a price on it - it’s between US$5-7,000 per month, but intrinsically can you put a price on how grandma can safely, conveniently and comfortably live at home for two more years? It’s almost disrespectful to try and put a price on that.

So things like utilizing your social network, there is a 3rd-party company now, a partner of ours, called

CloseBy Networks. They sell a software system on top of a Control4 deployment which essentially enables your social network, in most cases children, to assist aging parents with these issues. They can monitor things like, ‘well it’s 10am and grandma hasn’t gone into the master bathroom, there may be something wrong because she usually gets up at 8’ or ‘she hasn’t gone into the medicine cabinet’ or ‘she’s gone in three times already and she’s not supposed to’ or whatever the case may be.

Another one is actually assisting seniors get the movie setup and play from a remote location or setting lighting scenes to help them settle down for the evening. Then obviously the healthcare implications where people are starting to experiment with ZigBee monitoring devices. You can have a pad next to the bed or underneath the mattress so when they step out of the bed a signal is given. There are even ZigBee garments being experimented with that can monitor vital signs. I don’t want to go too far down that path, but in our world things like comfort, convenience, safety, security and remote monitoring - checking in so to speak, are enabled by the standard technology that exists.

Ed: And I would think that that’s also relevant for baby care as well.

Glen: Oh sure, the nursery. In fact, probably more than you are even thinking, not just monitoring the baby but helping the baby fall asleep with the right types of content.

Ed: Glen, do you have thoughts you would like to leave us with?

Glen: The things I wanted to hit on are that our lifestyles will be really dramatically enhanced by the integration of disparate technologies on an IP backbone. That’s the high level: life’s better when everything works together. Along with others, we are providing a platform for the digital home that helps bring those devices together in an affordable, convenient, easy to set up and easy to use way that can be operated in both a wired and wireless mode in order to be equally applicable to the new and existing home market.

The other big drivers will be energy management, elder care and certainly mobile as you descried. So those three things are the killer apps or killer value propositions for integrated homes. Probably the most exciting and breakthrough paradigms has been that app store paradigm where we see this working beautifully on TV’s receivers, refrigerators, other types of interfaces beyond your phone and iPad. These are breakthrough lifestyle enhancements that many thought were way out of reach.

Page 130: 2020 foresight - Tech Views of the Future - Ed Maguire

Glen Mella, Control4 US software

130 [email protected] September 2010

I remember watching one of the Star Trek movies, when they came back to earth to save the whales, and I remember we all laughed how Scottie walked up and picked up a mouse like a microphone as if he was going to talk into it because he didn’t know what it was. The more profound thing that he did was that he walked to a screen, and from 100 years in the future, was able to intuitively take it over and start using it.

It reminds me about when I rent a car. Detroit figured out user interfaces years ago. They understood that you have to be able to figure it out in 15 seconds or

you lose it. The high tech industry has been slow to be able to do this, but I think where we are going, is in your home, whether it is walking into the front door, or into the entertainment area, or the kitchen or the bedroom, that our homes will anticipate our needs or our preferences just like the hotel room of the future. When you walk in, the lighting will be set to your preference, the music you like will be playing, and the temperature will be exactly right for that time of day. That’s not difficult; it is the continuation of the trajectory that we are on. We definitely hope to be a part of that ecosystem.

Page 131: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

September 2010 [email protected] 131

Geoffrey Moore, TCG Advisors Geoffrey Moore is a best-selling author, a managing director at TCG Advisors and a venture partner at MDV. Recognized as a leading business consultant to large companies facing formidable strategic challenges, Geoffrey works with established enterprises in his current role at TCG Advisors.

Geoffrey has made the understanding and effective exploitation of disruptive technologies the core of his life's work. His books Crossing the Chasm, Inside the Tornado, The Gorilla Game, and Living on the Fault Line are best sellers and required reading at leading business schools. Earlier in his career, he was a principal and partner at Regis McKenna, Inc, a leading high-tech marketing strategy and communications company, and for the decade prior, a sales and marketing executive in the software industry.

He holds a bachelor's degree in literature from Stanford University and a doctorate in literature from the University of Washington.

Higher level programming skills gaining value With the commoditization of hardware and a shift in focus towards software, a war for talent is emerging for software programmers and engineers. In recent decades, Wall Street was a magnet for the algorithm jocks and sophisticated software engineers. Competition for the traditional course has arisen out of the huge successes of ecommerce and the increasing demand for those with the “know how” to construct the increasingly complicated programs required to serve the needs of the virtual consumer. Google and eBay offer poignant models of the scale required and the huge success available for those who are able to capture a market through programming and development.

Interestingly, the Bureau of Labor Statistics reports in its Occupational Outlook Handbook that computer and software engineering are among the occupations expected to grow most rapidly and add the most new jobs over 2008-2018, at 32% over the decade, a rate drastically higher than the average for all occupations. Interestingly, computer programmers are expected to experience a 3% decline in employment opportunities through the decade, largely because the work lends itself to outsourcing. This negative trend is not indicative of the overall demand for the service, but rather it also is becoming a commodity service and can be done cheaper elsewhere.

Projected occupational outlook

Change 2008-18 Occupational title Employment 2008

Projected employment 2018

Number (%)

Computer software engineers and programmers

1,336,300 1,619,300 283,000 21

Computer programmers 426,700 414,400 (12,300) (3)

Computer software engineers 909,600 1,204,800 295,200 32

Computer software engineers, applications

514,800 689,900 175,100 34

Computer software engineers, systems software

394,800 515,000 120,200 30

Source: Bureau of Labor Statistics

A war for talent is emerging for software

programmers and engineers

Computer and software engineering jobs are

expected to grow 32% through 2018

Higher level skills will be increasingly in demand

Page 132: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

132 [email protected] September 2010

Building systems of engagement Geoffrey Moore is inarguably one of the most widely read authors on the subject of technological innovation and adoption. His books Crossing the Chasm and Inside the Tornado have codified strategy for countless technology firms. Geoffrey’s activities as a VC and board member of several companies (including Akamai) give him perspective as a long-term investor as well as insight into the operational realities of a public company. Our conversation focused on a number of longer-term trends including globalization and the impact of cloud computing on enterprise architecture.

Key points Globalization will drive commoditization of what was formerly protected.

In order to protect margins, companies will outsource more, which drives globalization and commoditization. The flexible culture of the US positions the country better for adaptive change than other societies such as Japan and England.

On a micro level, enterprises will move from a focus of building “systems of record” to building “systems of engagement.” Enterprises will adopt communication and collaboration technologies from the consumer world that accommodate security and regulations, while enabling unparalleled flexibility and customer responsiveness.

The transition to the cloud is resulting in the reengineering of the stack. Cloud scale is available to all which drives the importance of outsourcing functions that are not core to the business. Clouds are displacing datacenters at the margin.

The scale of the cloud introduces both the need and the possibility for real-time predictive analytics, superseding traditional business-intelligence applications.

New companies with predictive analytics capabilities will continue to gain traction. There is a real war for talent around algorithm jocks. The quants used to go to Wall Street and now they go to Main Street. This is because the next phase of economic invention is global ecommerce.

Summary of interview on 20 July 2010.

Full transcript follows

Enterprises will move from building “systems of

record” to “systems of engagement”

Predictive analytics will continue to gain traction

Page 133: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

September 2010 [email protected] 133

Geoffrey Moore transcript Ed: Geoffrey, with your understanding of content and how it is increasingly becoming mobile and enabling new types of services and applications, it would be exciting to hear your thoughts and views on coming advances.

Geoffrey: We have been doing a bunch of work in our firm around what we call the future of enterprise IT. The notion that the economic and social consequences of the internet are being unfolded and will continue to unfold over this decade and the next is supremely important. For us, it represents a type of virtuous cycle that started with outsourcing. It started with outsourcing to developing economies, most notably China and India. The effect of this was the transfer of wealth from developed to developing economies which is something we have tried to do for generations.

My parents’ generation tried to do it with foreign aid, and it did not work. Outsourcing does work. Amazingly and astonishingly, these two economies have come onto the scene at a pace that would just have been unimaginable in my parents’ generation. It is because the work was outsource-able and the internet was a critical enabling ingredient, facilitating the explosive growth. The result is a global economy; it creates competition for developed economies. The competition is between low-cost developing vendors and higher-cost, presumably potentially higher-value vendors.

It also creates new markets, a phenomenon we are watching today. In the 19th and 20th centuries the economic sum crossed the Atlantic Ocean. I think we are going to watch the economic sum cross the Pacific in the next decade and a half. The great growth economies of my children’s generation will be China, India and others. I think Brazil has challenges and I think Russia is impossible. In that context, the way developing economies enter the world economy is always from the bottom up.

Firms must play the low-cost commodity alternative at the beginning of their entry into any of these markets and at the beginning of their journey as a country. As that journey continues, they want to work their way up the value chain. As they do, they pose increasing challenges to companies in developed economies who have higher cost structures and typically higher social burdens.

What has happened in the US economy, which is where I spend all my time, is that globalization has resulted in the commoditization of things that were historically more protected. That puts the developed economy companies on their mettle to step up their game. What they have to do is increasingly differentiate so they can charge the premium necessary to support a higher standard of living and higher cost structure. To do this,

outsourcing drives globalization, globalization drives commoditization, commoditization, drives differentiation, differentiation drives specialization. The need to specialize leads to the next round of outsourcing.

The result is this virtuous circle which I think is going to be continually emulated. For example, as companies in China and India start looking a little more like developed economy companies, they are going to do the same thing. The first round of service outsourcing, the chief place to take it was India. However, if you look at Wipro, Tata or Cognizant, they are doing two-tiered things. They are doing the higher value outsourcing themselves, but pass through the commodity stuff to a second tier of outsourcing.

I can’t see any reason for this not to continue, unless there is a breakdown in the global communication system. I mean a world war or a catastrophic reason, but it’s hard for me to think of an organic reason why it would stop. And I’m very enthusiastic about this idea because my belief is wherever you can move wealth, you can create a middle class. Wherever you can create a middle class, you can uproot and essentially diminish the impact of terrorism. It is sort of an economic goodness argument.

My view of the coming decade is that key themes are the rise of India and China as the great growth markets and the need for America and Europe to understand how to offload what they can no longer do effectively. They must also be able to reinvent themselves to find the next generation of invention. Particularly in America, I think America is still the invention engine of the world. Part of that is because of rule of law and part of that is the culture of being willing to accept fast failure through mistake making, kind of that iterative course correcting approach toward a goal which is something that is anathema in a place like Japan and which even in a place like England. It is very very rare.

So that is a macro view of change. In that context, my colleagues and I look at the technology sector and we talk about the systems of record. For us, it is analogous to laying down the interstate highway system. Until you lay out the interstate highway system you need not worry about hotels, motels and gas stations. However, once you lay it down you have a super infrastructure to build the next generation of capabilities upon.

In our view, the system of infrastructure is done and we are now moving on to what we are calling the systems of engagement. It’s a second layer on top of the systems of record, particularly for enterprise and business to business interaction. The business to

Page 134: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

134 [email protected] September 2010

consumer interactions systems of engagement didn’t need to be laid upon systems of record. This is because except for the transactional ecommerce sites, it was a media proposition. As long as the advertiser could figure out a way to connect the engagement of the consumer with some economically profitable thing to do, presumably present them a relevant add, then you can build this connection. There is a connection with the media model and the enterprise, but it is at one level of abstraction through some kind of behavioral targeting and advertising reconciliation mechanism. So the consumer experience has not had to construct itself around the limitations of systems of record. It has been able to kind of free the wheel. Because basically, it was all free.

In the enterprise, you can’t do that. In the enterprise, you have to live with the constraint of the system of record. The question is, if you could take the dynamics and productivity improvement capability of the system of engagement and somehow port that into the enterprise, who would care and why would we care? We believe why you would care is that systems of record changed the life of the front-line worker and the life of the executive suite, but didn’t really change much for the middle of the organization. In this new outsourced world, the middle of the organization is the cartilage that holds these outsourced value chains together. It continues to troubleshoot and renegotiate them; it keeps them alive. It is a really tough exercise in communication and collaboration, but it turns out a lot of those consumer technologies are all about communication and collaboration.

Ed: This has been the challenge of the enterprise since the rise of these systems of record. If you go back to the 90s, the vision behind enterprise portals and collaboration technologies has been piecemeal and very difficult to realize. With the advance of collaborative technologies and the increasing integration facilitated by interoperability standards, things like XML allow standardized information interchange which does provide a kind of WD-40 to grease the wheels.

Geoffrey: It is interesting to see how this goes. We are doing a very interesting project right now with the Association of Image and Information Management (AIIM), the enterprise-content-management guys. They grew up in the era of systems of record. All of their document management is tied toward the conventions of systems of record; there is an enormous concern about security, reliability and protection.

Now the world is saying that is good for our contracts, but portals are not good for interacting and the truth is that I don’t know what I am looking for. Even if I did know what I was looking, I am not sure I could even find it on your portal. I need another mechanism, and

by the way, I found it in the consumer world. When I go shopping, or when I look for a restaurant, or when I’m trying to figure out if there is a movie on, or if I’m trying to figure out what my friends are doing, all of a sudden these magical facilities show up and lead me through it. Nobody sent me a manual and I didn’t have to get trained on it. They just work. Can’t we do that inside the enterprise? Of course, the content guys are going, ‘Oh my gosh, what about security? Oh my gosh, what about privacy? Oh my gosh, what about this?’ It is a little bit like saying, ‘Well, of course you can date as long as her parents and my parents come along and sit in the back seat.’ Well that’s not going to actually create much romance.

So they are trying to figure out what conventions of systems of engagement can they embrace and how can they facilitate these interactions of engagement. Right now what they look like is a bunch of soviet designers trying to design an iPhone. It’s like, ‘Yes, we can design an iPhone. It only weighs 72 pounds and you don’t actually use your fingers, you use your feet. But yes, it’s an iPhone, of course.’ So this nuance of trying to capture the spontaneity, the fast cycle time of the consumer and system of engagement and apply it to tracking down a problem in the supply chain or expediting an order or dealing with an irate customer; the application is real business, but it’s interactive not just transitive business. This is a huge problem for anybody who is an incumbent of the systems of record era. To reinvent themselves, they have to let go of so much.

Ed: There is so much work flow and process that have been codified in this system. In a sense it provides a static obstacle to a more extensible view of processes. Additionally, one of the liberating characteristics of the consumer world is that there is this ocean of data that you can pull from. What I believe you were getting at is that there is a predictive model of prompting information to users.

Geoffrey: Actually, my experience is that a lot of this actually comes from people. For example, when you go to the Dell customer support site, there are 40,000 non-Dell employees that log onto the site to help answer your questions because they are people helping other people. So much of what people are trying to do in business is in the moment and isn’t in the database. It’s like you have got to call Harry and talk to Harry. I think this whole notion that you and I grew up in an era of computing and we have talked about IT as a computing capability. I think this will still continue, but I think it’s now context, not core. The communication, collaboration and next-generation content, the original content management was in alliance with computing.

So the AIIM guys, with enterprise-content management, linked content to computing. Then they

Page 135: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

September 2010 [email protected] 135

linked the document to the systems of record. Interestingly, media went in a completely different direction. Media went to communication with content that was free, open and completely unfettered. They did this even to the point that they lost their ability to charge for it. I mean with Napster and all those challenges. Now they are starting to recoup their ability to monetize it, but it went way over the line as far as the enterprise is concerned. Interestingly, I think they discovered collaboration in the middle by accident. I don’t think anybody really set out to create Twitter or Facebook as collaborative institutions. They just emerged. Texting also became a collaborative art.

Now we have these two things, which were born under two different starts, under different astrological times. I think there is going to be this very interesting attempt to say, ‘How can I capture the dynamic of communication and collaboration as it has unfolded outside of the enterprise, but still bring it back into the rule of law?’ I mean as an enterprise I cannot distance myself from that responsibility; I have to accept it. So it is a very interesting challenge. The tendency is you have to authenticate the user. ‘Yea, I know, but every time I have to authenticate the user I lose half of them.’

Ed: There are these obstacles to governance which create barriers and are an intrusive extra step and work flow.

Geoffrey: Exactly, and people didn’t want to do it and basically anybody who was any good didn’t do it. So whatever knowledge you did capture was from people who probably weren’t people you wanted it from, but were the only ones willing to cooperate with you. We have been working towards this and it is not like people haven’t cared about it or tried hard. If you watch what’s going on in the consumer world with behavior targeting, you start seeing how much we can infer from people’s behavioral tracks. Therefore, you can begin to build knowledge bases independent of a contributor. You have the difference between authored and emerged content. It is still early days, but I guess the security guys are pretty darn good at this already. People do try to modify the ads you see based on the clues they can pick up from what sites you have been to, etc.

Ed: So really what this is trying to replicate is the intuition or the expertise of somebody who has a well-honed ability to read people.

Geoffrey: I think the first choice is, could you put me in touch with that person? My first choice is to actually interact directly with a person. For business to business, I think that is the preferred choice and where all the systems are going to go towards. For business to consumer, it does not scale. However, there is some

of it in B2C, to the degree that it is not scalable you must have algorithms that have essentially inferred the gut checks or abstracted their way. That of course leads to the cloud. This is only possible in a cloud computing environment. You could not imagine doing this in a classically database driven way.

When I learned about databases they were a very discrete thing. You had “goes-into’s” and “goes-out of’s.” You had a very clear schema and that was the be all and end all. Obviously, those things are at the heart of systems of record, but they are not the heart of Google and they aren’t at the heart of Yahoo. The whole data management around Hadoop is a completely different paradigm from the relational table I grew up with. It is a completely different game and one that you could not play inside a datacenter. You couldn’t afford to play it inside a datacenter. So I think a lot of this is reinforcing itself and people are saying, ‘What is cloud computing about?' This is a big part of what cloud computing is about.

Ed: It really exposes the behavior of the interactions of the enterprise to all of the stakeholders in an electronic way. Before you had these personal networks or interlinked relationships and now you have this ability to potentially harvest the electronic trail of breadcrumbs that follows people around. The question is, who is in a good position to sniff out these breadcrumbs and figure out really how to structure the next wave of services?

Geoffrey: So I think that people well positioned are those who live at nexuses. Google is in a good position. So is Yahoo, so are eBay and Amazon for a different type of nexus. Akamai is well positioned for yet a different type of nexus. You do not want to get on the wrong side of the social contract.

Facebook is playing this game of chicken with the world to try and figure out how far it can go. In a business-to-business setting, that’s not something you want to do. In a business-to-business world, you want to measure twice and cut once. Mark, Cheryl and their crew aren’t doing this. In their world, they have to be careful because every time they do this they put their brand at risk. However, I think in general the consumer world is the place to run these experiments.

Ed: The paradigm shift that Facebook has established is significant and the risks they are playing with relate to the sensitivity of consumers.

Geoffrey: Yes, how much can they push that envelope?

Ed: Consumer applications are far less sticky than enterprise applications because of dramatically lower switching costs and lots of alternatives. But the idea

Page 136: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

136 [email protected] September 2010

that you can have this collaborative stream of consciousness that . . .

Geoffrey: Yea, it’s a stream of collective consciousness.

Ed: Exactly, Twitter fits that definition perfectly. The ability to collect this stream of consciousness that filters allows you to tease out the data that is most relevant. It seems that this is one of the biggest challenges going forward. How do you find the right types of filters or curators for these massive streams of information?

Geoffrey: So part of what’s going on is a real war for talent around algorithm jocks. The quants used to go to Wall Street and now they are going to go to Main Street. This is because the next generation of economic invention, though there are many venues for it, but one venue is global ecommerce.

Google has taught us that this game must be played at a remarkable scale. Their success is in large part because they figured out the level of scale before anybody else. We follow in their footsteps in that regard and probably rightfully so. The guys in the security agencies have been trying to play a version of this game for some time. And their nexus might be the cellular traffic, capturing it off the airwaves. I think each sort of pool of information will have its own venue that it will be collected from.

Ed: I guess you have these domains of information and there are those who have the skills to parse information and turn data into insight.

Geoffrey: Yea, all the things we used to say.

Ed: It’s the data-warehouse paradigm being replayed on a much broader scale with unstructured and all different types of data. It all does come down to the specific problem that is being solved. Ultimately these become applications. The tools are there and the problem you solve is where the value is created.

Geoffrey: In any technology adoption life cycle, the technology precedes the application. If you are in venture capital, you are a solution looking for a problem because you never exist before. The monetization doesn’t get exciting until the problem shows up because the problem brings the money.

Ed: It is the proverbial hammer looking for the nail.

Geoffrey: Except it is like you are a stapler looking for a staple and nobody knew you even had to have staples.

Ed: I mean that’s obviously the bet that you and your colleagues are focused on because you have to look

five, seven, 10 years down the road. Interestingly, now we have all these enabling technologies and services like the massive pool of available open-source software that has never been . . .

Geoffrey: Totally a game changer.

Ed: What happens to the incumbents in the enterprise world that have created the systems of record? Are we going to see a disruption in the stack similar to what happened in the transition from the mainframe to the client-server world?

Geoffrey: We think there is a huge transition. In our view, the entire stack is getting reengineered. The old stack could absorb any one of these changes. However, the fact is that they are all changing and at some point we are going to have a new stack. If you start at the top, we have the Windows-based desktop but we are adding the mobile world and that is either an extension or a new design center. That’s the question, whether you stay with the old stack or go with the new stack. Is the mobile client an extension of a desktop or is it a new design center? For the US, it’s probably a center and for China and India it’s probably a design center.

Transaction applications are transitioning into interactive applications. You can interact with a transaction application, that’s what OLTP is, right? But that’s not what we mean. We believe there will be a much more community-oriented, communication-oriented relationship. There will be real-time analytics as opposed to business intelligence. The business-intelligence guys say we do analytics. They do, but that’s not the analytics you and I have been talking about. What we are talking about is real-time analytics that can determine you are in the market for a car and can recommend a Ford. So it isn’t about SharePoint as a document-storage device, it’s about Outlook or SharePoint plus whatever can do video conferencing on Skype, etc.

Ed: The collaboration platform, the gathering place.

Geoffrey: We keep going. It is not databases, it’s datagrid. It is this private public cloud environment. Now, everything is becoming virtual. It is becoming virtual systems management. We are going to see the cloud displace the datacenter over time. Not this decade, but this decade will certainly be a decade of public private cloud hybrids.

I would feel very odd if I was a CIO and I went to my board of directors and asked for more money to build a new datacenter. I think I would get my head handed to me. The good news is that virtualization has freed up a lot of space so I can actually grow my apps quite a bit

Page 137: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

September 2010 [email protected] 137

in my existing datacenter. I have to believe that if I want to grow them past this extra space, I have to grow them outside. I have to find a way to have them hosted, or maybe my composite apps never get put in a datacenter. They are born in the cloud. Then how do I manage security, all the things you worry about? How do we make them work? There are answers, but they aren’t perfect at this time. I can remember when nobody would ever put client information on a UNIX computer. It didn’t have SNA. Where is IBM’s SNA networking? This is where it feels like it’s heading, at least from this neck of the woods.

Ed: Michael Tiemann was talking about applications being a momentary collection of resources.

Geoffrey: Yes, that is basically the vision of services oriented architecture. I think at the atomic level everything is a Lego. Whether you are making the Leaning Tower of Pisa or the Empire State Building, they can all be made out of Legos. It does reach a level of complexity where the application should be a persistent application, because it’s too much to reassemble.

Ed: It gets much more complex as you start to orchestrate a number of services. This idea of a higher level of logic is going to give rise to the application world becoming fragmented and verticalized application, analogous to what we are seeing in the iPhone world with over 170,000 apps. This will have application implications for potential investors. Occasionally you will have a hit that will emerge from their creation, but the addressable opportunity of these applications will be so narrow and so focused. This may give rise to what I call it the mom-and-pop application renaissance. This would result in the corner store application shop that addresses your very very specific problem; we can have a Geoffrey app and we can have an Ed app. Interestingly, the value may ultimately come from the platform providers. The people providing the tools, or to your point of the algorithms, that provide the intelligence.

Geoffrey: There is another thing about the app store model that hasn’t quite sunk in yet but will. If you are an IT manager and you need to deploy a client, perhaps your future client is deployed in an app store. The application becomes the client in your future cloud server architecture. This will be another example of trying to take the friction free successes of the consumer space and co-op them in support of a more productive enterprise use of IT.

Ed: The explosive adoption of the iPad over the last three to four months is anecdotal evidence that the iPad has caused this paradigm shift in how users expect their mobile experience to give them access to resources anywhere in a very intuitive interface.

Geoffrey: One of the ironies of this is that Nokia is looking for a new CEO. The irony may be that the iPad makes the smartphone obsolete and then what people are going to want to have is a great phone again. You aren’t going to make a phone call on an iPad. Well, you might if it’s videoconferencing, but other than that you might want a dedicated phone.

Ed: Well, maybe one that makes calls without dropping them.

Geoffrey: The iPhone is a horrible phone, but for a lot of people it is an indispensible device.

Ed: The potential role of the carriers in edge computing is also pretty intriguing. For instance, the role Akamai has played in delivering content to endpoints and enabling applications in the future.

Geoffrey: It is interesting. This is the first, I think it was actually last quarter, not this quarter, that was the first quarter the value-added enterprise services for Akamai exceeded the content-delivery services in revenue. With the rise of HDTV and enormous traffic increases in the media side, this shift is a reflection of two things that both indicate the cloud is increasing in importance. First, the one service is called dynamic site acceleration and that’s used by retailers to keep the response time to a consumer down to seconds. This is because of the data that we know about consumers losing their attention span. The second is called application-performance services which lowers response times of client-server applications across the internet to acceptable user limits.

Those two services, running client-server apps over the public internet and accelerating retail site performance to consumers, exceeded all of the movie, TV, and software delivery business for the first time ever last quarter. What it says is the cloud is becoming the highway vehicle of choice. Obviously it is still nascent, most enterprise apps are delivered over a dedicated VPN, but you can see where it is heading.

Ed: What is intriguing here is whether there is an opportunity for those that provide the pipes and the bandwidth to add intelligence or value or whether we start to see more and more pure specialization. On one hand, you have fully integrated platforms like Blackberry, which have served the enterprise well. On the other, you have this relatively fragmented ecosystem around Android which is fairly carrier neutral.

Some clear battle lines are being drawn that may result in the combination of delivery infrastructure, content as well as technology around the edge device. Ultimately I think those become less relevant and the applications themselves become the most important. It seems like we are working through so many of these initial problems trying to . . .

Page 138: 2020 foresight - Tech Views of the Future - Ed Maguire

Geoffrey Moore, TCG Advisors US software

138 [email protected] September 2010

Geoffrey: It is interesting this time around that the industry is so self aware of platform power. Any time anybody starts to develop platform power, the industry immediately tries to start to compensate for it. If Google starts to get platform power, the industry adjusts to try and curtail Google’s power. If Apple starts to develop platform power, and by the way the record guys didn’t get out of the way fast enough so they are under Apple’s thumb for the foreseeable future, the industry adjusts to try and curtail Apple’s power. But you can feel like the app guys now.

This whole thing with Adobe and now RIMM is going to go 100% behind Adobe and a lot of this behavior is about an ecosystem being hypersensitive to shifts of power and playing chess three moves ahead of the other guy. So from the point of view of an industry watcher, this is particularly interesting. A lot of the strategies we proposed in the 1990s, like Crossing the Chasm, Inside the Tornado and The Gorilla Game, are now like openings in chess. You know, that’s the King’s Indian, or that’s the whatever. I don’t play chess that way, but the industries are already trying to psych out the next move.

Page 139: 2020 foresight - Tech Views of the Future - Ed Maguire

Lew Moorman, Rackspace US software

September 2010 [email protected] 139

Lew Moorman, Rackspace Lew Moorman is instrumental in driving strategic planning, product development and new business initiatives for Rackspace. He joined the company in April 2000 and has served in a variety of strategy and marketing roles throughout the company's growth. Before joining Rackspace, Moorman held several positions at the management-consulting firm McKinsey & Company, advising high-technology clients on critical strategic issues

As Rackspace Hosting's chief strategy officer, Lew drives strategic planning, product development and new business initiatives across the company. He also serves as president of Rackspace's cloud business, leading the company's fastest-growing business unit. He speaks frequently at industry events on cloud computing, hosting and the rapidly evolving world of IT.

Lew received a BA from Duke University and a JD from Stanford Law School.

The impact of cloud computing on the IT organization Lew Moorman’s work at Rackspace has helped the company stake out the forefront of driving practical adoption of cloud computing services for businesses and organizations of all sizes. Our conversation focused on the transformational impact of cloud computing on the IT organization, the evolution of the cloud-services business landscape and the potential growth opportunities for new types of services. Rackspace’s Open Stack initiative is providing a framework to allow for more open standards, which should lower barriers to adoption over time.

Key points Cloud computing is a very different model and a consequential shift in

mindset. Employees will be able to figure out the best way to solve their problems and create a community by which they can share. Ideas can go viral, so everyone’s work can get better.

IT departments no longer hold the keys to the kingdom. They will evolve into the role of creating frameworks and mechanisms in place that enable innovation.

There are potential risks from cloud-computing enhancing “silos.” When people in IT departments can do so much on their own, this can lead to lack of control across the organization.

CIOs and CTOs will need to become more embedded within the business. Business acumen and a need to be much more in sync with what’s going on in the business and how to enhance it are required as well.

The vast majority of IT systems that will exist in companies will be “productized” systems that are repeatable with some ability to customize.

Cloud computing will not be homogenous or “one size fits all.” There will be many clouds that specialize in providing given types of services (HIPAA compliant clouds, trading clouds, low cost compute clouds, etc).

Developer talent will continue to be in great demand and we will need more people to help us be creative with data. There will be huge need for new schools to teach software development.

Summary of interview on 6 August 2010.

Full transcript follows

IT departments will evolve into role of

creating frameworks that enable innovation

Cloud computing will not be homogenous or “one

size fits all”

Page 140: 2020 foresight - Tech Views of the Future - Ed Maguire

Lew Moorman, Rackspace US software

140 [email protected] September 2010

Lew Moorman transcript Lew: The way that we at Rackspace think about cloud computing is that it represents a massive increase in the supply of computing power. It’s the latest stage in a transition that has happened from the original mainframe days, through the mini computer, through the desktop. What we’ve had is a gradual democratization of access to this very powerful thing called computing.

In the mainframe era, the smartest guys at the largest companies could perform some detailed analysis or accounting on the mainframe. Then Microsoft said, ‘Let’s put a PC on every desktop,’ and everyone had a lot more power to boost productivity.

What the web has done is to take us to the next level. There are laws of accelerating returns on these technologies. Going from mainframe to mini computer was a very small advance. As you started to get into the PC, it became a transformative advance. I think this next step to cloud computing is another exponential leap.

With cloud computing, we now have ubiquitous computing. Not only does everyone have computing power at their fingertips, they have the power of a datacenter at their fingertips. They have the ability to manipulate and access all types of data, to connect with and do things with that data, to create and store new data.

We are just starting to understand the potential of cloud computing. It is just starting to transform companies and individuals’ lives.

The web has been around for awhile, so almost everyone has had access to the web. But not everyone has had access to the computing that powers the web. Now, this computing is extremely cheap as well as widely accessible.

You see this in the changes that are starting to hit IT departments. One of the ways we built our business at Rackspace is that we had corporate departments that would come to us in the early days - primarily marketing departments and entrepreneurial divisions - who wanted to do things on the web and couldn’t get those things done through their IT departments. They went out and contracted with us, got the computing power they needed, got the project done and started to innovate.

Now anyone can go online with a credit card, spin up cloud servers, and get an application or a SaaS application. This is happening all over departments within companies. It is very exciting and is creating a tremendous amount of innovation. It is making

companies move faster. It is empowering individuals. It is making companies more effective and more efficient.

I would also say it is creating problems, and in some cases creating chaos. Cloud computing probably enhances silos within companies because now the people outside of IT departments can do so many things on their own. As silos start to crop up, there is lack of control over what is going on. What happens to the data, and to security? All of these dynamics have to be responded and adjusted to.

More and more companies are responding by embracing the cloud model. They are realizing that, yes, there is some downside, but it is still an opportunity. How do you empower individuals to take advantage of these tools responsibly and not put the company’s data and intellectual property at risk?

Ed: You mentioned an interesting point, which is a change in the role of internal IT, because of this new ability to access dynamic and very powerful computing power. How do you see that changing the role of the CIO and the roles that IT has traditionally performed within organizations?

Lew: I think there is a real change. It’s a move from centralized to decentralized. Even 10 years ago, for any project or application, the whole client/server model depended on a central department that installed software and that was always accessed from client machines.

The IT departments held the keys to the kingdom. Nothing could be done in the enterprise without going to IT. They built software, they ran software, they controlled and administered it - they managed everything. IT departments became these enormous organizations and drove a huge amount of productivity.

When you start to think about a cloud model, the client/server paradigm breaks down and you have individuals operating beyond the control of IT. This is changing the organization. I think IT has to assume the role of putting secure guardrails and parameters in place, while creating mechanisms that enable innovation to happen.

IT also assumes the responsibility to set up vendor relationships and the enterprise architecture. IT becomes an authority that sets a framework by which the organization can innovate. That is a very different role from saying, ‘Here is how you do your job.’

I will give you a good example we have internally at Rackspace. We have built custom software that we use to deliver customer service. Exceptional service is what we built our company on. We have all sorts of

Page 141: 2020 foresight - Tech Views of the Future - Ed Maguire

Lew Moorman, Rackspace US software

September 2010 [email protected] 141

employees who are building little add-ons, whether they are browser add-ons or whether they are using APIs to get data to change their workload. We are creating a way that employees can determine ‘here is the way I want to use the data so I can better serve the customer.’ This is a much more efficient model. This lets employees figure out the best way to solve their problems and create a community by which they can share ideas, which then go viral so everyone’s work can get better.

This stands in stark contrast to IT in the ivory tower saying, ‘We think if we implement this change in workflow, everyone will work better.’ The cloud approach is a very different model and requires a shift in mindset.

Ed: The interesting takeaway here is that the skill set of a CIO is going to incorporate governance, risk management and policy goals for the broader organization rather than being a strictly technical job.

Lew: That is one piece of it. One of the dangers of IT is over the last 10 years it has become too much of a “keep the lights on” type of organization. “Make sure my PC is patched and the systems work and they are up and running.” Everyone came to expect not a lot of innovation from IT. I think CIOs and CTOs need to become more embedded within the business, to help advance the business. There is now business acumen required of them, and a need to be much more in sync with what’s going on in the business and how to enhance it.

Ed: You made a great point about IT that 70% to 80% of IT budgets are about keeping the lights on. It would seem that one of the great promises of the cloud is that all of a sudden you start to free up these resources for innovation and break down the barriers so that smaller companies have access to much more powerful compute capabilities and analytic capabilities. This ability to unleash innovation could potentially create some significant power shifts within the organizations that are able to embrace the cloud quickly and innovate, versus older companies that may be stuck in maintaining systems that have been established, and where there may be some organizational issues that pose obstacles to adopting cloud computing.

Lew: There is no question about that. In most companies of some size today, there are turf wars between traditional IT and the developers for the organization. By the way, some of those developers might be sales or support people who are developing at night time because they know how to put subscriptions together. They are working to help change the company. They’re not just waiting for IT.

That is the issue: they don’t have to wait for IT anymore. As this is happening, the smart IT departments are figuring it out and saying, ‘Hey, instead of putting walls around everything, how do we put frameworks in place so that innovation can be encouraged, even as keep the essential controls around security and so forth?’

Ed: One other interesting phenomenon related to the rise of cloud infrastructure as a service is the new companies that are popping up that are building application services using Rackspace or Amazon EC2. In the past, there would have been significant investment involved to buy the infrastructure and bandwidth to deliver these services. I would love to get your perspective from some of the customers or developers you are working with that may be exploring some of the new business models that we could see emerge over the next 10 years.

Lew: There is no question that the costs to build a new company, especially a technology company that is a service delivered over the web, are extraordinarily low compared to where they used to be. This is driving the innovation model to another level. It has changed the model for venture capital. It has changed the way companies are formed. It is changing so much that is going on in the world and it is creating many promising new businesses.

Just think of all the two-person companies making real money on iPhone apps. That is a phenomenon of the cloud. If you think about traditional IT and the examples I gave about our company, it is also a factor in the number of applications running to help people do their jobs. It has grown exponentially.

It used to be you had email, the ERP system and maybe a HR system. Today there are hundreds and hundreds of applications in a company the size of Rackspace being used to make things happen. Some are sanctioned and some are not, but people are building applications like crazy because they can, and they have access. This is driving innovation at big companies as well.

Ed: That is a great point. One question I have is about the democratization of cloud computing. Rackspace has introduced the Open Stack initiative. It seems like that initiative has the potential to get us beyond the threat of vendor lock-in, which has been a real concern for companies that are thinking of adopting cloud computing. Open Stack seems like a way to create standards that will facilitate greater distribution or democracy of resources.

Lew: We are excited about Open Stack, and we’ve been pleased by the enthusiasm it has generated among our growing stable of partner companies, as

Page 142: 2020 foresight - Tech Views of the Future - Ed Maguire

Lew Moorman, Rackspace US software

142 [email protected] September 2010

well as among developers and customers. There are a couple of things going on. One is the belief in a different kind of business model. We think that operating software and putting service around software is going to be a major part of the change in the way software businesses are run in the future. For us, there is not much value in the core software. There is value in making the core software do things for people - in delivering exceptional customer service. That is how we are going to get paid and how we have gotten paid. That is why we decided to open-source our cloud, and it is a great model. It is going to get people behind it.

Two, we think there need to be a lot of clouds around the world. The idea of having agile computing that is standards based will accelerate this movement, and accelerates innovation around this movement. It will be great for us and good for others too.

There are going to be a number of winners in this space. The idea that there need to be hundreds of different ways to deploy clouds doesn’t benefit anybody. Clouds will be building blocks - having tons of different proprietary models will slow the pace. Consider the x86 standard and how it drove the first era of web-based computing. The fundamental building block going forward will be the cloud and an open standard will benefit users of technology. Open Stack gives people a strong set of tools they can use to deploy the technology themselves, as well as a number of service providers who are committed to it. We are committed.

You can find the right public cloud out there to use to fit your needs. This is a shared development model. That is another factor that will lower costs. These efforts will drive innovation and drive the pace of adoption.

Ed: How do the big hardware or infrastructure vendors stay competitive over the longer term? It seems that with this evolution of products and technology into services, there are forces of commoditization bearing on the server vendors and other pure hardware vendors. This creates challenges for vendors that may be suppliers to you. What strategies could they implement to stay relevant in a world where the cloud-service providers really become that front face of IT?

Lew: If you ask one of the hardware vendors they would say (because they say it on their sales calls and in their advertising) ‘services, services, services.’ Services really matter. They realize that a server or a switch itself is just an input. They’re commodities that we use as inputs to run our cars or whatever. What is interesting about the hardware vendors is that they have moved into complex system integrator roles and custom project services. That is a model that will exist for a long time.

But the vast majority of IT systems that will exist in companies will be productized systems that are repeatable with some ability to customize. That is where we think there is a new sweet spot. Delivering hosted computing or software is a way to power a big percentage of the IT that happens going forward.

There will still be lots of custom projects to figure out how to tie all the pieces together, how to create compliance, and create custom applications. We prefer the productized model. It is a different business.

As the traditional software vendors struggled in the transition into software as a service, it is hard to name even one that made that transition well. It is difficult to go from being a box maker and box seller to a productized computing-as-a-service provider. This is a very different model and a very different way of thinking about things.

Ed: It is, and there seem to be different hurdles to success. Scale is important if you are working in an environment where SLAs are important. What will be the characteristics of the winning cloud-service providers 10 years in the future after you have seen some evolution of both the hardware and software vendors above and below in this transformation? What is it going to take to be a winner in this new model where Rackspace is squarely positioned?

Lew: It is a good question. There are a couple of things that are going to drive success. One is that there are obviously scale impacts that are very positive. They go beyond just getting to low cost. There is brand awareness, trust, and all of those characteristics that come with scale that really matter. I think this is going to be an enormous market.

I also think there is not going to be a “one cloud fits all” approach for every type of IT. Amazon has done a really good job with its Elastic Compute Cloud with extensive, low-cost computing. That is their sweet spot. I think they will expand to other services as well. It has created a lot of opportunity.

We are more of a high-service, high-touch compute service provider. We help customers with production and mission-critical applications. That is just a different focus area. Many observers put Rackspace and Amazon in the same bucket. There is some overlap, but you will see that we are focused on one thing while they are focused on another.

I think people will come out with low-latency clouds for securities-trading companies, and super HIPAA-compliant medical records clouds. There will be a lot of opportunities in specialized niches. It is about being focused and having the scale and brand power to execute as a point of difference. We focus on service,

Page 143: 2020 foresight - Tech Views of the Future - Ed Maguire

Lew Moorman, Rackspace US software

September 2010 [email protected] 143

others are going to focus on cost, others will focus on capabilities, and those who execute really well will find markets.

Ed: That is a fascinating point. It ties back to your earlier comments around being able to offer these uniquely tailored and differentiated clouds that may deal with security or data-privacy or regulatory-compliance issues. In certain subscription-based businesses, one of the key characteristics of success has been this focus on customer satisfaction. The successful companies are changing the dynamics of the relationship between provider and the customer because the net present value of all of those renewals is really the value of the customer. Sustaining that cycle becomes far more of a priority than in a traditional product business.

Lew: I think that it is true. There is no question that if you look at churn rates and renewal rates in recurring revenue businesses, these are the major economic drivers. I think people also have to set the right expectations. We set a higher expectation for customer service and we have to exceed that. I think Amazon sets a very good expectation around a low-cost, automated “take it as it is, here is exactly how you can expect it to perform” model and they have done a great job. Salesforce.com has its own set of expectations.

I think success is about matching and exceeding expectations. Customers want different things. They don’t always want a high-service experience. Sometimes that is not appropriate. Sometimes they want an automated, low-touch experience. Acquiring customers is expensive, so once you have them, you want to get a perfect fit and make them happier than they hoped to be.

Ed: If you look at the spread of the model and the expansion of your business across the globe, what are your thoughts on the different dynamics of adoption in emerging markets? What impact could cloud computing have on economic development and development of the IT industry beyond the early adopter markets of the US and Western Europe - in China, India and other emerging markets?

Lew: I haven’t spent a lot of time in the Asian markets but my perception is that we in America all fear being leap frogged. Like a startup, a lot of these emerging economies are open to moving to these new models of computing very quickly. Because there is such great power in these new models, rapid adoption in developing markets presents a risk to mature markets that are slower to adopt cloud computing.

How quickly America and American business respond will certainly matter. At Rackspace, our Hong Kong

operation is young. We don’t have all of our products there yet, but we plan to get them there over the next year and the demand we are seeing is good. We wish we had more resources to move more quickly because we know there is demand.

Here is pull-through and the appetite is great. If you think of a lot of emerging economies, they skip the land-line phones and move right to mobile. Because of this, they developed mobile applications much faster than we did. Those dynamics can present risks and some of that could apply here in the case of cloud computing.

Ed: Let’s talk about other challenges and risks over the next decade. You mentioned the risk of fragmentation or silos within the organization. Certainly with data, as you start to deal with federating or distributing data, you have governance, consistency, and master data-management issues. Where might there be opportunities to create new value and what are some of the friction points that you see that could be obstacles?

Lew: There are a couple of areas that come to mind. One is just a skill-set observation. I think developer talent will continue to be in great demand. Developers basically build software that organizes data to perform tasks for you. With the proliferation of data, we need more people to help us be creative with data. I would love to have a national set of schools that help people learn development skills. It would be tremendous.

We could train people to help with development to earn really good livelihoods in this country. It is one of these new economy skill sets that will be in great demand for a long time.

It is ironic this week that Google WAVE got cancelled and basically deleted from the Google product portfolio. This is interesting because communication and collaboration within companies is very poor. I don’t think this issue has been cracked and I think there is opportunity here. The more distributed IT becomes the more people create their own little universes inside companies.

It gets more interesting to align people to things. The issue of how companies communicate and collaborate is still something to be cracked. The answer is not any of the solutions that are out there today. It is not email as it exists today. I don’t know what it is exactly, but I suspect the answers will work in much the same way that Facebook has solved these issues in our personal lives, or Twitter has solved them in terms of what is happening in the world at a given moment.

Ed: Salesforce.com launched a product called Chatter that is a first run at this.

Page 144: 2020 foresight - Tech Views of the Future - Ed Maguire

Lew Moorman, Rackspace US software

144 [email protected] September 2010

Lew: Yes, they are smart. They are hitting on something that is a real need. Obviously, they learned from what Twitter did because it is a similar interface. Collaborating and getting messages out across large groups is very difficult. Particularly when we are all so bombarded with information, it is difficult to break through. We struggle with this all the time at Rackspace, to communicate with employees and get them aligned.

Ed: Projecting out 10 years, what do you imagine could be some of the other significant types of applications or different types of customers and uses that Rackspace and others could be powering, where we are only seeing the potential right now. Are there any types of applications or industries where there is some opportunity for unusual or extraordinary change?

Lew: There is no question there are industries and markets that have yet to go through even the first phases of technology transformation. I still think that

media, and particularly television, is in an early phase of transformation. You are starting to see this with Hulu and Youtube, but there is a ton of disruption that is going to happen.

I also expect massive proliferation of personal-data sources and computing in our lives. Whether it is about a personal camera to protect our home, to uploading how much we weigh when we get on the scale, or how much water we are consuming. There is so much data that could be valuable in our personal lives that we don’t have any way to access or record. I think these types of use cases are very early.

With that data, a lot of interesting applications could emerge. We are going to start to see this much more, in terms of really understanding what people call the Internet of Things. How do we track them and learn from them and use them in our lives? We are just starting to see this and I think it could be fascinating.

Page 145: 2020 foresight - Tech Views of the Future - Ed Maguire

Sanjay Poonen, SAP US software

September 2010 [email protected] 145

Sanjay Poonen, SAP Sanjay J Poonen is senior vice president and general manager of Performance Optimization Applications at SAP BusinessObjects. Previously, Sanjay served as vice president of Line of Business Operations at Symantec. Reporting to the president and vice chairman, he played a key role in the merger integration of VERITAS and Symantec. Before the merger, Sanjay was vice president of Strategic Operations at VERITAS, with sales responsibility for strategic OEM accounts, and drove new platform deals with hardware partners like HP, IBM and Sun. Prior to joining VERITAS, Sanjay served as an executive officer at Informatica, first as vice president and general manager of the Analytics Business Unit then as senior vice president of Worldwide Marketing, establishing Informatica’s leadership in the data-integration market. Sanjay was a member of the founder’s circle at Alphablox (acquired by IBM) and worked at Apple Computer and Taligent (acquired by IBM). He began his career as a software engineer at Microsoft, working on early forms of Microsoft Exchange.

Sanjay holds a master’s degree in business administration from Harvard Business School in Massachusetts, where he graduated as a Baker Scholar; a master’s degree in management science and engineering from Stanford University, in California; and a bachelor's degree in computer science and math and engineering from Dartmouth College in New Hampshire, where he graduated summa cum laude and Phi Beta Kappa.

Universal analytics - Leveraging data everywhere Our conversation with Sanjay Poonen of SAP focused on the trend of embedded analytics. Sanjay has a distinguished tenure in the field of analytics, and his views are as insightful as they are forward-looking. We discussed the impact of the new social-computing paradigm on the enterprise-software industry, and the implications of embedded analytics in an increasingly connected world.

Key points The coalescence of the pervasiveness of connected devices and social

collaboration constructs will require the development of new platforms by companies, such as Oracle and SAP.

Algorithms, including those used in predictive analytics, are complex and require specialized personnel to create and manage.

However, the algorithms will evolve to be more distributed and more transparent, manifesting in nimbler applications.

These intelligent systems will have voluminous date on which to base intelligent decisions and will form the basis for new applications, particularly for data-intensive industries, such as consumer-packaged goods (CPG) and financials.

Convergence is a trend that will continue, but it is unlikely any single company will specialize from the top of the stack all the way down to the chips.

Summary of interview on 20 August 2010.

Full transcript follows

Distributed and transparent algorithms

will result in nimbler applications

It is unlikely any single company will vertically

integrate through the entire stack

Page 146: 2020 foresight - Tech Views of the Future - Ed Maguire

Sanjay Poonen, SAP US software

146 [email protected] September 2010

Sanjay Poonen transcript Sanjay: First of all, as we look at the world in 2020 there is a very different set of applications that are much more consumer-like and a very different way in which people interact with their systems than the business software of the 80s, maybe even the 90s and 2000s. It is going to be a lot more consumer-like because there are very fundamental shifts going on in the market. Mobile is the new desktop.

In much the same way, I grew up in India and was a computer science engineer for most of my background. We shift the entire mainframe generation there. We were taught client-server types of technologies. C++, C Java, maybe even PASCAL in the early parts of it. We coded to that in this sort of Gen X and even the baby-boomer generation, learned mainframe in that part of the world. We think the Gen-Y population that is going to really be in place in 2020 will have an even different paradigm because they will be developing more consumer-like applications for mobile and disconnected unwired types of devices. It means that those applications need to have a very different look and feel interaction model, more nimbleness, quicker development time frames, availability on channels like an app store.

Then the type of traditional buildout takes 3-4 years that a large enterprise software or even consumer software, Microsoft, us, Oracle develops. We have to start thinking now about both the user experience and the type of platforms that will be in play for those applications that will start getting dominant in the next 5 to 10 years. The user experience will be on the web, more social media-like. They will have a lot more collaboration and connectivity without necessarily having access to all the bowels of an enterprise-software system. You will start to have much more of the social-media types of applications that will influence business software. The biggest confluence area where it will show up first is collaboration. Embedded analytics, analytics will be just a very embedded part and we will cover that in the second trend, which is these intelligent systems.

The analytics will be one which if you look at some of the most popular consumer websites, things like intelligence and predictive capabilities are so buried into the back end of things, it shows up in websites like Netflix, like Amazon, like Pandora, where the intelligence lets you buy the next best book you would like, or the next best movie, or the next best music song. Three popular aspects of what people buy today, books, movies, and songs. Intelligence built into that means your experience endears you to that site, creates stickiness with that. Those are the types of user applications. I think you will also find more nimble types of applications that can automatically be downloaded from places like an app store. App stores will form around business-software companies. For example, in our world we are putting a lot of emphasis

around an eco-hub, 2-million-plus subscribers, and that becomes a conduit for a very different generation of people that people who might buy software from us on a service marketplace.

The second core thing is the aspect of what you called intelligent systems. Beyond the way in which analytics plays its way out in the consumer-like fashion, in the examples I used of two or three types of websites. First, casual users can experience hardcore or complex analytics. The algorithms behind an Amazon or a Netflix may be very sophisticated, but they are manifested to the user in a very easy to use fashion. Behind the scenes we believe predictive analytics really powers many of those applications in a way that you don’t have to be a PhD. Today, the entire predictive analytical application space still requires a statistician, a PhD type of skill set is used only by those types of users. Those capabilities still remain, those algorithms still remain and evolve, but they get manifest in much more user-intelligent applications. They can go anywhere from your web to your handheld device to your car. There is more software in your car today than in the 1970s spacecraft. You will find that even our devices that we drive will work with, maybe even the devices that control out metering and utilities in our home get significantly more intelligent and make us make better decisions. It might help us drive faster, better avoid traffic in the car, it may be helping us with energy management in our homes for all the electrical or gas-driven devices that we have.

Those intelligent systems mean the processing and analyzing of that data and the predictive capabilities of those get embedded behind the scenes, get embedded behind the scenes manifested in simple applications. Behind the scenes beyond the predictive capabilities these intelligent systems will have voluminous data-handling capacity because much of what ends up getting processed today in disk-based structure can be handled in in-memory structures, which may be either physical RAM or flash. Both of those are disruptive technologies to disk-based analytical solutions. Both of those, the price of memory to disk have come down rapidly. Many of those things could power anything from the desktop, to smart meters, to certainly datacenters where you can process voluminous data and make decisions. Then, by vertical, some of these industries that, certainly ones that build their reputation on voluminous amounts of data, you take retail, CPG, Financial Services, utilities, to some extent even healthcare with healthcare records, and even maybe the public sector, those five or six information-intensive industries will be first consumers of these devices, appliances, software that run in the form factor that allow operations that took seconds to take milliseconds, operations that took minutes, took seconds. We think we will have a 100x factor of performance that will help us there.

Page 147: 2020 foresight - Tech Views of the Future - Ed Maguire

Sanjay Poonen, SAP US software

September 2010 [email protected] 147

Third, of course there is your trend around ways in which parts and elements of the stack start converging. While the convergence does allow elements of this to be a lower cost of ownership, one has to remember that there will always be certain providers of technology that will never always be totally vertically integrated. For example, there is a semiconductor business, and a motherboard business, and a chip business that are dominated by a number of key few players. I don’t see one player going all the way down from services down to the chip because there are always going to be ways that that is done better, faster, cheaper by an Intel, an AMD, a set of Japanese providers, so on and so forth. Elements of the software stack, we are certainly starting to see that in that the convergence can help things happen faster, better, cheaper and certainly that is happening as we speak.

That is happening in applications; that is happening in elements of middleware; that is happening in elements of the database stack. That is happening as you can also see in security with the recent moves even by some of the hardware vendors like Intel to move into security. However, the place we feel will continue to be the place not over just today but over the next 10 years is the rigidity of that stack needs to allow openness for elements that an ecosystem can do better than a particular vendor and complete customer choice if they decide for a particular component of the stack, they are better served if they go with a best-of-breed type of player.

The extent you can allow optimization for components within the stack that customers expect to get together in a converged stack, while at the same time allowing modularity, allowing flexibility for people to plug out and plug in new components at particular layers of that stack, there is significant opportunity. At the very top of that stack, if you go all the way from chip to hardware to storage to servers to database, the services layer at the very top of it that takes a lot of smiles to the customer, is always going to be the one with enormous flexibility.

Who would have predicted that the Cognizants, the Wipros, the TCSs of the world would be as dominant as they are if we had looked 10-15 years ago, as also some of the more nimble services plays that may never become 5-, 10-, 15-, 20-billion-dollar businesses but can create 100-million-dollar businesses. There is going to be a very large proliferation of service providers that don’t become necessarily captive to one stack.

Those would be my preliminary comments on the way in which we view the world in 2020 along those three aspects.

Ed: It is a great point you make about the way analytics get embedded into ecommerce systems

around books, movies, and song through consumer websites. That is pretty transparent. What is intriguing is this idea of being able to have more personalized experiences where as you mentioned, analytics has been largely the domain of specialists, how might this play out? This is a very specialized type of capability but in a sense as we look to empower those who don’t necessarily have this expertise, what is the way this could play out?

Sanjay: I think you have to look at every segment from the consumer back to business by industry and ask yourself, where are the places where, let’s start with the consumer, their life could be made significantly better through the better use of data that manifests themselves in analytical decisions. Clearly, in the consumer-buying experience, whether it is the buying experience in the ecommerce world of books, movies, songs, or aspects of this embedded analytics is certainly going to make this a buying experience that is more sticky. To the extent that if you look in the homes of the disposable income on things that involve energy management of a home, it is absolutely necessary for the utilities to provide more analytical data that intelligently decides when and where you do things that you don’t have to do at a certain time.

For example, to the extent that you can decide when to run your washer and dryer because you have flexibility to do that and can take advantage of flexible rates in utility bills. That is something that is already starting to pervade the market and you will see that become almost dominant. Perhaps the flexibility to automatically start a wash/dry cycle based on the lowest rate or potentially lowers your most expensive use of power for things like air-conditioning and, so on and so forth - energy efficiency in general in the home.

In all the other aspects of how a consumer operates, driving is probably one of the other experiences, traffic congestion and so on and so forth. We are already starting to see intelligence built into systems that allow a planet or world start to run more efficiently. Everything from the way traffic in congested areas could be routed to various different places to ways you could drive more efficiently, all of this will have analytics behind it.

Then if you go behind this to the business world, I started with the consumer because we have to look at these analytical systems first from the standpoint of the consumer and that allows us to build simple systems rather than looking at it from the standpoint of the business. From the business I think you have to look at the industries that are most information intensive where a power of information allows them to more heavily stick on and hold on to their customers but more effectively sell product.

Page 148: 2020 foresight - Tech Views of the Future - Ed Maguire

Sanjay Poonen, SAP US software

148 [email protected] September 2010

Service-centric industries care about their customers and to the extent that they have better and quicker fashion to up sell, cross sell, maintain customer profitability and loyalty, analytics at that bottom of that, or at the core of that, allows them, and certainly financial services have been doing this for awhile, but their systems get simpler and easier and not as much manual to be able to do this. There are many other service-centric industries. I think healthcare becomes a key area where things like patient records and identifying how you can more efficiently serve a patient has huge opportunity for analytics.

The product-centric companies, the manufacturing-centric companies, there the equation is, certainly you do care about your customers, but it is about profitability of your operations and product costs. To the extent that you know how to source materials better, today most modern big companies, whether automobile, or high tech or manufacturing, are dealing with tens of thousands of suppliers. Every one of those companies is dealing with companies in emerging countries. Brazil, China, India, so on and so forth, I mean China is no longer emerging; they are the second-largest economy. To the extent you are able to analyze a supplier’s potential risk and understand that this may be the right time for me to not just look at low cost of a supplier but potentially their risk, I might be willing to move off a supplier to a more expensive supplier that is less risky, you are dealing with an analytical decision there in the context of procurement. Manufacturing companies have to deal with everything from the source, all the way to manufacturing, to delivery, all very analytical decisions. Many of those decisions now, not just because they have a large number or suppliers but also the ability with fast constructs like in-memory computing can deal with solutions at a level of complexity one could never deal with before.

Let me give you another example. To the extent that a CPG company could, in real time, decide what is the profitability of a campaign, down to the SKU level, of let’s just say Nestle chocolate or Kraft, down to the food product they are selling, or Procter and Gamble, down to the various different.... That process, which is typically called sales and operational planning, SNOP, could be done in a much faster fashion, tie it back to the trade promotional management systems, and allow them to do a P&L analysis down to the SKU, allocate things differently, play with different scenarios, and run campaigns real-time. Then you connect the manufacturing company to a retailer, you are able to, with the combination with mobile and some of these analytics, decide as a consumer I go into a store, I have sent to me on my mobile device a promotion for a particular product, the retailer has behind the scenes connected my profile to something that would appeal to me, connected all the way to the point of sale system,

and using my phone from where I got the promotion, to the point of purchase where I am buying this at Safeway or wherever, you have built more loyalty.

You can see these all the way from procurement to a retailer; you can build a much more intelligent system that is handling analytics behind the scenes but is connecting live with the consumer with their handheld device. That is an example of a scenario that we think is starting to happen now but it is going to become mainstream certainly in the 2015-2020 time frame.

Ed: That ties into another theme about the idea of mass customization where you have these predictive or analytic capabilities throughout the entire value-process chain, that enables a level of control at the end-user’s point that there hasn’t been beforehand. What do you think are the big problems or big hurdles to overcome to realize this vision?

Sanjay: I think you are absolutely right. In the 1980s and 1990s, these concepts appeared in books. There were books written about 1-to-1 marketing and about customized orders to an individual. The reality is that over the course of the last decade, much of the systems, software, and processes have allowed these things to start to become a reality. They will absolutely become a reality and the people who wrote those books will finally realize their dreams. It is absolutely right that that trend is happening. The “internet of me” or I want things my way are really going to be something that are pragmatically possible.

The hardware and a lot of the things that happened in the lower level of the stack start converging so you don’t have to have as many components, you can have a lot more potentially on a motherboard. Good example, when we worked on our high-performance analytical clients, Intel built us a very high-performance motherboard that is the basis of that. HP and IBM then took that and created software that has system-software management and, so on and so forth, an appliance that has a lot more. That series of components would have probably 10-15 years ago have cost and had a much higher footprint and taken a lot more space in a datacenter and would have been harder to cobble together.

The convergence by hardware vendors to allow software vendors to do what they do well is going to continue. More and more intelligence and more and more power happens at the processor level. Quad processor and multicore, all of that continues so that Moore’s Law continues to allow us more computation power in the processor. In-memory capability, close to that processor, RAM or FLASH memory, allow us to do things with a voluminous amount of data so that those devices happen in a much more converged, smaller state than they ever would; physical in size in a datacenter.

Page 149: 2020 foresight - Tech Views of the Future - Ed Maguire

Sanjay Poonen, SAP US software

September 2010 [email protected] 149

One of the key barriers there is obviously energy efficiency to ensure as more of these computationally heavy solutions get deployed in the datacenter, they have a lot more ability to, cooling and, so on and so forth; power consumption is a significant concern. I know hardware vendors are constantly looking for ways in which the power consumption of these hardware devices that have more computational capabilities and don’t disseminate the heat or require huge amount of cooling.

Secondly, if you move up the stack into the software world, it requires business, companies like us, to constantly ask ourselves where we can reduce costs of ownership where today people have to do things the manual fashion through services. That is not a threat to the services companies; it just means that they do higher-value things. An ERP system in the 1970s was all custom code and today there is software like us, we have created that industry and are the lead in that industry. Analytical software, and even today, is a lot of spreadsheets. The more of that that can be automated saves costs and allows more of that to be automated. You move up the stack to the services layer, the services vendors get more intelligent at doing things that add value to a business. They become more industry-specific specialists, and the folks that are the body shop get very efficient at doing things in low-cost locations that may not be just India and China.

Finally, on the device as you get to the consumer itself, one of the key things that I think needs to happen is that battery life needs to allow for devices, and this is a problem across the board where the most powerful devices, much like most powerful appliances in databases, more powerful devices don’t have to be charged every 6 or 12 hours but the battery life allows you to last much longer. I think there are going to be significant innovations that allow that to happen in the next 10 years.

Ed: That is certainly a linear and an iterative process. We have the exponential improvements in processing power that are all working in our favor. The one final question I would ask is that as we envision this increasingly networked and converged world, the amount of data is going to get bigger and there are new constructs for managing data that is coming in larger volumes and is more distributed. From the standpoint of someone, a user or a business trying to make sense of these massive blooms, algae blooms of data as it were, the risk is how do we maintain the discipline and keep this data from choking off the clarity of purpose and allowing us to find that data that is important. We have had some discussions about the idea of data curators as content on the information side has gone from centralized through networks, newspapers, to the internet where it is a bit of a Wild West. There has been some thought that we are going to see the idea of editors and curators and central

repositories of content become more important over time and integrate themselves into these business processes as content-enabled value-added services that get embedded into the supply value and process chain. What are your thoughts about these things?

Sanjay: I think you are absolutely right. In what the Economist called the “data deluge” in their February issue, the human brain is certainly capable of handling a lot of different, if you look at us as human beings, we can multitask and have intelligent things going on. We have a way of processing priorities. You have everything from a mother who can multitask with kids to a CEO that can multitask with various different priorities with a human brain that is capable of doing that. Increasingly, systems that enable that to happen, automated behind the scenes, it is going to be a fact of life.

If you take some of the examples that you talked about, in the world of unstructured data, we have talked a lot about analytics, which is structured data, but we haven’t talked much about unstructured data. Clearly this is becoming more and more solved. The expertise by which a search-based solution can intelligently determine that this is what you want to see has revolutionized based advertising. Google has made a huge business as a result and they are continuing to refine. That same type of value-add allows me and you and all of us business professionals to filter through what elements of content we want to read and what we don’t. That is only going to get better so that people are able to hone in on the unstructured content, news feeds, whatever it is that they care about.

Beyond that, I think you are going to have ways, that aggregation of content, from a variety of different syndication, allows some kind of federated way by which that could be shared. Let me give you an example, and Dun & Bradstreet and all these guys have started doing this, but the model is still fairly expensive where the price per record, where I think you are going to see a much more proliferation of syndication of content, when you get a best-practice set of customers who all say I want to pool my information here and as a result I don’t need to, necessarily need to know, your data, but benchmarking myself against the average of 50 allows me to know how good I am. In public markets that happens automatically because we disclose revenue and appropriate metrics about our business on a quarterly basis. Folks like you who are research analysts can do a market basket. Imagine if you could get that same level of discipline around everything from buying behaviors to a variety of other things that happen in CPG and retail. Today companies try to do it but I think there is going to be a significant wave in which that content can be purchased, can be distributed, maybe even almost down to free levels. There is going to be a new age of content providers that are the AC Nielsens, the Dun & Bradstreets of the future that deal with this in a very different internet-connected world.

Page 150: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

150 [email protected] September 2010

Keith Schaefer, BPL Global Keith Schaefer, Co-founder, Chief Executive Officer and President of BPL Global, Ltd., plans, directs and manages all aspects of the company’s strategy, business development, operations, strategic partnerships and financing.

Over the past 24 years, Keith has held senior management positions with companies ranging from hyper-growth start-ups to Fortune 100s and a Global 500 company. He has co-founded or led more than ten start-ups and two buyouts, and was instrumental in completing two initial public offerings and nine trade sales.

Mr. Schaefer earned a Bachelor of Science degree from The University of Pittsburgh.

The smart grid emerges from theory Since 1982, growth in peak demand for electricity - driven by population growth and consumer electronics - has exceeded transmission growth by almost 25% every year. In contrast, R&D in electric utilities is among the lowest of any industry. Additionally, the industry experiences large peak demand spikes, making it even more difficult for utilities to predict the appropriate amount to supply. These problems illustrate clearly why smart-grid technology has received far more interest recently.

In theory, a smart grid delivers electricity from suppliers to consumers more efficiently supported by an intelligent monitoring system and renewable distributed energy resources in the distribution grid. Specifically, the smart grid is becoming more plausible with the availability of connected devices today through pervasive communication networks. In order for a smart-grid system to be effective, sensing, measurement, and control devices must be deployed across the grid with a communications system that brings the data back to the utility operations center where software based analysis and control systems automate the management of the grid. This connected network would allow dynamic control of the entire electricity-provisioning process by utilities and a significant smoothing of the demand curve, reducing peak demand, allowing a more stable and consistent electrical production and delivery process.

The smarter grid

Source: Wikipedia

Embedded sensors and intelligence breed

smarter grids

Regulated utilities have historically been

disincentivized to invest in R&D

New regulatory regimes and technologies have

resulted in new investment opportunities

Page 151: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

September 2010 [email protected] 151

The grid gains intelligence As the CEO of BPL Global, a smart-grid software and solutions company, Keith Schaefer has focused on the promise of the intelligent grid. Our conversation focused on the drivers, obstacles and enabling factors for smart-grid technology both for the home and for commercial buildings. In our view, the smart grid is at its core powered by a software system connected to sensors throughout the grid - over time this will drive evolution of the grid to an energy internet, incorporating different types of sources and inputs.

Key points The adoption of smart-grid technologies will usher in an era of smarter,

connected utility grids and improve energy efficiency. Consumers benefit from smarter energy use and grid operators gain efficiencies in managing their respective infrastructure.

Standards-based smart grid systems are increasingly being integrated into the existing infrastructure. As an intermediary between the digital back office and analog distribution systems, the smart grid adds actionable intelligence.

Governments are regulating utilities to change the composition of their energy assets and production, requiring renewable energy resources and a focus on efficiency.

Both governments and industry groups are beginning to agree on technology standards, which are necessary to avoid the risks and costs associated with new technology adoption.

New opportunities for innovation will be possible in areas of measuring and optimizing energy use, collecting and analyzing data from sensors as well as integrating management of distributed energy resources.

The estimates from leading consulting groups provides a compelling glimpse of how dramatic the change of carbon emissions could be if smart-grid adoption occurs - ranging from 7.5% to 50%+ depending on the level of adoption globally. Ultimately, it will be a several-trillion-dollar market opportunity.

Both operators and consumers of services stand to benefit from smart-grid adoption

Summary of interview on 3 August 2010.

Full transcript follows

The smart-grid market opportunity may reach

into the trillions of dollars

Page 152: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

152 [email protected] September 2010

Keith Schaefer transcript Keith: Collectively we all agree that the grid is not intelligent today but will become very smart over the next 9 to 10 years. The reason that we feel so strongly about this is that the demand for energy on a global basis is growing. Depending on whose research you look at, whether it is the Boston Consulting or McKinsey, roughly 30% growth in electric consumption is projected over the next decade. However, the ability to create new sources of energy to keep up with that demand is falling far short. For example, in China, if nothing changes, their GDP will be affected in a negative way because China won’t be able to build generation and distribution infrastructure fast enough to keep up with demand. Therefore, a disruptive and transformative event is taking place on a global basis. Utilities, governments, and consumers are converging to what we call a tipping point. They are embracing a smart grid, an intelligent grid. Let me walk you through some of the drivers.

One key driver is what I just described - that demand is growing faster than the ability to cost effectively generate electricity.

Second, there are large initiatives around the globe requiring utilities to reduce carbon emissions for a cleaner environment. This is being led at regulatory and political levels across the EU and in forward thinking states like California, Pennsylvania, New Jersey, Texas, Colorado, and Ohio to name a few. The mandate is for utilities to reduce their emissions for a cleaner environment to address the issue of global warming. Utilities are not just being asked. They are being regulated to change and penalized if they don’t. The EU has mandated reductions of carbon emissions by 2020. The Chinese have come out and volunteered to lead the global effort in reducing carbon emissions. At the US federal level, bills like cap and trade are being bantered around. A cap and trade bill probably won’t pass this year but certainly some form of the legislation will pass eventually. That is why you see so many utilities at the state level in California and Pennsylvania following state regulatory legislation that asks them for a reduction in energy consumption. In California I think the goal is 10% and in Pennsylvania it is 4.5%. This trend is rapidly crossing the country. And as I said, the EU and China are already there. Reducing carbon emissions is a driver.

Another driver is rising fuel costs. Think about our own country’s dependence on oil supplied by foreign countries. These oil exporting countries are sometimes unstable or aren’t as friendly as we would like them to be. Think about the Middle East. Why are we always in battles over there? Think about Venezuela with a regime that is unfriendly with the U.S. There is increasing concern in the world to find alternative sources of energy that are cleaner, safer, more affordable, and not dependant on the parts of the world where there is instability.

A forth driver is consumer behaviour. We are in a consumer world today that is becoming greener while demand for energy is going up because of the increased use of everything and a growing population. Consumers are being influenced to be greener in the educational system. It is happening when you see public service announcements for being greener, cleaner, and more ecologically aware. If you look at a certain population, say 25 and younger, people that will drive consumer behaviour over the next 10 years, they are demanding a cleaner and greener environment. More importantly, they are willing to change behaviour to be green, from recycling or buying a different kind of a house or condominium, to living in sustainable communities. I don’t know if you have been reading recently that more people are involved in co-ops. They have a common garden or a common area that is more energy efficient. That is happening on a global basis. Quite frankly, America is not leading that effort; the Europeans are far ahead. We are trying to catch up.

A very important driver is technology innovation. Basically, electric distribution hasn’t changed for 100 years. Up until now the utility’s “customer” has been the regulator that approves rate increases - not the consumer that buys electricity. When you talk to executives in the industry, at least in North America, they are thinking more along the lines of public utility commissions (PUC) or Boards of Public Utilities (BPU) and the regulatory environment. That is how they get their money, through rate increases. This singular focus is changing because now with decoupling there are a multiple companies competing for the consumer. Now, all of a sudden, utilities have to be consumer aware. This change in focus will catalyze an infusion of new technology in the industry.

Page 153: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

September 2010 [email protected] 153

Another driver is an influx of really high-tech, young, savvy workers. They want to be able to manage their environments from their desktop, iPhone, or a remote location. That reminds me of cleaner cars. Electric cars are coming in small numbers now but significantly in 10 years and will have a major impact on utilities. High-tech workers will be open to options about when they charge their car batteries and when they allow a utility to use their car battery as a distributed resource. A smart grid will impact how tech-savvy workers use electricity at work or at home. They want a smart dashboard, if you will, to be able to control the heat or air in their apartment or house. They want to be smarter about appliances. You see a lot of appliances that are becoming smart appliances. People are becoming aware of when it is okay to use energy and when it is not. When it is on peak, it isn’t, and when it is off peak, it is. Managing energy consumption at this level of granularity is something we could only dream about a few years ago. Smart energy is moving from pilot to commercial scale deployments and will become persuasive over the next decade.

The use of energy storage in combination with distributed renewable is an emerging trend. Today about 2% of electric generation comes from renewables but the goal is to get as much as 20%. Some say a little more conservative goal of 15% from renewable energy is more realistic. With the proliferation of solar on rooftops in areas that have extensive sunlight like the desert of California and Arizona or wind farms in areas with high wind velocity like Colorado, Alaska and New England, a need for energy storage is developing. The key to broad market adoption is making the renewable solution as reliable and cost effective as coal or as hydro. Renewables can really benefit when combined with energy storage. I think there is going to be a rush from VC, R&D, and university labs to energy storage. When the wind comes in the middle of the night, it could be stored in a large battery, ice on the rooftop, or flywheels. That stored energy can then be used in the middle of the day during peak demand. Utilities will not have to build another coal or gas fired power plant. Renewables and storage in combination with load management has the potential to save the utility money and significantly reduce the carbon footprint. Integrated management of distributed resources will help wind and solar become affordable and replace 15-20% of what I call dirty energy.

If you think about the problem, demand growing greater than generation, three things are changing that will create a multitrillion-dollar smart grid industry.

One, governments are rallying around a common belief and desire to do something about our carbon footprint and global warming. What has not yet been agreed to is how we get there. There are strong opinions in North America that cap and trade would kill jobs. We have to be smarter and it is going to require time, 2011 or 2012, to get it right. We will eventually partner with the Europeans, Indians and Chinese to have a common global agreement. If you layer down inside the US, for example, state by state the regulatory environment is changing and forcing this transformative, or what I would call disruptive, event to build a smart grid.

Second, utilities are now saying that they need an intelligent grid that can allow them to monitor, manage, and operate more efficiently from inside substations all the way through customer premises. Utilities are looking at connecting everything from smart meters and home area networks to distribution automation for energy efficiency built around enterprise software, much like the internet.

Third, new players are working with utilities to achieve this objective. Whether it is Google, Microsoft, Cisco or somebody in Silicon Valley thinking about the newest way to create a home area network (HAN). Some company will capture our imagination like Google for search engines and Microsoft for computing. There is money being poured into the smart grid industry by investors. When the consumer sees the benefits that they can be green, save on energy costs, and reduce dependence on foreign oil, it will empower them to embrace smart grid technology. We see statistics that as many as 40% of consumers are ready to follow now. The remaining 60% are waiting to be educated and there will be an education process over the next decade.

I believe there will be branded services to help people manage everything from your washer, to your dryer, to your refrigerator, to your thermostat for cooling or heating, to your battery-operated car, to more energy-efficient lighting. I think this can save the average consumer anywhere from 10 to 20% on electric and water bills even though the cost of a kilowatt hour or

Page 154: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

154 [email protected] September 2010

the rate of a cubic foot of water will go up. There will be such a saving from energy-efficiency that the savings will be greater than the rate increases.

It is a win-win-win. The win will be for the utility because it will have increased rates. The win will be for the consumer because they are playing in an energy-efficient world and lowering their monthly bill. Perhaps most importantly, the win will be for the environment because of reduced carbon emissions. Big think tanks are coming in and the numbers are pretty amazing. Boston Consulting Group and McKinsey are the two I am going to quote. Boston Consulting Group thinks that global energy-related carbon emissions could be reduced by 7.5% almost initially. McKinsey thinks we can cut carbon emissions in half without reducing the benefits of energy that end users enjoy. That is a pretty wide range. If it is 7.5%, it is a several trillion dollar global market. And if it is 50%, it becomes transformative, disruptive, and a gamer changer.

Ed: It is interesting because we have had a lot of focus on the production of energy, but to your point, you really need to tie it to the consumption. We have never really had truly focused efforts to help optimize that. In some commercial buildings there is some automation to control lighting and heating and other systems but the broader extension of this, and I think this ties into some other thoughts about having these internet-connected devices. Craig Mundie has called it ‘the internet of things’; where you are able to apply layers of software intelligence and optimization. That is a really compelling vision.

Keith: You are right and I think that is why BPL Global is in the sweet spot of the smart grid industry. As you know, we are global not just domestic. If America is only 25% of the opportunity, how can you really solve a global problem if you just build products for North America? BPL Global builds an enterprise class software platform called Power SG®. This product is slowly, but incrementally, becoming the de facto standard in the industry as a platform that manages connected devices, whether it is a sensor, controller, monitor, or a smart meter. We happen to make sensors, monitors and controller, but work exceptionally well with smart meters from Landis+Gyr, Elster, or Echelon. Our solutions reaches across the meter managing HVAC systems and leveraging home area networks.

Our software ties all of the information from connected devices together using analytics for optimization. Software is the intelligent part of the grid. Hypothetically, it is 2:00 P.M. in the afternoon and 98 degrees in New Jersey. The humidity is at 90%. Electric demand exceeds generation and distribution capacity and there is going to be an outage. What do you do? You create a load reduction event; you shed load. With BPL Global software and sensors at FirstEnergy, we can immediately do that and the utility saves the major problem of an outage. With an outage the utility loses revenue because there is no electricity being used and consumers are disrupted. In some parts of India we see 7 to 10 outages a day. It is hard to become a competitive nation in the global economy if you are having multiple disruptive events every day. The smart grid will improved system reliability in addition to energy efficiency.

Our solution integrating software, communications and smart devices is our differentiator. We have created a platform that is open architected, Java based, and that works with anyone’s smart grid application. We integrate with legacy systems of the utility, whether it is SAP or Oracle. We work with all of the big iron companies, whether it is Siemens, ABB, Areva, or GE. We also have our own smart grid applications from substations through customer premises. Our platform is focused on integrating the smart grid within a utility and connecting to customer loads. We have chosen not to participate in home area networks. It remains to be seen who will be the leaders in that category. I think Google, Microsoft, and Cisco have a play in the HAN. Those are the big names. The big telecommunications companies would also be a natural in this area - AT&T, Verizon, Vodafone, or China Telecom. Solutions that garner the imagination of the consumer could even be from somebody that is 25 years old from Berkley or Stanford, maybe Pitt or Carnegie Mellon. It’s the entrepreneur with a plan that is totally different that catches us all by surprise, just like Google did with search.

Ed: Well that is the amazing thing about the pace of innovation over the last decade. Ray Kurzweil made this point that it took books 400 years to reach a quarter of the population. It took the telephone 50 years. It took cell phones about 5 years and it took Facebook 3 years. With the adoption of the iPad and tablet computing just taking off at that exponential

Page 155: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

September 2010 [email protected] 155

rate, you have these shifts that seem to be more impactful and happening more quickly. I would really agree with you in that regard.

Keith: Standards are critically important to the pace of technology adoption just like all of the business drivers we talked about. Without standards, the smart grid will only live up to half of its promise. If you think about it, why did the cell phone business take off? Standards. Why did the computing business take off? Standards. We have to have standards. There are good people in the U.S. Department of Energy working on smart grid standards. We have our own lobbyists and investors working to affect these standards. The Obama administration and Secretary Chu are very well aware of the importance of standards in accelerating the pace of smart grid adoption. I also believe the standards are not going to be geographic but global. It will be very important to align standards in the U.S., EU, China and India. Developing standards is an ongoing process. We should see substantial progress in 2011, if we are lucky, and by 2012 at the latest. We will certainly have well defined standards before 2020.

Ed: It seems that we have the basis for a lot of compelling innovations that will harness the power of optimization and collect all of this data about usage and potentially provide bi-directional instructions. What are some of the obstacles over the next several years? I think we alluded to standards but you also have the role of the very large companies, because historically technology and especially software companies, constantly have to balance the appeal of a closed and proprietary system, which delivers a lot of value and certainly accrues a lot to the vendor with the desire and need for openness, which is one of the factors that stimulates adoption.

Keith: I think the key factors for success, or those that could slow it down between now and 2020, are exactly what you said.

First and foremost, standards are a key to success.

Second accelerating market adoption is very important. Because of the regulatory environment, utilities are notoriously slow to change and adapt. The significant level of market drivers makes the pace of change faster than ever before, but it still takes time.

The third factor is open versus closed systems. If the smart grid is a closed system, then we have failed. We are strong believers at BPL Global, in open systems that collaborate across multiple smart grid applications from multiple vendors. The requirement for an open system is born out again and again and again when we talk with utility executives. I have talked to probably 50 executives over 3 years on this topic and they want an open-architected platform. The smart grid platform must integrate seamlessly with a utility’s legacy systems, representing existing investments of millions even hundreds of millions of dollars. You see Current and GridPoint abandoned their closed systems and adopt open-architected platforms like ours at BPL Global. In addition to being open, systems must be secure and scalable. Security is going to be increasingly more important. Before the smart grid, the worst thing that could happen in a utility grid was that an operator pulled a lever in a substation and two blocks went black. However, it is different if you can hack a system and shut down a city the size of New York. Scalability is still a question to be answered. Smart grid deployments have been demonstrated for 10,000 customers, 50,000 customers. We can even model millions of customers. But, no one has deployed a fully integrated smart grid to millions or tens of millions of customers.

The fourth success factor is return on investment. Utilities must have a solid ROI to move forward with smart grid investments. My concern is the ROI for a meter alone is not very good. When you can tie smart grid investments to a rate case you can get a return on investment like we are doing at FirstEnergy. We have to be smart as an industry that we are selling something with a viable business case.

A fifth success factor is consumer education. Consumers play a vital role in this transformation. I think that they are learning, but to cross the chasm you are going to have to get broad consumers to buy-in. They must see the value added for themselves. It will take massive awareness campaigns from big-brand companies to achieve broad-based consumer buy-in.

A final factor is adequate investment capital, venture capital to drive innovation. You see this with some of the big Cleantech funds but the industry would benefit from more access to capital.

Page 156: 2020 foresight - Tech Views of the Future - Ed Maguire

Keith Schaefer, BPL Global US software

156 [email protected] September 2010

Ed: The benefits that will accrue from saving costs and the efficiencies that accrue to the utilities, the lack of incremental investment in spinning capacity for instance, is a far more efficient use of capital and should benefit the vendors that are taking that software approach.

Keith: I absolutely agree.

Ed: It’s exciting. One of the themes that we have been exploring in this project has been this opportunity for new types of innovation, new types of applications that are enabled by availability of ubiquitous mobile access and now with the increasing connectedness of devices. What you need are these control platforms that allow you to build these applications and these interfaces on top of them. It is compelling that if you combine this with location-based services and the ability to know, if you have a system that knows when you are getting within 5 miles of your house, it will turn the air-conditioning on.

Keith: Remote access, whether it is from your car or it is from your laptop or your iPhone will happen. Consumers will have easy tools to better manage their energy use. It may also come in the form of surprising kinds of services. For example, think about GE selling appliances and now they have electronic access to what that appliance is doing. There could be routine computer based diagnostics similar to when your car is serviced. All major appliances will always be “docked” and connected to consumers, manufacturers and utilities. Consumers will be able to manage their appliances for efficiency and lower energy costs. Manufacturers will gain new insights into their products, the problems they are having, the energy they are using and user behavior patterns. Utilities will be able to use appliances as distributed resources. The smart grid will generate a wealth of data. It will be totally different.

Page 157: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

September 2010 [email protected] 157

Stratton Sclavos, Radar Partners Stratton Sclavos serves as partner of the investment firm Radar Partners. He formerly served as chairman and chief executive officer of VeriSign (Nasdaq: VRSN), the leading provider of trusted infrastructure services to web sites, enterprises, electronic commerce service providers and individuals.

Stratton also sits on the board of directors of Juniper Networks and Salesforce.com. He was recognized by the Silicon Valley Business Journal as the Entrepreneur of the Year in 1998 in the emerging companies category. He was vice president of worldwide marketing and sales for Taligent, a joint venture of Apple, IBM and Hewlett Packard. Stratton served as vice president of worldwide sales and business development for GO Corporation, a mobile computing company, from 1992 to 1993.

Stratton holds a bachelor's degree in electrical and computer engineering from the University of California at Davis.

How to invest when innovation requires less capital The need for smaller investments in startups is driven by the lower costs firms incur in the launch process today. With cloud computing and the general commoditization of hardware, what was formerly a capital-intensive and arduous process of gathering and integrating hardware resources now only involves a credit-card fee for rental space on a virtual server like Amazon’s EC2 or Rackspace. Though not all of these new firms have equally as inventive and groundbreaking ideas as previous startups, the capital they require is simply not an amount justifiable for a large VC.

Micro venture-capital funds operate in the same way as larger traditional venture firms except the size and returns of their investments are smaller. They are traditionally focused on the earliest stages of the startups, making typically pre-seed or seed investments. Though relatively small in number, these early stage micro VCs are making an impact and are growing in number.

Total venture capital software investments

0

2,000

4,000

6,000

8,000

10,000

12,000

2001 2002 2003 2004 2005 2006 2007 2008 2009

(US$m)

0

200

400

600

800

1,000

1,200

1,400Total funding

Number of deals (RHS)

Source: Annual figures based on quarterly data from The MoneyTree™ Report by PricewaterhouseCoopers, the National Venture Capital Association, Thomson Reuters

Disruptive technologies such as cloud computing

lower the barriers to innovation

Micro venture-capital funds are becoming

more relevant

It’s possible that venture deal frequency will rise while deal size will fall

Page 158: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

158 [email protected] September 2010

Content and intelligence Stratton Sclavos is best known from his tenure at VeriSign, where his strategies sought to effect the convergence between internet, telecom and mobile infrastructure services. In his role at Radar partners, Stratton is focusing on early stage investments. Our conversation touched on new enhancements in content tagging for multimedia data, the ongoing power shifts from center to edge, back to center of the network, and the increasing capital efficiencies of the VC industry.

Key points Content from multimedia, such as video, will increasingly have rich

metadata tagged to provide opportunities for advertising tie-ins to products shown. This will open up new opportunities for providers of content or news.

As information and entertainment sources become more fragmented, the role of editors will become more important for content. This mirrors a pattern seen in technology, which often moves in a sine wave. The processing, computing, and thinking power shift from the edge to center, edge to center. When the edge gets more capable you move more processing to the edge. Then as the problems get more complex, like video, you want to move it back to more centralized processing. This same phenomenon is playing out in the internet as media has fragmented into internet sources and blogs, and the role of editors will centralize more power again.

The development of the cloud has led to the democratization of innovation and access to resources. In turn, the pace of innovation and ability to bring it to market will accelerate as the size of the required capital investment declines. This may give rise to the micro VC. The VC market will bifurcate to smaller, nimbler VCs who invest in early stage startups and the larger, traditional VCs who invest in later-stage or more capital-intensive industries.

Mobile applications will be bigger than what happened in the late 1990s with the internet and the web browser, bigger than what happened with the first generation of mobile and the iPhone. They are just starting in demos, such as an iPad controlling your house and using smart-grid technology.

The social aspect of the phone and, the iPad, in particular, are becoming the tools and the accelerators of work/life blend as opposed to obstacles to good communication and happy family situations.

Summary of interview on 21 July 2010.

Full transcript follows

Content curation will play an increasingly important

role as the internet continues to grow

The cloud has democratized innovation and access to resources,

quickening innovation

Page 159: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

September 2010 [email protected] 159

Stratton Sclavos transcript Stratton: Things that should happen in one year tend to happen in ten but occur on a scale much grander than ever envisioned. We are definitely seeing that in terms of interactivity between mobile and broadcast and programming.

It’s really interesting to have teenagers. I have a 19- and 21-year old and I am not sure my son has watched a network television show on the television in the last six months. He gets everything off of his laptop. The experience is much better for him. He can do it when he wants and he can do it where he wants. It is obviously an HD broadcast on a high-resolution screen, and to him that’s now the way he wants to watch programming.

My daughter is a Facebook nut. Getting her notifications not just on her laptop but over her mobile makes her happy every day. This immersive experience that people can now partake in, whether it’s over a screen in the living room or through a laptop or over mobile, both the entertainment and the interactivity part of it is just exploding more than when we first started talking about three screen and being able to start and stop a video. It is context and a notion for interactivity.

I just looked at a company that has a novel idea around putting ecommerce activity right into the broadcast signal in a way that you wouldn’t have to stop your broadcast experience. For example, you could bookmark it and then come back later and search that video clip and find what you bookmarked. It could be the shoes that Fergie is wearing or some kind of profile of the entertainer. All that type of stuff can be integrated into that type of immersive background experience.

Ed: Necessarily you would have to have a far more dynamic type of metadata generation and management that would be able to be applied to both user-generated content as well as content that comes from established providers.

Stratton: Right, and in this particular model the way you do it is you do have a tagging system. The tagging system is a post-processing activity, after the programming has been done. The goal is to get that to one or two days so after post processing of a TV series or a movie, it could all be there and tagged for use anytime and sold to advertisers.

I was watching one of their demos here and it’s a live broadcast of a concert. Whatever the star happens to be wearing probably hasn’t been sponsored by a particular advertiser or product company yet. If you go to these people and say hey, ‘This entertainer is likely to be wearing your stuff and we can tag that, would you like us to tag it and link it directly to your

ecommerce site?’ It actually creates a much bigger ecommerce potential, not just what has been thought of pre-filming but also what can be added in post processing, depending on what other things are happening on the screen that could be tagged in a post-processing way.

Ed: This opens up a number of interesting opportunities for the providers of content or news. Obviously, the traditional media companies, music companies, and entertainment models have been struggling to adapt their business models to a connected world.

Stratton: A good friend of mine is the President of ABC networks and he hates the Hulu model. He thinks those economics are horrible and it cuts into the value of his stuff by allowing free distribution on that model. Now Hulu has gone to Hulu plus, trying to create a subscription service. Those are hard models. I showed him this thing. You could say ABC probably has one of the best apps on the iPad right now. If on their iPad experience they used this new tagging system, he could actually sell advertising much more broadly and to a new group of folks because the content in the programming could be repurposed for ecommerce or for simple sponsorship. It is those new models where you think about how to bridge the old business model that is unsustainable in the new world with something in the new world that gives him more inventory or potentially new advertisers to sell to.

Ed: Having the intelligence about user behavior and ultimately the ability to almost micro target that message is going to be critical.

Stratton: That’s right. You are right when you say the tagging system itself is of a new breed and I am a big believer that you cannot interrupt the entertainment stream for the passive viewer. The real reason that you see advertising still working generally in a broadcast model is because people do not want to stop every time they see something or they don’t want superfluous information going up on the screen trying to get them to buy something at the same time that they are enjoying an entertainment.

This is why so many people TiVo shows and they don’t start watching them until 22 minutes into the hour long show. They know they can skip all the commercials if they wait. This gives you a way to enjoy the show, but if there is something in there that I want to target, I just click a button or tap on the screen and all of a sudden all of the tagged pieces of content are shown to me. I can also bookmark it and go back and say, ‘What were those shoes she was wearing?’ It is a simple idea but it is one of the more interesting ones I have seen lately about how you bring interactivity and marketing into these new models.

Page 160: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

160 [email protected] September 2010

Ed: This sparked my thinking here because simultaneously you have this incredible fragmentation of what used to be the established delivery channels for content, entertainment, and information, for that matter. Now we are incredibly fragmented. In a way it’s incredibly democratic to start a blog and propagate a blog and information, but it also seems that there is an increasing value to be a curator or filter or an editor of a lot of this information. It is only going to increase and it just completely overwhelms the end user.

How do you bring sense out of this deluge of information?

Stratton: You are pointing out one of my beliefs around technology and information delivery. These things, for lack of a better term, go in sine waves. I used to talk about this when it was around DNS and whether the intelligence should be in the center of the network or out at the edge. Whether it’s mainframes to minis or PCs to client servers and now to networks, the power shifts.

The processing, computing, and thinking power shift from the edge to center, edge to center. When the edge gets more capable you move more processing and thinking there. Then as the problems get more complex, like video and the rest, you want to move it back to more centralized processing where you can throw much more intelligence and much more processing power at it. This editorial thing is exactly another one of these examples where information was tightly controlled by the publishers and by the broadcasters for a very, very long time in a private model. To access it you either had to own a TV or a subscription to a cable or satellite broadcaster. Then the internet comes along and the model goes completely the other way, where it is always open.

As you say, people can start a blog and you get this open society where anybody can publish anything and it appears to be from a professional publisher because it’s so easy to make it look that way. People have been deluged with this and are longing for editorial quality, insight, and a honing down of the actual channels of access and insights. Whether a sine wave is the right way to think about it I am not sure, but I am struck by the way we go from closed, to open, to tailored, time and time again in all of these technology paradigms.

Ed: There is an analogy when you talk about these enabling technologies that are democratizing innovation and access to resources. You are involved with SalesForce.com and you look at the vision of Force.com as well as what they are doing, with Azure and Amazon web services.

In the old days you had to buy a bunch of servers, stick them in the garage, load them up with an OS, and write applications. Now, when you have an idea,

the barriers to realizing those ideas have fallen away. All you really need is a credit card and an Amazon account to spin up an application.

When I think about value creation over the next ten years, the ability to create a scalable application is going to become somewhat challenged. There is this huge opportunity for value creation almost on a vertical basis and one of the emerging themes has been the idea of the “mom and pop application provider.” There will be incredible fragmentation but an almost completely personalizable experience for the end users, the small business, or even the larger businesses because of this. There are implications for where value ultimately is created for investors and where the business will emerge.

On the other hand there are 170,000, probably 200,000, applications in the iStore. Does that become a similar model in the enterprise? Does that make the SAPs and the Oracles adapt to a far more fragmented approach for dealing with customers?

Stratton: This is exactly the next area I wanted to dive into with you. There are a couple of things we have noticed over the last few years. I would say 80% of the entrepreneurs that come into our firm looking for seed capital or A-round capital already have an application up and running, if not hosted on Amazon, hosted somewhere like Rackspace or somewhere else.

The capital efficiency that can be applied to developing, introducing and then iterating on these new applications is phenomenal to us and we don’t think that is going to change. In fact, we think it is going to get more and more like that. The more interesting things are the capital efficiency for a venture firm trying to launch these new companies is much better. It makes firms like ours, which are smaller and more agile, without LPs to have to worry about, able to put US$250,0000, US$500,000, US$1,000,000 to work very easily across several different companies. In contrast, a Benchmark or an Accel really wants to spend US$5,000,000 or US$10,000,000. The efficiency of the entrepreneur is currently in conflict with the business models of the large funds.

Ed: You hit on another critical; point. If you don’t need to spend more than a few hundred thousand dollars to seed a company, where are the opportunities for the big VCs in IT?

You can go into highly capital-intensive areas like clean tech and manufacturing but maybe that model becomes outmoded. Perhaps the micro VC model becomes more prevalent. Like the TechStars or Ycombinators of the world, where you have these small investments, but try to get economies of scale and offer expertise, cultivation, and mentoring.

Page 161: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

September 2010 [email protected] 161

Stratton: There are two ways this model can go. The first is that the large VCs will invest in the cloud-infrastructure companies. Those will still be capital intensive; in fact, many of those will be more capital intensive because you have to build it “before they come.”

Secondly, you will see the VCs to be more later stage. The larger funds are going to wait for someone like Radar to do the seed and A round, percolate the company to generate US$5,000,000 to US$10,000,000 of revenue, and then they make a US$5,000,000 or US$10,000,000 investment to try and scale the company. I think you will see it be bimodal in that sense. We talk to Sequoia, Kleiner, and those guys all the time.

In fact, Kleiner came in on a second round on something we had already done, an open-source network-management tools company. We are starting to see that bifurcate, but it is going to be a struggle for those firms to find the early stage companies who need the amount of capital that they really have to put in place. Otherwise their LPs are going to be pissed off because they are not putting the firm to work, and they are not paying them to hold cash.

Ed: It is disruptive that you don’t need as much investment upfront. Innovation is far less capital intensive than it used to be.

Stratton: Let’s go to the other side of this, which are enterprise applications. About a year ago I had lunch with Marc Benioff before a board meeting and I said, ‘Can we see some new products or some new features or something at the board meeting because all we have been talking about is governance and comp committee reports and audit committee reports?’ And he said, ‘Oh, that’s a good idea, in fact we have been working on this thing that’s kind of like Facebook for the enterprise.’ The next day he showed us this thing, and I got to tell you Ed, I sat there and said this is more transformative than what you did when you started Salesforce. Have you seen Chatter?

Ed: Absolutely. The idea of bringing that paradigm of information streams and the collaboration to the enterprise could be so profound because it solves so many of these problems that management systems were trying to solve ten years ago. You have the bi-directional nature of email where you send an email back and forth but to be able to have these threaded comments and collaboration, it is totally disruptive.

Stratton: And so a couple of things in that sense. First of all, what Marc said was that there are almost 500 million people who already know how to use the Facebook interface and it is going to a billion. Those are the people we are going to be hiring in the next 10

to 20 years. That is an incredibly powerful notion when you think about the next working-generation workforce is a set of social-networking experts and that is the interface they are using. They are growing up with it and it is going to transition into the work force. Additionally, I do believe what Marc said, that it is going to cut email in half. Again, my kids don’t use email and, they really, they are not much on voice calling; they are text messaging, Facebook chatting and Facebook notifying.

The second thing that gives Chatter such a phenomenal tool set is that it is not just in a Facebook context of people notifying each other of what they are doing, or applications notifying. Now all the data systems in a company can Chatter at you as well. For example, you are a sales person and your top customer opened up a level one trouble ticket. You are going to see that. It’s going to notify you and you aren’t going to have to wait for somebody to tell you or have to go find it out. If one of your other customers is 90 days’ overdue on an invoice, the system is going to Chatter at you. When I really started to put this together, this notion that the whole world is going to Chatter at me and make it easier to collaborate and get my job done, I think the productivity gains are going to be phenomenal. Literally, in the first 60 days of Chatter being out in the customer base, customers are telling us they are pretty sure they are going to double their seat penetration within 18 months.

Ed: That’s kind of hard to conceive at an enterprise level, that you can have that level of acceleration.

Stratton: The Dell CIO sent Marc and the board the notification that he was sending out to the 30,000 users at Dell on this thing. Within days he said people are, it literally is like wildfire, so excited about seeing this stuff because the notifications are just coming at you all the time.

Ed: I talked to George Hu and a few of the other Salesforce.com folks at an event they did here in New York and the social impact of flattening the organization is significant. For example if Marc would post something, you would have a response or an idea from someone lower down in the organization who might never email the CEO with an idea. The ability to create this flatter organization with more transparent communications across the organization could ultimately have a profound impact.

Stratton: Right. I think it’s funny if you think about the little paradigm we talked about. The enterprise is going social, and entertainment, content, and editorial is going seamless where it flows across all the device types and the rest, at the same time it’s also trying to find its quality level. It is going back to where editorial quality is becoming important.

Page 162: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

162 [email protected] September 2010

The other thing I believe is the capability. These are not just smartphones but are in fact smart computers and they are becoming the remote controls for our lives. Just the simplest application on my iPhone is a DIRECTV one where I can program something on the DVR right from the phone. If a new show is coming up, a live sports event, or if I am out of the house or travelling, and I don’t want to miss it, in about 90 seconds I have it fixed on the DVR and it is going to record. The joy of that versus the frustration of missing the capability to record it is phenomenal. That is just a little remote control.

We have seen these iPad demos for controlling your house and using smart-grid technology to do what you are doing. That stuff is going to be like wildfire. The mobility applications we think are just starting in their ability to delight the consumer. We are in for as much of a change in what these applications are able to do and how quickly they can be written as we ever have been. It is going to be bigger than what happened in the late ’90s with the internet and the web browser. It is going to be bigger than what happened with the first generation of mobile and the iPhone.

We have a small investment with a company called Sencha. They are open source, Java Script, and now have mobile tool sets where literally you can write an HTML 5 app and deploy it across iPhone, or Android, or RIM or anybody else. In the first two weeks we had somewhere close to 50,000 or 60,000 downloads from real developers who want to build these things. It’s just phenomenal. The whole openness of web kit, CSS, and HTML 5 is going to bring the next generation of innovation around mobile apps and really will make them remote controls of our lives. We carry these gadgets all day long. We couldn’t live without them. My wife used to forbid them at the dinner table and now routinely asks me to Google something during dinner because we are having a discussion and want to know who’s right about something.

Ed: I wonder if it is possible to quantify how many dinner table arguments have been solved by having a smartphone in someone’s hand.

Stratton: Well, I am telling you it is millions and it’s going to be tens of millions in the coming years.

Ed: It’s interesting that you tie together the smart home and the smart grid. We have been talking to people about just those topics and the tie-in with the edge. An interesting point that was made on an earlier call was that we are moving past the stage of evolution in mobile where you are adapting the applications from the last technology wave or the last architectural paradigm to this new format and now we are seeing these truly native applications and these new native ideas evolving because the people who are developing them live in this world. That’s related to Marc’s

comment about the generational shift. I think that’s quite prescient because it’s a completely different perspective when you grow up just having these devices around you.

Stratton: No, I think that’s right. I used to talk about this when VeriSign did Jamba and the ringtone things. I wouldn’t have known what a ringtone was if I wasn’t watching my kids. When we bought the intermessaging gateway, I wouldn’t have known what text messaging was, or I knew what it was, but I didn’t realize that’s what everybody is using, without watching my kids.

If you think about that generation at 21 and 19, they grew up always with cell phones. They grew up most of their lives with the network always being on and having broadband access versus dial-up. They are now starting to come into their work lives where social is the paradigm that they are most comfortable with and it is going to be available for them to use in their work lives. I sit here and I say, ‘What are their kids going to grow up with, take advantage of, and evolve into with their work lives?’ That’s why we do this. Technology is so fun when it comes to the life changes it makes generation to generation.

Ed: I just heard Clay Shirkey talk about the cognitive surplus that has given us Wikipedia. He showed this graphic that compares the amount of hours people watch TV versus the amount of hours it has taken to develop the content on Wikipedia. It is just a fraction. I think about my 7-year-old son who was contributing to Wikipedia because he is an expert on SpongeBob Squarepants.

You have mentioned open source with some of your investments. We had a great conversation with Mike Tiemann and he talked about this incredible wealth of open-source IP, a billion lines of open-source code, for a developer that’s starting out today that is accessible and provides essentially an easily accessible foundation around which to build any sort of IT application, or solution that you want.

When we start to see that accelerate, what I am really curious about and interested in is where we are going to start seeing those inflections in innovation. Tom Siebel made the argument that we have already seen that 30-year innovation cycle in IT crest into a plateau. Bill Joy has also made a similar argument that the next stage of innovation is going to be focusing back on the big problems of the next generation. Energy, the hard sciences, material science, personalized medicine, and when you think about it, all of these technologies may be setting the stage for another chapter in evolution.

Stratton: Right, that is dead on. We shy away from all the green stuff. For a partnership like ours, the capital intensity is going to be way too hard. On the other hand I would love to see some breakthrough there. I

Page 163: 2020 foresight - Tech Views of the Future - Ed Maguire

Stratton Sclavos, Radar Partners US software

September 2010 [email protected] 163

think efficiency rates in solar or all that stuff are still a generation from what is really practical and feasible to compete with fossil fuel. I am glad somebody is working on fuel cells. I am glad somebody is working on new solar and technology at the silicon level. All this stuff, I am glad somebody is working on it, but I think we will make big shifts, but they are ten- and twenty-year shifts. Good for us, as a society this can only be good.

Ed: The key takeaways of what you have argued here are that content again has further stages of evolution. The idea of being able to tag and create that interactive immersive experience is really compelling and the ubiquity of mobile devices and mobile communication around us may give rise to a lot of opportunities.

Stratton: The social aspect of the phone and the iPad in particular are amazing. Rather than being disruptions to your family life and to your nonwork hours, they are actually accelerators to help you do

things in your family life, but also to be more productive at work so you have more time to do those other things. I have always talked about work/life blend versus work/life balance and we are arriving at a time when the devices are becoming the tools and the accelerators of work/life blend as opposed to obstacles to good communication and happy family situations.

Ed: I have not heard that expressed that way Stratton. I would like to see this trend happen - we certainly have the bleed over for any knowledge worker. We are always connected - but what is amazing to me is how in just the last 3 months the iPad has transformed ideas about mobility, applications, how you can access and package content and how you can build business models around this type of device. If that’s how quick things can change and it’s only the first year of this new decade, I am bracing for what we will see over the next nine.

Stratton: No, I am too. I am very glad, being 48, that life expectancy rates are going way up because I would like to enjoy the next couple of decades of this change.

Page 164: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

164 [email protected] September 2010

Michael Skok, North Bridge Michael Skok joined North Bridge Venture Partners in 2002. Prior to this, Michael had been an entrepreneur and CEO in the software industry for over 20 years. Michael founded, led and raised over US$100m in private equity for his companies in the CAD/CAM, document management, workflow, imaging, collaboration, security and analytics markets - spanning the mini computer, workstation, PC, client server and internet eras.

Six of these companies were successfully acquired by Siebel, Platinum, FileNET, Banyan and IBM and two went public, including Symantec for which he built Symantec UK into its most profitable international business. Most recently Michael founded Alphablox where he envisioned and defined the market for analytical applications, now a multibillion-dollar market segment. IBM acquired Alphablox and markets the product as DB2 Alphablox and integrates it into its Websphere and Rational product lines.

Michael has served on many private and public company boards as well as supported various software industry groups such as the Software Publishers Association where he was chairman for a number of years in Europe.

Michael is a graduate of Nottingham University in the United Kingdom (awarded joint honors in management sciences and engineering).

Three vectors for innovation Michael Skok has been a VC with North Bridge for a number of years and brings a wealth of operational experience and a theme-oriented approach to his venture investing. Our conversation touched on several of the priority themes for his own practice, which include Software as a Service, disruptive business models and the rise of analytics. His role as an early stage investor demands a longer-term view of where opportunities may lie and our conversation provided a number of valuable insights.

Key points Three key themes: Software as a Service, everything that supports SaaS

and cloud infrastructure, and disruptive business models.

Innovations within infrastructure design are leading to new scale-out architectures as opposed to traditional scale-up architectures. Scale-out architectures commoditize IT further and enable opportunities for automation; the result will be disruption from the bottom of the IT stack all the way up to end-user applications.

Open-source development will continue to flourish, enabling innovation and application development at a scale previously unfathomable. Open-source communities are better able to reach scale and commit to best practices, resulting in cleaner code and development speeds far beyond the capabilities of any single IT organization.

The growth of connectivity will dramatically change the penetration of the internet, with potentially billions of new users connecting in the future.

Software as a service (SaaS) evolves into everything as a service (EaaS). Why not have people as a service (or crowdsourcing)?

Summary of interview on 8 July 2010.

Full transcript follows

Scale-out architectures will commoditize

hardware

Open source enables development at scale

Page 165: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

September 2010 [email protected] 165

The rapid growth of connected devices and their recorded media will result in demand for real-time predictive data analytics. Applications become “smarter” as the scale of the cloud and the amount of data provide for better predictive analytics.

IT within organizations becomes smaller and more business centric. Smaller, more agile development firms will be able to achieve IT scale at par with larger organizations, resulting in increased business and faster innovation cycles. Larger, slower companies could find their competitive “moats” drained.

‘In the business world today, everybody is trying to do more with less. In order to do that what they have to have is focus on core competencies. By implication, what that causes people to do is to outsource everything that is not a core competency.’

Connectedness and distributed data married

with analytics will give rise to smarter devices

Incumbent organizations will need to shed

noncore operations to remain competitive

Page 166: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

166 [email protected] September 2010

Michael Skok transcriptMichael: About 20-plus years ago, way back before mp3s or anything else, I was asked what key technology trends would influence homes. I said that everything would go digital. You would have what I called a DigiCenter in the home and you would stream everything, to get all your music and videos and everything else. Of course, at the time, it sounded like lunacy, but I thought why would you use anything other than digital when you know it can be digitized? It turns out that it took a lot longer than I predicted, but I think Apple has finally figured out how to do it all in the last few years. But the thing I got totally wrong, which makes me laugh, is that I thought that everything would be online about 10 years sooner, and in fact it took a hell of a lot longer for that to happen. So I may be similarly wrong here because I am going to make some similar predictions for the next 10 years that continue on one of the core tenets of that original premise. Everything that can be digital will go digital, and everything that can be connected, will be, including not just bits but atoms and people and places.

As a VC I have to focus, so there are three themes that I track right now, SaaS, cloud and open source. SaaS is at the top of the technology stack, and at the bottom of the stack I look at everything that supports SaaS, such as cloud and infrastructure. Finally, I track destructive business models such as open source because they drive investment opportunities. In each case I’ll talk about current investments and what they represent today and then project where these areas are headed.

Starting at the top of the stack with software as a service, we started with fairly basic examples of software as a service, such as email with pioneers like hotmail, and now we have advanced to things like CRM on demand with Salesforce.com. But a few years back, we made an investment in a company called Demandware, which delivers ecommerce on demand. At the time people thought it would be a tough sell to rely on a third-party service for something so mission critical. Now, a few years later Demandware’s ecommerce is the front door for many major brands and handles even billion-dollar retailers.

We have already gone from a place where people said, ‘SaaS will be only for SMB and non-mission critical to now handling enterprise-critical applications.’ We were told in diligence, ‘People will never give up their storefront and their very presence online to somebody else.’ And, of course, we have seen that proven wrong. Demandware will do about US$2.8bn of transactions this year and we are nearing a 100-million-dollar backlog for that business. It is clear to me that Software as a Service has become mainstream today, so I will fast-forward to where it is going.

I think SaaS is going to evolve to, what I coined a couple of years ago, to EaaS, which despite being

another crazy acronym is simply “Everything as a Service.” What do I mean by everything? Well let’s talk about it. We get Software as a Service today. We get search as a service from Google, we get data as a service from Thompson Reuters. We get media as a service from Netflix. We are already getting gaming as a service with things like WoW. We are getting voice as a service from many providers. These are all more straightforward digital services.

But in the future, why wouldn’t we be getting people as a service? That’s really crowdsourcing. I think we are just beginning to see the trends of that in things like crowdsourced design, all the way to Amazon’s Mechanical Turk. Then that leads to things like manufacturing as a service, and, in fact, everything as a service. There is a huge business driver behind this that is going to cause more and more need for things to be consumed, both digitally and even physically as a service. It may be hard for people to get their head around in ten years from now what could you physically consume as a service, but in the business world today, everybody is trying to do more with less. In order to do that what they have to have is focus on core competencies. By implication, that causes people to outsource everything that is not a core competency. That means that if you are excellent in the business of manufacturing then the last thing you want to do is try to handle logistics when there are many experts at that, and vice versa.

To go back to our ecommerce example, if you’re a very high-quality retailer like Gucci, then the last thing you want to be in is the IT services business. What you want is a better ability to focus on your core competencies of merchandising and marketing and branding, which is where you differentiate and actually justify your positioning as a premium price brand. Therefore, you are totally willing to outsource your ecommerce platform and the entire stack that has to be managed for you to be able to put up a storefront. That’s not only smart but it’s actually enabling you to spend more time on your core competencies. This example will be replicated all over the world in so many different businesses, creating many different services, or one day Everything as a Service. EaaS!

If you look at this as a macro trend, it’s also part of the connected world and globalization. For example, people have long outsourced things like basic data entry offshore and basic data full research as a service. The more the world is accessible in this connected form, the more it will be possible to outsource and connect to people places and services you need from wherever they are economical. I believe you will see this as a mega trend. The core competencies, the outsourcing, the globalization of reach to get more and more of everything you need as a service.

Page 167: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

September 2010 [email protected] 167

Ed: One of the interesting impacts of this outsourcing is that you end up getting more vertical specialization among those who are delivering services.

Michael: Absolutely, in fact it is one of the areas that continues to be very interesting to us. We have made many vertical investments in the past. When we made an investment in Phase Forward, which was doing clinical trials as a service, everybody thought it was an unusual niche. Did it make sense to build a service business to outsource that? But of course it makes total sense now that we have fast-forwarded 10 years. It became a big public company, and remained as such until Oracle just bought them. If you look 10 years forward you’ll say there are lots of things that are obvious when you have the ability to connect everybody and everything globally and reach things in this “always on” way.

Enter cloud computing to support all that.

In cloud computing you have the very early stage of the infrastructure being built to support EaaS. The virtualization has happened and the organization of that into grids and clouds has happened, but the actual infrastructure that scales out to support that cost effectively is still early. In fact, as is typical we have disruption happening from the bottom of the stack upwards. You might say the compute piece is somewhat done with multi core but the ability to connect the compute infrastructure with virtual networking is still up for grabs and there are still lots of pieces for that that are still not solid. There are companies we have invested in like Xsigo that are solving that problem.

The next layer up, storage is definitely not done. The inefficiencies of storage are just ridiculous. I’ll give you the stat, I’m sure you know this but in the enterprise today if you store a single piece of data once, it unfortunately ends up stored 13x over and everywhere between your backup, your disaster recovery, your high availability, your redundancy, etc. So while that’s an apparently ridiculous statistic it’s a fact. We think that there’s huge disruption still to happen in storage virtualization and in the infrastructure for storage compute and networking. We’ve invested in a company called Actifio to address this head on.

If you go up one level to the kinds of application services that are needed to support a scale-out, where again we have not gotten true scale-out as opposed to scale-up going on in things like databases. Oracle rack clearly doesn’t cut it for the Cloud world. Scale-out is being sold with some workarounds today like Memcached and Hadoop, which are great. We have made an investment in Northscale, for example, on the Memcached project but it really doesn’t address the transactional integrity that is needed. Because of this

we have made an investment in a company called Akiban, which is doing cloud-scale databases in a pure scale out way. Today there are huge management challenges to do this. Akiban is making it effortless, with minimal administrative cost to enable horizontal scale-out in a linear fashion. We think that is still up for grabs. And it is fascinating seeing how much the database world has come back into focus as an investment opportunity. This is just one example of the lack of true ability to manage scale out not scale up in the layer of application services that is necessary to support EaaS.

Actually we see challenges of management of the cloud infrastructure as important to resolve in the immediate future. For example, look at virtual desktops. It makes sense to virtualize the desktop but when you look at the cost of actually deploying it, it makes no sense at all to keep every single desktop with everything from the OS, apps and user information stored thousands or tens of thousands of times in these big institutions. So I’ve made another investment in a company called Unidesk to do management of virtual desktops. Unidesk is approaching the aforementioned problem by splitting out each of the OS, app, data and user layers, for example, avoiding the replication of every single copy of Windows that is in an organization and every single application.

Disruptions all the way out to the desktop, from the very bottom of the stack right up to the end user, are coming in the infrastructure to support these software-as-a-service-type applications, or what I would describe as edge-based services.

This is just one example of how early we are in managing the disruption up the stack of scale-out, very early. There is lots of room for innovation.

So let’s turn to disruptive business models and my current focus on open source. I think that you know the state of the nation today is that we have got one public open-source company that has really proven its business model and that is Red Hat. I think that’s surprising. You know we believe that there are a lot of applications that just cannot be built effectively without open source. I put my money where my mouth is when we seeded one right inside of North Bridge. I spent almost 6 months getting this company off the ground, but not because of the technology - that was already there - but because we wanted to innovate on the business model.

Ed: Yes, I think I know the company - that’s Acquia, commercializing Drupal right?

Michael: Exactly it is Acquia and the unique way we’re commercializing Drupal is really the lead-in for discussing the future of business models.

Page 168: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

168 [email protected] September 2010

First, a little color: I enjoyed building a relationship with the Belgian founder, Dries Buytaert, and am happy to have seen him move to join us physically in Boston some 2.5 years later. In that time Drupal has grown orders of magnitude. In March this past year we had a Drupalcon in San Francisco and 3,000-plus people showed up with double that number online. That’s a bigger following than for example MySQL or JBoss ever achieved. Why? It’s an example of a new class of application; social publishing that is simply so compelling. Following in my theme of connecting people and things, this is connecting people and content and creating communities often even including commerce. The 3Cs there are key in the future of the web: content, community, commerce. Of course, there are other examples like Facebook that talk to this trend, but Drupal is unique as an open platform that enables people to build websites for every purpose from private-sector leaders like Sony to public sector and government, the White House site and many other major agencies run on Drupal. It’s an example of an application you simply couldn’t build any other way than open source. And why is that? Because the needs are so diverse to enable integration of the Drupal platform with everything from Facebook to Amazon to Salesforce and most recently Four Square. There are in fact now over 5,000 modules built on top of Drupal to connect it with all the major web services and applications out there.

To achieve this it’s estimated that over 100,000 people have contributed and there is no way you could have an R&D team that could handle all those diverse needs. You just could never staff that with a traditional R&D team. Yet all of that is available to you in open source, and instantly as a Drupal developer or partner.

Now regarding the business model: I wouldn’t have backed it if it were simply a support business model. So we focused on building digital services like the Acquia Drupal Network with Search as a Service, Remote monitoring, etc. That was the first layer. Then we looked to build Drupal as a Service with Drupal Gardens and recently we’ve introduced Applications as a Service with things like Drupal Commons, an out-of-the-box experience for Enterprise Communities. Each of these are different business models that build on Drupal as a platform, a far-reaching platform, and provide a high-margin multiplier to the business, while serving customers a richer and richer experience of Drupal itself. Great business models of the future will provide three things: reach, richness and a high-margin multiplier. Reach will often enable things like the long tail, popularized with things like Amazon’s book catalog; richness with things like the depth of applications we’re seeing from Acquia built on the open-source modules, and high-margin multipliers like open-source development and electronic services built thereon.

So it is no surprise to me that Acquia is not only breaking every record in terms of growth in our portfolio, but is also showing the way to the future of multilayered business models.

Ed: Yes, absolutely, I see all of these as enabling factors or enabling technologies. Open source as well as this incredible scalability and flexibility afforded to people that have an idea that ultimately can give rise to far lower barriers to innovation going forward and interesting business models.

Michael: One of the things that will be fun here will be to look at how businesses’ models will evolve in the next 10 years. Whatever they are, I think people will end up developing multilayered business models to deliver their value. What do I mean by that? Well people tend to overgeneralize today and say a company has one business model, eg, like Google is advertising-driven. I just don’t think this is the future. I think companies will simultaneously run multiple business models, as exemplified by Acquia. I already try to get my companies to think ahead in this way.

Ed: I’ve met with DemandWare, whom you were kind enough to connect me with, and that model where the company becomes the infrastructure partner of their clients, taking a percent of the transaction, totally aligns the interests of the two.

Michael: Exactly, and that’s happening more and more. That’s a perfect example. It’s a win-win model. When we enable our customers to grow, we benefit directly. I think businesses are going to evolve with at least two or three different layers of value and business models in the technology industry that support their success. It will be a part of how they will avoid “the innovator’s dilemma.”

Ed: It’s interesting because we have recently done quite a bit of work on Microsoft, As much criticism as Microsoft gets, some of which is justifiable, I really think they have put together across their portfolio the perfect example of so many of these themes. The layered business models, the cloud infrastructure, and the antivirus business. Here is a business that has had subscriptions, people have paid for technology. Now you’re moving into a world where companies like AVG are offering free technology and generating revenue through paid search to their user base or offering a premium to subscribers of an ISP that ends up being a premium service. Somebody ultimately has to pay for the technology and the vendors must be paid as well. That’s starting to play out and I would be interested in your views on this. The reception of traditional software vendors has not been that open but I believe content is increasingly going to become a part of the value stream or the innovation stream. What I believe you will see will increasingly be test-enabled service

Page 169: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

September 2010 [email protected] 169

companies that incorporate in their offerings to capitalize off of user penetration. SalesForce.com just bought Jigsaw, which is a proprietary content database that layers on top of the service that they already offer. If users are already using your applications but the seat licensing opportunities are already saturated, how do you continue to offer value that goes beyond just a pure feature function? I think that ties into the analytics, scalability, and many of the other themes you have highlighted.

Michael: I couldn’t agree more. I think you are right that Microsoft has done a better job than they get credit for in terms of thinking that through. Unfortunately, they are such a big organization; it’s actually hard for most people to fathom the significance of what they have done in so many different ways.

Ed: I recently spoke with a game developer who was really impressed with what they have done around Xbox Live. There was just some news that came out that they have done over a billion dollars in revenue through that. That’s their ecommerce content delivery platform.

So this captures your three areas of focus as a VC, but are there any other important trends you see?

Michael: Sure, there are a few important supporting trends I’d highlight, such as connectivity, big data and the increasing importance of analytics.

You are going to have to have “always on” connectivity reaching much further across the globe in order for all this to play out to its full extent. In the US we take for granted that we can be always on, all the time, at high speed. We are a long ways from that for the other 3-billion people in the world. Actually, we have made an investment in a company literally titled O3B, the other three billion. They are going to put up the satellites that will reach the other three billion in a low-cost way and we think that is huge. The ability to get the connectivity so the rest of the world can be always on all the time is going to be just critical. Especially if you are going to really enable these services to be consumed in the way we have just talked about.

I think there will be some really interesting opportunities that emerge and so another major theme for us, just to push on everything being available as a service, is you are going to want it available anywhere any time. Where are you going to use that? It’s going to be on edge-based devices, we know them today as iPhone, iPad, and Androids, but who knows what they will be in the future. I mean it’ll be connected into your car, in your appliances at home, machinery and equipment, in the factories, in the supply chain, and so forth. The point is every person today is going to want to be connected and to every device.

In ten years’ time it’ll not only be every device, and every machine, but at every checkpoint, both in the physical and the online world. And they will need to be reliably always on all the time. That is a huge challenge to enable in technological terms. Everything from bandwidth to management of that but we think that is a huge trend to continue to bet on. It is natural that people will want to do it. For example, one of the drivers is that people want to measure everything. Why? Because it’s the old adage that you can’t manage what you can’t measure. People are going to want to measure the flow of their goods through supply chains, the processing of every patient through a hospital. They are going to want to manage the connectivity of every one of the field workers doing field automation, field service, and so forth. Every person and everything always on is going to cause a huge amount of data. That is another big theme that comes out of this: big data.

You are going to have big data like we have never imagined before. We are already seeing this. The digital world will break through the zettabyte this year and it’s obvious it’s not going to slow down. It’s extraordinary to think about how much data is being generated today, let alone how it will grow as just described.

Ed: Right, Petabytes, Zettabytes and Yottabytes.

Michael: Exactly, and the compression of how quickly it will be connected will be also unbelievable. You are going to have to get much more real-time in-line analytics and one of the shifts is going to be towards much more in the way of predictive analytics. This is going to be self-serving for a second, but that’s why we invested in Revolution Analytics. It is another open-source company bringing the R language to market to take the next leap there.

Ed: Well, it’s a very concentrated market with SAS and IBM, but SAS more than anyone else. That’s a tremendous amount of concentration in the industry and it is certainly rife for fragmentation and new entrants.

Michael: Yes, exactly. We also are lucky enough to have attracted the founder of SPSS, Norman Nie, to come and start the company with us. He knows what it is like to scale something from zero dollars through a billion dollars. We are quite excited about that. The point is the big scheme of where it’s going, to answer your 10-years’-out question, is clearly more data and the need to analyze it faster. You need to be ahead of trends whether it is consumer or economic trends. This is going to drive the need for predictive and real-time analytics. That’s something that is a consistent theme, I think I told you ten years ago, and I just don’t think it’s going to go away.

Ed: Well, that’s right. The data only grows and the more of these devices that are connected via IP and protocols like ZigBee to network the home, the cost declines. You get more and more visibility into all of the business

Page 170: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Skok, North Bridge US software

170 [email protected] September 2010

processes, utilities, and the need to manage energy. Applying the intelligence and domain knowledge to every problem in the world is compelling. I’m very much in agreement with you there.

Michael: One more last boring but very important thought then: Centralized IT in data centers of the future, in the cloud will become an increasing challenge for all sorts of areas. Most notably, in a very basic way, things like heat, power, energy management, and in a more substantial way, the management of scale and management of the pace of delivery of this stuff.

For example the whole development and deployment cycle is going to get a very big shift. It has to, to enable the scale and speed we’re implying here. It’s probably no surprise to you to hear this, but 70% of the cost of running the cloud infrastructure and datacenters is operating expense. And nearly half of that operating expense is the cost of manually managing the constant upgrading, updating and management of the life cycle. It must change. It can’t be that it takes us months to develop applications and many months more to deploy them.

And when people need updates or security patches they need them in real time.

So, the biggest change is going to be in the development of the operations piece and we’ve got some important companies there. For example, rPath is really taking off as the next generation from BladeLogic or OpsWare, that were script based. Scripts just don’t scale. The whole approach is still highly interdependent on manual processes and is very labor intensive. We think that will have to become automated to scale and in the next ten years will grow to include genuine self-healing kinds of characteristics with things like model-driven management of dependencies as rPath provides.

Without that, as datacenters and the cloud scale out, everything would become more and more fragile. We must have some fundamental breakthroughs in management and deployment in the entire lifecycle to take the fragility out of IT in order for EaaS and cloud to play out. But enough predicting from me! The only thing I can be sure of is that I’ll be wrong on much of this! So in 10 years’ time, please can call me up again and I’ll simply bet we won’t have solved much of this as definitively as I would hope.

Page 171: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

September 2010 [email protected] 171

Michael Tiemann, Red Hat Michael Tiemann is president of the Open Source Initiative as well as vice president of Open Source Affairs at Red Hat Inc. Michael is a true open-source software pioneer. He made his first major open-source contribution over a decade ago by writing the GNU C++ compiler, the first native-code C++ compiler and debugger. His early work created world-leading technologies and also informed the first open-source business model.

In 1989, Michael’s technical expertise and entrepreneurial spirit led him to co-found Cygnus Solutions, the first company to provide commercial support for open-source software. During his 10 years at Cygnus, Michael contributed in a number of roles from president to hacker, helping to lead the company from a fledgling startup to an admired open-source leader. Michael also provides financial support to organizations that further the goals of software freedom, including the Free Software Foundation, the Electronic Frontier Foundation, and the GNOME Foundation.

Open source - Lean, mean and versatile With Linux nearing its 20th anniversary and other leading open-source projects well into their second decade or more, the resilience and usability of open-source technologies has become established and vetted by broad adoption. Steady evolution from a broader community of developers and users, features and capabilities are increasingly on par with commercial, closed-source offerings. We believe improving usability and management tools have helped extend the appeal from beyond highly tech savvy to broader IT adoption.

The advantages of the crowdsourcing development model is that features can be analyzed, tested, vetted and improved by peer review as part of a continuously iterative process. The result is that changes tend to be more incremental and evolutionary rather than disruptive in nature. For example, the Linux kernel is updated every two or three months, on average, with approximately 10,000 patches. The total number of lines of code in the Linux kernel has grown nearly 75% since 2005 as more developers make more changes per release. Lines of code added per release have tripled, according to The Linux Foundation.

Estimated value of Linux kernel development

0

10

20

30

40

50

60

May 05 Jan 06 Sep 06 May 07 Jan 08 Sep 08 May 09

(US$m)

0

50

100

150

200

250

300

350

400

450

500(US$m)Per release Aggregate (RHS)

Source: Credit Agricole Securities (USA), FLOSSMetrics, The Linux Foundation

Under an open-source model, Linux kernel

development has accelerated

“All bugs are shallow” posits that communities of developers are better

at fixing flaws

Open-source ‘crowdsourcing’ can outpace proprietary development efforts

Page 172: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

172 [email protected] September 2010

Free works - How open source creates better software Michael Tiemann is one of the most prominent evangelists of the open-source movement. Our conversation focused on the dramatic gains in overall quality (or lack of flaws) that the open-source development model has been able to achieve. He draws parallels with the work of W Edwards Deming, a former WWII veteran who took his management principles to help Japan revolutionize the auto manufacturing business after Detroit rejected his help. Interestingly, there are roughly 1 billion lines of open-source code currently available for anyone who wants to use the code for their own use, to develop new code on top of this. This great resource of open-source software provides a significant basis for accelerating innovation in the next decade, in our view.

Key points The open-source ethos of improving quality constantly and forever

produces software that is measurably improved from proprietary systems, and, when combined with its openness, speeds innovation and cultivates sustainable intellectual development.

Open-source development has resulted in fewer code bugs and faster development times resulting in faster innovation cycles. The “defect density” for traditional proprietary closed-source code averages between 40-60 flaws per 1,000 lines of code - the most recent analysis by vendor Coverity indicated that the vast majority of open-source software had an average defect density well below 1 flaw per 1000 lines of code.

Open source becomes increasingly relevant because there has been no Moore’s Law for software. Niklaus Wirth’s Law states, ‘Software is getting slower more rapidly than hardware is getting faster,’ or ‘what Intel giveth, Microsoft taketh away.’

In 10 years, the concept of an application will evolve into ‘a momentary coordination of resources built solely for solving the problem at hand.’

Developers will increasingly build platforms where applications are not managed as monolithic entities. This opens the door to building vertical, specialized applications quickly and solves immediate problems at hand.

Providers of monolithic applications face an uphill battle as they work to make their systems more open and manageable.

Summary of interview on 15 July 2010.

Full transcript follows

As open source matures, its value becomes

more apparent

A primary advantage to open source is the flexibility to make

changes and modifications at will

Page 173: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

September 2010 [email protected] 173

Michael Tiemann transcript Ed: I believe the open-source model for software development and intellectual development has been quite disruptive over the last decade, and Red Hat has certainly been at the center of cultivating these communities of innovation and enabling a rise in incredibly pervasive powerful technology that is really powering a new transition to cloud-based computing. The idea of allowing the value and innovation to accrue toward proprietary ideas or domain-specific ideas rather than having to reinvent the wheel on basic infrastructure seems to be so critical to the Red Hat value proposition.

Michael, what is your vision of where open source and where technology and the software industry goes over the next ten years and also any implications beyond software, looking into broader technology and life as well?

Michael: So let me start small and just talk about open source. If you know a little about my biography, you know that what brought me to, into the world of open source, to start the world’s first open-source business and the dramatic experiences that I had, the contrast that I saw, between the capabilities of proprietary software and the capabilities of open source. When I started my professional career at age 21, I did not have a whole lot of life experience, and I could not square how dramatically different the capabilities that I saw emerging from a single programmer at MIT, Richard Stallman, how it was possible that software that he was giving away and being modified by other people was accelerating at a rate far greater than anything that I saw in the commercial software world. I watched that for a couple of years before I decided to bring myself to start a business based on open source.

Perhaps one of the most dramatic examples that really made me think that there was a business opportunity was when I was visited by some scientists from Los Alamos National Laboratories. They had just gone off to build a virtual nuclear weapon. Their idea was to build a computer model and blow up a virtual bomb to test America’s arsenal instead of using up real uranium bombs deep under ground. They had spent about US$100 million buying Sun workstations, and they had built one of the first supercomputing clusters using the commodity technologies of the time, Sun workstations running Sun OS.

I was very proud of a lot of things that I had had been able to do with the GNU compiler, so I asked the scientists from Los Alamos what the most important mathematical subroutines that they were running and if my compiler could do a better job. They replied that it generally involved a lot of linear algebra utilizing LINPACK. I had used LINPACK and knew it well because my GNU compiler generated the code. After they revealed they were using Sun’s proprietary

floating plane acceleration board, I asked if they had an instruction set manual. While they did not have a manual, I was able to infer the instruction set is from the output. Four hours later, I was generating code on the GNU compiler that was running 10% faster on the benchmark than the Sun compiler code. That meant, by notional value, I had delivered US$10 million worth of value to the US government in 4 hours’ time.

I thought to myself, in one afternoon, this is one person in one afternoon; solving one problem was able to deliver so much value. And because open source puts no limit on innovation, I wondered what the world would be like if every programmer in the world were empowered to solve any problem that came their way from any source. That made me think about the scalability of the open-source model. We are still in relatively early days of that. When I started as the CTO of Red Hat in January of 2000, Linux commanded about 5% of the top-500 supercomputing sites: today it is 95%. When you slice it every other way, if you look at Intel vs AMD, its 60-40 or 70-30, its IBM vs HP 60-40,70-30.

Ed: I think that one of the distinctive value-creation characteristics of a subscription-based business such as Red Hat’s is this alignment of value creation with customers being satisfied, because when you look at your business, what are the switching costs? Very low - you can download the software for free, but you are building a consistent growth of value that customers are deriving and it’s being reflected in revenue growth. I see that dynamic playing out among the software as service vendors, many of the cloud vendors.

Michael: One of the reasons that I moved from CTO of Red Hat to being VP of Open Source Affairs, because as much as I love technology, I really felt that open source is really more of a people and process problem than a technology problem.

There is the technology development continuum, which is very exciting and complex, but there is also the people and process. The question is how people value technology, and how people value vendors, and how people value innovation. I was giving a talk just yesterday at a government agency, and they are struggling on the one hand to fulfill their mission and on the other hand to do good science, and on the third hand do more with less and all these other complicated things. I brought with me a lot of stories from across both our public sector and private sector about how folks do this, but what it really comes down to is that there is the old school ways of thinking, militate against innovation, and militate against the kind of improvement, which in a perfect world we would all strive to accomplish.

One of the stories that I have been telling a lot lately is that when you look at the history of W. Edwards Deming, who after distinguishing himself with his

Page 174: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

174 [email protected] September 2010

service during WWII went to the American executives of all the major automotive manufacturers and said, ‘I can help you build a better product’ only to be told to get lost. The executives believed they won the war; Deming didn’t know how to win a war just because he was a soldier. Of course, they were deaf to the fact that they built some of the most unreliable vehicles in the field. And, of course, we know that when Deming was utterly unsuccessful in convincing any American executive of the need to fundamentally re-conceive how to think about the nature of their business and the relationships with their employees, their labor force etc, Douglas Macarthur took pity on Deming, and suggested Deming find an audience in Japan.

Lo and behold, Deming revolutionized the entire industry by demonstrating successful principles of management in manufacturing in Japan, and what I have really come to appreciate is that when you look at the 14 points, which are articulated in his book Out of the Crisis, what you notice is not only how obviously aligned they are with open source, but how open source actually enables one to take them to a logical extreme that I am sure Deming himself could not have imagined when he proposed the 14 points. The concept of breaking down barriers so that people can work cooperatively, that is what the open-source licensing regime is all about.

Another little story from my bio is when I met Bjarne Stroustrup, the person who invented the C++ programming language. I met him in 1987 at the world’s first UNIX C++ conference. I walked up to him and I said, ‘I’m writing a C++ compiler.’ He responded, ‘Oh, that’s nice. I’ve already done that.’ So, I said, ‘Well, actually, I’m writing a native code C++ compiler.’ He told me that was stupid and that I couldn’t possibly make the compiler work on every platform. Stroustrup had written a translator from the C++ language to C, and then he relied on other vendors’ C compilers to translate his output into the native code. Well, obviously I was undaunted, and the good news is that the C++ compiler is the industry standard and it does work exactly as I had proposed in 1987.

When I proposed that we create the world’s first open-source company, people in Silicon Valley said that it was a stupid idea because it’s impossible and no one would pay for free software. I have heard a lot of people talk about impossibilities. I feel a kindred sprit with Deming because people talked about the impossibility that he proposed, of valuing relationships based on loyalty and trust instead of price, or improving the product continuously and forever.

In some talks I discuss the code quality metrics that have been published in 2004 by a software company in Silicon Valley called Coverity. In 2004, they reported that the average proprietary software has 20 to 30 defects per 1,000 lines of codes, which means that a

5.7-million-line Linux kernel was expected to contain somewhere between 114,000 and 171,000 software defects that their tool would find. They ran their tool and found 985, of which every single one was triaged, 600+ were deemed serious and 100% of those deemed serious were fixed within 6 months.

Ed: Fixing that many critical flaws must be impossible to do with a closed-source model.

Michael: Think of another company who had a large operating system that maybe did not have this superior level of quality, but only average quality. Think of what their release process looked like from a timeline perspective from when they said Long Horn would be ready to the time they pulled out, literally, every single meaningful feature that they had promised the market over a seven-year blip.

It became known as “Short Horn.” I kid you not, you can look it up. This is the other thing I kid you not about, every feature they promised had to be removed before it shipped, and the only feature that nobody ever asked for, which was a content management feature, is what they put in. But needless to say, when Vista finally did launch, it was not known for its quality. Or rather, it was not known in a positive way for its quality.

Back to Coverity, they performed a measurement and found that instead of 20 to 30 defects per line of code, it averaged 17. In 2005 the Linux kernel grew to over 6 million lines of source code and the defect density decreased to 16. In 2006 the folks began to think it was not fair to look at Linux alone, and maybe Linus Torvalds and his disciples are outliers that would be better to ignore as statistical outliers rather than to compose a new theory to deal with them.

Coverity measured 32 different open-source programs, including the GNU C compiler that I used to work on, including the famous Apache web server, etc, and what they found was the average defect density was less than 5 per 1000 lines of code. In the lamp stack, LINUX, Apache, MySQL, PHP, the worst performer was PHP, with 434 defects per 1,000 lines of code. That was only 40 to 60 times better than industry average; Deming said you should improve product quality continuously and forever. Now imagine a commercial CEO wakes up in the morning, and says, ‘We are 60 times better than average. Should we continue to improve quality, or should we start paying out bonuses?’ What do you think the average company is going to do?

In 2008, the Department of Homeland Security got the idea that quality could result in national security, so they funded Coverity to open the aperture from 32 programs to 250, covering 55 million lines of source code. Not only did they find that the defect density was still less than 1 defect per 1,000 lines of code, but PHP

Page 175: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

September 2010 [email protected] 175

was one of 11 projects with 0 automatically detectable bugs. In 2009 Coverity increased the aperture even wider, and there were three times as many programs that were “perfect” with respect to this diagnostic device.

I was making a good living travelling the world on behalf of Red Hat, talking about the great quality of Japan, and holding up the icon of Toyota (until that story went bad last year, because of their supposed accelerator problems and so forth and so on. Toyota ultimately said, ‘We were not prepared to become number one because we have always been looking forward, and once one becomes the leader, one starts looking backward, thinking how to stay ahead instead of how to keep improving.’) I would say that the open-source community has demonstrated, with respect to the observable quality, that this ethos of improving quality constantly and forever, is alive and well in what we see coming out of the programmers, but more importantly, the mechanisms by which this is achieved, the open-source participatory model. This is something, which in my own experience explains how it is that we achieve, in many cases, what is 100 times better-than-average software quality.

In the world of open source, when something unexpected happens and the developer tries to chase the bug down and discovers that the bug lies not in their own software but lies in a module beyond, the ability to fix that problem where it lies without polluting one’s own code, is the reason that we are not putting so many bugs into our own software. Because with proprietary software, the bug lies elsewhere, you can’t go to your manager and say, ‘I can’t trust this database so let’s just give up on the project.’

What you end up doing is putting in a workaround, which maybe over time turns into some legacy functionality on which people start depending. If you think about how the whole enterprise stack is composed, of 100 different intra-operating elements are all putting in workarounds because of the bad behavior of the other elements in the system, you get this network amplification of workarounds that creates unmanageable code.

When we look at where we are today, yes, open source is free to acquire by downloading, yes open source has wonderful economics compared to proprietary software in terms of cost model, but the fact that this open-source model can run at such a high level of quality in such a robust manner, with so much interoperability, I think that’s the reason you see companies like Google, other companies who are putting open-source infrastructures together like Amazon.com, able to achieve levels of scale and start making forward progress by simply remediating the existing software because we have developed this model that would have made Deming smile.

Ed: It’s really the quantitative or the empirical proof that all bugs are shallow in that model.

Michael: Yes. The important thing is not only that all bugs are shallow, but the cumulative effect of maintaining a zero inventory of bugs, lean manufacturing, is a groovy thing not just because it saves you money, it allows you to pivot. If lean manufacturing was only about reducing the amount of cash you needed to run a business, all of those books would be really short. Lean manufacturing leads to a comprehensive set of new capabilities such as agility, flexibility, increased opportunities for innovation, etc.

This is the lesson Deming was trying to preach: if you build quality in the first place, every other thing you do contributes exponentially to creating value for the business. What I believe we are seeing in the world of open-source software, is that we have created a platform that has been so reliable for people to then go and do their own innovation, at a rate that makes sense to them, as opposed to constantly slapping their heads, and asking themselves why they, we, ever do it this way, or it will never work, or we can’t move to the new version because of X and we can’t stay on the old version because of Y, all those ranges of impossibility in the realm of proprietary software have been eliminated. The result is that every unit of energy applied to the platform is moving things forward. All bugs being shallow is nice to have, but that’s not the only strategic fact.

Ed: I think that’s a corollary to the one of the points that you made in your talk about the value in innovation for the customers or the users of open-source software to allow the value from innovation to accrue higher.

Michael: Exactly. One of the things some have said is that the threats to an open-source software company such as Red Hat is that as companies become more familiar with open source, there is a greater likelihood of them to take that capability internally, rather than to buy more services. My experience has been the opposite. The reason that it is the opposite is precisely for the reason that you just mentioned, which is, that the customers’ ability to innovate on customer-specific issues is the scarce resource of every company.

When a company has an opportunity to engage in innovation with the cooperation of their downstream customer, the bottom line is the innovation generates value, and so any time something can be standardized to a point that it can then basically go to a commodity supplier, that’s a good trade. You want to trade commodity work for proprietary value. If you observed some of the folks that you met at the Red Hat Summit, you would notice that Red Hat has some fairly smart people in pretty good quantity working really hard to ensure the kind of continuity that leads to the stability

Page 176: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

176 [email protected] September 2010

of the platform that we are providing, and while it is theoretically possible, and while we have had customers who have decided that they are going to go off and build their own platform, those have been relative minority in number and they often come back after some period of time.

Ed: That makes a huge amount of sense for a company to focus on the business of the business rather than having to become experts in unrelated domains.

Michael: Yes. On my first trip to India, I met with the senior IT manager of a company called the Life Insurance Company of India - it’s sort of a government, public-private entity, which basically manages the pensions for Indians. He told me that a few years before I came to his office, they downloaded the Free Red Hat Linux version 6, and they got it running and it worked perfectly; they got all their applications on it and it worked perfectly. I asked why I was there. He responded that because he had 2,040 offices across India, that he proposed that the company should standardize on this version. He said, ‘What I discovered was that there were dozens of people who had done virtually the same thing. Except some chose one version, some chose another, some standardized on this component, some standardized on that component, and at the end of the day not one of us can stand up and justify why our platform should be the reference platform. Not only that, if one of us tried to do it, then unfortunately the way things are, it would be undermined.

We need a neutral third party that we can trust, who will give us the right answer, and who will have a long-term solution that will, you know, provide us the continuity as well as the adaptability and give us that single reference point that will be unquestioned for all our offices’. I said ‘Ah, now I understand why I am here.’

Ed: I think of Red Hat in a role almost as a curator of all this intellectual property, and acting as the central arbiter.

Michael: If he was just a random insurance broker, and he had one little office and with 200 people and 10 creaky servers running in one office building, he would not have needed Red Hat nearly as much. When he said, ‘We need a reference platform,’ I wanted to give you not just a logical story but an anecdotal story as well to address that question of what is, how does the value proposition actually scale with adoption? It becomes more valuable the more people that use it rather than less valuable.

On to the future, 10 years ago I was part of the Red Hat analyst tour where we basically gave our 10-year predictions to Gartner, IDC, and the Forresters of the world. What we predicted 10 years ago was that the operating system horse race would narrow down to

only two horses: Microsoft and Red Hat. And for all intents and purposes, that’s what has happened.

What I have believed over the last couple of years is 10 years from now is the dominant question will be, ‘What is the future of the cloud?’ In my view, there are going to be two kinds of clouds 10 years from now. There are going to be public clouds and there are going to be private clouds. I believe companies like Google are going to be the masters of the public clouds and people are going to basically say, ‘I need my Google subscription and I don’t care that all my data is up in the cloud, I’m just one person, nobody cares about me. Google already knows everything about me, so paying the money for the subscription is no skin off my nose.’ In the case of the private cloud, I think people are going to be looking for the most neutral, the most authentic, and the most functional components from which to compose their private clouds. I think you saw a glimpse in Boston during the Red Hat Summit at how Red Hat is aiming its business to be the preferred supplier of cloud foundations for what will be all of the private clouds that run.

Ed: The cloud has a lot of the similar characteristics to the role of open source, and the cloud itself has a number of really disruptive characteristics. What are your thoughts on how the open-source paradigm may extend beyond traditional infrastructure?

Michael: I believe in 10 years, the fundamental concept of an application, is a momentary coordination of resources that exists for the purpose of giving the answer to the question asked. If you underline that word momentary, when you are done asking the question, the application actually goes away.

For example, when our Sales VP is running his daily, weekly or monthly reports and asks, ‘How am I performing?’ What comes from the database is actually a momentary collection of resources that gives him the answer that allows him to tell our CEO what the forecast is that day. It is not a monolithic, purpose-built application that must be maintained as an application by the IT department.

Ed: That is hugely disruptive.

Michael: Let me give you examples of where I have seen this. There is an article in the Harvard Business Review, published, I believe in April or May 2008, with a title that says it all: Radically Simply IT. It was co authored by Dr. David Upton who is now Chair of Operations Management at Oxford University - poached from a 10-plus year career at Harvard Business School. The article is about how Shinsei Bank, formed in the aftermath of the collapse of Long Term Capital and taken over by the Ripplewood Holdings, wanted to get the bank back on their feet and in working order as quickly as possible.

Page 177: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

September 2010 [email protected] 177

The conventional logic on restarting the bank was to spend US$500 million to buy an enterprise banking platform that would take 4 years to create. By the way, US$500 million is a really good deal because I have spoken with people who have spent much more than that and still don’t have it working. Shinsei chose a different approach; they spent US$55 million and had the system working in 12 months.

Shinsei have basically made an environment in which applications are momentary collections of resources that do the job needed to be done. For example, when I was visiting them in 2006, or 2007, they were telling me about a zero inventory application. They have committed and upheld to their company that no bug goes unresolved for more than 24 hours in IT; they practice zero-bug inventory management. I asked how they were able to do it, and they told me, ‘Oh, it’s really simple. Instead of having a version 1.0 and a version 2.0, and putting all the risk and all the different things we are going to do into 1 big version, any one which can slow us down on that version 2.0, we practice continuous innovation.’

In fact, the most recent posting that I put on opensource.com, called integral innovation, and it’s a pun on that term integral, has to do with this concept. You can think of it like calculus: if you make every innovation infinitely small, do it infinitely fast, and delivery, infinitely many of them, then the area under that curve is your full rate of integration at zero risk. The reason that you can achieve zero risk is that if any one of those infinitesimals misbehaves, you can throw it out and you are suddenly working again.

Shinsei asked if I wanted to see them launch an application, and I said, ‘Let me check my calendar, and you tell me when you are going to launch the application and I will try and be here.’ And he said, ‘We do it all the time, just come down to the accident room.’ And I said, ‘What is the accident room?’ And he said, ‘Well, we design the system to assume that it will fail. If it doesn’t fail, we will simply inject failure into the system, and that ensures that we are constantly protecting our discipline of having the ability to correct any problem within this short system of time, but should the application actually work, then it’s a success.’ Using this approach, they achieved number one in customer satisfaction and loyalty in Japan within two years time.

I brought this story back to Red Hat and we sent our IT guys to Japan. Since then, they have begun to practice this methodology inside of Red Hat. At the Red Hat Summit in Boston, I gave a presentation called Doing More Innovation with Less Risk. I had our CIO, Lee Congdon, up on the panel with me, and I introduced Lee by saying that when I came to Red Hat in 2000, the Red Hat IT people and the Red Hat open source hackers, had agreed to a relationship of mutual disrespect.

Because the hackers are doing what the hackers do, and the IT guys think they have all this important responsibility to keep the company running as it grows and grows and grows, I witnessed that disrespect growing into contempt over time. As our own IT people continued to implement proprietary systems, which frustrated absolutely every user, while insulting every engineer, Lee came into that context in 2007, and he presented in my session a graph showing the number of IT deliverables the year he got to Red Hat, and the answer is zero.

We were spending all of our effort remediating broken systems. He has a graph, costs are high, and deliverables are immeasurable. Then we started to implement this approach and we started to have some deliverables, and costs came down, and then our deliverables started ramping up and our costs were falling. We have an extremely compelling graph that has been de-quantified, it is to scale but it has not been given scale, and the result is that we have achieved that concept of doing more with less.

We have begun to build the concept of a platform where applications are not managed as monolithic entities, but rather they are these momentary co-ordinations of resources, and we have succeeded in beginning to unravel the proprietary stacks that we have been hostage to, and we are implementing our own or other open-source technology in our own IT operations, which previous CIOs had basically declared to be impossible. By now you should know that that word is basically red meat to me.

Ed: That’s a challenge; it’s a dare.

Michael: It’s even an invitation of the right way to go. Oh, by the way, one of Lee’s comments about injecting failure into the system, he says we don’t invest in injecting failure into the system, we get it for free.

Ed: That flips the whole development paradigm on its head. The hackers become your friend, your best friend.

Michael: That’s right. It also makes it immediately obvious that the bright future of cloud computing is practical at a very nuts and bolts IT level. It’s not just about putting a social service into the cloud for a whole bunch of unknown clients to come and buzz around, it’s actually able to perform business functions that do day-to-day boring stuff.

Let me give you one other reference, a book called The Only Sustainable Edge, by John Hagel and John Seely Brown. I know John Hagel because he is a top-dog consultant at Deloitte. They published this book on the Harvard University press in 2005 and my comment to the all the executives at Red Hat is this is the best book you will read on the open-source model, that never mentions open source.

Page 178: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

178 [email protected] September 2010

The authors talk about a whole bunch of edge-related dynamics across a wide range of industries, and it blows my mind that that they didn’t mention open source. They do acknowledge that they did all their research in 2003, and they might not have had enough confidence in 2003 to bet on a horse called Red Hat as much as they were willing to give credit to their other prime stories. If you look at Moore’s Law, Disk Law, Fiber Law, which should be giving us exponential IT capabilities, it’s interesting that there has been no Moore’s Law for software.

Our CEO found a quote called Wirth’s Law from Niklaus Wirth, ‘Software is getting slower more rapidly than hardware is getting faster.’ Another way that it used to be said in the Valley, ‘What Intel giveth, Microsoft taketh away.’ What Hagel and Brown propose in their book, which I think was very prescient, number one, from an economic inputs perspective, any business that can make use of IT in any way should be able to gain an exponential advantage with respect to that IT if they can do it right.

What the authors believed was virtualization and service-oriented architectures would finally loosen us from the monopoly, the monolithic model, and deliver the power and the promise of IT. What they basically said was that in the 21st century, you are not going to find a business worth investing in that is not using information technology to substantially leverage and develop its future capabilities. I agree with that statement. If it is true that IT becomes a strategic capability, much the way that a great post player becomes a strategic asset in a basketball team, if you don’t have somebody who is going to hold the post, you aren’t going to the NBA championship. They have basically declared that IT is no longer an option, but at the same time, until we can practice IT and capture the exponential capability that it offers, we are adrift.

The one other credit that I will give to their analysis, as I mentioned earlier, there was a question that the code quality of the Linux kernel should be treated like a statistical outlier and be thrown away, or whether it should lead to a greater understanding. The thing that blows me away in that book published in 2005, is that when they put forward their hypothesis about how companies, and especially how CEOs are accelerating the rate of destruction of value of the S&P 500 companies.

To demonstrate 50-, 75- and 100-year trends, they took the financial services industry as a whole industry, and they said we are going to treat their performance over the last 10 years as a statistical outlier and ignore it. They did not call it for outright fraud, but they basically said they think they were more right about what they were talking about than the results reported by the financial industry, so with that, our hypothesis. How’s that for putting it on the table?

Ed: They identified the outlier that’s for sure. That’s a really a provocative concept; the idea of the momentary collection of resources.

Michael: Or the coordination of resources.

Ed: That pretty much defines a fluid and agile cloud-based service-oriented vision.

Michael: Yes, absolutely. There are other stories that are starting to come to light, and when I talk with some of Red Hat’s own strategic customers, their problem statement, the need to cut the Gordian Knot. We have evolved these applications, which are so rigid and so problematic and stuck in their version XY way of doing things, and we have reached the absolute limit of what we can do.

If you think the problem is in the commercial world, you ought to look into the government services world. Obviously, the concept of throwing it all away and doing it all over again is something that these guys have even tried from time to time. Look at how many times the FAA tried to re-write our traffic control system, or the FBI tried to re-write the information-sharing system and did not get it ready in time for the next threat. There have been all kinds of problems with the old model, but what I think we are finally beginning to see is this super coordination.

Let me give you some other numbers. In 2008, SAP research published a paper that measured the observable available lines of open-source software code in the world from 1996-2007. What they found using two different collection methods was one said that the content coverage was doubling every 12.49 months and the other collection method said every 12.51 months. In 2008 it surpassed 1 billion lines of source code. Now, if you take that billion lines of source, and you consider that 100x quality multiplier, and you consider that our model is to build highly modular, interoperable systems, the network effect of all those components is geometrically far advanced from what a comparable monolithic model would imply.

People say, ‘A billion lines of source, that’s impossible!’ Well guess what, Fedora version 12 has 205 million lines of source code. Red Hat’s experimental and community environment takes the 20% of the code that 80% of the people are really actively using and maintaining and we are at 205 million lines of code. The amount of functionality that it represents is simply extraordinary. The code is applicable to anything, whether we are looking a bio-informatics, whether we are looking at statistical analysis, whether we are looking at networking stacks or storage, whether we are looking at real-time platforms, etc, everywhere you look.

Page 179: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

September 2010 [email protected] 179

In fact, it just occurred to me that I have now been living in NC for 10 years, so I am starting to think about what I was doing ten years ago. At the time in 2000, the story was that Microsoft was going to be able to work from their two monopoly positions of operating system and desktop applications, and they were going to monopolize embedded systems and they were going to monopolize broadcasts, all these other monopolies were going to happen. Now I’m reading the stories about Microsoft’s record infant mortality - they launched some new service and some new platform and in 45 days they killed it.

Ed: That was the Kin, their mobile phone.

Michael: The mobile phone has gone to Linux. My GPS is running Linux, and my disc array is running Linux. The little network attached storage thing, the ATM machines are starting to run on Linux. Again, it’s the fine-grained interoperability and that ability to maintain quality while extending into new areas. The people who have to be scratching their heads, I think even more disruptive than the proprietary software industry, are the Venture Capitalists who have a model that they can, number one, predict and, number two, steer the next proprietary success. I think that we are moving into a hyper-organic world with the field of a billion lines of source code that curate to some degree or another. How that field can be adapted to the next application is the interesting question.

In 1995, there was an article in Fortune magazine about Moore’s cannibal principle. Gordon said that the fundamental idea of the semiconductor technology industry was to take what were previously discrete electronic components, integrate them into a single new device, and then give them back for free or at least for a lot less than they cost as individual components, thus semiconductor technology eats everything, and those that oppose it get trampled.

Now, if you think about it from a software perspective, what is it that takes what were previously discrete software packages, databases, application servers, network monitoring software, virtualization, you name it, integrated into a single new platform, and then gives it out for free, or at least a lot less than what they cost as individual components? The answer? It’s a Linux distribution. Think about this: whether or not anybody cares at all about the Linux operating system itself, it is a little bit like somebody going to the semiconductor industry and saying, ‘Haven’t you got anything new besides transistors and gates?’

That’s not the point; however, the point is that when you have a platform that has become so all encompassing, that platform becomes a way of reaching new markets in a way that is totally different than when all you have to reach them with individual components.

I think that world view, when you connect that concept of the constant incorporation into the ‘billion lines of source collection,’ plus the high degree of modularity, which is immediately transferable to cloud, I think that’s another problem for the proprietary guys that have got these monoliths that are so heavy, that if we if we take the cloud analogy as a literal analogy, you know, those giant collections of rock and ice will never float, right? And what we have done is have all these little components, relatively free of defects and all interoperable, all very light, they actually fit into the cloud system in an authentic way, as opposed to a whole bunch of people running around with a bunch of smoke machines.

Ed: That really is the seed of what I expect to be a very robust decade of innovation and, in a sense, I’m expecting we will see the rise of the mom and pop application shop, and we could see that the giants could be disrupted orthogonally, the way Dave Kellogg recently described it. This disruption is going to come from thousands or millions of little ideas that have fewer barriers to entry than ever before.

Michael: Let me leave you with this idea in your mind to see if you like it or not. I was reading Wendell Berry, who put forward all of the ideas Michael Pollan made so popular in The Omnivore’s Dilemma, in the 1970s Wendell predicted it and wrote about it very eloquently and beautifully. Michael Pollan himself gives credit that Wendell Berry saw all this, Pollan was at the right place at the right time and became so popular. Wendell Berry talked about the nature of the farm being in which not just every farm being different, but every hillside of every farm is different one with another, and the responsible farmer is the one who knows the right thing to do to each different place. When you apply the industrial logic of turning the whole thing into a big dustbowl and fertilizing it and cleaning it, etc, you fundamentally destroy the productive capacity of that farm, which has happened basically in India and is happening now in the Midwest.

The way that this relates to the mom and pop application thing that you talked about, I believe that we are coming up to this point where this attempt to create this universal, monolithic, application has reached its logical conclusion; the dinosaur has reached the point of extinction collapse. In the future world, the ability to recognize all of the individual value generation opportunities and to create all those edge relationships and to be able to collect all the value of those constituent relationships that Hagel and Seely Brown talked about, that is something which not only does the job better but maintains the fertility into the future. I believe that there is not just what works better today, but there is also the effect of what makes it possible to work better tomorrow, I think that that is something which I have a lot of confidence in the way that we do business in the world of open source - that’s what we are protecting.

Page 180: 2020 foresight - Tech Views of the Future - Ed Maguire

Michael Tiemann, Red Hat US software

180 [email protected] September 2010

Ed: It’s sustainable intellectual development.

Michael: That’s right, as opposed to exploitation and destruction.

Ed: And value creation too.

Michael: And that might explain why Microsoft has largely run out of gas.

Ed: There was a great post on the Google blog back at the end of December talking about their views on how open-source systems take longer to cultivate but are far more sustainable. It is exactly the same view, and I think when we look out 10 years, it’s truly nourishing food for thought.

Michael: That is the perfect code on this conversation; any viable 10-year prediction must take sustainability into account.

Page 181: 2020 foresight - Tech Views of the Future - Ed Maguire

Ray Wang, Altimeter Group US software

September 2010 [email protected] 181

Ray Wang, Altimeter Group Ray brings enterprise-software experience honed from two decades of product management, management consulting, and marketing roles. Prior to serving as a VP and principal analyst at Forrester, he headed up the customer relationship management (CRM) analyst relations program for PeopleSoft. At Oracle, Ray served senior product management roles for both the ERP and CRM product lines. While at Personify, Ray was the marketing chief for a web analytics startup valued at US$500m.

Before working for packaged application vendors, Ray developed his management consulting and strategy experience at Cap Gemini Ernst & Young, Deloitte Consulting, Detroit Medical Center, and the Johns Hopkins Hospital. He specialized in SAP implementations, general strategy, program management, change management, mergers and acquisitions, and healthcare operations.

Ray graduated from the Johns Hopkins University with a BA in natural sciences and public health. His graduate training includes a master’s degree from the Johns Hopkins University in health policy and management and health finance and management. He is also certified in SAP FI/CO modules, facilitation, and program management office.

The growing importance of IP and content Ray Wang has been an industry analyst and active commentator in the software industry since his days at Forrester and moving to Altimeter group. Ray is known for his insightful research on applications and SaaS, and his blog, A Software Insider’s Point of View, is widely read within the industry. Our conversation focused on several longer-term trends, notably convergence, the focus on system integrators delivering IP and the evolution of software providers as information brokers.

Key points We will see convergence between the software companies, system

integrators, hardware companies, and even other types of companies today. Hardware and software vendors will acquire each other in an attempt to integrate and optimize solutions, which will be delivered as services.

System integrators are starting to deliver their own intellectual property (IP) in order to reduce their dependency on the Oracles or the IBMs, and have their freedom. System integrators will deliver their own IP in smaller solutions possibly running on a third-party cloud.

Software companies will emerge to become information brokers. More companies will ‘‘applicationize” content by delivering services from the cloud around their proprietary content. Data providers or those companies with large client bases will leverage proprietary content as a means of forming barriers to entry.

The lines between hardware, software, and systems integration vendors have blurred. In the past, these three groups partnered; in the future they will compete against each other.

Within 10 years there will be a big privacy attack or cyber security event that will cause us to be more cautious with personal information.

By 2020 we will see more personalized, specialized types of businesses that do scale on the smaller market and mega corporations everywhere, influencing and changing and commoditizing the cost structures.

Summary of interview on 30 July 2010.

Full transcript follows

Convergence will take place in many areas of the

traditional stack

Conjoining content with applications and

delivering as services will provide competitive edge

Convergence enables smaller, vertical applications and business models

Page 182: 2020 foresight - Tech Views of the Future - Ed Maguire

Ray Wang, Altimeter Group US software

182 [email protected] September 2010

Ray Wang transcript Ray: Let me start with the way the industry is headed, in terms of the technology solutions company.

The first thing is that we will see a convergence between the software companies, system integrators, hardware companies, and even other types of companies today. One of the factors of convergence is that hardware companies are realizing that they are being relegated to the datacenter and because of that they are going to have to do more. They are either going to have to buy a software company or a large SaaS vendor to deliver those kinds of higher-value solutions or they are going to be relegated to putting components together for everybody else. The Dells, HPs, and Ciscos of the world have no choice but to get into software.

The second thing is that system integrators are starting to deliver their own Intellectual Property (IP). They are trying to reduce their dependency on the Oracles or the IBMs, well IBM itself is a system integrator, but the Oracles, SAPs, and Microsoft. They want to have their freedom and we are going to see more and more of that. Lastly, small solutions are going to be delivered by system integrators on their own IP, maybe on their own, somebody else’s cloud. They are going to, back to retaining their IP instead of just being a deliverer of those services or making it happen.

The third thing happening is software companies are emerging to become information brokers. They are not just delivering a solution but what the cloud is doing is allowing them to aggregate data for prediction, for trends, and for benchmarking purposes. Those trends drive another point where companies must consider if they even need any of these solution providers. We are seeing companies buy out key software assets so they can deliver services directly to the customers without having to use a packaged application. One of the great examples of this occurred on Tuesday. Roper Industries bought a software company that just did supplier networks.

Ed: I did not see that.

Ray: I think it went under everybody’s radar. Nobody paid any attention to it. I think they paid US$250m for a software vendor to deliver stuff. They have everything from student cards, student systems for cafeteria systems, to truck freight matching, to cold storage facilities. Basically, they bought software so they could be the front end for the supplier system. It’s SaaS-based software in the food industry.

Ed: You are one of the early people to see the increasing role or convergence between the information and software providers. The software provides the delivery vehicle for information, but as you look at where the value resides in the solution,

those that have the data are one component. Those that have the business process and technology vehicles for delivery are starting to converge. I also expect this to continue. I think project Dallas at Microsoft is maybe the first of these initiatives as well.

Ray: They are definitely a part of that wave and Intuit is doing something in that role as well. Every smart company or small business starts with Intuit. It would be smart for Intuit to go out and really broker SMB services, not only the services they have but the information and data.

Ed: I was just going to mention that companies like Moody’s, Dun and Bradstreet, Walters Klouwer, Macmillan, all of these providers of professional information relevant to professionals have been exploring ways to “applicationize” their proprietary content as a way to create new revenue streams.

Ray: There is a buy-side and a sell-side for the information coming from the software vendors and that is what is making it exciting in the marketplace. That is the technology picture in 2020. We are seeing this world of convergence that is happening on the enterprise side.

Then there is all this societal stuff happening. What happens in the world beyond social? What do we do? I think what we are starting to see is that right now everybody is open. People are open to testing out new ideas. They are open to sharing their location information, sharing their spend data, and trading off their ability to get free services for information. We are in the golden age of lots of openness and lots of sharing. Instead of free love we will call it free social. People are more willing to say, ‘Hey I don’t care, you can have my information.’

In 10 years I think something will happen. We will have a big privacy attack or some cyber security event will occur. Not to be pessimistic, but one of these events will cause us to be more cautious with this information. For example, there are companies who are aggregating location information and figuring out behavioral patterns for where individuals are. When we start to realize that we are building the matrix that we are living in I think we will become a little more guarded. People are going to be more restrictive about what or who they are open with and what the sharing parameters and privacy concerns that will arrive.

Ed: That is a great point. You are already starting to see awareness about the amount of information that Google has gathered and the type of data Facebook is collecting. This vision of content services where you anticipate what an end user or a consumer of information may need depends to a large extent on the service providers being able to place behavior in context.

Page 183: 2020 foresight - Tech Views of the Future - Ed Maguire

Ray Wang, Altimeter Group US software

September 2010 [email protected] 183

The challenge is going to be privacy disclosure issues. How much are individuals going to want the world around them to know about themselves? This was information that was never considered to be private because no one was collecting it before.

Ray: Consumerism or consumer technologies are moving into the enterprise and it is a theme that is going to continue. It is a theme we have seen over and over. From PCs making it into the workforce, instant messenger making it in, internet and web-based solutions coming into the market, to mobility and handheld devices, we are going to continue to see how consumerism permeates the enterprise and really changes how people work.

These new form factors that are occurring like the Kinect’s motion sensing and Wii, augmented reality devices that are happening with mobile and location-based services. All of these things are going to change what we view and how we use business applications.

Augmented reality is a great topic to look at. It’s in its infancy but it is the thing that will happen maybe three years from now and will be mainstream in 2020. Just being able to take your mobile device, take the camera, take the geo-positioning and spatial information and be able to identify that, ‘Oh, my gosh, that’s my customer and their electrical outlet and line is out.’ How can we help solve that preventively?

Maybe I am in the middle of New York City and I am trying to track down a friend and I don’t know where they are. They have given me their secure, tokenized, secret location information code and I know that they are lost and can track them down. These things are going to be part of our daily lives and we are going to make those assumptions. The fact that I walked into a Starbucks 16 times in 4 days at 3pm to have meetings and at 4pm to work on my own personal stuff. People are going to track this. The owner might be smart enough to say, ‘You have been here long enough, what can we do to offer you a meeting space? What can we do to offer you catering? Let me reserve a table for you.’ As metadata gets better and as all this convergence of information comes together and people can assimilate that, there will be some benefits. We will be living in a police state if we are not careful with this information. We will have to weigh each one of the benefits and disadvantages.

Ed: The potential for innovation that is unlocked by “the internet of things,” the number of sensors and devices that will increasingly be connected to the internet and the ability to have these context-based services that users will be able to harness agents that can deliver very specialized and personalized experiences to them.

You have called out a potential area for conflict and probably some negotiation of what those bounds are in the creeping power of an internet that increasingly

collects all types of data about people’s behavior. Not just the digital signatures they leave in the world but other behaviors. The idea is that the thoughtful entrepreneur who recognizes you and is trying to be captured in artificial intelligence and that does create some ethical issues that do need to be addressed.

What are the ethics of the digital concierge? The intelligent system that knows who you are and how comfortable you are, you are comfortable having other human beings know things about you intuitively, but there are going to be some ethical issues about how much you want the machines to know about you.

Ray: There is a lot of stuff that is going on with Electronics Frontier Foundation focused on digital rights and digital liberties. We are vastly losing our individual protections and rights. Hopefully, in a more educated fashion we will be able to protect rights, those while everyone will be looking for those opportunities to say ‘Hey, look this is for our own good and safety.’

It is going to be one of those things: ‘Do we have sensing networks and cameras the way the British do or do we have different ways to do that that is secure?’ With the Patriot Act, anyone can come in and get a warrant to figure out any information. That is scary if there is a certain level of abuse. People will learn to be more careful and there will also be a business and market opportunity for those protecting individual rights and liberties. That is on the ethical side on 2020. The societal side we have talked about a little bit. We talked about consumerism and the enterprise. There is this notion about what will work look like?

There are really four forces of change that are happening right now. The first part is a number of things we can’t control in the macro condition that are always going to be there. Those are the economic insecurities, loss of faith in capitalism, or loss of faith in communism, depending on where you are. Loss of faith in the markets, all those things are going to happen and they are out of our control.

The second thing that is more interesting is really about new ways to work, better tying in the marketplace. In those new ways to work we are starting to see how distributed work environments work. You take something like JetBlue. They do not have call centers. When you make a phone call, every call gets routed to somebody’s house. You need an IP line and a laptop to be able to do reservations for JetBlue. This is pretty cool. On one hand, as a company, you are not bound to a 90-mile radius of employee talent. You can go around the world. For the employee, if you want to work from 3-5 and then work from 9-7 the next day and then not work the day after that, you have the ability to have more flexible work pools because you are dealing with a global pool of talent. This is also a danger because they can find the

Page 184: 2020 foresight - Tech Views of the Future - Ed Maguire

Ray Wang, Altimeter Group US software

184 [email protected] September 2010

lowest-cost individual, but you can go on and on. This is what is happening with the distributed talent and distributed work environment.

We are also walking into work with five generations of workers, people who still look at keyboard cutouts on green screens to people who never ever send an email or use the phone. That in itself is pretty interesting. This has implications for the types of work training, what kind of workers you can hire and the type of talent that is out there. It all comes into play. You have different issues going on with knowledge and knowledge management and that makes it a very interesting place as we are thinking how and why people learn and what is effective.

Then we have this issue right now about different and emerging business models. Think about it this way; in the height of the financial crisis, a company like Google was worth something like US$160 or US$170bn dollars in market cap. GM was worth US$4bn. Basically, Google could have bought GM for fun. That in itself is just quite amazing. A company that just does search, it is a very important piece, I am not denouncing that it’s just search, but a company that just does search could make more than GM and actually have assets. So business models are changing. At this moment they are being broken up, being re-examined and people are doing different things.

We do these things with laptops and the winning laptop deals are in the US$100 range or even sometimes below a US$100 because instead of selling laptops people are selling services. Products are excuses to sell services now. What does that mean? We have all these different kinds of trends that are emerging and it changes how people view business models, what is important, and how to get things done.

We will see is models continue in 2020, what we are looking at is much more personalized, much more specialized types of businesses that do scale on the smaller market and mega corporations everywhere influencing and changing and really commoditizing the cost structures. This will occur in the same way we have seen in almost every business cycle just they will have the leverage of technology They may not hire as many employees as they used to and what that means is that we will have a lot more employees looking to start their own businesses and a lot more independents that are in the market.

Ed: With the declining costs of access to technology and this increased power at the hands of entrepreneurs, what are the ramifications for innovation and value creation in technology?

Ray: The sole notion that we talked about knowledge workers 10 to 15 years ago is coming to fruition. This is really based on where you see things like the digital

divide occur. Mobile devices in emerging markets are really changing anything from payment mechanisms to access to information and knowledge to make decisions to really help people plan and forecast. On that level, yes, people will be better informed.

The question is: Can they make more money being better informed and where will those opportunities be? What we would like to say is that we have all these high-tech solutions that are out there but honestly I think it is the low-tech solutions that have a lot of potential, still.

Ed: The one thing that you can never outsource or commoditize is the relationships between one person and another. That is too difficult to recombine into bits and bytes.

Ray: Here is the thing I tell people all the time: When people get really excited about technology and all of the implications and what a great world we will have, I think about the time when I was in second grade and people said computers will change the world and that we will be able to work 3-day workweeks and all this stuff will be automated. You sit there and say, ‘Yeah, I am working harder than ever.’

I am not sure it’s happening that way. What I do tell people is that if something can be commoditized, if something can be digitalized, your job doesn’t exist. Once you figure that out, people will come back and realize what are those positions and jobs that you can create that actually cannot be outsourced and cannot be sent somewhere else. That is where the value-creation shifts will occur.

Ed: That is an interesting point you make. One of the themes we have been exploring is that all of these enabling technologies like open source and infrastructure as a service have lowered barriers in technologies to commoditize a lot of the low value-add applications or functions in favor of strategic thinking, architecture, understanding business process, understanding very specific problems, and result in value moving up the stack into applications, applications that become far more narrow in focus or vertical in focus. Will we see an analogous phenomenon among small- and medium-business applications to what we have seen in the iTunes ecosystem?

Could we see applications that are highly verticalized, very fragmented, potentially giving rise to a lot of mom and pop application providers that might have formerly been consultants or VARs and now they focus on a very domain-specific application. The implications of this are that value creation becomes highly fragmented and distributed among different players. I think at the enterprise there are different dynamics at play, but this is one of the potential outcomes or scenarios that is increasingly possible with these platforms as a service offering.

Page 185: 2020 foresight - Tech Views of the Future - Ed Maguire

Ray Wang, Altimeter Group US software

September 2010 [email protected] 185

Ray: Yes, it’s going to happen that way. Just as we talked about what is going to be differentiated in that market, the other place to look at it is if you are a plumber, electrician, or auto mechanic, you will still be around. That cannot be outsourced that quickly. The physical jobs will still be here. That is the thing to think about. How many of those will be around? When we look at job creation numbers and where things are going to end up, those are the things you worry about. How many of these high-thought, high-tech jobs can we sustain? How many of these physical jobs will we have in the marketplace? What does the world do about that? These are huge issues.

Ed: Hopefully, we will start to see some of the high-octane thinkers and quants move from finance back to hard sciences and solving some of the big problems of our era, the clean-energy problems and problems around physics, structural and chemical engineering, in drug discovery and the idea of precision medicine. These are all challenges that can be addressed now with more compute power than we have ever had before, predictive power. You need the human talent behind that too.

Page 186: 2020 foresight - Tech Views of the Future - Ed Maguire

Stephen Wolfram, Wolfram Research US software

186 [email protected] September 2010

Stephen Wolfram, Wolfram Research Stephen Wolfram is a distinguished scientist, inventor, author and business leader. He is the creator of Mathematica, the author of A New Kind of Science, the creator of Wolfram|Alpha, and the founder and CEO of Wolfram Research. His career has been characterized by a sequence of original and significant achievements.

Born in London in 1959, Stephen was educated at Eton, Oxford and Caltech. He published his first scientific paper at the age of 15, and had received his PhD in theoretical physics from Caltech by the age of 20. Stephen's early scientific work was mainly in high-energy physics, quantum field theory, and cosmology, and included several now-classic results. Having started to use computers in 1973, Stephen rapidly became a leader in the emerging field of scientific computing, and in 1979 he began the construction of SMP - the first modern computer algebra system - which he released commercially in 1981.

Stephen has been president and CEO of Wolfram Research since its founding in 1987. In addition to his business leadership, Stephen is deeply involved in the development of the company's technology and continues to be personally responsible for overseeing all aspects of the functional design of the core Mathematica system.

Stephen has a lifelong commitment to research and education. In addition to providing software for a generation of scientists and students, Stephen's company maintains some of the web's most visited sites for technical information. He is also increasingly active in defining new directions for education, especially in the science he has created.

Building on Mathematica, A New Kind of Science, and the success of Wolfram Research, Stephen in May 2009 launched Wolfram|Alpha - an ambitious, long-term project to make as much of the world's knowledge as possible computable, and accessible to everyone.

The science of information accelerates Stephen Wolfram is a physicist, software developer, mathematician, author and businessman known for his work in theoretical particle physics, cosmology, cellular automata, computational complexity theory, computer algebra and the Wolfram|Alpha computational knowledge engine. Our discussion touched on the merger of natural language and programming and the growth of data that can be computed.

Key points Advancements in linguistic input, programming language, and interactive

documents will change how humans interact, interface, and create and manage content in systematic ways.

Development of computable knowledge and system design are coupled to automate how questions are answered, how programs are developed and how the two interface naturally.

Computable knowledge rather than process knowledge will result in development that is more natural in a linguistic way.

Summary of interview on 21 July 2010.

Full transcript follows

The traditional computing interface will undergo

dramatic change

Page 187: 2020 foresight - Tech Views of the Future - Ed Maguire

Stephen Wolfram, Wolfram Research US software

September 2010 [email protected] 187

Advances in knowledge-based computing and linguistic techniques will allow it to do more interesting things with the torrent of data from new sensors and sources.

Algorithms and programs will be created in an automated way from mining from the computational universe of possible programs.

Investment focus will be on business drivers as solving business problems becomes easier, more natural for common personnel.

Those at risk include traditional software developer tool and platform providers.

‘Wolfram|Alpha is 10 million lines of Mathematica code and it would be simply impossible to build it without the symbolic programming aspect of it.’

The rampant growth of data will present opportunities for

intelligent computing

New means of developing programs will evolve

Wolfram believes traditional software tool

and platform providers are at risk of losing

Page 188: 2020 foresight - Tech Views of the Future - Ed Maguire

Stephen Wolfram, Wolfram Research US software

188 [email protected] September 2010

Stephen Wolfram transcript Stephen: The big point is that lots more stuff can become computable. We are used to looking stuff up in a database and getting a number out. Maybe you have lots of plain text that explains what’s going on but we don’t have the ability to compute from these things. I think the biggest thing I see coming is this knowledge-based paradigm where you are expecting to start with a collection of computable knowledge.

We are used to having computer languages, tools, and, so on, where you are building programs from little pieces, from the ground up. They don’t already have knowledge. You are not starting from a collection of computable knowledge but are starting from ‘I want to write a program that is in raw Java or raw C code.’ The thing that is going to be very prevalent is that one expects to start from a platform with a lot of computable knowledge.

One of the things we are seeing both in the general public information sector and the corporate sector, where we are doing lots of work with companies and other organizations that have lots of internal data, is they say, ‘We have all this internal data but how do we figure things out from it?’ The issue is how you make it computable. If you could imagine that you had a human expert who poured through all this data and was able to use models and methods and, so on, to figure things out from the data, what could you have from this expert? Now imagine automating whatever it is you could ask that human expert. If it is possible that is a big direction.

One of the things that has happened through the work we have done through Wolfram Alpha is that we provided evidence that yes it is actually possible to make a big chunk of knowledge computable so that you can get expert level answers to questions based on what is underneath. So there will be things that happen, like the whole field of data curation.

Like, for example, now there is this job title of data curator. We have lots of them now in the Wolfram Alpha organization. There are going to be lots more data curators in the world. There is a lot more automation that you can apply to doing data curation but ultimately human judgment is needed to actually make things solid enough that you can expect to build and compute. I think there is a hope that the general Wikipedia-style crowdsourcing thing will somehow allow lots, at least for public data, lots of curation to be done by the vast crowds out there. I am pretty skeptical about that.

I think there are certainly particular kinds of tasks, street mapping is one place where that seems to be working quite well and probably extraction of structured data from plain text where it is a very well-defined thing that you are expecting the crowd to do, that can be done. When it is the general data curation problem, I am skeptical that that is going to work.

I think another big issue is how we interact with computers, what are our mechanisms? There are different kinds of interfaces, whether it’s the programming language interface, the menu-driven interface, the form-driven interface, or a natural-language type of interface, that people are used to. As we get more of this knowledge-based computing going on, there is the kind of linguistic interface, the kind of natural-language interface to things, and there is going to be a lot more you can do with that. Some of the most interesting stuff is going to happen on the border between natural language and programming languages.

Right now it is two separate things. Either you can do, like in a search engine you can type in natural-language key words, in Wolfram Alpha you can type in natural-language questions. Both of these things are very, natural language is very floppy and undefined. You get to ask one question in natural language. You don’t get to build up this whole tower like you do with programming language. What I think will happen is how do you bring together programming languages, which have evolved only quite gradually over the years, with linguistic input.

We will see a bunch of very interesting things happening where we can essentially tell our computers what to do using natural language and then that will be shown what is the precise programming-language-like specification of what the computer thought we told it to do. Then we can take those pieces of programming-language stuff that have been constructed from our linguistic input and build from them to construct bigger and bigger programs. I am going to guess this will sort of blur the distinction between who is able to program and who is not. Who learned all that stuff to become an expert programmer and who’s just typing in plain English stuff? That distinction is still the users of software and the programmers and, is if anything, further apart today than at times in the past. That is something that is coming.

Interactive documents will finally really arrive. There have been a zillion false starts at that over 30 years. Multimedia and this-and-that and then we have the web and then we have flash embedded in the web but you still need a flash programmer to program to create non-trivial interactive documents. You know we have technology that is going into this space, computer world document format is the ability to have easy-to-create interactive documents, where you don’t just get to say, ‘Here is a plot of what you expect for this or that,’ but you get to have something that is interactive and modifiable.

The merger of all three of these things I am talking about, namely linguistic input, programming language, and interactive documents is definitely coming. You are giving linguistic input, that is creating something which is specified in a programming language, but that then

Page 189: 2020 foresight - Tech Views of the Future - Ed Maguire

Stephen Wolfram, Wolfram Research US software

September 2010 [email protected] 189

is able to give you something that is interactive that could be part of your document. Those are a few things that are coming.

An area that I don’t know exactly what is going to happen is integration vs dis-integration. You know we have 50,000 apps on the iPhone, maybe it is 100,000 by today, I don’t know. We also have some of these very broad search engines like Wolfram Alpha and like Mathematica and, so on, where there is a single place you go to get a bunch of stuff. It is going to be interesting to see if software is really like books where there are lots of them and you use them for only a short while and you buy them all the time and then they die away or is the real value in consistent long-term tools?

Maybe what will end up happening is that both of those models will exist. Right now, right in these months, we are right at the cusp of all sorts of things happening with respect to that.

Ed: Stephen I thought the concept of a data curator was very interesting, specifically in terms of navigating information. On a macro basis there is increasing relevance for information curators or editors to make sense of the deluge of information.

Stephen: It is more structured. What is data curation? It is trying to figure out how you clean a bunch of raw data that you have collected somehow that has a bunch of messy pieces to the point where there is enough consistency that you can do something automatic with that data.

For example, if you have a list of countries in the world and the number of sheep in those countries or something. In the original source of data it is going along 2 million sheep in a country, 3 million sheep in a country, and then there is one that is 5 sheep in a country. Now you know there is not 5 sheep in a country. There is something wrong there. Wherever that little piece of data came from there is something wrong. Maybe they left out the word million or maybe they coded incorrectly. Maybe the definition of country is kind of crazy and there should be two things combined to represent an ordinary country. This kind of cleaning activity is something you can have an automated system go and flag places that something is likely to be wrong. It is something we have put a lot of effort into, building these types of automated systems.

Ultimately a human is going to have to look at it and say, is this something interesting about the world and there is an anomaly here or this is something wrong with the data if you are going to build a lot of other things from that. If you are just going to get the one fact and read it in a web page that’s one thing but if you are expecting to make a conclusion, build up even plots from that kind of data, it has to be more. In that

case you need humans because you need judgment and those types of things.

What we have done at Wolfram Alpha is probably pretty far out on the curve on these types of things because we are putting in expert level knowledge in thousands of domains. We actually end up needing a data curator, not any data curator, but we end up needing the world expert in field x to tell us the best research-grade way to compute something. It’s not something where we are foraging the web and ingesting what we find. It is something where you actually have to go, and with some judgment, figure out what’s the expert information on this particular thing.

There are other pieces of data curation. As we move towards more linguistic ways to ask questions, there is a whole world of linguistic curation that has to do with taking language as humans use it and being able to codify how that really works. What are all the ways people say such-and-such a thing and what are the kinds of variance that exist? At a very simple level, you have a particular city. What slang names for that city are used that somebody might enter when they are trying to get related information? There are more sophisticated versions when it comes to describing operations.

For example, in the merger of natural language and programming one of the challenges is when you describe what a computer should do, if you were to say to a person, ‘I want my computer to arrange my photographs by geolocation and date,’ being able to understand what it means to arrange my photographs by geolocation is not obvious what that means. Maybe there is slang that says geo arrange my photographs. You have to arrange and capture all of these human details of language if you want to make it so a computer can systematically process these things.

Ed: What you articulated is that human service has enhanced technology. You still need that guiding hand of judgment and the expertise.

Stephen: There is a huge amplification factor. When we take away some medical computation or something and we get the information from the world experts on that and we put that into Wolfram Alpha, then it is there and anybody can use it. It’s a different economic situation. If you want to know that fact you have to go find one of the world experts and spend your hour with them and pay them their fee or whatever it is. Conversely, somebody went and captured that information and put it into a computable form and now it is something that is immediately accessible to lots and lots of people.

In a sense this is an old idea. This is the idea that the old AI idea and later the expert systems and, so on, of let’s capture human knowledge and make it automatically computable. The only difference now is

Page 190: 2020 foresight - Tech Views of the Future - Ed Maguire

Stephen Wolfram, Wolfram Research US software

190 [email protected] September 2010

that it is actually working, which it never did before. This is an old story from the ’60s.

Ed: It seems like there have been a lot of different approaches through knowledge management systems, enterprise portals, and various approaches that ended up trying to be intrusive in order to capture the intrinsic knowledge within an organization. You need that human input to correlate it as you are saying; the combination of that programming language and the ability to parse a request from natural language has been the missing glue.

Stephen: Several things have happened. Why is the technology that we are building for knowledge-based computing possible today and wasn’t possible before? There are a bunch of different reasons. Some of it is in order to get fairly fast response times and to do non-trivial computations you need computers that are fairly fast. Now there is no real issue with us storing all the data from lots of different places. It is only terabytes, so to speak, and that is easy to store. Again, this is a fairly recent thing.

The prevalence of the web is not so important in the end, in terms of ingesting data because the web is a very messy source of data. You end up having to go to the primary sources of data rather than just the web. The web has given people the expectation that many more places have data, and many more people that think about producing data. The existence of the web gives some kind of expectation that it will be transportable in a way that wasn’t there before. When it comes to linguistic processing, in the past it was, give me a corpus of ten million pages of technical documents about bio medicine. That was hard to get in the past. Now it is trivial.

I think what is going to happen next, lots and lots of sensors, lots more sources. It will become very cheap to ingest lots more data of lots more kinds. People will routinely, whether it is to the personal analytics level of people recording tons of information about themselves like every keystroke they type, or whether it’s at the level of being able to track all the food items in a grocery store or whatever else. The cost of sensor technology is going way down. It is going to be routine to track all this stuff. Then, what happens then? Well you have all this data and then you have to say, ‘Well what are we going to do with all of this?’ What is particularly relevant about all that data is humans saying, ‘Well I wonder if this, this and this’ and you have got to be able to translate the I wonder this, this and this into something a computer can go look at the data and figure out what they can figure out from it. So I think with the torrent of sensor-based data it is really good that this knowledge-based computing stuff and linguistic input is coming just at the time we are going to have this torrent data because it is what is going to allow us to do something interesting with that data.

The big change, when you make things computable, when you have a company or a city or something when the things in it become computable, there is sort of a qualitative change in the way people do things. With the web and search engines, one went from a world where typical facts and knowledge and, so on, were quite nontrivial to get. It was a high cost to go to the library and talk to a reference librarian to find an expert. This was a high-cost thing and people didn’t do it. Now it is absolutely routine and you find out about your company or your organization by typing it right into the web. We can do that. So I could routinely know a certain amount and that was very nontrivial before but it is still the case that there is lots of stuff that isn’t just static information but it’s stuff that maybe a question has never been asked before.

The question is how you get such a question answered. Science has led the way because there are all sorts of questions we can answer using the methods of science but most people are not in a position to get them answered for themselves. An early promise of computers was to be able to automate these kinds of things. It hasn’t really come to the common person until now.

These are big areas and the things that I am mentioning are things that I have been involved in one way or another because those are things you can find out from me more than other people. The other things you can find out from anybody.

Another big thing is how programs, how algorithms get created. There has been this methodology of software engineering that says you get an engineer and they build things just like they build a bridge. They start from one piece and keep adding things and eventually you have a bridge or eventually you have your algorithm for your program for doing whatever it is you want to do.

One of the things that has come out of a bunch of science that I have done is this idea that there is this computational universe of possible programs. Even quite simple programs are useful for things and that means that it becomes feasible to search this computational universe for programs that are useful for your purpose. Whether your purpose is cleaning up images or whether your purpose is encryption or doing routing or linguistics. It becomes possible to not have the engineer build the program step-by-step but instead have the program be mined from this computation universe of possibilities.

That sounds very futuristic but it is, in our own work in Mathematica and Wolfram Alpha, a methodology we increasingly use. So there are an increasing number of algorithms that no human built. We found it. We searched a trillion possible programs and one of them was the one that was the best with respect to certain criteria and that ends up being the one we use for such-and-such a function. I think that will be in a lot of

Page 191: 2020 foresight - Tech Views of the Future - Ed Maguire

Stephen Wolfram, Wolfram Research US software

September 2010 [email protected] 191

areas, whether in consumer electronics or web infrastructure and so on. There are a lot of places where this type of mined algorithm will become more and more prevalent.

Ed: Michael Tiemann from Red Hat was saying that currently there are about a billion lines of open source code that have been developed that are essentially accessible to anybody who is looking to incorporate that to solve a problem. I think what you have articulated is this methodology that can really take advantage of that is enormous.

Stephen: Right, this is even a more extreme version of that because you are looking at programs constructed symbol by symbol. We are not looking at fragments of programs that some human wrote. You just enumerate all programs that have such-and-such a structure.

Ed: So it is almost a Monte Carlo type of simulation.

Stephen: Programs, and then it is ‘guess what, right here we found this nugget of gold’ that is the program we end up using for hashing or some such other thing. It is a powerful methodology. Another thing to say that I have been much involved in is the notion of symbolic programming. People have had, programming language evolution has been, the programming languages that we have today, are from a software engineering point of they are more sophisticated than other ones but from kind of a core “for” loops and “do” loops and things they are a surprisingly slow evolution.

One of the things I have been involved with for 30 years now is this notion of symbolic programming where you are dealing with representing, dealing with this symbolic language where instead of just saying we have numbers, we have strings, we have this, we have that, you have this kind of general notion of symbolic data that you are dealing with.

What is important about that is all this different type of data is that whether it’s graphics or whether it’s a program itself or whether it’s some sort of execution history of something or whatever else, all of this

becomes a uniform kind of thing. It is all symbolic data, it’s all symbolic expressions. So that means you can have a programming language where you have a fairly small number of parameters that can cut across all these different kinds of areas and that is something that people have been pecking away at, doing this stuff.

Wolfram Alpha is 10 million lines of Mathematica code and it would be simply completely impossible to build this thing without the symbolic programming aspect of it. Symbolic programming is something that is not as widely understood as it should be and it is an idea that is at least 50 years old actually. It is something that is only very, it’s been very gradual. It’s understanding a point of view of practical software engineering.

Ed: Have you had any thoughts on Google’s acquisition of Metaweb technologies?

Stephen: Well, I mean, I have been friends of Danny Hillis for 30 years or something. Freebase is pretty much Wikipedia and sort of an attempt to mine entities from Wikipedia. I think the main play right now is: Can we have a default way of referring to entities on the web that webmasters can tag their websites with? I think that is a fine thing and is sort of the semantic web idea as well.

If it gets pushed in the right way, more people will tag that way and there will be an additional amount of structuring, and it makes it easier for a search engine to know, when you know what the input was for the search engine, and you know what link people clicked on, you can say, ‘Well, that was the link that was associated with entity ID such-and-such,’ rather than just that was the link that is the thing that’s on so and so’s website. To me this is kind of what you do if you don’t have the hood-space, so to speak, to make everything computable. If you could get to the point where you could ingest all of this knowledge and really make it computable, you have gotten a lot further than being able to have a consistent tagging of the web. It is still useful to have a more consistent tagging of the web and you know that’s what I think is going on there.

Page 192: 2020 foresight - Tech Views of the Future - Ed Maguire

Important disclosures US software

192 [email protected] September 2010

Important disclosures Analyst certification I, Ed Maguire, hereby certify that the views expressed in this research report accurately reflect my own personal views about the securities and/or the issuers and that no part of my compensation was, is, or will be directly or indirectly related to the specific recommendation or views contained in this research report. In addition, the analysts included herein attest that they were not in possession of any material, non-public information regarding the subject company at the time of publication of the report.

"CL" in charts and tables stands for Credit Agricole Securities (USA) Inc estimates unless otherwise noted in the Source.

Credit Agricole Securities (USA) Inc does and seeks to do business with companies in its research reports. As a result, investors should be aware that the firm may have a conflict of interest that could affect the objectivity of this report. Investors should consider this report as only a single factor in making their investment decision.

EVA™ is a registered trademark of Stern, Stewart & Co.

RATING RECOMMENDATIONS: Buy = Expected to outperform the local market by >10%; Outperform (O-PF) = Expected to outperform the local market by 0-10%; Underperform (U-PF) = Expected to underperform the local market by 0-10%; Sell = Expected to underperform the local market by >10%. Performance is defined as 12-month total return (including dividends). Our 12-month forecast for the S&P500 is 11.3%. Overall rating distribution for Credit Agricole Securities (USA) Inc. Equity Universe: Buy / Outperform - 60%, Underperform / Sell - 40%, Restricted - 0%. Data as of 30 June 2010. INVESTMENT BANKING CLIENTS as a % of rating category: Buy / Outperform - 7%, Underperform / Sell - 8%, Restricted - 0%. Data for 12-month period ending 30 June 2010. Prior to 25 November 2008, Credit Agricole Securities (USA) Inc used an absolute system (based on anticipated returns over a 12-month period): Buy: above 20%; Add: 10%-20%; Neutral: +/-10%; Reduce: negative 10-20%; Sell, below 20% (including dividends). Please note that when an analyst(s) writes about companies that are not under the analyst's coverage ("not rated"), the material presented should not be construed as research but is offered as factual commentary. It is not intended to, nor should it be used to form an investment opinion about the not rated companies.

FOR A HISTORY of the recommendations and price targets for companies mentioned in this report, please write to: Credit Agricole Securities (USA) Inc, Compliance Department, 1301 Avenue of the Americas, 15th Floor, New York, New York 10019-6022.

CREDIT AGRICOLE SECURITIES (USA) INC POLICY: Credit Agricole Securities (USA) Inc's policy is to only publish research that is impartial, independent, clear, fair, and not misleading. Analysts may not receive compensation from the companies they cover. Neither analysts nor members of their households may have a financial interest in, or be an officer, director or advisory board member of companies covered by the analyst.

ADDITIONAL INFORMATION on the securities mentioned herein is available upon request.

Page 193: 2020 foresight - Tech Views of the Future - Ed Maguire

Important disclosures US software

September 2010 [email protected] 193

DISCLAIMER: The information and statistical data herein have been obtained from sources we believe to be reliable but in no way are warranted by us as to accuracy or completeness. We do not undertake to advise you as to any change in our views. This is not a solicitation or any offer to buy or sell. We, our affiliates, and any officer director or stockholder, or any member of their families may have a position in, and may from time to time purchase or sell any of the above mentioned or related securities. This material has been prepared for and by Credit Agricole Securities (USA) Inc. This publication is for institutional client distribution only. This report or portions thereof cannot be copied or reproduced without the prior written consent of Credit Agricole Securities (USA) Inc. In the UK, this document is directed only at Investment Professionals who are Market Counterparties or Intermediate Customers (as defined by the FSA). This document is not for distribution to, nor should be relied upon by, Private Customers (as defined by the FSA). This publication/communication is distributed for and on behalf of Credit Agricole Securities (USA) Inc in Australia by CLSA Limited; in Hong Kong by CLSA Research Ltd.; in India by CLSA India Ltd.; in Indonesia by PT CLSA Indonesia; in Japan by Credit Agricole Securities Japan, a member of the JSDA licensed to use the "CLSA" logo in Japan; in Korea by CLSA Securities Korea Ltd.; in Malaysia by CLSA Securities Malaysia Sdn Bhd; in the Philippines by CLSA Philippines Inc.; in Thailand by CLSA Securities (Thailand) Limited; and in Taiwan by CLSA Limited, Taipei Branch. Singapore: This publication/communication is distributed for and on behalf of Credit Agricole Securities (USA) Inc in Singapore through CLSA Singapore Pte Ltd solely to persons who qualify as Institutional, Accredited and Expert Investors only, as defined in s.4A(1) of the Securities and Futures Act. Pursuant to Paragraphs 33, 34, 35 and 36 of the Financial Advisers (Amendment) Regulations 2005 with regards to an Accredited Investor, Expert Investor or Overseas Investor, sections 25, 27 and 36 of the Financial Adviser Act shall not apply to CLSA Singapore Pte Ltd. Please contact CLSA Singapore Pte Ltd in connection with queries on the report. MICA (P) 168/12/2009 File Ref. No.: 931318. This publication/communication is also subject to and incorporates the terms and conditions of use set out on the www.clsa.com website. Neither the publication/ communication nor any portion hereof may be reprinted, sold or redistributed without the written consent of CLSA Limited and Credit Agricole Securities (USA) Inc.

© 2010 Credit Agricole Securities (USA) Inc. All rights reserved.

Page 194: 2020 foresight - Tech Views of the Future - Ed Maguire

Important notices

27/08/2010

© 2010 CLSA Asia-Pacific Markets ("CLSA").

This publication/communication is subject to and incorporates the terms and conditions of use set out on the www.clsa.com website. Neither the publication/ communication nor any portion hereof may be reprinted, sold or redistributed without the written consent of CLSA. CLSA has produced this publication/communication for private circulation to professional, institutional and/or wholesale clients only. The information, opinions and estimates herein are not directed at, or intended for distribution to or use by, any person or entity in any jurisdiction where doing so would be contrary to law or regulation or which would subject CLSA to any additional registration or licensing requirement within such jurisdiction. The information and statistical data herein have been obtained from sources we believe to be reliable. Such information has not been independently verified and we make no representation or warranty as to its accuracy, completeness or correctness. Any opinions or estimates herein reflect the judgment of CLSA at the date of this publication/ communication and are subject to change at any time without notice. Where any part of the information, opinions or estimates contained herein reflects the views and opinions of a sales person or a non-analyst, such views and opinions may not correspond to the published view of the CLSA research group. This is not a solicitation or any offer to buy or sell. This publication/ communication is for information purposes only and does not constitute any recommendation, representation or warranty. This is not intended to provide professional, investment or any other type of advice or recommendation and does not take into account the particular investment objectives, financial situation or needs of individual recipients. Before acting on any information in this publication/ communication, you should consider whether it is suitable for your particular circumstances and, if appropriate, seek professional advice, including tax advice. CLSA does not accept any responsibility and cannot be held liable for any person’s use of or reliance on the information and opinions contained herein. To the extent permitted by applicable securities laws and regulations, CLSA accepts no liability whatsoever for any direct or consequential loss arising from the use of this publication/communication or its contents. The analyst/s who compiled this publication/communication hereby state/s and confirm/s that the contents hereof truly reflect his/her/their views and opinions on the subject matter and that the analyst/s has/have not been placed under any undue influence, intervention or pressure by any person/s in compiling such publication/ communication. Subject to any applicable laws and regulations at any given time CLSA, its affiliates or companies or individuals connected with CLSA may have used the information contained herein before publication and may have positions in, may from time to time purchase or sell or have a material interest in any of the securities mentioned or related securities or may currently or in future have or have had a business or financial relationship with, or may provide or have provided investment banking, capital markets and/or other services to, the entities referred to herein, their advisors and/or any other connected parties. As a result, investors should be aware that CLSA and/or such individuals may have one or more conflicts of interests that could affect the objectivity of this report. The Hong Kong Securities and Futures Commission requires disclosure of certain relationships and interests with respect to companies covered in CLSA's research reports and the securities of which are listed on The Stock Exchange of Hong Kong Limited and such details are available at http://www.clsa.com/member/research_disclosures/. Disclosures therein include the position of the CLSA Group only and do not reflect those of Credit Agricole Corporate & Investment Bank and/or its affiliates. If investors have any difficulty accessing this website, please contact [email protected] on (852) 2600 8111. If you require disclosure information on previous dates, please contact [email protected].

This publication/communication is distributed for and on behalf of CLSA Limited (for non-US markets research) and /or Credit Agricole Securities (USA) Inc. (for US research) in Australia by CLSA Australia Pty Ltd; in Hong Kong by CLSA Research Ltd.; in India by CLSA India Ltd.; in Indonesia by PT CLSA Indonesia; in Japan by Credit Agricole Securities Asia B.V., Tokyo Branch, a member of the JSDA licensed to use the "CLSA" logo in Japan; in Korea by CLSA Securities Korea Ltd.; in Malaysia by CLSA Securities Malaysia Sdn Bhd; in the Philippines by CLSA Philippines Inc. (a member of Philippine Stock Exchange and Securities Investors Protection Fund); in Thailand by CLSA Securities (Thailand) Limited; and in Taiwan by CLSA Limited, Taipei Branch. United States of America: This research report is distributed into the United States by CLSA solely to persons who qualify as "Major U.S. Institutional Investors" as defined in Rule 15a-6 under the Securities and Exchange Act of 1934 and who deal with Credit Agricole Corporate & Investment Bank. However, the delivery of this research report to any person in the United States shall not be deemed a recommendation to effect any transactions in the securities discussed herein or an endorsement of any opinion expressed herein. Any recipient of this research in the United States wishing to effect a transaction in any security mentioned herein should do so by contacting Credit Agricole Securities (USA) Inc. (a broker-dealer registered with the Securities and Exchange Commission) and an affiliate of CLSA. United Kingdom: Notwithstanding anything to the contrary herein, the following applies where the publication/communication is distributed in and/or into the United Kingdom. This publication/communication is only for distribution and/or is only directed at persons ("permitted recipients") who are (i) persons falling within Article 19 of the Financial Services and Markets Act 2000 (Financial Promotion) Order 2001 (the "FPO") having professional experience in matters relating to investments or high net worth companies, unincorporated associations etc. falling within Article 49 of the FPO, and (ii) where an unregulated collective investment scheme (an "unregulated CIS") is the subject of the publication/ communication, also persons of a kind to whom the unregulated CIS may lawfully be promoted by a person authorised under the Financial Services and Markets Act 2000 ("FSMA") by virtue of Section 238(5) of the FSMA. The investments or services to which this publication/communication relates are available only to permitted recipients and persons of any other description should not rely upon it. This publication/ communication may have been produced in circumstances such that it is not appropriate to categorise it as impartial in accordance with the FSA Rules. Singapore: This publication/communication is distributed for and on behalf of CLSA Limited (for non-US markets research) and /or Credit Agricole Securities (USA) Inc. (for US research) in Singapore through CLSA Singapore Pte Ltd solely to persons who qualify as Institutional, Accredited and Expert Investors only, as defined in s.4A(1) of the Securities and Futures Act. Pursuant to Paragraphs 33, 34, 35 and 36 of the Financial Advisers (Amendment) Regulations 2005 with regards to an Accredited Investor, Expert Investor or Overseas Investor, sections 25, 27 and 36 of the Financial Adviser Act shall not apply to CLSA Singapore Pte Ltd. Please contact CLSA Singapore Pte Ltd in connection with queries on the report. MICA (P) 168/12/2009 File Ref. No. 931318 The analmysts/contributors to this publication/communication may be employed by a Credit Agricole or a CLSA company which is different from the entity that distributes the publication/communication in the respective jurisdictions.

MSCI-sourced information is the exclusive property of Morgan Stanley Capital International Inc. (MSCI). Without prior written permission of MSCI, this information and any other MSCI intellectual property may not be reproduced, redisseminated or used to create any financial products, including any indices. This information is provided on an "as is" basis. The user assumes the entire risk of any use made of this information. MSCI, its affiliates and any third party involved in, or related to, computing or compiling the information hereby expressly disclaim all warranties of originality, accuracy, completeness, merchantability or fitness for a particular purpose with respect to any of this information. Without limiting any of the foregoing, in no event shall MSCI, any of its affiliates or any third party involved in, or related to, computing or compiling the information have any liability for any damages of any kind. MSCI, Morgan Stanley Capital International and the MSCI indexes are services marks of MSCI and its affiliates. The Global Industry Classification Standard (GICS) was developed by and is the exclusive property of Morgan Stanley Capital International Inc. and Standard & Poor's. GICS is a service mark of MSCI and S&P and has been licensed for use by CLSA Asia-Pacific Markets.

Page 195: 2020 foresight - Tech Views of the Future - Ed Maguire

Research & sales offices

Produced byProduced byProduced by

Australia CLSA Australia Pty Ltd CLSA House Level 15 20 Hunter Street Sydney NSW 2000 Tel: (61) 2 8571 4200 Fax: (61) 2 9221 1188

India CLSA India Ltd 8/F, Dalamal House Nariman Point Mumbai 400021 Tel: (91) 22 6650 5050 Fax: (91) 22 2284 0271

Philippines CLSA Philippines, Inc 19/F, Tower Two The Enterprise Center 6766 Ayala corner Paseo de Roxas Makati City Tel: (63) 2 860 4000 Fax: (63) 2 860 4051

USA - Boston Credit Agricole Securities (USA) Inc 99 Summer Street Suite 220 Boston, MA 02110 Tel: (1) 617 295 0100 Fax: (1) 617 295 0140

China - Beijing CLSA Limited - Beijing Rep Office Unit 10-12, Level 25 China World Trade Centre Tower 2 1 Jian Guo Men Wai Ave Beijing 100004 Tel: (86) 10 5965 2188 Fax: (86) 10 6505 2209

Indonesia PT CLSA Indonesia WISMA GKBI Suite 901 Jl Jendral Sudirman No.28 Jakarta 10210 Tel: (62) 21 2554 8888 Fax: (62) 21 574 6920

Singapore CLSA Singapore Pte Ltd 80 Raffles Place, No.18-01 UOB Plaza 1 Singapore 048624 Tel: (65) 6416 7888 Fax: (65) 6533 8922

USA - Chicago Credit Agricole Securities (USA) Inc 227 W. Monroe Street Suite 3800 Chicago, IL 60606 Tel: (1) 312 278 3604

China - Shanghai CLSA Limited - Shanghai Rep Office Room 910, 9/F 100 Century Avenue Pudong New Area Shanghai 200120 Tel: (86) 21 2020 5888 Fax: (86) 21 2020 5666

Japan Credit Agricole Securities Asia BVTokyo Branch 15/F, Shiodome Sumitomo Building 1-9-2, Higashi-Shimbashi Minato-ku, Tokyo 105-0021 Tel: (81) 3 4580 5533 (General) (81) 3 4580 5171 (Trading)Fax: (81) 3 4580 5896

Taiwan CLSA Limited Taiwan Branch 27/F, 95 Tun Hwa South Road Section 2 Taipei Tel: (886) 2 2326 8188 Fax: (886) 2 2326 8166

USA - New York Credit Agricole Securities (USA) Inc 15/F, Credit Agricole Building1301 Avenue of The AmericasNew York 10019 Tel: (1) 212 408 5888 Fax: (1) 212 261 2502

China - Shenzhen CLSA Limited - Shenzhen Rep Office Room 3111, Shun Hing Square Di Wang Commercial Centre 5002 Shennan Road East Shenzhen 518008 Tel: (86) 755 8246 1755 Fax: (86) 755 8246 1754

Korea CLSA Securities Korea Ltd 15/F, Sean Building 116, 1-Ka, Shinmun-Ro Chongro-Ku Seoul, 110-061 Tel: (82) 2 397 8400 Fax: (82) 2 771 8583

Thailand CLSA Securities (Thailand) Ltd 16/F, M Thai Tower All Seasons Place 87 Wireless Road, Lumpini Pathumwan, Bangkok 10330 Tel: (66) 2 257 4600 Fax: (66) 2 253 0532

USA - San Francisco Credit Agricole Securities (USA) Inc Suite 850 50 California Street San Francisco, CA 94111 Tel: (1) 415 544 6100 Fax: (1) 415 434 6140

Hong Kong CLSA Limited 18/F, One Pacific Place 88 Queensway Hong Kong Tel: (852) 2600 8888 Fax: (852) 2868 0189

Malaysia CLSA Securities Malaysia Sdn Bhd Suite 20-01, Level 20 Menara Dion 27 Jalan Sultan Ismail 50250 Kuala Lumpur Tel: (60) 3 2056 7888 Fax: (60) 3 2056 7988

United Kingdom CLSA (UK) 12/F, Moor House 120 London Wall London EC2Y 5ET Tel: (44) 207 614 7000 Fax: (44) 207 614 7070

CLSA Sales Trading Team Australia (61) 2 8571 4201 China (Shanghai) (86) 21 2020 5810 Hong Kong (852) 2600 7003 India (91) 22 6622 5000 Indonesia (62) 21 573 9460 Japan (81) 3 4580 5169 Korea (82) 2 397 8512

Malaysia (60) 3 2056 7852 Philippines (63) 2 860 4030 Singapore (65) 6416 7878 Taiwan (886) 2 2326 8124 Thailand (66) 2 257 4611 UK (44) 207 614 7260 US (1) 212 408 5800

© 2010 CLSA Asia-Pacific Markets ("CLSA"). Key to CLSA investment rankings: BUY = Expected to outperform the local market by >10%; O-PF = Expected to outperform the local market by 0-10%; U-PF = Expected to underperform the local market by 0-10%; SELL = Expected to underperform the local market by >10%. Performance is defined as 12-month total return (including dividends). 14/09/2010

CLEANGREEN& TM

At CLSA we supportsustainable development.

We print on paper sourced fromenvironmentally conservativefactories that only use fibres

from plantation forests.Please recycle.

CLSA is certified ISO14001:2004