22
1 Does Speed Matter? Eli Noam Columbia University Congratulations to us. We have made great progress in access networks for individuals and households. We have moved, in a few years, from the kilobit stage to the megabit stage. Technical progress for network throughput has been increasing at the rate of Moore’s Law, as I will show. And now the question now is, what is the next next? Are we moving to a gigabit speed? Do we need a gigabit speed? Or are we approaching a level at which extra throughput has only small benefits, just like extra speed on an automobile? The first and most obvious answer is that although everybody calls it ‘speed’, this is really a misnomer for data transfer rate, or bit rate per second. Other people call it ‘bandwidth’, which is equally imprecise. It uses an analog concept for digital. However, since everybody is using it, I will also talk about speed as a shorthand.

Eli Noam Article

Embed Size (px)

DESCRIPTION

Eli Noam Article

Citation preview

Page 1: Eli Noam Article

1

Does Speed Matter?

Eli Noam

Columbia University

Congratulations to us. We have made great progress in access networks

for individuals and households. We have moved, in a few years, from the

kilobit stage to the megabit stage. Technical progress for network

throughput has been increasing at the rate of Moore’s Law, as I will

show. And now the question now is, what is the next next? Are we

moving to a gigabit speed? Do we need a gigabit speed? Or are we

approaching a level at which extra throughput has only small benefits,

just like extra speed on an automobile?

The first and most obvious answer is that although everybody calls it

‘speed’, this is really a misnomer for data transfer rate, or bit rate per

second. Other people call it ‘bandwidth’, which is equally imprecise. It

uses an analog concept for digital. However, since everybody is using it,

I will also talk about speed as a shorthand.

Page 2: Eli Noam Article

2

So the question is, are we reaching a saturation point, at which most

people don’t need much more, just like most people are happy with a

regular car and do not need a big truck to drive around, even if it were

affordable.

This is one school of thought, which believes that we will be reaching

saturation. Let’s call them the “network skeptics.” The other school of

thought believes in “if you build it they will come”, a famous line from

the baseball movie “Field of Dreams”. Let’s call them the “network

visionaries”.

Who is right? History is full of mis-prognostications. There were the

visionary mis-prognostications with the wrong decision taken – a Type

1 error—or a skeptical misprognostication, with the right decision not

taken – a Type 2 error. Type 1 is excessive credulity, Type 2 excessive

skepticism. A Type 1 errors were the satellite mobile phone systems

such as Iridium or Teledesic. A Type 2 error was the belief by AT&T and

the McKinsey consultancy that by the year 2000 there would be only 1

million mobile subscribers in the world.

Page 3: Eli Noam Article

3

So what is it?

The answer has implications for operators, because they need to plan

which networks with many billions of capex they must push.

The answer has implications for investors, who need to judge which

horse to back , in a race between different types of platforms.

And it has implications for regulatory policy, both in terms of national

broadband plans – what to fund and incent, and how to protect

competition and access policy.

There are several ways to peek into the future. The first is the historical

approach. What have been the broad trends?

The trend of technology has been a pronounced annual growth rate.

This chart is exponential, and shows acceleration, not a slowing down.

Its average growth rate after 1940 is 34.7%, and still accelerating. For

the period starting in 1850, it’s 18.9%.

Page 4: Eli Noam Article

4

This is technology of pipes, not applications or economics. We have

airplanes flying at over 2 times the speed of sound, but they are not

economical for tourism. We fly at the same speed for decades now, plus

the extra time for security. So we have to translate technological rates

into prices, too, and applications. If we look at that we see how the price

of bit distribution drops exponentially in the marketplace.

Page 5: Eli Noam Article

5

These are broad trends over a century and more, and there is no reason

to assume that they will somehow end or slow. The transmission

technology of fiber, in particular, still has a huge upside. For a single

fiber strand, the theoretical capacity is about 25 terabit per second. In

the laboratory, about 15.5 terabits have been achieved. That’s 15 million

megabits, and one can easily put several fiber strands together for

transmissions in the petabits. Therefore, we will continue to see bit

transport to increase in capacity and drop in price.

Page 6: Eli Noam Article

6

That’s on the supply side. And what about the demand side? This goes to

the question of applications. As bits become cheaper, people use more

of them. They do so by increasing the time they are connected, and by

increasing the ‘bit-richness” per time unit. This, too, is a historic trend.

This price decline in transmission means that we can afford to “enrich”

the informational content of media, measured by the bit content, per

second, of media consumption.

Page 7: Eli Noam Article

7

As one can see, the bit-richness of media has continuously gone up.

According to my estimation, it has gone up at a 6-8% GAGR increase.

And where will it go next?

The skeptics believe that we have enough at maybe 10 Mbps. But they

do not do the math. Applications will continue to rapidly grow in their

needs for speed. Let’s do a back-of-the-envelope calculation. An HD

quality TV has 1080 horizontal lines and 1920 vertical lines, i.e. 2

million pixels. 3 primary colors are required for each pixel at 8

bits/color. 60 frames per second is the TV standard. This means that

such HD TV requires 3 Gbps of speed, plus some for audio. A household

will realistically require a second and third channel for other

simultaneous uses such as TV watching, games, or channel surfing by

other members of the household, or by multi-taskers. This would mean

a transmission speed requirement of about 10 Gbps, not 10 Mbps.

Compression reduces this, of course, maybe by a factor of 100, and one

can reduce the frame rate to 30. This would bring down the required bit

rate to 50 Mbps, but at the expense of quality and latency. (Latency is

important for multi-player games.)

Page 8: Eli Noam Article

8

And this is not the end for speed requirements. None of the above is

extravagant. But now, let’s look at the visionary scenario. With TV

screens becoming flatter, bigger, and cheaper, the pixel density will have

to grow just to maintain sharpness. The next generation of TV

resolution–4K—has about 12.7 million pixels. ( NHK’s Ultra HDTV

standard has about 16 times as many pixels as regular HDTV, which

would bring it up to 32 million pixels). There are 3 colors per pixel, and

they will require an increase to 16-bits color to deal with the greater

sharpness. The frame rate will be at least 60 frames per second, and

more likely 72 or more. This adds up to 44 Gigabits per second. Or

three times as much for NHK’s standard. To create three-

dimensionality—which is now spreading to television-- requires a

doubling, at least. 2-way interactivity -- with a up-channels, not just

down-channels--doubles this again. Superior audio such a 5.1 or 7.1

and superior sound sampling will also require much more bandwidth.

Adding all this up, back-of-the-envelope style, results in a transmission

requirement of about 200 Gigabits per second. 3 such channels per

households would bring it to over half of a Terabit. Using NHK’s UHDTV

resolution raise this to 1.8 Terabit.

Page 9: Eli Noam Article

9

Obviously, all of these numbers will be squeezed by compression and

other techniques. But this is the reference point from which engineering

must artfully whittle bits away to fit the narrower channel. Even if we

compress and reduce bandwidth by a huge factor of 1000, it would still

require 600 Mbps per household.

And then, there is another next generation beyond that next of 4K. 4KTV

does not include the future TV of immersion. Such TV would permit a

user to participate, like a game player or the user of virtual worlds, in

visual action, and be inside the imagery with its visual and auditory

action. Such a kind of TV is a logical extension of video games, and it is

entirely predictable and inevitable. In terms of bit requirements, it

would be another 16 times expansion. This would bring it,

uncompressed, to 30 Terabits per second.

From today’s perspective there might not be much of an imminent

consumer demand, even for 4K. But that’s what people also said when

color supplanted black-and-white, or when 1080 lines of HD doubled

the 525 lines of NTSC, or when DVDs replaced DVDs, or when cable TV

introduced 12 channels instead of the four or five over-the-air signals.

Page 10: Eli Noam Article

10

Users get used to higher quality and quantity almost immediately and

never go back.

So my first conclusion is, there is a huge upside to speed requirements.

Obviously, supply will sometimes be ahead of demand, it’s a chicken-

and-egg situation. But that’s not a long term issue. Just a short time ago

most internet video ventures collapsed because their speed

requirements were ahead of the user base. Today, Netflix has overtaken

Comcast as the number one video subscriber service in the U.S. and

maybe the world. Online video is everywhere. Try to take it away and

expect a rebellion. Try to deny it to some people and expect a revolt.

Yesterday’s vision becomes today’s commonplace , tomorrow’s

entitlement, and on the day after tomorrow’s human right. In other

words, speed is not just a technological and economic issue, it’s also a

political one.

This is important to understand, because it has direct implications for

the various national broadband plans that are being unveiled around

the world. I will address this now, but for reasons of time I will focus on

only one issue – the role of wireless.

Page 11: Eli Noam Article

11

Or, more specifically, when it comes to broadband, is wireless a

substitute for wireline? Will 4G become the substitute for broadband?

Many people believe that wireless will solve competition, it will solve

the rural problem, it will solve the budget deficit.

Just about everyone wants the answer to be yes. Governments, because

this makes it possible to claim success in bringing broadband to rural

areas, and in creating competition. Telecom providers, because it will

reduce the regulatory burden on them, and also give them hopefully

more spectrum. Startups and technologists like it, because it provides

opportunities for new things. There are not many skeptical voices out

there.

Last March, the Federal Communications Commission in Washington,

with major fanfare, presented its National Broadband Plan (NBP).

370 pages. Good information. Good insights. We at CITI contributed a

commissioned report to it, on investment plans by companies.

So if I ask the Americans in the audience, what is the central point of the

plan? People will usually tell you, the central point is the ‘100 square’,

100 megabits to 100 mil people in 2020.

Page 12: Eli Noam Article

12

But No. the key operational aspect is the goal of bringing broadband to

rural areas by wireless.

The idea is to liberate 500 Megahertz of spectrum, mostly from the TV

broadcasters, to auction it off to providers of 4G-- presumably mobile

telecom companies, who are cheering this on, because it would more

than doubling their spectrum-- and to use the proceeds to create

broadband connectivity for unserved areas and people. Spinning the

gold of broadband networks out of thin air, at no cost to taxpayers.

Let’s look at the elements of this program. First, Will it happen? Second,

Will it do the job of spreading broadband? For reasons of time, and for

reasons that the issue is very America-specific, I cannot elaborate on the

first question, except to summarize my analysis:

a major struggle with broadcasters in the name of broadband internet

will result in generating only $8.2 billion towards infrastructure, all of it

for 4G wireless, and providing only a modest pipe.

The second question to address is whether the mobile wireless

approach will do the job of creating broadband for rural areas and

people. And here, too, I am skeptical.

Page 13: Eli Noam Article

13

My conclusion is not based on lack of knowledge about the wireless

medium. Having been a long time wireless enthusiast as a licensed radio

amateur, Advanced Class, I’ve operated mobile radio transmitters and

receivers before mobile wireless became a consumer product.

4G wireless would reach realistically speeds of only about 3 Mbps –

ignore all of those ‘up to’ speed projections in the press that are

generated by corporate PR machines --. This is only a fraction of the

speed of wireline speeds. In comparison, fiber supports today 150 Mbps,

and can easily be upgraded to gigabit speeds as demand emerges.

Cable’s DOCSIS 3.0 modem service runs at over 50 Mbps and can readily

reach 200. Even DSL, using slightly improved telephone networks, can

reach in newer versions over 20 Mbps. In other words, fiber and cable

are 50-100 times as fast, and DSL is about 7 times as fast, and they have

a decent headroom to expand speed.

Second, if millions of people were to stream movies over wireless, the

networks would come to a crawl. Let’s do a simple back-of-the envelope

calculation. Suppose we succeed in liberating to 4G new spectrum

totaling 300 Megahertz. Each cell site could use one sixth of these

Page 14: Eli Noam Article

14

frequencies without interfering with its neighboring cell site. (The

allocation of non-interfering frequency groups is one of the main

principles of cellular technology). Furthermore, duplex (2-way)

communications would halve the channel in each direction. This would

translate -- using a translation of 2 bits per second per each hertz -- to a

2-way pipe of 50 Megabits per second. This pipe would be shared by

the several companies providing the service in the area of the cell site. It

would also be shared by the users in the same cell site. Therefore, any

time that more than 10 people try to use the cell site at the same time,

the average speeds (combining uploads and downloads) would drop

below 5 Mbps (50 Mbps divided by 10.) And of course more than 10

people would use the cell site if it is the only or main connectivity to the

internet. The only way to counteract this would be by constructing a

large number of additional cell sites, so that the number of pops

(people) per site would drop. But even if there would be a cellsite for

each single user (less than a pop), the speed, by the above calculation,

would be only 50 Mbps. This is not a matter of better engineering, it’s

physics. Engineering might improve spectrum efficiency and other

elements.

Page 15: Eli Noam Article

15

Thus, wireless is not going to catch up with wireline. There are several

reasons for this. First, as much technical progress as wireless is making,

it is not gaining on wireline, which always seems to be roughly 2 orders

of magnitude faster, and growing faster at an even faster rate than

wireless in recent years. The growth rate is 51.2% for wireless after

1980. Which is phenomenal. But it is 34.7% for wireline bandwidth

after 1940. And as you can see much faster more recently.

Both numbers are well within a Moore’s Law type progress.

Page 16: Eli Noam Article

16

Second, these are engineering numbers, not economic ones. The

problem with wireless is that it has negative economies of speed, while

wireline has positive economies of speed. If you double network speed

for wireless, one need more spectrum, and more closely spaced cellsites.

Spectrum becomes more expensive as it becomes harder to vacate,

more competed for by companies, moving to less desirable frequency

bands, and requiring bigger political and regulatory battles. Cellsites

become more expensive as the easier locations are used, and

Page 17: Eli Noam Article

17

landowners become more savvy. On top of the rising costs, these

cellsites now serve fewer people, so average costs rise.

In contrast, to add to the bitrate of fiber wireline requires mostly adding

electronics, and can be done without high transaction costs, just capex

cost.

For both platforms, costs are declining at first with speed. They are u-

shaped for wireless: the first units of load expensive, then cheaper, bu

eventually higher again. But for wireline, cost is asymptotic with speed,

at least for a long while.

So for a while we moved down that U-curve of wireless. But I doubt that

this will continue. Its Inherently a limited resource. Not as limited as

people thought, but still limited, and it is a shared resource, in which

users collide. I proposed an economic arrangement, not a property

rights one or a regulatory one. But it would raise the price, based on

congestion.

It is a waste to use spectrum for fixed applications, outside of low

density areas. Just as it is a waste to use fossil fuel for transportation.

This difference – economies of speed for wireline, and diseconomies of

speed for wireless—are the crucial factor. It means that it makes no

Page 18: Eli Noam Article

18

economic sense for wireless to be the substitute for high-speed wireline

when it comes to fixed locations such as homes and offices. It would be a

waste of scarce spectrum. Wireless has its unique uses in mobile and

nomadic applications, or in inaccessible areas, where people would

accept a lower speed. It might also be a very short tail for a wireline

network, more like a cordless phone. But it would not be the alternative

platform to wireline.

As I mentioned, uncompressed HDTV requires a transmission speed of

about 10 Gbps per household. And next generation 4K TV would bring

it up to .6 Terabits. This is about 200,000 as much as the speed of 4G

under normal utilization, and even more if 4G is heavily utilized!

There are other dimensions, too. First, cost to users. Because of the

relative scarcity of spectrum, mobile 4G broadband service would be

more expensive than wireline services, as a way to match demand with

supply. Satellite-based broadband internet, even in its forthcoming next

generation, is still more expensive.

Second, cost to taxpayers. From a taxpayer’s perspective, a 4G wireless

coverage would also be more expensive than DSL for large parts of the

Page 19: Eli Noam Article

19

country. This is shown by the FCC itself in one of its maps. For the

western and northeastern parts of the country, closing the broadband

gap by means of DSL would be cheaper than with 4G. The basic insight is

that we are not trying to blanket empty areas, but reach households

who mostly already have a wired connectivity of phone or cable. So we

are talking about upgrades, not greenfield construction.

The third problem is the restrictiveness on users. The inherent

limitations of wireless communications mean that their use would be

more restricted and managed by the network operator to keep data

flowing. In other words, the openness of the internet, protected through

rules of net neutrality which are also important in many countries,

would be harder to sustain in the more limited wireless 4G environment.

Would rural areas accept for long the 4G mobile communications as

their broadband platform—at a lower speed, higher price, and with less

openness? At first, it would of course be an improvement for those who

currently have no broadband access at all, and provide competitive

alternatives to others. This would be welcomed with open arms. But

soon, the reality of a second-grade quality of connectivity will sink in.

Page 20: Eli Noam Article

20

We should not expect rural areas to sit by and stare into their little

4Gwireless laptop screens while their metropolitan brethren enjoy 2-

way, 3D, 4K, 5.1 sound, 6 foot screen television.

It will lead to alternative market-based or political solutions to upgrade

the service level to match that of metropolitan areas, i.e., to wireline.

Thus, 4G wireless is only a temporary substitute.

Why then not move the national effort to fiber (with possible tails of

coax or fixed wireless), which is future-proof, in contrast to wireless?

The problem is that the Federal budget deficit does not permit the

funding of a national fiber or rural network upgrade initiative. With no

public money to spend, this leaves the government with the fallback to

use an off-budget currency – spectrum allocations— to advance its goals,

and it shapes its preference to the wireless platform.

So we should now engage in the debate and constructive dialog how we

can upgrade rural networks, instead of how we can give them some

second rate connectivity. This does not replace the wireless approach

but adds a strong wireline alternative dimension.

Page 21: Eli Noam Article

21

Moving more spectrum to mobile and fixed wireless users is a laudable

goal and deserving support. But it is hardly a national broadband push.

It’s foremost a mobile enhancement. Its main contribution would be to

improve the coverage for every smartphone user in the country to

higher data speeds, to make broadband ubiquitous geographically, and

to create competitive alternatives to the existing cable-telco duopoly.

These are important accomplishments. But they do not solve the rural

broadband problem.

There is no doubt in my mind that within 20 years in the richer

countries virtually all households will use bandwidth well above 200

Mbps. Much of it will be provided on a commercial basis, but some will

have to be generated by a variety of public policies. In 20 years there

will be fiber connectivity pretty much wherever there is copper today,

using the same rights of way, utility poles, and ducts. And people will

then wonder how 20 years earlier we thought that 3 Mpbs would be

enough. Just as we wonder today how our parents or grandparents got

along on 3 or 4 TV channels.

Page 22: Eli Noam Article

22

Is this the skeptical scenario? No. Such skepticism would end up costing

us a high price due to retarded development. Is it the visionary

scenario? Not really. I think that thinking about gigabit networks to the

home is realistic. What is not realistic is to think that one can do so on

the cheap, through wireless.