12
A-TeAmGroup An A-TEAMGROUP Publication Market Data – Methods for Managing Costs in Tough Times September 2009 Sponsored By: www.a-teamgroup.com It’s said that market data is the lifeblood of financial markets. Today, there’s more of it than ever before, and it’s delivered at sub-millisecond rates. It’s an important resource that needs an increasingly important resource to manage it. But it’s also said to be the second-highest cost item for market participants after payroll. Needless to say, with costs everywhere under scrutiny, even market data can’t escape the attention of the bean-counters. Market data managers are facing an impossible trade- off: how to do more, with less. Or are they? This special report examines how to keep the lid on market data costs, even as timely and comprehensive information becomes more important to a market practitioner’s ability to perform. MarketDataInsight presents

September 2009 A-Te MarketData presents Market Data ... · low-cost client-site commodity market data servers to help manage costs. • Hosted consolidated feed. Delivers compact,

Embed Size (px)

Citation preview

A-TeA

mGro

up

An A-TEAMGROUP Publication

Market Data – Methods for Managing Costs in Tough Times

September 2009

Sponsored By:

www.a-teamgroup.com

It’s said that market data is the lifeblood of financial markets. Today, there’s more of it than ever before, and it’s delivered at sub-millisecond rates. It’s an important resource that needs an increasingly important resource to manage it.

But it’s also said to be the second-highest cost item for market participants after payroll. Needless to say, with costs everywhere under scrutiny, even market data can’t escape the attention of the bean-counters.

Market data managers are facing an impossible trade-off: how to do more, with less. Or are they?

This special report examines how to keep the lid on market data costs, even as timely and comprehensive information becomes more important to a market practitioner’s ability to perform.

MarketDataInsight presents

Are you paying too much formarket data?

Or paying for content youjust don’t need?

Move over to Fidessa’s award-winningmarket data solution

Powerful . Comprehensive . Affordable

To find out more about our market data solution and why Fidessa is trusted bymore than 24,000 users visit www.fidessa.com or email [email protected].

niques involve a real assessment of mar-kets in which actively passive, choice of location of co-location infrastructures and a true cost/benefit analysis of low-latency connectivity choices. They in-volve long-term decisions about where data is sourced and hosted, and how it’s delivered. Managed services – long dis-missed as outsourcing in disguise – are now taken seriously by firms large and small seeking to reduce the overhead of the market data infrastructures.

The credibility of managed services means that consumers are tapping into new suppliers of market data – from value-added networks services provid-ers through to execution management systems vendors, which find themselves managing low-latency market data by necessity and through their connec-tivity services are able to offer services to clients.

The power of the network is being ex-ploited, as multiple services are delivered through the same pipe to a firm’s indi-vidual locations. The potential for con-solidating network suppliers is growing, helping to reduce the overhead associat-ed with delivering data from a multitude of originators.

At the heart of getting value for money, though, is a concept that hasn’t changed through the generations, even though

The move toward fast markets is placing renewed emphasis on market data, with algorithmic

trading and other applications requiring every tick, delivered in true, low-latent real time. There are more ticks, delivered more quickly than ever, from a wider range of sources than ever before.

Meanwhile, costs – not least the cost of market data – are under scrutiny. Market data – famously, if not in actual fact – is a firm’s largest single cost, aside from human capital. Whatever, market data costs are not trivial. The current market environment just adds to the urgency of the requirement to control them.

But it’s also true that market data man-agement is a mature art. For many of those charged with managing a firm’s market data resources, this may be the third or fourth recession in which scru-tiny has settled on their raw materials. We have been through the era of inven-tory control, optimization of fee-liable services and the raking over of contracts to wring out every last ounce of market data value.

Is there more that can be done?Apparently so. But no longer can signif-

icant savings be wrought from a simple audit. Those kinds of inefficiencies were excised a couple of generations back.

Today’s efficiency maximization tech-

Managing Market Data Costs - A-Team Group

September 2009 • Issue 06 3An A-TEAMGROUP Publication

its application by practitioners has been questionable: ensuring that the consum-er gets only the data he, she or it needs. It’s true to say that in many cases, the enterprise is awash with data even if the end-consumers aren’t getting the data they require. Firmly focusing relevant data on the applications that need it is the order of the day. By careful manage-ment and sourcing, humans and appli-cations alike can be supported properly without breaking the bank.

Clearly, if it were easy, it would have been done before. Applying markets expertise to data sourcing is a difficult skill, and securing the appropriate mix of delivery methods – through the use of direct and consolidated feeds, and local and hosted services – is a design chal-lenge that requires careful analysis.

One thing is for sure: The current round of market data cost reviews won’t be the last. But it may count as one of the trickiest, with firms having to make their incisions with due care – and to think big picture when looking at new ways of handling important data services.

Today more than ever, market data can claim to be the lifeblood of the financial markets. Handle with care.

Andrew DelaneyEditor-in-Chief

The Market Data Trade-OffBy Andrew Delaney, President & Editor-in-Chief, A-Team Group

For news, thought leadership and resources on low latency applications, technologies and architectures…

www.low-latency.comThe online resource from A-Team Group

A-TeAmGROUP

Managing Market Data Costs - Interactive Data

September 2009 • Issue 064 An A-TEAMGROUP Publication

Managing Market Data Costs by Stratifying Services

co-locate applications alongside a low-latency consolidated global datafeed at a number of hosting centers in order to reduce latency and save on communi-cations fees.

Mitigated feeds• . For clients who do not need every tick of data, intelligent-ly mitigated feeds are a smart way to receive essential data without “all the noise,” bandwidth and processing costs that full-tick feeds require. Examples in-clude a mitigated options datafeed that is designed to help reduce bandwidth by over 90 percent compared to the full OPRA feed, or a mitigated equities feed of NYSE data.

XML data delivery via the Internet• . For applications requiring a few hundred re-quests per second or less, this can be an economical option because it does not require dedicated equipment or circuits and can be integrated quickly.

Intra-day snapshot data• . A flexible portfolio administration tool that can deliver snapshot data every five min-utes when streaming real-time data is not a requirement.

End-of-day data• . A data service with a communications interface linking an extensive collection of data to PC and workstation software.Interactive Data delivers the full range

of market data services and has designed them to help reduce firms’ total cost of data usage. The introduction of new hosted market data services, as well as more effi-cient bandwidth use with a focus on high-performance market data servers and mit-igated datafeeds, represent major steps in helping clients to address rising costs.

For instance, Interactive Data’s Client Site Processor (CSP) technology, which was one of the 2008 InfoWorld top 100 IT initiatives, increases performance by 100 percent over the previous version, which itself was signif-icantly faster than the technology of some other providers. The new CSP is also based

Market data costs continue to rise rapidly, driven by ever-in-creasing market data volumes

and the communications and infrastruc-ture costs associated with them. To ad-dress and contain the rising costs, firms are employing several strategies to make their dollars stretch further. One tactic is to stratify the firm’s market data user base and provide each group with only the data that it needs.

There is a wide and growing spectrum of market data services available to firms that address varying needs across the enterprise, from ultra-low latency direct exchange feeds used to power electronic trading applications, to end-of-day data used for back office portfolio pricing. The choices include:

Ultra-low latency direct exchange •feeds. The costs of implementing sub-millisecond direct feeds can be signifi-cantly reduced by using the vendor hosted model, which reduces commu-nications, ticker plant maintenance and ticker plant capacity management costs for clients, among other expenses.

Deployed low-latency consolidated •feed. Technological advances in areas such as data compression and multi- core processing help keep communica-tions and processing costs down, and allow firms to use relatively compact and low-cost client-site commodity market data servers to help manage costs.

Hosted consolidated feed• . Delivers compact, bandwidth-efficient, stream-ing tick-by-tick data for a subset of the consolidated feed data via leased line or secure Internet VPN. With the market data infrastructure hosted by the vendor, and communications costs reduced by user-defined portfolios or watchlists, hosted datafeeds offer a sig-nificant opportunity for cost savings for many firms.

Co-location service• . Enables firms to

on relatively compact and low-cost Dell servers to help keep costs under control.

At the same time, Interactive Data uses advanced message encoding techniques, designed to be twice as efficient as the FAST protocol, to deliver its PlusFeedSM consolidated feed data. This technology helps to keep down the costs of deliver-ing data to the customer site.

Meanwhile, an upcoming version of the CSP supports a compact output format that reduces the server’s output for clients by a ratio of about 7:1, which will result in further cost savings for hosted datafeed clients using the firm’s PlusFeed Select and PlusFeed VPN services. Turning to APIs, Interactive Data has recently released sup-port for Java bindings and shortly plans to release an upgraded C++ version. Both releases are designed for easy integration of PlusFeed, helping to reduce long de-velopment cycles.

With its market data offerings, Interactive Data’s goal is to offer real-time solutions that are designed to address client requirements and at the same time lessen the impact of enormous increases in market data volume and reduce the complexity of implementation to help keep the total cost of ownership down.

Don Finucane is Vice President of Product Manage-ment and OTC Data Services at Interactive Data.Interactive Data Corporation (NYSE: IDC) is a lead-ing global provider of financial market data, ana-lytics and related solutions to financial institutions, active traders and individual investors. PlusFeed, Interactive Data’s consolidated low-latency global datafeed, provides data from over 450 sources and exchanges, covering over 6 million instruments. PlusFeed provides global coverage across all asset classes, and extensive Level 2 data for a wide range of global exchanges.www.interactivedata.comThis article is provided for information purposes only. Nothing herein should be construed as legal or other professional advice or be relied upon as such.

By Don Finucane, Vice President of Product Management and OTC Data Services, Interactive Data

The global markets move fast. Does your data?

Wherever you trade around the world, BT brings you closer and gets you there faster with our suite of low latency connectivity and hosting services.

Closer is Radianz Proximity Solution, BT’s fully managed hosting and connectivity service that places your servers next to exchanges, MTFs and ECNs in New York, London, Chicago, Paris, Frankfurt, Tokyo, Hong Kong, Singapore and Sydney, giving you access to real-time market data from more than 25 trading venues across multiple asset classes in less than 10 milliseconds.

Faster is Radianz Ultra Access, BT’s ultra low latency service utilizing patent-pending technology to connect you to market data and trading in under 1 millisecond for 15 exchanges, MTFs and ECNs in New York, London and Chicago.

BT’s Radianz financial connectivity and hosting services bring together a community of more than 11,000 customer locations around the world, enabling access to hundreds of pre-trade, trade, and post-trade applications covering all asset classes across the entire securities trade cycle.

BT. Bringing it all together

C

M

Y

CM

MY

CY

CMY

K

wifc_spread_left.pdf 9/9/2009 11:51:52 AM

Market Data: Methods for managing costs in tough times – Market Data: Methods for managing costs in tough times – a network-centric view

BT Financial Services Sector Insight

Despite the fact that the technology underpinning the market data sector has changed so dramatically, its way of dealing with that technology has often changed relatively little. Market data vendors and users still tend to use a separate wire to connect to each of the di�erent data sources that they use. If somebody told you that you had to have a separate wire to connect to each of the di�erent people who you wanted to talk to, you’d think that they were insane. The internet is the same – one connection gives you access to the billions of people and services available online.

Not so with a modern dealing room where you’re still likely to see individual connections to di�erent data sources – maybe to dozens of data sources – each with its own cost. So, if you use a separate cable for each data source that you want to access, there are no economies of scale. Accessing market data therefore costs much more than it needs to.

There are many ways to manage costs, though the aim is generally the same – to bring costs down. In today’s tougher times, where �nancial institutions have much less money than before, bringing costs down by a serious amount needs to be a bottom-up process, and not just a top-down cost-trimming exercise. The top-down approach is to simply cut down on the data you see on your screen. The bottom-up approach, however, means looking at how the data you use gets delivered.

The rules of economy of scale apply to networks as they do to almost anything else. Like the shipping industry, costs relate to what volume of stu� you want to move and how far you want to move it. The more containers that you can �t onto a ship, the less it costs per-container to ship them. In networks, the more megabits the network cable can carry, the cheaper it costs per-megabit to move the data.

The solution comes down to a simple concept: sharing. We know that we all share the internet, and by sharing an infrastructure we all gain the bene�ts of all of the economies of scale that each of us creates which makes internet usage cheaper for all of us. We all share telephone networks: when we are not talking to someone, the line is left free for someone else to talk to them. We share the local area network in our o�ce building. So why don’t we share our market data networks?

One classic reason was that you were given no choice but to buy a network connection from each data source. Each data source made you buy into its own network, so you ended up with a separate network connection for each data source that you wanted to access. Lots of reasons were given for

this requirement, eg only the data source could understand the technology, guarantee the deliv-ery, understand its proprietary data formats and communications protocols, etc. But as the inter-net has shown us that data communications is no longer “black magic”, we have become less willing to accept the “if you want it, you do it my way” approach from data sources.

Once a �nancial institution decides to take strategic ownership of how it accesses market data sources, it can change the shape of its cost structure. Rather than each market data source gaining its own economies of scale by running its own network, the data user gains overall economies of scale by routing all of the market data that it needs down its own infrastructure. Rather than the market data source making a pro�t margin on its network service, the market data user makes a savings margin on its own network infrastructure. The more data that it puts down its own infrastructure, the greater percentage that it saves.

Yankee Group recently conducted an analysis of potential savings

�nancial institutions might realize by accessing multiple services through a single infrastructure.

They concluded that overall savings exceeded 50%.

Having multiple services over a single infrastructure is not limited to market data alone. By adding other services – trading, clearing, settlement, market data distribution – the data volumes increase and the economies of scale increase even further as a result.

And this is what is happening around us in major �nancial institutions in the world of “Uni�ed Com-munications”. By combining data �ows from their front-o�ce and back-o�ce operations, from their retail activities and their institutional activities, by combining voice and data onto a single communications infrastructure they are able to generate economies of scale that give them an advantage over their competitors – a reduced cost per data element.

It’s clear that there are massive and unnecessary costs across the market data industry today. The good news is that there is a lot of money to be saved, and unifying communications by consolidating data services onto a single connection is one of the best ways to save.

C

M

Y

CM

MY

CY

CMY

K

wifc_spread right.pdf 9/9/2009 11:49:28 AM

How to do more with your market data without hurt-ing performance: Our panel

of experts offers insights into how to develop a strategy that optimizes the use of market data sources while ensuring high levels of service across the enterprise.

Given today’s stress on high-performance, low-latency delivery, is it realistic for firms to expect to be able to reduce their spending on market data services?

Finucane: It is probably not real-istic to expect firms to maintain cur-rent levels of service and reduce their spending on market data. That said, there are clearly areas of opportunity to reduce costs, but these are largely offset by rising costs associated with delivering vast quantities of low and ultra-low latency data to electronic trading applications. Firms that are willing to work with alternative pro-viders of market data services – and open to looking beyond the incum-bent providers for new and innovative solutions – are likely to find real cost savings.

Pickles: We’re seeing ever-rising market data volumes from exist-ing sources. We’re seeing increased market data volumes as a result of new market regulations requiring increased transparency across more asset class-es. We’re seeing increased demand for market data for liquidity management

Managing Market Data Costs - Roundtable

September 2009 • Issue 068 An A-TEAMGROUP Publication

and risk management purposes. And we’re seeing more execution venues as new regulations change the shape of the market and competition across the market. Add onto that the low-latency drive to move all of this data faster, and the result is unlikely to be cheaper if firms continue to approach their market data environment the way that they have traditionally done so. The value of information is being un-derstood more and more as firms are able to process more and more data and turn it into information. For firms to be able to reduce their spending on market data, they need to become much clearer about what they really need, to pay only for what they need, and not to expect to get everything for less and then throw most of it into a garbage can.

Grob: The demands for high per-formance, low-latency data delivery are definitely there but these need to be managed against an overall pres-sure to contain costs. This provides a real challenge for firms as, at first glance, these objectives would seem to be mutually exclusive. This is one of the reasons why Fidessa has invest-ed so heavily in its global ticker plant technology. What we do is take market data from around 100 different venues and then combine it with news, chart-ing, analysis and other useful data. This is then presented as a single, clean “data surface” to our customers on both the buy-side and the sell-side. The advantage is that they only have to integrate with this surface once and so can enjoy all the benefits of high

Roundtable: Methods for Managing Costs in Tough Times

Steve Grob, Director of Strategy, Fidessa

Chris Pickles, Head of Marketing - Financial Markets & Wholesale Banking, BT Global Services

Don Finucane, Vice President of Product Management and OTC Data Services, Interactive Data

considerable value-add that comes from rationalising all of that data that gets output originally in different proprietary formats over proprietary protocols. Direct exchange feeds can be low cost in the exchange’s home country and incredibly expensive if you are located in another continent – particularly if an exchange forces its customers to use the exchange’s pri-vate network. And today we have to consider the sheer volume of data that increases the cost of the technology that you need to receive it – an exam-ple is the recent forecast from OPRA that its direct users will each need capacity of over 500 megabits per second for its feed.

Grob: Obviously, different venues will have different pricing models for their data and, of course, the analyt-ics and news providers have an even more diverse range of pricing op-tions. The real difference, though, seems to occur on a geographic basis. This is especially true in terms of the cost of putting in reliable and effec-tive comms infrastructure. The Middle East and Russia, for example, are two regions where comms lines are notori-ously unreliable and, often, expensive. This is, in part, down to the fact that there simply isn’t the range of comms suppliers in these regions to provide the level of competition usually re-quired to drive down costs and im-prove service levels. Also, maintaining multiple sources and or vendors will see costs escalate pretty quickly so it makes commercial sense to consoli-date data, particularly if you can do so through a single data supplier.

What are the best techniques for optimizing market data consumption, both for investment professionals and for computer-based applications?

Finucane: Firms should conduct a

Managing Market Data Costs - A-Team Group

September 2009 • Issue 06 9An A-TEAMGROUP Publication

performance, low-latency data with the minimum integration costs.

What are the key steps in a review of market data costs?

Finucane: One of the key steps is to have a process to keep track of market data use and also periodically review the need for each specific type of data and service. Firms need to go through the list of data they are receiving and ask, “Do we really need this con-tent or this functionality?” Today’s technology will allow solutions to be tailored to each group.

Pickles: Starting the analysis bot-tom-up is key – starting at the foun-dation. Getting your market data en-vironment right when it’s built on an infrastructure that is inappropriate, inefficient and outdated is not really possible, and yet most firms (though not necessarily the largest firms) are still using 1990s approaches to market data today. Twenty years ago, the head of treasury of a major global bank said to a major global data vendor, “I just want you to be two wires out of the wall”, so that his firm could get eve-rything down those two wires. Most firms today are not even trying to get to that point, while larger firms are clearly moving in that direction. They want one infrastructure overall, and not one set of infrastructure per data source. You only achieve that by start-ing from the bottom up.

Grob: The first step is to segment the different requirements for each business unit. Different types of trad-ing and asset classes will require dif-ferent levels of minimum acceptable latency and reliability. Different busi-ness units, or individual users, may also have different coverage needs. This information can then be used to provide an overall map of require-ments. The second step is then to look at how effectively the firm is meeting

these needs. Costs are typically broken down into data vending fees (direct from the exchange), transmission fees and connectivity infrastructure. Obviously, one of the easiest ways to reduce costs is to minimise the differ-ent communication infrastructure re-quirements that are involved. In some cases this might involve making some tough decisions between specialised high performance data vendors and those that offer broader instrument or geographic coverage. Where individu-al user requirements are defined, the use of a highly flexible, high perform-ance service, such as that provided by Fidessa, allows companies to tailor each user’s coverage to their specific needs, thereby containing costs.

What types of data incur the highest cost: consolidated real-time exchange data; direct exchange feeds; analytics; news services; research services; other?

Finucane: For firms that do not use a vendor, direct exchange feeds are typically the most expensive to imple-ment. In addition to communications costs, firms must assume the costs of ticker plant maintenance, managing ticker plant capacity, development and quality assurance, building moni-toring tools and adding supplemental data.

Pickles: There are two sides, at least, to an answer – one relating to the cost of the content and the other relating to the cost of the infrastruc-ture. In terms of content, the simple answer is that the information that has the highest added-value on top of the raw data has the highest cost. That can be research services, for example, at a content level. Consolidated exchange data can be expensive because of the amount of valuable content but also because of the geographical and technical complexity of achieving that consolidation, and also because of the

Managing Market Data Costs - Roundtable

September 2009 • Issue 0610 An A-TEAMGROUP Publication

detailed analysis of user requirements that includes future growth and ex-pected usage, separate the wants from the needs, and design solutions that meet user requirements in a cost-effective way. This approach works for all services from workstations to datafeeds.

With Interactive Data’s PlusFeedSM Data Distribution Platform (PDDP) – a high performance, scalable data management system – users can in-tegrate other internal and external datastreams, as well as distribute a normalized datafeed to thousands of end-users and applications through a comprehensive set of APIs. Users can stream full orderbook data tick-by-tick where required, snap pricing data for those applications which need only limited information and mitigate throughput on a per exchange basis to help optimize a user’s infrastructure’s overall performance. While some ap-plications require every tick, others would be fine with mitigation. The PDDP is designed to help those with multiple applications.

Pickles: Usage analysis has to be a core technique for this. Now the in-dustry has vendors delivering data usage analytics services and market data cost analysis services, so there is much less of an excuse for firms to say that this can’t be done. Paying for what you need, and not for what is available to you, is basic good business practice. But looking at the overall cost of the market data business operation should be the real starting point. All costs need to be taken into account – hardware, software, network, personnel, space, power, air conditioning, management time, etc – to see what the consump-tion of data is truly costing the firm using its existing infrastructure. That way the firm can see which data has high cost with high value and which data has high cost with low value. Changing your network infrastructure for receiving market data can save perhaps 50% of a firm’s technology

costs for receiving that data.

Grob: There’s no substitute for un-derstanding the user requirement for data and delivering precisely what they need in the most efficient way. As the requirement varies, so too will the method of delivery. Probably one of most challenging market data feeds to process is the OPRA price feed that delivers pricing for US Equity Options. Because of the enormous number of instruments (multiply every US stock by Call/Put, expiry date, strike price, etc) this creates a feed that runs at up to 2 million messages per second. Most traders don’t want to have to receive and handle the full feed, but be selec-tive about the instruments/strike/etc they want and then ensure they get all data relating to them as fast as possi-ble. Consequently, when Fidessa de-cided to add support for US equity op-tions, we were faced with significant challenges in terms of how to handle this feed and adopted a request-based model to satisfy these user require-ments. This means that the front end can be configured in such a way as to allow the user to select which con-tracts he requires real time updates for. Other prices can then be sent out on demand.

How important are market data inventory systems to managing market data services?

Pickles: Data needs to be identi-fied and its users and usage needs to be tracked to see what is consumed, where and in what way. Market data doesn’t come free and doesn’t come at a fixed low cost. For years in the UK most homes haven’t had to track their usage of water: they get all that they want for a fixed price. In many other western countries, water is metered and you pay for every drop that you use. In the UK you were really sup-posed to tell the water company if you happened to have a swimming pool:

who would do that if they didn’t really have to? The parallel in the market data industry is the “honesty state-ment” for reporting data usage to data sources. Market data has value to both the user and the supplier. Another factor with data is that the infrastruc-ture costs money, whether you get value out of the data or not. Moving data round without using it and get-ting value from it just adds to the un-necessary costs. Market data invento-ry systems are vital in order to be able to manage your data costs as well as your infrastructure costs as effectively as possible. These costs are being born directly by the user firm, and not di-rectly by the firm’s clients, and over-head and infrastructure costs are get-ting to be harder and harder to pass on to clients.

Grob: The objective of market data inventory systems is to try and cap-ture all the relevant information as-sociated with a firm’s market data costs so that these can be analysed on a regular basis. This is useful both within the firm, to audit the use of ex-pensive market data, and externally, to demonstrate the appropriate due diligence and monitoring to market data suppliers.

Such systems are not critical in themselves, but are an important part of understanding costs and mitigat-ing the risks associated with over-use of market data. Neither are such systems foolproof: an application might subscribe to market data, and then republish it to a large number of users in a form which allows the source data to be easily reconstituted. Such indirect usage is difficult to mon-itor effectively.

While undoubtedly important, these systems also represent a significant overhead in terms of implementation and operating costs. One of the major benefits of a managed data service like Fidessa is that they effectively include a fully managed inventory system and can thus provide reports of what each user has for the clients, exchanges,

Managing Market Data Costs - A-Team Group

data sources etc and handle all the re-porting for exchange fees etc also.

How is the increasing acceptance of managed/hosted market data delivery impacting data costs?

Finucane: The rapid adoption of hosted services is having an immedi-ate positive impact on costs. Firms that were once reluctant to host serv-ices outside their own data centers are now embracing the idea, as commu-nications and security concerns have been addressed. It is allowing firms to make their market data spend go fur-ther and to be able to contain costs that would otherwise be rising faster.

Pickles: It’s helping to bring down costs by helping data users to ration-alise unnecessary or inappropriate

infrastructure. As an example, you could take twenty data feeds directly from twenty exchanges with twenty different networks and software in-frastructures, or you could take one consolidated feed of data from twenty exchanges over a single network and software infrastructure. The costs models that underpin those two sce-narios are very different. Managed/hosted market data delivery solu-tions meet many more users’ require-ments then in-house solutions. We also have to remember the “long tail” in this market. In Europe some 20 fi-nancial institutions are responsible for around 80% of equity trading volume, with some 2,000 financial institutions handling the remaining 20% of equity trading volume. Different firms han-dling different volumes and types of business and data have very different requirements, and managed/hosted solutions are particularly relevant to

a very large number of firms, just as Software As A Service is for many ap-plication users.

Grob: We’re seeing increased demand for managed data services precisely because firms are looking to reduce the operational overhead of manag-ing multiple data sources themselves.

While historically many firms sought to retain all of their trading infrastruc-ture in-house, they are increasingly willing to acknowledge the substan-tial cost benefits that a managed serv-ice provides.

Because Fidessa is able to spread the costs of providing low-latency data across a large number of custom-ers, we can bring down the individual cost for each firm considerably and, with more than 20 years’ experience in delivering mission critical services, our track record gives users the confi-dence they need.

A-TeAmGROUP

Read the Latest Headlines andGet a FREE Issue Now! www.market-data-insight.com

With a focus on pre-trade market data, analytics and decision-support, Market Data Insight offers the most advanced analysis of the trading room technology marketplace available today. Market Data Insight explores how new technologies, instruments and players are impacting the business, reshaping distributor and integrator relationships and turning traditional business models on their heads.

Get real insight into:

• Real-time market data• Specialist news, research and analysis• Trading room infrastructure

Visit Market-Data-Insight.com to read the monthly features and stories, search our full archives, download a free monthly issue and fi nd out how to subscribe for just £350/US$695 per year. Contact [email protected] or call +44(0)20 8090 2055 for more information.

MarketDataInsightwww.market-data-insight.com

www.sungard.com/globaltrading

MARKETMAP

You need accurate and timely financial information. SunGard delivers it to you.

With its suite of MarketMap terminals, SunGard provides high quality market data,

news and analysis from over 160 global exchanges and OTC sources. Very intuitive and

deployed in seconds, our ASP solution brings you fully scalable, multi-asset and powerful

tools to make the most informed decisions.

Control your costs in real time with the integrated permissioning system, unique today in

the market.

Ask today for your free 2-week trial: [email protected].

©2009 SunGardTrademark information: SunGard and the SunGard logo are trademarks or registered trademarks of SunGard Data Systems Inc. or its subsidiaries in the U.S. and other countries. All other trade names are trademarks or registered trademarks of their respective holders.