20
Enterprise Data Management APRIL 2008 • emii.com Moving Data Downstream Downstream data integration becomes less of an upstream swim. Managing a Data Overload Growing data problems drive financial services firms to reevaluate their information strategies. From The Publishers of: Report

Enterprise Data Mgmt Report

Embed Size (px)

Citation preview

Page 1: Enterprise Data Mgmt Report

Enterprise Data Management

APRIL 2008 • emii.com

Moving Data Downstream Downstream data integration becomes less of an upstream swim.

Managing a Data Overload Growing data problems drive financial services firms to reevaluate their information strategies.

From The Publishers of:

Report

2008EDM 4/15/08 4:16 PM Page 1

Page 2: Enterprise Data Mgmt Report

2008EDM 4/15/08 4:16 PM Page 2

Page 3: Enterprise Data Mgmt Report

Enterprise Data Management Report

APRIL 2008 • emii.com

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 3

5 Managing a Data OverloadBy David LewisAn ever-increasing volume of data, a good portion ofwhich contain errors, is driving financial services firmsto reevaluate their information strategies. The biggestchallenge, however, may be getting these firms to buyinto the fact that this is an ongoing process thatneeds to be managed properly.

8 Moving Data Downstream By Stephen MauzyDownstream data integration - the goal of which is to getcleansed, reliable data to the right departments - can beharder than most financial firms think, especially for anindustry built on stand-alone and legacy back-office sys-tems. But the goal is definitely achievable through invest-ments of time and money and proper governance.

12 Tomorrow’s EDM Solutions TodayBy Edward McGannInvestors demand real-time, accurate and, in somecases, enhanced data so they can understand theirpositions and performance in their investment portfo-lios. To meet that demand, BNY Mellon AssetServicing employs not only its powerful technologyinfrastructure but also its product management andbusiness units to develop strategies and devise solu-tions to clients’ requests.

14 A Critical Year for XBRL By David LewisThe SEC's effort to make financial reporting interac-tive through its eXtensible Business ReportingLanguage program is on the fast track to adoption,despite opposition from some corporations andinvestors.

16 One Step Ahead in OTC DerivativesBy Gregory MorrisIn an industry facing calls for improvements, The Depository Trust & Clearing Corp already is making them.

One Step Ahead in OTC Derivatives

Table of Contents

16

5MANAGING A

DATA OVERLOAD

2008EDM 4/15/08 4:16 PM Page 3

Page 4: Enterprise Data Mgmt Report

www.emii.com

A Publication of Institutional Investor, Inc.© Copyright 2008. Institutional Investor, Inc. All rights reserved. New York Publishing offices:225 Park Avenue South, New York, NY 10003 • 212-224-3800 • www.iinews.com

Copyright notice. No part of this publication may be copied, photocopied or duplicated in any form orby any means without Institutional Investor’s prior written consent. Copying of this publication is in vio-lation of the Federal Copyright Law (17 USC 101 et seq.). Violators may be subject to criminal penaltiesas well as liability for substantial monetary damages, including statutory damages up to $100,000 perinfringement, costs and attorney’s fees.

The information contained herein is accurate to the best of the publisher’s knowledge; however, the pub-lisher can accept no responsibility for the accuracy or completeness of such information or for loss ordamage caused by any use thereof.

VINCENT YESENOSKYSenior Operations Manager(212) 224-3057

DAVID SILVASenior Fulfillment Manager(212) 224-3573

REPRINTS DEWEY PALMIERIReprints & Premission Manager (212) [email protected]

CORPORATE GARY MUELLER Chairman & CEO

CHRISTOPHER BROWN President

STEVEN KURTZDirector of Finance & Operations

ROBERT TONCHUK Director/Central Operations & Fulfillment

Customer Service: PO Box 5016,Brentwood, TN 37024-5016.Tel: 1-800-715-9195. Fax: 1-615-377-0525UK: 44 20 7779 8704Hong Kong: 852 2842 6910E-mail: [email protected]

Editorial Offices: 225 Park AvenueSouth, New York, NY 10003. Tel: 1-212-224-3279 Email: [email protected].

EDITORIAL ERIK KOLB

Editor of Business Publishing

DAVID LEWISContributing Reporter

STEPHEN MAUZYContributing Reporter

GREGORY MORRISContributing Reporter

PRODUCTION

AYDAN SAVASER Art Director

MARIA JODICE Advertising Production Manager

(212) 224-3267

ADVERTISING/BUSINESSPUBLISHING

JONATHAN WRIGHTPublisher

(212) 224-3566 [email protected]

PAT BERTUCCIAssociate Publisher

(212) 224-3890

LANCE KISLINGAssociate Publisher

(212) 224-3026

LESLIE NGAdvertising Coordinator

PUBLISHINGBRISTOL VOSS

Publisher(212) 224-3628

MIKE FERGUSMarketing Director

(212) 224-3266

Editor’s NoteWelcome to the 2008 Enterprise Data Management Report, an updateon how financial services firms are addressing the issue of accurate,transparent data and its consistent integration into various applicationsacross their businesses.

Enterprise data management is a relatively new business objectivethat few firms fully understand. That is why the Enterprise DataManagement Report begins with an overview of the concept andthe issues firms face as they move through the various stages ofimplementation. For most, that means starting at the beginning anddealing with an ever-increasing volume of data, a good portion ofwhich contain errors. The biggest challenge, however, may be get-ting these firms to buy into the fact that this is an ongoing processthat needs to be managed properly (see story, page 5).

Next, the Report addresses the issue of downstream data integration,the goal of which is to get cleansed, reliabledata to the right departments. That can beharder than most financial firms think,especially for an industry built on stand-alone and legacy back-office systems. Butthe goal is definitely achievable throughinvestments of time and money and propergovernance (see story, page 8).

Beyond that, the Report includes an itemon the SEC's effort to make financialreporting interactive through its eXtensibleBusiness Reporting Language program(see story, page 14) and sponsored articles from BNY Mellon AssetServicing and The Depository Trust & Clearing Corp.

Enterprise Data Management Report is the latest in a series of specialsupplements produced by Institutional Investor News exclusively for ournewsletter subscribers. It is part of our commitment to bringing our read-ers the freshest news and in-depth analysis on important sectors andtimely topics within the financial markets.

Enjoy,

Erik KolbEditor of Business PublishingInstitutional Investor News

Enterprise Data Management

APRIL 2008 • emii.com

Moving Data Downstream Downstream data integration becomes less of an upstream swim.

Managing a Data Overload Growing data problems drive financial services firms to reevaluate their information strategies.

From The Publishers of:

Report

4 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

2008EDM 4/15/08 4:16 PM Page 4

Page 5: Enterprise Data Mgmt Report

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 5

WALL STREET HAS A PROB-LEM - a big, fat, ugly data problem.While bad information represents alarge part of the problem, another sig-nificant issue is that the sheer volume

of data has surpassed tidal wave proportions. With data growing atexponential rates, particularly at financial services institutions, thetidal wave of data is more like a series of tsunamis.

The reasons are many. Two of them are the plummeting cost andgrowing volume of storage. According to BearingPoint, storage costshave fallen 99.75% per gigabyte since 1980, while online storagevolume is projected to grow 273% between this year and 2011.Another reason is the continuing rise of online commerce, includ-ing 30% annual increases in online auto insurance purchases and anexpected 27% annual rise in online banking.

Cost used to be a filter, according to Ed Hagan, managing directorand global leader of BearingPoint's Information ManagementSolution Suite. Now, however, there is no filter. “The new filterneeds to be, what is valuable information? What is the most impor-tant information to the enterprise? Most financial organizations real-ly can't tell you that,” he said.

The answer is a concept called enterprise data management(EDM). “The biggest challenge is that there is this roiling sea ofdata and, if you jump in just anywhere with your tin cup and startbailing, it's a long and not-so-productive process,” Hagan said.“Whereas if you can somehow start to classify this sea of informa-tion into areas of primary, secondary and tertiary value, you canprioritize your initiatives and start to look at what applications aremost critical to your most valuable information. Then you cantake a more structured approach to solving the problem and man-aging the problem on an ongoing basis.”

Roots of the ProblemPart of today's swelling data problem is simply the ballooningvolumes of information that afflict all U.S. corporations. Partof it, however, can be attributed to the financial services com-panies themselves. “The financial services industry is an inter-esting space,” Hagan said. “On many elements, they are lead-ing the charge with cutting-edge practices. But they also havesome of the biggest problems, so their pain around these issuesis pretty substantial.”

Scott Dillman, managing director at PricewaterhouseCoopers,relayed a story that explains how financial have brought some ofthis burden upon themselves. Not long ago, he and his teamaudited and ‘cleaned’ the basic information of a major bank. Loand behold, about 75% of the bank's accounts contained errors.

Managing a Data OverloadGrowing data problems drive financial services firms to reevaluate information strategiesBy David Lewis

2008EDM 4/15/08 4:16 PM Page 5

Page 6: Enterprise Data Mgmt Report

6 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

There was no intent to mislead, rather the bank had an informa-tion technology problem, Dillman noted. That problem caused itto create an account for ‘John Doe’ at one branch, an account for‘John R. Doe’ at another branch and a ‘J. Randolph Doe’ moneymarket account, before later adding a ‘J.R. Doe’ loan.

The potential for error is obvious. So is the point that this methodmakes it nearly impossible to sort all this out from the ‘John Doeand Mary K. Doe’ joint account.

The bank's problem clearly is that it has no technology capable ofcreating a single identity for ‘John Doe’ with multiple attributes.That means it has no way to provide value-added informationsuch as ‘John Doe, who has two joint accounts with Mary Doe,who has a new line of credit and likes to use our supermarket-based branches.’ It also means that, potential errors aside, themultiple accounts the bank creates and maintains simply clog thesystem, making it redundant and costly.

But the sins of the financial institutions are vastly compoundedby questions of what to do with and how to make sense of theburgeoning heap of numbers and text. According toBearingPoint, e-mail at many financial services firms is increasing40-60% per year, and at some financial companies email volumesare rising at a rate of 100% annually. Meanwhile, trading volumeson the New York Stock Exchange and NASDAQ have increased19 times in the past decade.

Prerequisites to EDMThe answer to the data glut is to be found in a cluster ofacronyms - IM, MDM and EDM. Respectively, these areinformation management, master data management andenterprise data management.

To break those down a bit, information management is the umbrel-la term for the notion that information must be managed before itcan be processed. This has led to a new breed of ‘C suite’ officerknown as the chief data officer or something similar. Yahoo! appearsto have been the pioneer in creating this post, followed in the finan-cial space by CitiGroup and JPMorgan Chase. While the preoccu-pation of the typical chief information officer has been informationprocessing—the manipulation of data by particular applications—the job of the CDO is data and only data.

The development of the role of chief data officer or its equivalent isone measure of the growing maturity of data management. “One ofthe challenges of that role, like any kind of C-level role, is that it ispretty immature at the beginning,” Hagan said. “As new leadershipin this space starts to emerge, we will see a broader business perspec-tive around dealing with information management.”

The goal of data management is actionable information. “If youdon't have clean data, you're not going to find that you have realinformation,” said Dillman. “If you don't have real information,you really have no way of knowing whether decisions are beingmade on a solid basis.”

Master data manage-ment, in effect, is thebackbone that under-lies enterprise datamanagement. “Theway we look at MDMis as a subset of thewhole informationmanagement space.Master data manage-ment is looking atwhich pieces of dataneed to be standardizedacross the organiza-tion,” Hagan said.“This is really the back-bone of how you moveinformation across theorganization.”

Historically, everyapplication defined itsown master data. “If their ability to pass the information acrossthe organization and their referential data is not consistent, it is abig challenge,” Hagan noted.

The idea of common hierarchies brings up the problem of the‘silos’ of data. “The financial industry is just like every otherindustry out there in the sense that it developed its business ver-tically, silo by silo,” noted Michael Atkin, managing director ofthe EDM Council, an industry trade association. “So you have allof these silos within a financial institution by function, by dataarea, by geography, by a whole host of things.”

Then, as the world changed and no longer could just operate in avertical business framework, companies had to start looking attheir organization horizontally. “All of a sudden, they needed toreconcile all of these data stores that existed all over their organi-zation without coordination and without alignment,” Atkin said.

Understanding EDMOperationally, the key concept to unwinding this Gordian knot isthe final acronym, EDM. That concept means just what it says:managing data that is transparent, detailed, relational, accurateand appropriately shared across an entire enterprise.

To elaborate, a 2006 Finextra Research study interviewed 17 chieftechnology officers, chief information officers and other seniordata and business managers of major banks and buy-side and sell-side firms in Europe and North America. The analysis found thatthe meaning of enterprise data management differed by manage-ment position, business type and company size.

Yet, there was broad agreement that enterprise data managementis a process required to enable disparate applications and parts ofa business to share information. It is driven by a need to promote

Ed Hagan, managing directorat BearingPoint

2008EDM 4/15/08 4:16 PM Page 6

Page 7: Enterprise Data Mgmt Report

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 7

accuracy, transparency and efficiency in the business, to ease reg-ulatory compliance and improve client service and performance.

Most interviewees said that, for them, EDM meant capturing,managing and analyzing product, customer, counterparty andoperational data at a very granular level. They noted that they hadbegun to, or wanted to, standardize and manage data centrallyand share it across the business.

“EDM is not a system, a technology or a process; it is an objec-tive,” Atkin said. “If you understand it as an objective with incre-mental strategies that deal with your individual challenges, thenthe goal is to achieve enterprise data management.”

Enterprise data management thus includes issues such as dataquality, management, governance and architecture.“Governance means this needs to become a broader enterprisediscipline, as opposed to something we look at as, ‘Here's thisproblem we need to fix,’” Hagan said. “Particularly in organi-zations like financial services organizations, the same level ofdiscipline that is put around managing financial assets is notthere when it comes to managing information assets. It's real-ly a new dimension by which we need to manage our business-es, as opposed to a problem that we need to fix.”

The Finextra study also underlined the criticality of EDM.Asked what were the driving forces behind their enterprise'sinvestment in EDM, 59% of respondents said risk manage-ment was very important and the primary driving force. Thatwas followed by compliance, 47%; business growth, 41%; andoperational efficiency/cost reduction, 35%.

Atkin noted that corporate investment in EDM beats thealternative. “It is more expensive not to be able to manageyour business, not to be able to meet your time-to-marketobjectives and not identify and profit from niches and marketopportunities,” he said. “It's more expensive to have tradesfail, to fail a regulatory audit or to compensate your clientsbecause you're not meeting the terms of your investmentagreements.” And so on.

The State of the IndustryAs to how financial institutions are meeting the data chal-lenge, most observers agreed that few are. A handful of playersare, maybe five percent, they noted, citing Morgan Stanley,Goldman Sachs and Barclays Capital as leaders in the sector.

“Financial institutions differ pretty substantially,” Hagansaid. “The one common element, at least in the larger, globalorganizations and especially for those involved in the capitalmarkets, is the recognition that this is a challenge, and it isgetting more challenging every day. With the exponentialgrowth of information and the risk associated with that infor-mation increasing every day, executives are trying to figureout, ‘How can I manage the cost side of this equation? Howdo I manage the risk? And ultimately, how do I get some value

from all this data thatis piling up across ourorganization?’”

While financial institu-tions might be far fromgetting on top ofEDM, they're notignoring the concepteither. “EDM is com-pletely accepted bymost of the Tier 1financial institutions,and it is absolutelyabove and beyond the‘What is it and whyshould we care?’phase,” Atkin said.“That, in my opinion,has occurred relativelyquickly. Four years ago,we couldn't even spell‘reference data.’ Now,there are data management programs underway at virtually everyfinancial institution around the world, and data is understood asthe asset that it is.”

However, there is more variation regarding what actions thesefinancial institutions have taken. Indeed, just as levels of cor-porate engagement with EDM differ from company to compa-ny, EDM governance and strategies also vary widely.

To help, BearingPoint identified a four-part framework to helpcompanies address their EDM strategy:

1. What information is critical to understanding whetherthe company is executing its business strategy?

2. Who needs what information to make strategic deci-sions from an operational perspective?

3. What are the critical processes within the organizationfrom the standpoint of performance and quality?

4. What information is critical to regulatory or other exter-nal requirements?

“Most financial institutions that we deal with – primarily Tier1 and 2 – recognize EDM, understand it and know they havea problem that needs to be managed,” Atkin said. “Most ofthem have created data groups, appointed people to be respon-sible for data activity and are trying to set up appropriateinternal governance mechanisms.”

That being said, it's still a foreign concept for most financialinstitutions, Atkin noted. “At the moment, it's almost beenrelegated to just another task to perform within the financialinstitution. That is not necessarily good,” he said. “The ten-dency is to have a short-term view and try to fix things tacti-cally. That is not the makings for a good data strategy.”i

Scott Dillman, managing director at

PricewaterhouseCoopers

2008EDM 4/15/08 4:16 PM Page 7

Page 8: Enterprise Data Mgmt Report

8 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

E NTERPRISE DATA MANAGEMENT(EDM) appears to have transcended therhetoric stage, garnering not only lip serv-ice but increased attention and money.Most people recognize the value of clean,

accurate data that is ‘fit for purpose.’ Unfortunately, theprocess of getting ‘fit for purpose’ data to end-users can be aslow slog of a process.

The first step in moving data from here to there is assuringquality. Data integrity degrades over time. Consider a simple,but common, scenario: a regulated investment firm transactswith another firm it initially regards as low risk. Over time,however, the counterparty's investments gradually tilt towardhigh risk because of changes in geographic location, macro-economic variables, poor management or obsolescence, thuschanging the firm's risk profile from low to high.

The obvious remedy is to update the data - if only it were thatsimple. Data that is extracted and aggregated into down-stream systems is usually controlled by the managers of theupstream system, and they typically lack sufficient incentiveto improve the data's quality beyond theirspecific application domain.

The solution is to centralize and automatedata cleansing to create a sustainable andcontinuous approach by integrating theprocess with data gathering. Such a systemimproves data consistency, removes inaccu-racies and, ultimately, improves risk man-agement and overall business performance.

Of course, not all the data can be auto-matically and centrally cleansed. Logicalerror types in data structure can be cor-rected through programmed processes,but there will always be errors that areimmune to logical conclusions or thevalue of a particular field. “There isalways going to be conflicting data on aprice, a name, a date or a corporate actionmessage,” said Barry Raskin, president ofTelekurs Financial USA. “If you want

straight-through processing, you can't get it unless it's closeto being fully automated.”

All AboardOnce an initial data cleansing is completed, the real workbegins. Financial institutions still confront a superfluity ofstand-alone and legacy back-office systems. “The large invest-ment banks could have 200 systems with several sources ofreference data,” said Tom Stock, senior vice president of prod-uct management at GoldenSource. “The big issue on distribu-tion is what architecture you can put in place to get it to allthe downstream systems.”

That architecture doesn't come cheap. Boston-based AiteGroup estimates downstream connectivity spending willexceed $2.8 billion in the United States and Europe this year.According to Adam Honore, senior analyst at Aite, Tier 1institutions (bulge-bracket firms and global banks) will spendan average of $9 million each to improve their data connec-tivity, with some firms spending more than $20 million. Tier2 firms (mid-sized asset managers and large hedge funds) willspend roughly $4 million to solve their connectivity prob-lems, while Tier 3 firms (small asset managers and hedgefunds) will expend between $250,000 and $500,000.

A variety of options exist for moving datadownstream. Some firms rely on internalsolutions, while others rely on vendor sup-port. Others have adopted outsourcedsolutions that encompass both businessprocess and technology. Most Tier 1 andTier 2 firms tend to adopt some combina-tion of each. Whatever the strategy, Aitehas found that downstream data integra-tion consumes about 10% of the initialEDM budget.

While hardware and vendors can be vettedthrough budget analysis, people can't.Having being prompted for informationabout the data, business users frequentlyexpect IT to respond with answers.However, IT people are rarely the origina-tors of the data. For that reason, firmshave a difficult time gathering accuraterequirements for new initiatives. “Gettinga full and complete inventory is a must,”

Moving Data Downstream Downstream data integration becomes less of an upstream swimBy Stephen Mauzy

Tom Stock, senior v.p. of product management

at GoldenSource

2008EDM 4/15/08 4:16 PM Page 8

Page 9: Enterprise Data Mgmt Report

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 9

Raskin said. “But often things like func-tionality, data elements and data depart-ments get overlooked.”

Allocating big money to get everyoneaboard is a formidable challenge. EDM isan ambivalent priority for many firms,even though projects higher on the totembenefit from downstream connectivity.Honore cited trade processing, settlementand generic integration efforts as initiativesthat derive value from effective connectivi-ty strategies.

Risk mitigation is another hot-button issuethat begs to be mitigated by reducing dis-crepancies, according to Raskin. “EDM issometimes viewed more as an obstacle togetting things done,” he said. “But thereare consequences for not implementingEDM correctly. In the most benign sense,you might be buying data multiple times. In a more malig-nant sense, you have one guy quoting a price in a tradingroom and settlement people having no idea where the priceoriginated.”

Managing stakeholder requirements is a critical process that'sanchored in understanding workflow, data dependencies andorganizational tolerance for operational disruption. Manyfirms use formal processes such as service level agreements tospecify requirements and establish EDM objectives.

A Common LanguageStandardized jargon is another invaluable commodity. A com-mon language requires standardized data definitions. Duringthe requirements gathering process, it can be difficult to getpeople to agree on the purpose of the data. For many institu-tions, the natural tendency is to allow customization orunique interpretations instead of forcing people to exploit anexisting standard.

Several firms advocated creating a solid data dictionary andrequiring people to talk in the terms of that dictionary. Fromthat point forward, projects can define tags off the dictionaryand map everything to it. “You have to get everyone on thesame page,” Honore said. “You don't want to send mixedmessages by using interest rate or coupon when you are talk-ing about the same thing.”

Metadata repositories are popular components of many of thelarger EDM integration projects. The side benefits to a deepmetadata repository are expansive, from enhancing reporting torisk management to something as simple as an intranet searchengine. The people closest to the source are allowed to contributethe definition and grow the information repository.

Once everyone is referencing the samepage, attention can turn to managingconsequences. Downstream consumersneed to be prepared for changes becausereporting and entitlements can be affect-ed by changing data. The problem is onepart technical and one part standards. Insome situations, data changes need to bereflected at one point in time, and EDMimplementations need to accommodatedifferent ‘go live’ times for a data change.

“Downstream applications go across tradeexecutives, books and records, clearingand settlement, valuation, portfolio man-agement and risk systems,” said MichaelAtkin, managing director at the EDMCouncil. “You look at the process and thesteps involved, and you realize that peo-ple have built up a way of operatingunder one set of approaches. Now you are

trying to change that and you have to understand all thoseinter-relationships.”

There is just one caveat: don't read too much into inter-rela-tionships. It isn't uncommon for a firm to look back andrealize that many data fields sought in the project shouldhave been rejected. In this instance, performance implica-tions of the request were the roadblock. Business users do notalways need every attribute on every instrument in everyinstance.

Minimizing SilosEvery chief information officer and risk manager would likeall connections running off the most accurate, most con-trolled data repository. Most firms have bypassed this pushby turning their legacy security master producers into dataconsumers as well. The migration off legacy systems will takeyears of effort in most instances, but it should be addressedaccording to business priorities.

“Getting everyone connected can take up to three years,” saidStock. “A lot of it depends on the size of the organization andthe sophistication of the enterprise in past EDM efforts. Thelarge investment banks have some sort of reference data inconsolidation projects, but it's still a long process. That's oneof the biggest challenges in enterprise data management –speeding up that integration process.”

One of the challenges many firms insufficiently address is theidea of a response mechanism from the downstream system.A firm can create the perfect data extract with the cleanestdata possible for a particular interface. But how does it assurethe downstream application actually got the data, processedit and acquired the results the application expected?

Adam Honore, senior analystat Aite Group

2008EDM 4/15/08 4:16 PM Page 9

Page 10: Enterprise Data Mgmt Report

2008EDM 4/15/08 4:16 PM Page 10

Page 11: Enterprise Data Mgmt Report

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 11

The answer for many firms is a centralized data model thatemphasizes understanding market and reference data, legacysystems and how content is intended to be used throughoutthe organization. But the system only works with tight super-vision. “Without tight controls on centralized ‘data depots,’separate applications and fixes can creep in at a business unitlevel and an institution will begin down the path to silo-based data,” said Honore.

More cross-product selling and more sophisticated clientsincrease the need for consistent data across traditional busi-ness silos. Customers often receive direct access to data viathe Web and online service capabilities, so the data needs tobe as accurate and timely as possible without the need formanual intervention – a primacy reason why firms havelearned the key to continued growth is increased straight-through processing rates.

Good, clean quality reference data necessitates minimizingdata silos. Many firms implement source system controls toensure data satisfies quality standards at its point of origin.When properly implemented, source quality controls caneffectively prevent the proliferation of invalid data. But

source system quality controls alone cannot enforce dataquality. They cannot, for example, ensure that data quality ismaintained throughout the data life cycle, especially whenmultiple data sources with varying levels of cleanliness arecombined in downstream data integration processes.

To further ensure data remains high quality, many firmsadopt a flexible data quality strategy that incorporates dataquality components directly into the data integration archi-tecture. Successful application of this strategy requires a dataintegration platform that can implement a broad range ofgeneric and specific business rules and also adhere to a vari-ety of data quality standards.

Proper GovernanceEffective data management requires top-down, enterprise-wide guidelines that align the information architecture withthe business goals of the financial institution. Effective gov-ernance is key, but it can present a formidable obstaclebecause of the difficulties associated with providing a clearbusiness case on the benefits of data management.

“Data alone has no intrinsic value,” said Gary Barr, globalhead of EDM at Reuters. “It is an enabler of other processes,and the true benefits of effective data management are sys-tematic and intertwined with other processes, which makes itdifficult to quantify all the downstream implications orupstream improvements.”

Ultimate responsibility of data integration usually falls tothe CTO, CIO or COO but, in most cases, they work inconjunction with business heads or dedicated data managersfrom various trading and operations areas across the enter-prise. “The bottom line is that you need C-level buy in,”Raskin said. “You can't go anywhere without that becausethings can get political real fast and then they can getbogged down.”

Difficulty calculating return on investment (ROI) also ham-pers the ‘buy in.’ Benefits aren't as readily transparent and aseasily measure for EDM as they are in other projects. Honoresuggested designing an environment that provides for met-rics to be gathered, which, in turn, allow an institution tomeasure ROI in specific data operations/business functions.One can calculate the cost of maintaining accurate counter-party data against the revenues from the client and/or thepotential losses. Delays in settlement and failed trades can bemeasured against those caused by data errors.

The goal in downstream data integration is to get cleansed,reliable data to the right departments. “Data utopia is coreclient data, core securities reference and core trade data allbeing fed to a centralized data model that then gets distrib-uted,” Honore said. “It's not realistic, but that's what youwork toward.” i

CRUNCHING THE DATAAite Group interviewed a wide range of people who have successfully completed initial EDM projects and are currentlyengaged in subsequent downstream activities. Some of thekey findings can be found in a white paper titled Navigatingthe Rapids of Downstream Data Connectivity. They include:

• Spending on downstream connectivity in the United Statesand Europe will exceed US$2.5 billion this year.

• EDM projects mature an average of three years before diving into downstream connectivity.

• On average, only 10% of an initial EDM budget is allocatedfor downstream connectivity.

• The average connector takes between five and 22 days ofengineering effort.

• Internal IT produces 57% of all connectors.

• More than 60% of new connections go to back-office systems.

• Metadata dictionaries are one of the emerging trends inmanaging connectivity.

• Global trading and settlement, trade processing and compliance solutions ranked near the top of last year’s IT initiatives.

Source: Aite Group

2008EDM 4/15/08 4:16 PM Page 11

Page 12: Enterprise Data Mgmt Report

Tomorrow’s EDMSolutions TodayA look at expanding technology and data management with BNY Mellon Asset ServicingBy Edward McGann, Managing Director

“i

12 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

WITH TODAY’S GLOBAL cred-it environment symptomatic of anoverall global economic slowdown -and the United States seemingly atthe nexus of that environment - the

need for information is ever more critical. This is particularly truefor institutions involved in the investment industry, whether theyare managing money, creating investment vehicles or providingservices to the buyers and sellers of investments.

That is where service providers like BNY Mellon Asset Servicingcome in. Such providers are constantly in the crosshairs of demand-ing clients and industry participants to provide real-time, accurateand, in some cases, enhanced data so they can understand theirpositions and performance in their investment portfolios.

To meet that demand, BNY Mellon Asset Servicing employs notonly its powerful technology infrastructure but also its productmanagement and business units to develop strategies and devisesolutions to clients’ requests. These strategies and products pro-vide clients the access to the data they need and the tools neces-sary to analyze and make sense of the wealth of information.

BNY Mellon Asset Servicing is not only in the business of safekeep-ing and accounting for its clients’ assets, it is in the business of pro-viding information on those assets. Complicating this somewhat‘simple’ effort is the huge amount of derivatives and other complexinstruments, where the information may not be complete at theprimary level. These instruments in the modern-day portfolioderive their value from underlying instruments that may behavedifferently, depending on extraneous conditions and events.

The challenge is to provide clients with relevant information in amanner that is useful and supports decision-making systems andmanagement information platforms as varied as the investmentsthemselves. This requires service providers to combine data ele-ments that exist on a variety of internal platforms with that ofother vendors’ platforms, as well as sometimes combining thatwith proprietary information resident at the clients’ sites.

The Mechanics of EDMWithin BNY Mellon Asset Servicing, how the information is collect-ed, combined, parsed and disseminated to end-users begins withinGlobal Product Management. By collecting inputs from clients,industry participants, internal sources such as BNY Mellon AssetManagement and other leading indicators, Global ProductManagement works closely with the Technology Group to mine theenterprise’s platform of systems and data warehouses to identify theinformation necessary to meet a particular demand or potential newproduct offering. By employing modern-day programming tools, theData Management team can extract the information, often in a real-time manner; package the information; and create a report, data fileor some other method of output that the end-user can utilize.

When the internal sources do not contain all the required fieldsof information, outside resources are reviewed. This is often thecase when the product requires additional ‘enhancements’ toallow for a productive, analytic tool. Being able to stress testdata and perform ‘what if ’ analysis is often at the forefront ofthe new products and services provided by firms like BNYMellon Asset Servicing. The more robust and deep the infor-mation, the more valuable and accurate the results are.

SPONSORED ARTICLE

2008EDM 4/15/08 4:17 PM Page 12

Page 13: Enterprise Data Mgmt Report

BNY Mellon Asset Servicing, the largest custodian in the world andone of the largest cash processing organizations in the world, has accessto a wealth of data and platforms from which to mine information.The trick is putting it in a place that is readily accessible and secure. Inaddition, the footprint in which we operate places us in every timezone across the globe – the very same places where our clients operateand conduct business.

Being able to regionalize data while simultaneously making itavailable 24/7 on a global basis is another challenge facing theenterprise data management field. Our clients’ need for informa-tion may span that of a local office in a location like Singapore to aregional office in London or Paris, and ultimately to a global head-quarters located in New York. Investment decisions are often madeon that information, with the results captured and reported on.

Credit meltdowns need to be reacted to quickly, and the best deci-sions are those made with optimal information. Our platforms are

positioned and configured in a manner that provides access to thenecessary core information and then supplements that with value-added information from non-core sources, embeds analytics anddecision tools and delivers it all to clients via FTP, host-to-hostconnections or Web-based applications like our Workbench orInform platforms. In some cases, that same information must betranslated to a second or third language so the user has it in a for-mat that is useful, and readable, to them. Today, some of ourreports are available in as many as 12 languages.

Warehousing the information is critical to ready access, especiallywhen the base information resides in separate locations. EagleInvestment Systems, a BNY Mellon Asset Servicing subsidiary, is aleading global provider of financial technology solutions. Servingmany of the world's most prominent financial institutions, Eagle'sWeb-based solutions integrate and streamline the investmentprocess. Eagle PACETM is an advanced data-centric platform that isfed information from various sources so firms can execute querieson the data and provide analysis.

Such tools are powerful, as the structure of the underlying dataallows for a variety of ways to dissect it, including monitoring theinvestment performance of the portfolio and anticipating potentialchanges to the portfolio through our performance products. EagleInvestment Systems’ products are an important cornerstone tohow BNY Mellon Asset Servicing solves its clients’ diverse enter-prise data management needs.

Looking to the FutureThe challenges ahead will only become more complicated anddemanding as the nature of the investment environment evolves

and new instruments are added. Consumers of the information -clients and internal parties - will become more demanding as theneed to track performance and understand risk become even morecritical. Firms like BNY Mellon Asset Servicing will be required toharvest the data and information that exists across its diverse, glob-al footprint and provide it in real time to the end-user.

As the world becomes smaller in terms of global communicationand interactive markets, success will be measured in fractions ofseconds. The firms that meet those demands to provide accurateand complete information, along with providing tools to under-stand the information, will be the ones who survive and excel.

Data and relevant information are critical to an enterprise’s suc-cess. BNY Mellon Asset Servicing and our partners in theTechnology and Data Management groups are meeting thoseneeds and positioning ourselves for continued success in thefuture. After all, information is at the core of what we do.

About the Author:Edward McGann is a managing director and head of product managementfor financial institutions and international markets within the GlobalProduct Management unit at BNY Mellon Asset Servicing. In that capac-ity, he and his team are respon-sible for ensuring the firm’s cur-rent suite of products and servic-es meets the needs of thoseimportant client bases and fordeveloping future product offer-ings for the asset servicing mar-ketplace. His responsibilities alsoinclude strategic and financialplanning, overseeing capitalplans related to product andtechnology development andstrategic initiatives that ensureclients' satisfaction with TheBank of New York Mellon.

About the Company:Operating in 34 countries and serving more than 100 markets, the Bankof New York Mellon is a global financial services company focused on help-ing clients manage and service their financial assets. The company is a lead-ing provider of financial services for institutions, corporations and high-net-worth individuals, providing superior asset management and wealth man-agement, asset servicing, issuer services, clearing services and treasury servic-es through a worldwide client-focused team. It has more than $23 trillionin assets under custody and administration, more than $1.1 trillion in assetsunder management and services $11 trillion in outstanding debt.Additional information is available at www.bnymellon.com.

“The firms that meet those demands to provide accurate and complete information, along with providing tools to understand the information,

will be the ones who survive and excel.”

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 13

SPONSORED ARTICLE

2008EDM 4/15/08 4:17 PM Page 13

Page 14: Enterprise Data Mgmt Report

14 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

T HIS IS A PIVOTAL year in the SEC's effortto make filings interactive through eXtensibleBusiness Reporting Language (XBRL), the lan-guage of interactive financial reporting.Securities and Exchange Commission chairman

Christopher Cox said so earlier this year, and he was right.

Cox was referring to the SEC's program, as well as the interactivefinancial data picture worldwide. Indeed, Israel, China, Singaporeand Japan also are moving to XBRL-based financial reporting.“The global movement to interactive data for financial reporting istruly underway,” he said. “Without question, 2008 will be a water-shed year for interactive data.”

Indeed, this year already has been, although not always in the wayCox meant. In February, the Advisory Committee onImprovements to Financial Reporting recommended slowing theadoption of tagged financial disclosure. The panel's final recom-mendations are expected this summer, although its advice has beenroundly ignored by SEC officials so far.

Big BenefitsWhen implemented, XBRL would enable users large and small todrill deeply and instantaneously through the body of public filingsin that format, to access SEC documents as they are filed in realtime and to be able to feed the numbers into a spreadsheet or othermodeling applications of their choice. XBRL is designed to be astandards-based way to express financial statements, includingsometimes critical but hard-to-pinpoint data such as footnotes toSEC reports. “We look at XBRL as being the ideal data structurefor financial reporting,” said David Blaszkowsky, director of theSEC's Office of Interactive Disclosure.

That ideal is attainable because each of the thousands of data andtext components that compose financial reporting can be describedby XBRL ‘tags.’ The tags in turn refer to files in a taxonomy thatdefines them, making them machine readable. A myriad of plat-forms and applications can then slice, dice, analyze and present thedata. Because any one tag can call upon any other tag, the numbers

and texts can be compared by data categories such as date, compa-ny, industry and topic.

In fact, the SEC already does that in a limited way, so far posting307 filings from 74 companies through its XBRL Voluntary FilerProgram. And in February, the agency unveiled Financial Explorer,an online XBRL tool that demonstrates the system’s potential forinteractive research and graphics.

According to proponents of the program, another reason the finan-cial world needs XBRL is because the SEC's current database,Edgar, is outmoded. “Edgar is essentially a document collection sys-tem,” noted Christopher Whalen, co-founder and managing direc-tor of Institutional Risk Analytics in Croton-on-Hudson, N.Y. “Itdoesn't read the documents, and it doesn't validate them. It justcollects them, assigns them a unique ID number and off they go.”

“With XBRL, what the SEC needs to do is migrate this fairlyancient system over to something that is far more data-centric, asopposed to document-centric,” Whalen continued. “That probablyincludes some level of validation. In other words, when the docu-ment hits, they are going to have to look at it and ask, ‘Okay, didyou follow the rules for the tagging?’”

Building MomentumThe advisory panel's comments aside, XBRL appears to be steam-ing ahead. Recent milestones on the road to adoption include:

• September 2007 - Cox and the XBRL U.S. project teamannounce the creation of data tags for all U.S. generallyaccepted accounting principles (GAAP).

• October 2007 - the commission creates the Office ofInteractive Disclosure.

• December 2007 - the XBRL U.S. team releases the firstTaxonomies for U.S. GAAP, the all-important dictionary of tags.

• February 2008 - the second draft of the GAAP taxonomies isreleased.

• April 2008 - the comment period for XBRL draws to a close.

With the comment period over, a preliminary ruling isexpected soon, followed by a final ruling in autumn. If allgoes as scheduled, this ruling will require the top 500 compa-nies by market capitalization to file their 2008 annual reportsthrough XBRL.

A Critical Year for XBRLThe SEC's interactive data program takes key steps toward adoption, despite some corporate opposition By David Lewis

“of

2008EDM 4/15/08 4:17 PM Page 14

Page 15: Enterprise Data Mgmt Report

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 15

Some of the public companies and other filers that anticipate some-day having to code their financial statement in XBRL are not sopleased by the prospect. Reaction to the project can be roughlydivided this way: favorable among analysts, numbers-crunchers andsmaller investors; less favorable among medium-sized and smallpublic companies, mutual funds and others who see an undefinedexpense looming for little or no gain. In the financial sector, theconversion is eagerly anticipated by medium- and small-sized shopsand less eagerly by larger institutions, which already may have builtan XBRL-like database on their own dime.

Still, an SEC-sanctioned XBRL data vault over time would be sheerheaven for most analysts. “The long-term potential is that it is def-initely going to be a benefit for the analyst community,” said GlennDoggett, CFA and policy analyst for financial reporting at theCharlottesville, Va.-based CFA Institute Centre for FinancialMarket Integrity. “This is especially true for small and mid-tierinvestment analysts who, when you go to visit them, still have astack of 10-Qs and 10-Ks on their desks. They're really going to bethe first beneficiaries of that XBRL framework.”

For analysts that work at the big investment banks, however, theresponse may be a bit more muted. “Whether it is on the buy side

or the sell side, they already are subscribers to services that providemuch of the same information to them,” Doggett noted. “For theseplayers, the change to XBRL is not going to be as much a questionof how they operate as it is providing them information faster andwith less potential for error.”

Why reduced error? “Right now, third-party databases are tran-scribing numbers, whereas with XBRL analysts will be gettingcompany-identified values with company-identified tags,”Doggett explained. “As a result, you have a very one-to-one com-munication between what the company says and what everyinvestment analyst gets.”

An Issue of PoliticsNo one doubts XBRL can work because it already does; the real issueis politics. “The challenge for the SEC is not the technical challengerelated to XBRL,” Whalen noted. “It is more, how do you align thistechnology with the business rules and legal responsibilities of theSEC and do it in such a way that you don't piss off all of the filers?”

Whalen is among those who believe a 2008 rulemaking deadlinemight be pushing it. “In theory, they want to have a rule ready forthe commission to consider in September that would set atimetable for adoption,” he said. “The reality is that there is a lot ofwork to be done between now and then, and I'm not sure that theyare going to have enough time to get everything aligned correctlyin order to hit that deadline.”

Still, the 2008 deadline is important to some of the program’s keyplayers, namely SEC Chairman Cox. “Let's face it, Chris Cox isdone at the end of this year,” Whalen said. “One way or the other,you are going to see a new FCC Chairman, and I don't knowwhether or not the future leadership of that agency is going to sup-port something like XBRL as strongly as he has.”

Meanwhile, the SEC's Blaszkowsky argued that, at least for largercorporations, the transition to XBRL will not be so difficult. “Thisis not just a document or a bunch of linear or analog data that hap-pens to be converted to a digital form, this is inherently digital datathat can be found and applied across other databases,” he contend-ed. “As such, it is inherently digital, it is inherently tagged and it isinherently available for the kind of constructive engagement thatenterprise data management systems are developed for.”

The main point, according to Blaszkowsky, is that universal adoptionof XBRL is inevitable. “This has its own compelling internal logic,” hesaid. “The investment world and the corporate world have worriesand concerns and some of them are very legitimate ones, but they willfind that those concerns are unwarranted or exaggerated and that thebenefits are real. They will want to see XBRL implemented.” i

“The investment world and the corporate world have worries and concerns and some

of them are very legitimate ones, but they will find that those concerns are unwarranted

or exaggerated and that the benefits are real.”— David Blaszkowsky

Some of the interactive research and graphics availablethrough Financial Explorer.

2008EDM 4/15/08 4:17 PM Page 15

Page 16: Enterprise Data Mgmt Report

16 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

One Step Ahead in OTC DerivativesIn an industry facing calls for improvements, The Depository Trust & Clearing Corp already is making themBy Gregory Morris

MARCH WAS THE MONTH forgetting in step. On the 13th of thatmonth, Treasury Secretary HenryPaulson, Jr. set the pace in hisremarks on recommendations from

the President’s Working Group (PWG) on Financial Markets.He cited a number of the working group’s key findings andspecifically called for a joint industry response in several finan-cial sectors, including over-the-counter (OTC) derivatives.

Just two weeks later, almost two dozen financial institutionsand trade associations sent an open letter to the President ofthe New York Federal Reserve, Timothy Geithner. That let-ter cited progress to date and underscored the financial com-munity’s support for further improvements. The NY Fedreplied the same day with encouragement and suggested sev-eral near-term goals.

With all the mutual support and affirmation carrying on in theforeground, it was easy to miss a groundbreaking initiativeoccurring in the background. Demonstrating that some seg-ments of the market are already on the case, The DepositoryTrust & Clearing Corp (DTCC), through its Trade

Information Warehouse, completed the first automated pro-cessing of a credit event for a Canadian printing firm, comply-ing with protocols issued by the International Swaps andDerivatives Association (ISDA).

With this first automated credit event now concluded, DTCCis focused on enhancing this functionality of the Warehouse inpreparation for future events. Some of these priorities includeallowing for automatic adherence by counterparties and addingindex tranches to the products the Warehouse supports forcredit events.

A Track Record of ProgressDTCC first launched Deriv/SERV, its automated matchingand confirmation platform for OTC derivatives, in late 2003 tohelp the derivatives community address the operational chal-lenges they faced in a market growing at breakneck speed. Theservice has been instrumental in allowing market participantsto meet their commitment to global regulators to strengthentheir infrastructure by increasing the automated processing ofOTC derivatives transactions.

Today, more than 95% of credit derivatives transactions areelectronically matched and confirmed on the Deriv/SERV plat-form. Transaction volume for all OTC derivatives productsjumped 123% to 5.9 million transactions last year, up from 2.6million the previous year. More than 1,100 global dealers, assetmanagers, hedge funds and other end-users in 31 countriesautomate their OTC derivatives transactions throughDeriv/SERV, with more being added each week.

To further support derivatives trading, DTCC launched itsTrade Information Warehouse, the first automated globalrepository for OTC derivatives, in November 2006. Part ofDTCC Deriv/SERV’s family of automated post-trade process-ing services for the OTC derivatives community, theWarehouse provides an automated environment where con-tracts can be tracked and serviced over their lifecycle. Last year,about three million contracts were recorded into theWarehouse, with an average of an additional 10,000 new con-tracts now being added daily.

SPONSORED ARTICLE

2008EDM 4/15/08 4:17 PM Page 16

Page 17: Enterprise Data Mgmt Report

APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 17

Expanding on the Warehouse’s functional-ity, late last year, DTCC launched a cen-tral settlement service for OTC creditderivatives transactions, in conjunctionwith CLS Bank International of NewYork. It is the OTC derivatives industry’sonly automated solution for calculating,netting and settling payments betweencounterparties to bilateral contracts.

The new service has been designed to enablepayments associated with transactions con-firmed through Deriv/SERV and residing inthe Warehouse’s global contract repository tobe netted by value, date, currency and coun-terparty. Payments eligible for settlementinclude initial payments and one-time fees,coupon payments and payments associatedwith post-trade events. Central settlementgreatly reduces operating risks for users byreplacing manually processed bilateral pay-ments with automated, netted payments.

The Warehouse generates bilaterally netted payment instruc-tions and sends them to CLS for settlement. CLS automatical-ly notifies its Settlement Members, who effect settlementthrough CLS. Reports are generated and delivered to counter-parties early in the morning on settlement day.

In the second quarterly central settlement cycle for the newservice on March 20, 2008, the amount of trading obligationsrequiring financial settlement was reduced by 93%, from agross of $18 billion in aggregate U.S. dollar terms to $1.2 bil-lion net. Gross settlements by the 15 participating OTC deriv-atives dealers were consolidated from 400,000 to 200 net set-tlements. Payments were made in five currencies: the U.S. dol-lar, euro, British pound, Japanese yen and Swiss franc. Overtime, the number of currencies in which payments can be madewill be expanded from the initial five.

Ahead of Regulators’ Recommendations“Our initial offering for automated processing of OTCderivatives products began to take form in 2003 and wentlive in 2004, prior to the earliest regulatory calls to addressthe deficiencies in the system,” according to Frank DeMaria, managing director and chief operating officer ofDTCC’s Deriv/SERV. “Since then, we have increased ourservice offering and are working with both buy-side and sell-side counterparties.”

Indeed, DTCC’s products and services seemto complement Paulson’s March 13thremarks. “We need a dedicated industrycooperative,” he said at the time. “Marketvolume and instrument complexity call for aclear, functional, well-designed infrastruc-ture that can meet the needs of the OTCderivatives markets in the years ahead.”

Paulson further commented that such anindustry cooperative “must capture all sig-nificant processing events over the entirelifecycle of trades. It must have the capabili-ty to accommodate all major asset classesand product types. It must be operationallyreliable and scalable and use automation topromote standardization that will create effi-ciency and moderate excessive complexity.”

Paulson noted that the PWG specifies thatthe infrastructure must have a flexible andopen architecture for inter-operability,

upgrades and improvements. “The facility also shouldenhance counterparty risk management through netting andcollateral agreements by promoting portfolio reconciliationand accurate valuation of trades,” he added, urging the indus-try to “incorporate, without delay, cash settlement protocolinto standard documentation.”

Integrating the InfrastructureAs with the overall thrust of the industry initiative to make theOTC derivatives market more transparent and efficient, DeMaria said Deriv/SERV’s expanded capabilities respond toPaulson and the PWG’s recommendation to develop a longer-term plan for an integrated operational infrastructure in theOTC derivatives market.

The PWG calls for maximizing “the efficiencies obtainablefrom automation and electronic platforms by promoting stan-dardization and interoperability of infrastructure compo-nents.” It also urges participants to enhance their ability to“manage counterparty risk through netting and collateralagreements by promoting portfolio reconciliation and accu-rate valuation of trades.”

De Maria noted that these initiatives are already part of theWarehouse’s daily process. “Maintaining the most up-to-dateinformation on trade details in one central portal addressesthe challenges participants face in keeping their collective

“Maintaining the most up-to-date information on trade details in one central portal addresses the challenges participants face

in keeping their collective deal books in synch.”

SPONSORED ARTICLE

Frank De Maria, managingdirector and chief operatingofficer of DTCC’s Deriv/SERV

2008EDM 4/15/08 4:17 PM Page 17

Page 18: Enterprise Data Mgmt Report

18 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008

deal books in synch,” he said. Because the Warehouse enableseach participant to see the positions they hold with theircounterparties, there is more transparency and portfolio rec-onciliation is much more seamless.

“The first step is efficient and timely reconciliation among coun-terparties,” said De Maria. “That enables them to terminate,assign and amend positions as the front office sees fit and do so ina controlled manner.”

In terms of providing an integrated infrastructure that encompassesthe buy-side as well as the dealer community, DTCC’s Deriv/SERVoffers it. “Well over 1,000 of our customers represent buy-sidefirms,” De Maria said. “We are an industry owned organizationwhose policy and priorities are set in conjunction with ourOperations Management Group, which includes representationfrom both dealers and buy-side firms. This helps us build consensuson our key initiatives that reflect the interests of a wide range ofindustry members.”

To ensure broad participation in DTCC’s derivatives services, it wascritical that Deriv/SERV’s matching and confirmation service andthe Warehouse had a full spectrum of interface capabilities. “It canbe used by the most technologically sophisticated firms, as well as bythose who do not have as robust an infrastructure,” De Maria said.It is also important to note that buy-side services are charged no feesto use the service.

And what about risk mitigation? Risk awareness currently is a veryhot topic, De Maria noted. “The OTC derivatives market has allthree major forms of risk: market, credit and operational. In a philo-sophical sense, the market risks and the credit risks are what bringthe business into being. But wasting money on operational risksbenefits no one. If you increase automation, you both increase effi-ciency and reduce risk,” he said.

A New Focus on NovationThe major new emphasis for this year is on novation. NovationConsent, a new service launched earlier this year, is intended toautomate the email process that takes place between counterparties

of assignment transactions. Provided through the Trade InformationWarehouse, Novation Consent automates the request, approval andnotification procedures among the three trading parties involved inan OTC credit derivative contract assignment, as stipulated byISDA in its Novation Protocol.

Under the ISDA Novation Protocol, when a party to an OTC deriv-ative transaction wishes to exit that contract by assigning its positionto a third party, that party - the transferor - must notify the remain-ing party and the entering party - the transferee - and seek permis-sion for the assignment from the remaining party.

“Deriv/SERV worked with the OTC derivatives industry tobuild an automated tool for the processing of novations fullycompliant with ISDA’s novation protocol,” De Maria said.“We have designed Novation Consent to deliver the featurestrading parties have been seeking in terms of speed, efficien-cy and inter-operability across platforms.”

Novation Consent streamlines assignment processing by allowingfirms to consolidate consents messages. Furthermore, it leverages theWarehouse’s power as a global repository of confirmed OTC creditderivative transactions by retrieving trade data from the Warehouseand enabling users to submit assignments to Deriv/SERV’sMatching and Confirmation service.

In the new novation service, as with all its products, De Maria notedthat DTCC is careful not to let the drive to standardization and effi-ciency impinge on the flexibility that makes the OTC market sovibrant. “Dealers and end-users are working through ISDA standardmaster agreements,” he said. “We will continue to see standardiza-tion in legal documents, allowing all participants to speak the samelanguage, but there is still a great deal of flexibility. Clearly, ourindustry has been very proactive and has had great foresight.”

About the Company:Depository Trust & Clearing Corp (DTCC), through its subsidiaries,provides clearing, settlement and information services for equities,corporate and municipal bonds, government and mortgage-backedsecurities, money market instruments and OTC derivatives. The firmalso is a leading processor of mutual fund and insurance transactions,linking funds and carriers with their distribution networks.

DTCC’s depository provides custody and asset services for about 3.5million securities issues from the U.S. and 110 other countries andterritories worth more than $40 trillion. In 2007, DTCC clearedand settled more than $1.86 quadrillion in securities transactions.

DTCC’s OTC derivatives services are provided by its wholly-owned sub-sidiary, DTCC Deriv/SERV. As managing director and chief operatingofficer of that subsidiary, Frank De Maria is responsible for the day-to-day operations of DTCC’s automated services for the OTC derivativesmarket. He oversees the company’s matching and confirmation system forcredit derivatives and leads a cross-organizational team in charge of sup-porting and developing the Trade Information Warehouse.

For more information, please visit www.dtcc.com.

SPONSORED ARTICLE

0

5

10

15

20

25

35

30

2003 2004 2005 2006 Source: ISDA Market Survey

Growth in OTC Credit Derivatives VolumeNotional Amounts in trillions of US Dollars

3.8

8.4

17.1

34.4

2008EDM 4/15/08 4:17 PM Page 18

Page 19: Enterprise Data Mgmt Report

www.dtcc.com

DTCC Deriv/SERV’s family of services for OTC derivatives is:

• Reducing risk

• Cutting costs

• Enhancing efficiency

• Building the largest community of users worldwide

DTCC Deriv/SERV’s electronic trade matching and confirmationservice and Trade Information Warehouse make paper-based,error-prone, manual processing obsolete. Join global dealersand the buy-side community in automating and streamliningyour OTC derivatives post-trade deal flow.

Deriv/SERV services OTC credit, equity and interest ratederivatives on a global basis, at no charge to buy-side firmsand at cost to dealers.

To learn more call London +44 (0)20 7444 0411 New York +1 212 855 5424 Or visit www.dtcc.com

Say Buenos Dias to Automation,Sayonara to Risk

The Logical Solutions Provider

Clearance andSettlement - Equities

and Fixed Income

Asset Servicing

Mutual Funds

Managed Accounts

AlternativeInvestments

Insurance

Global CorporateActions

OTC Derivatives

2008EDM 4/15/08 4:17 PM Page 19

Page 20: Enterprise Data Mgmt Report

Having comprehensive securities reference data is great.

Having a way to make sense of it is even better.

Add more value to your reference data with Standard & Poor’s Cross Reference Services™.

Now there is a global solution that can help tie all your securities reference data together—Standard & Poor’s Cross Reference Services. Customizable and available through multiple delivery channels, the service links identifi ers, entities, issuers and obligors across global and domestic markets. It’s the insight you need to help manage your enterprise-wide exposure, enhance compliance, highlight potential confl icts of interest and identify investment opportunities.

Analytic services and products provided by Standard & Poor’s are the result of separate activities designed to preserve the independence and objectivity of each analytic process. Standard & Poor’s has established policies and procedures to maintain the confi dentiality of non-public information received during each analytic process.

© 2008 Standard & Poor’s, a division of The McGraw-Hill Companies, Inc. All rights reserved. STANDARD & POOR’S is a registered trademark of The McGraw-Hill Companies, Inc.

Learn more. Visit www.sp.crossrefservices.com, e-mail [email protected] or call 212.438.4500 (North America) +44.(0).20.7176.7445 (Europe). www.standardandpoors.com

2008EDM 4/15/08 4:17 PM Page 20