Upload
phamduong
View
214
Download
0
Embed Size (px)
Citation preview
Big Data Drives Transformation in Capital Markets
Nadeem Asghar Field CTO Financial Services- HortonworksRamana BhandaruVice President – Capital Market Practice Leader – Capgemini
3 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Transformation
--- Maturity Stages
OptimizationExplorationAwareness
---
Matu
rity
Sta
ges
Peer Competitive Scale
Standard among peer group
Common among peer group
Strategic among peer group
New Innovations
No Use Case Name
1 Single View of Institution
2 Predict Risk Exposures
3 Predict Counterparty Default
4Automation of Client Due Diligence forconsumer onboarding
5 Enhanced Transaction Monitoring
6 Enhance SAR Accuracy
7 Credit Risk Calculation
8aRegulatory Risk Calculations – Basel III & CCAR
8bRegulatory Risk Calculations – Basel III & CCAR
9aCalculating VaR across multiple trading desks
9bCalculating VaR across multiple trading desks
10Calculate credit risks across a variety of loan portfolios
11 Internal Surveillance of Trade Data
12CAT (Consolidated Audit Trail)/OATS Reporting
13 EDW Offload
Corporate &
IT Functions
Trading Desks
Use Cases at different levels of organizational maturity
Surveillance
Security & Risk
28a
5
71
6
3
4
9a
10
11 12
8b9b
13
4 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Capital Markets
Risk Reporting, AML/FATCA Compliance & Market/Trade Surveillance
Risk Data Aggregation (Credit, Market, Basel ,FRTB etc),
Surveillance Reporting, Market integrity & investor protection
Trade Lifecycle
Trade strategy development, backtesting across asset classes;
looking for correlations etc.
Sentiment Analytics
Leverage Social Media and other data feeds to drive
trading strategies and portfolio rebalancing decisions
Single View of Client & Client Benchmarking
Single View of Customer Activity & Risk across multiple
trading desks
Data Products
Analytic tools (statistical modeling, functional grouping, time series analysis) to clients around trade
data; Reduce Market Data Storage Costs
Capital Markets
6 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Decisions
Core Banking
Positions
Reference Data
Market Data
Docs, emails
Server logs
Streaming: Network Probes, Click Stream, Sensor, Location
Batch: Call Detail Records
On-Line: CustomerSentiment
Unstructured: Txt,Pictures, Video,Voice2Text
Online News Feeds
Broker Notes
Corporate DataMarket Data
Social Media
Buy/Sell decisionsRight size Client PortfolioWho do clients trust etc
Using Social and Other Data Feeds to drive Trading Decisions
8 © Hortonworks Inc. 2011 – 2016. All Rights Reserved CONFIDENTIAL & PROPRIETARY INFORMATION
Financial Risk Data Aggregation & Reporting
The Common Risk Types on HDP
- Credit Risk
-Market Risk
-Operational Risk
- Liquidity Risk
- Volcker Rule
- CCAR
- Basel III
9 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
FRTB was introduced to rectify the shortcomings of Basel 2.5– Reporting on FRTB by end of 2019
– VaR is replaced by a 95% Expected Shortfall (ES)
– IR has been replaced by IDR
– 10 days horizon
– More Models, More Sophistication
– Models need to have higher accuracy and more higher data quality
– FRTB introduces data management & governance challenges
– Hortonworks shines at Data Ingestion, Data Lineage & Provenance
Introducing the FRTB
10 © Hortonworks Inc. 2011 – 2016. All Rights Reserved CONFIDENTIAL & PROPRIETARY INFORMATION
RDARR Reference Architecture
12 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Case Study: A Business Data Lake for a financial services company, to enable low latency Risk detection and action
▪ Increased regulatory demand and market forces demand retention of large scale granular data and quick access to it
▪ The Financial Services company desires to move from the current ‘brittle’ DW architecture to a Data Reservoir, with a ‘minimalistic’ model
▪ Load data relating to trades, positions, valuations, etc. – and classify hot, warm and cold data according to latency of access desired (e.g., hot data represents most recent 5 days data and will reside in memory)
▪ 11 Scenarios successfully proved via the Capgemini CUBE environment, and scale to be showcased via Pivotal’s 1000-node AWB
▪ Warm and cold data will be accessed from the Reservoir via a SQL-like interface
P
I
V
O
T
A
LData Reservoir
Stream Ingestion
Spring Batch
▪ S2: Continuous data load and aging (partial)
▪ S8: SQL access to hot data
▪ S8: SQL access under peak data load conditions
▪ S11:Data reservoir node failure
In-Memory Processing
▪ S5: SQL access to warm data
▪ S6: SQL access to data in the reservoir
ConfigurationFor Eviction
▪ S10:Backup store node failure
▪ S7: Direct access to reservoir data
▪ S1: Initial data load
SQL FireTransformation
HDFS
▪ S9: Cache server node failure30
30 1111
SQLFire
Pivotal HD
GemFire
▪ S3: On-demand cache load of reservoir data
13 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
Case Study: Trade Execution Analytics for Morgan Stanley
Morgan Stanley
▪ Business Challenge:
▪ Joint-venture for Wealth Management requiring efficient execution of trades for customers
▪ Regulatory compliance requiring proof of multiple factors
▪ Solution:
▪ Production application on: Cloudera Hadoop + Qlikview
▪ Application for: CEO, heads of trading desks.
▪ Post trade execution analytics
▪ Enables Trading Desk heads to track trade execution efficiency; Track lost trades and why
▪ Brings together Client Data, Market Prices, Inventory, Best Execution
▪ Ascertainable business benefit
http://www.forbes.com/sites/tomgroenfeldt/2012/05/30/morgan-stanley-takes-on-big-data-with-hadoop/
14 © Hortonworks Inc. 2011 – 2016. All Rights Reserved
14Copyright © Capgemini 2016. All Rights Reserved
Insights as a Service | March 2016
Processed 1.2 BN records within the prescribed SLA’s and made hardware (compute, storage) as well as
Spark configuration recommendations for the client to implement within their environment
From Big Iron to Big Data: Transforming a Global Bank’s Core Reg Reporting Operations leveraging Platform as a Service
Client Overview: Founded in 1865 to finance trade between Asia and the West, today the client is one of the
world’s largest banking and financial services organizations serving more than 47 million customers. HSBC’s aim
is to be acknowledged as the world’s leading international bank.
Client Challenges:
▪ Global General Ledger data processing and reporting runs on legacy mainframe application that was being
retired
▪ The client needed to choose between re-negotiating an expensive contract for multiple years of lock in or
transform the platform to leverage advances in Big Data
▪ Internal Big Data platform unable to provision the specialized environment needed for this program
Delivering Business Data Lake as a Service in a high performance configuration that would meet the
clients needs. Built in support for the environment in a pay-per-use model, with ability to scale up under a
week gave the client the flexibility they needed.
On the innovation front, we developed a Rule Migration Framework that would transform 80,000 rules
and criterion from mainframe formats to an open standard that would accelerate the current transformation
and be re-usable for future upgrades, resulting in significant savings in manual effort and reduction of
errors.
Capgemini’s unique mRapid data ingestion framework was leveraged to ingest terabytes of transactional
data into the Business Data Lake platform in compressed timeframe, thereby allowing the data analysis
and transformation to commence sooner than planned.
Key Benefits delivered:
▪ Our Pay-per-use, secure, scalable Business Data Lake as a Service allowed the client to start this strategic
program 6 months sooner than they would have, were they to use internal resources.
▪ Our innovative technical frameworks reduced manual effort by 30% over the course of the engagement.
BDLaaS Solution Architecture
Ou
r k
ey s
olu
tio
ns