Thomson Reuters is pleased to be a sponsor for this years A-Team Entity Data and Applications Directory. This special publication lists all the major suppliers of regulatory and risk data services, covering areas such as:
FATCA, Solvency, EMIR, Dodd-Frank, UCITS, LEI, Counterparty Risk and so much more.
3. Entity Data Management and Downstream Applications
5
Introduction 3
Foreword 7
Overview 9
Legal Entity Identifier 11
Regulation 18
Entity Data Suppliers 23
Entity Data Management 28
Downstream Applications 33
CONTENTS
As a marketing or business manager, you know you need content
marketing if you’re going to succeed in attracting and engaging
with today’s more savvy buyer. But do you:
•Struggle to find time to create content consistently?
•Find it hard to think of fresh topics to write about?
•Lack the capacity to generate blogs, run or moderate
webinars, seminars or events or other valuable content?
•Fail to generate enough leads or sales conversions
from your marketing efforts?
You’re not alone. While 93% of marketers use content marketing
today, their top two challenges are a lack of time (69%) and producing
enough content (55%)*
Come to the content experts at A-Team Group.
A-Team Group has, since 2001, been delivering distinguished
content based on in-depth domain expertise on behalf of B2B financial
technology suppliers. Run by experienced business journalists,
we thrive on taking complex business and technology topics and turning
them into compelling content assets to drive lead generation and
prospect nurturing with a measurable ROI.
Whether you just need support with content for your blog or
to manage a webinar, or if you want the full service content
marketing strategy and execution, A-Team Group have the
experience, knowledge and content know-how to help you succeed.
* Source: 2013 survey of 1,217 respondents across a range of industries, functional areas
and company sizes, by Content Marketing Institute, MarketingProfs and Brightcove.
Call 020 8090 2055
For a free consultation or to ask any questions, give us a
call 020 8090 2055 or email angela@a-teamgroup.com
4. Entity Data Management and Downstream Applications
7
by Adam Devine, VP, Product Marketing, WorkFusion
2015 appears (fingers crossed) to be the end of six years of turbulence for the financial
services industry, but ascent back to healthy profitability won’t come solely from winning
new business. Fundamental to widening margins will be radically reducing the cost
without compromising the quality of the industry’s most valuable asset: Data.
No-brainer, right?
You’d think it would be, but the technology and services businesses that serve financial
data operations have failed to respond to the need for radical cost reduction with radical
innovation where most of the cost lives: data collection.
Aside from a very few players, business process outsourcing and knowledge process
outsourcing providers have maintained the status quo, wary of letting go of the dated
and increasingly expensive labor arbitrage model and embracing automation. Bank-
backed utilities are promising, but their data feeds still require customization. Automation
point solutions are quarantined from one another and dedicated to solving individual
pain points. Internal IT projects suffer from lean budgets.
Radical profitability improvement requires radical innovation. We believe this radical
innovation comes in the form of machine learning. Machine learning – or software that
programs itself by watching humans work – has the power not only to remove cost
through automating data collection, but also to improve quality, speed and transparency.
Unburdening valuable data analysts of repetitive data collection lifts human intelligence
from chasing data to running with it. How much more value could the financial industry
add if the human capital spent on data collection was invested in customer service and
product innovation?
WorkFusion is sponsoring A-Team Group’s Data Management Summits through this year
in pursuit of the answer. Please visit our booth for a live demo of how the industry is using
WorkFusion’s machine learning powered platform to solve data collection. We’re also
pleased to be sponsoring this industry handbook as an invaluable guide to entity data
and its management. We hope you will find it useful in your own pursuit of the answer.
Foreword
For news on further Hot Topic webinars as they are
added go to bit.ly/rdrwebinars
Reference Data Review
Your Reference Data Resource from A-Team Group
Forthcoming Webinars
If you would like to learn about webinar sponsorship and
speaking opportunities, please contact Caroline Statman at
caroline@a-teamgroup.com
March 24th BCBS 239 (Part of Basel III)
April 28th Enterprise Data Management - The Next Generation
May 7th Pricing and Valuations Data
May 14th Screening for Sanctions, Watch Lists and PEPs
May 19th Data Governance
May 28th Solvency II
June 2nd Utility Model for Data Management
June 9th A Collaborative Approach to Client and Entity
Data for Client Onboarding
June 16th BCBS 239
July 9th Entity Data Management
July 14th Risk Data Analytics
bit.ly/rdrwebinars
5. Entity Data Management and Downstream Applications
9
Overview
Hierarchies and
links
Entity data hierarchies
describe how an entity
is connected to parent
entities, other related
entities and ultimately a
beneficial owner.
Entity links describe
relationships between
clients, counterparties
and issuers.
Entity Data
Definition
Entity data identifies:
n Clients
n Counterparties
n Issuers
Data lineage
Problem: governing the sources, controlling the access, and
generating a full audit trail of how entity data is sourced was
important, and regulations have made it essential.
Solution: WorkFusion actively governs worker access to sources,
delegates specific tasks to specific tiers of workers, and controls,
tracks, and reports touchpoints from the start to the finish of the
entity data supply chain.
www.workfusion.com
The financial crisis of 2008 exposed many faults in capital
markets, not least a lack of entity data covering relationships
between customers, counterparties and issuers, and their
exposure to each other. At the epicentre of the crisis, when
Lehman Brothers filed for bankruptcy, the data gaps made
it impossible for regulators and market participants to trace
holdings and business hierarchies across Lehman-affiliated firms
and measure counterparty risk exposure with any speed.
Lehman was not alone in the crisis, which proved catastrophic
for a number of financial institutions. The outcome was a
meltdown in financial markets and reputational loss for both
regulators and banking leaders. And there were questions: what
had happened, why did it happen and who was to blame?
The lack of data left regulators piecing together the answers
to the questions, but as holdings were unravelled and links
between entities identified, it became clear that the data
vacuum that caused the crisis needed to be filled with accurate,
complete, high quality and comprehensive entity data that
could be constantly updated.
While recovery from the crisis has been slow and costly, it has
provided a period for radical change set in motion by the G20
when it mandated the Financial Stability Board to create a
Global Legal Entity Identifier System that would issue a unique
identifier to each registered entity and provide both regulators
and market participants with tools to improve risk management.
A wave of regulation also followed the crisis and it, too, focused
on entity data, requiring institutions to begin the turnaround
from a security-centric apporach to business to an entity-centric
approach. The challenges of this change are significant, but
benefits and opportunities are emerging as regulators gain a
clearer insight into the market and financial institutions develop
entity data management processes that provide a framework
for compliance with existing and forthcoming regulations, as
well as a better understanding of customers and counterparties
on which to build new business initiatives.
6. Entity Data Management and Downstream Applications
11
Legal Entity Identifier
Overview
The Legal Entity Identifier (LEI) is a free-to-use standard
entity identifier that uniquely identifies parties to financial
transactions. Its development, and that of a global LEI
system to support its widespread use, was mandated by the
2011 G20 Cannes Summit in the wake of the 2008 financial
crisis and in the hope of averting further similar crises.
While the 2008 crisis highlighted the inability of regulators
to track parties to transactions, measure their counterparty
risk and understand overall exposures with any speed, the
LEI is designed to help regulators measure and monitor
systemic risk by identifying parties to financial transactions
quickly and consistently, and obtaining an accurate view
of their global exposures. Market participants are also
using the LEI to improve risk management within their own
organisations.
To date, the driver behind LEI adoption has been regulation,
with existing regulations such as Dodd-Frank and European
Market Infrastructure Regulation (EMIR) requiring firms
within their scope to use LEIs for trade reporting. The
identifier is also a requirement of forthcoming regulations
such as Solvency II and Markets in Financial Instruments
Directive II (MiFID II), and is expected to be mandated in
any further regulations touching on entity data.
Market participants are taking different approaches to
implementing the LEI. At this stage, most are taking a
tactical approach, mapping the LEI into multiple and
separate data stores, but some are taking a strategic
approach and using regulatory requirements around the LEI
as an opportunity to review how they acquire, manage and
distribute entity data.
With numbers of LEIs issued in the low hundreds of
thousands, market participants expect the tipping point for
wide-scale adoption to come when over one million LEIs are
issued, or sooner if regulators mandate increased use of the
identifier.
Development timeline
The mandate issued by the 2011 G20 Cannes Summit
aimed at stabilising global financial markets and, on this
basis, required the international Financial Stability Board
(FSB) to lead regulatory work and deliver recommendations
for a global LEI system by June 2012, ahead of the 2012
G20 Summit in Los Cabos.
While previous attempts by the financial industry to create
a common global entity identifier failed due to lack of
At a Glance
The LEI is a standard
and free-to-use entity
identifier designed to
work within the Global
LEI System to help
regulators stem systemic
risk. It is not yet widely
used, but is gaining
traction as regulations
mandate its use and early
adopters find entity data
management use cases
for it beyond regulatory
trade reporting.
www.cw.cw ounterpartylink.com
7. 12
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
13
Legal Entity Identifier (cont.)
Statistics
n Over 330,000 LEIs
issued worldwide
n Four-digit prefixes
allocated to 30 pre-
Local Operating Units
n 22 operational pre-
Local Operating Units
n 22 pre-Local Operating
Units endorsed by the
Regulatory Oversight
Committee
collective intent, lessons learnt from the financial crisis led
regulatory authorities and market participants to agree that
a uniform global system for legal entity identification would
be beneficial and to the public good.
Regulators would be in a better position to measure and
monitor systemic risk, and handle any resolutions; financial
firms would be able to improve risk aggregation and
reduce operational risks associated with reconciling the
identification of entities; and all parties would benefit from
higher quality and more accurate entity data.
With just a year to make recommendations and ambitious
plans to have a self-standing interim global LEI system in
place by March 2013, the FSB approved an International
Organisation of Standardisation (ISO) proposal for an LEI
standard in May 2012.
Initial plans proposed a single and central global LEI
registration authority – Swift favoured itself as a good fit
for this – but the plans were quashed by recommendations
from the FSB’s LEI expert and industry working groups
for a global federated model that would include local
organisations registering entities and issuing LEIs.
With an LEI standard in place and the decision made on
a federated system, the FSB issued a report in June 2012
entitled ‘A Global Legal Entity Identifier for Financial
Markets’. The report included 35 recommendations for the
development and implementation of the Global LEI System
and was approved by the 2012 Los Cabos G20 Summit.
Addressing the report’s recommendation that a global LEI
system should be developed for the benefit of both public
regulators and private market participants, the FSB established
an LEI Implementation Group and supporting Private Sector
Preparatory Group that would work together, following and
clarifying the recommendations for a three-tier global system
comprising a Regulatory Oversight Committee, a Central
Operating Unit and Local Operating Units.
At times, progress was slow, but the FSB deadline of
Avox, a wholly owned subsidiary of The Depository Trust
Clearing Corporation, DTCC, matches, enriches and maintains
legal entity reference data for its clients, delivering corporate
hierarchies, registered address information, industry sector codes
and company identifiers. This approach ensures that clients can
rely on the most accurate and timely data available to facilitate
decision making and regulatory reporting.
For more information, please visit www.avox.info.
www.avox.info
Legal Entity Identifier (cont.)
having an interim global LEI system in place by March 2013
was met. Work continues on improving the system and it
is expected to be complete once its central operations,
including a central LEI database, are in place.
The Global LEI System
Built on the basis of the FSB’s recommendations, the Global
LEI System includes three key elements:
The Regulatory Oversight Committee (ROC) – The ROC
includes regulators from around the world that have agreed
to participate in the Global LEI System, follow its principles
and purpose, and support its governance in the interests of
Outstanding
issues
n Central LEI database to
be established
n LEI hierarchy data to
be defined
n Additional pre-Local
Operating Units to be
endorsed
n Transition from interim
to complete Global LEI
System
9. 16
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
17
Legal Entity Identifier (cont.)
Key Links
Financial Stability
Board report and
recommendations
www.leiroc.org/
publications/gls/
roc_20120608.pdf
Pre-Local Operating
Units Endorsed by the
Regulatory Oversight
Committee
www.leiroc.org/
publications/gls/
lou_20131003_2.pdf
Charter of the
Regulatory Oversight
Committee for the
Global LEI System
www.leiroc.org/
publications/gls/
roc_20121105.pdf
Statutes of the Global
LEI Foundation
www.leiroc.org/
publications/gls/
gleif_20140824_3.pdf
by the ROC in October 2013 were the CICI utility,
sponsored by the US Commodity Futures Trading
Commission (CFTC) and operated by DTCC and Swift; WM
Datenservice, sponsored by the German Bundesanstalt für
Finanzdienstleistungsaufsicht; and the Institut National de la
Statistique et des Etudes Economiques, sponsored by the
French Ministry for Economy and Finance.
The CICI utility, which issued CFTC Interim Compliance
Identifiers (CICIs) ahead of the ROC allocation of four-digit
prefixes to pre-LOUs and third-party registered CICIs ahead
of the FSB decision to allow only self-registration, removed
tens of thousands of CICIs from its database in 2013 before
distancing itself from these early glitches by rebranding the
CICI utility as the Global Markets Entity Identifier Utility in
January 2014.
LEI hierarchy data
LEI hierarchy data has become a subject of significant debate
since the FSB issued its recommendations for a global LEI
system in June 2012. At the time, the FSB stated that initial
reference data to be used in the Global LEI System should
be the business card data described in the LEI standard ISO
17442:2012.
The FSB also recommended that the ROC should undertake
regular reviews of the LEI reference data and monitor required
changes, additions, retirements and modifications. Still more,
it recommended that the LEI Implementation Group should
develop proposals for additional reference data on the direct
Legal Entity Identifier (cont.)
Significant
Milestones
2011 – G20 mandates
the Financial Stability
Board (FSB) to deliver
recommendations for a
global LEI system
June 2012 – The
FSB publishes
recommendations for a
global LEI system. The
recommendations are
endorsed by the G20
January 2013 – The
Regulatory Oversight
Committee (ROC)
of the LEI takes over
management of the LEI
initiative from the FSB
March 2013 – Deadline
for self-standing global
LEI system is met
October 2013 – First
pre-Local Operating
Units are endorsed by
the ROC
June 2014 – The Global
LEI Foundation (GLEIF) is
established
July 2014 – Stephan
Wolf, chief technology
officer at Interactive Data
Managed Solutions AG,
is appointed CEO of the
GLEIF
Registration fees for an LEI
Registration fees for LEIs are paid by entities to Local Operating Units (LOUs) within
the Global LEI System. The system is based on sustainable funding and includes fees
that must be paid by LOUs to support the Global Legal Entity Identifier Foundation
(GLEIF). The foundation requires LOUs to pay a $20 annual licence fee for each LEI
they issue. LOUs must also pay a member credit fee of $10 per LEI to supplement
funding of initial GLEIF operations.
Registration fee examples:
n The London Stock Exchange is an endorsed pre-LOU. It charges £115 (plus VAT) for
an initial LEI registration and an annual maintenance cost of £70 (plus VAT) per LEI.
In line with sustainable funding, both costs include the LEI licence fee the LOU must
pass back to the GLEIF.
n The US Global Markets Entity Identifier (GMEI) utility operated by DTCC in
collaboration with Swift is an endorsed pre-LOU. It charges $200 for an initial LEI
registration plus a charge of $20 that is passed back to the GLEIF. The annual
maintenance cost is $100 plus a $20 charge that is passed back to the GLEIF.
and ultimate parents of legal entities, and relationship or
ownership data more generally, by the end of 2012.
These proposals never came to light and after a significant
time lapse, the ROC returned to the outstanding issue of LEI
hierarchy data, which is essential to the original intent of the
LEI to help regulators measure and monitor systemic risk, in a
progress note published in January 2015.
The note acknowledged the importance of information
on organisational relationship structures, particularly
hierarchical structures. It also described a task force set up
by the ROC in December 2014 to develop a proposal for
the collection of information on direct and ultimate parents
of legal entities within the Global LEI System. A public
consultation on the topic is due to take place in 2015, with
discussion points likely to include the necessary extent of
hierarchy data, how it will be standardised on a global basis
and how it will be funded within the global system. After
this, the ROC suggests phased implementation of hierarchy
data will begin at the end of the year.
Outlook
Development of the Global LEI System has been sporadic,
but significant progress has been made since work began
in 2012 and positive market sentiment suggests the system
could succeed in its aim of helping regulators monitor
and measure systemic risk. The system benefits from joint
development by the public and private sectors, and on
this basis it is gaining credibility and buy-in across financial
markets.
From a practical perspective, the key components of
the system are in place and LEIs are being issued by an
increasing number of LOUs that have been endorsed by
the ROC. While the number of entities that have registered
for and been allocated an LEI is low in terms of the global
universe of entities, the number is rising and will continue
to rise as Dodd-Frank and EMIR drive adoption of the
identifier, and forthcoming regulations mandate its use.
The LEI is unlikely to be used by financial institutions as a
primary identifier for many years to come, if at all, but it
is here to stay and will become more useful as coverage
increases, hierarchy data is added to the basic LEI standard
and the Global LEI System is refined and strengthened to
give regulators a clearer view of market activity and systemic
risk, and global financial institutions a better understanding
of their customers and risk exposure.
10. 18
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
19
Overview
Regulatory reform following the financial crisis sought to
stabilise and secure capital markets by closing information
gaps exposed during the demise of financial institutions,
providing a clear view of counterparty and market risk
exposure, and improving the transparency of financial
transactions and market activity.
The US Government’s Dodd-Frank Wall Street Reform
and Consumer Protection Act was the first regulation
aimed at preventing further crises and took effect in July
2010. It was followed by the European Union’s European
Market Infrastructure Regulation (EMIR), which focuses
on transparency in the over-the-counter (OTC) derivatives
market and was implemented in August 2012 with an initial
reporting deadline of February 2014.
These regulations and others, including Know Your
Customer (KYC) and Anti-Money Laundering (AML)
regulations, Basel III and BCBS 239, Markets in Financial
Instruments Directive II (MiFID II), Solvency II and the
Alternative Investment Fund Managers Directive (AIFMD),
all require the use of legal entity data and/or Legal Entity
Identifiers (LEIs) as a means to standardise data that is
used to identify entities, discover entity exposure or
meet regulatory disclosure requirements. Similarly, the
Foreign Account Tax Compliance Act (FATCA) uses Global
Intermediary Identification Numbers to identify financial
institutions within its scope.
The entity data and LEI requirements of regulations already
using LEIs and entity data are described below.
Additional detail covering the full scope of the regulations
can be found in A-Team Group’s Regulatory Data
Handbook - http://bit.ly/regulatoryhandbookedition2
Dodd-Frank
The Dodd-Frank Wall Street Reform and Consumer
Protection Act is a US Government issued regulation that
aims to promote oversight of financial institutions through
a wide array of reforms. The legislation calls for the creation
of new data, issues guidelines on reporting formats and
maintaining and analysing existing data, and focuses on
standardisation of reference data across the industry.
The legislation’s requirement for standard reference data is
designed to improve the quality of financial data available
to regulators so that better analysis of risk and market
data can be made. It is also designed to improve market
transparency, initially in the OTC derivatives market.
The LEI facilitates these objectives by consistently
identifying parties to financial transactions and supporting
the aggregation of risk information associated with each
legal entity. The LEI is in early stages of adoption by
entities that are active in capital markets, but as take-
up grows, either by choice or as a result of regulatory
mandates, regulators will increasingly be able to
consolidate and analyse counterparty risk data without
having to reconcile multiple, non-standard datasets.
For financial institutions, the challenges of Dodd-Frank
include implementing the LEI, as many reference data
repositories are not readily extensible, and investing
in processes that allow the identifier to be included in
downstream systems that assess risk and counterparty
exposure. As the majority of entities do not yet have LEIs,
firms must also continue to use numerous proprietary and
vendor identifiers to access data from different sources of
entity data. This presents a significant cross-referencing
challenge that is expected to endure until global LEI
coverage is complete and firms become confident enough
in their use of the LEI to make it a primary entity identifier.
Although implementation of Dodd-Frank has been slow
since it became effective in July 2010 and firms are at
different stages in their responses to the regulation,
best practices are starting to emerge as large financial
institutions adopt the LEI and establish entity data
management and governance programmes that meet the
requirements of the regulation.
Regulation (cont.)Regulation
At a Glance
Regulation:
Dodd-Frank Wall Street
Reform and Consumer
Protection Act
Regulatory Regime/
Authority:
US Government
Effective Date:
July 21, 2010
Target Market:
Global financial
institutions
Core Data
Requirements:
Identification of entities
– clients, counterparties
and issuers
Regulatory data that keeps you on the right course. Only
Thomson Reuters has the depth and breadth of data, the global
footprint, local knowledge and proven experience to deliver the
exact data you need to not just comply – but thrive – anywhere
you do business. Step by step guidance for cost-effective
compliance, across the board, across the globe, including
specialist data sets for: FATCA, Basel III, Solvency II, EMIR,
Dodd-Frank, IFRS and more.
prdcommunity.com
11. 20
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
21
At a Glance
Regulation:
European Market
Infrastructure Regulation
(EMIR)
Regulatory Regime/
Authority:
European Union
First Reporting
Deadline:
February 12, 2014
Target Market:
Global financial
institutions
Core Data
Requirements:
Identification of entities
– clients, counterparties
and issuers
European Market Infrastructure Regulation
European Market Infrastructure Regulation (EMIR) is a
European Union regulation designed to ensure OTC
derivatives are cleared via a central counterparty (CCP).
In this context, a CCP must be listed in the European
Securities and Markets Authority (ESMA) registry and
set up and authorised as described in EMIR so that it is
recognised across member states. EMIR also introduces
risk management procedures for non-cleared OTC
derivatives and requirements for derivatives to be reported
to a trade repository.
Under EMIR, both parties to a trade must ensure that data
related to a concluded trade, as well as data related to
the entities involved in the trade, is reported to a trade
repository. All derivatives contracts regulated by EMIR,
including both OTC and exchange-traded derivatives,
must be reported, as well as lifecycle events such as give-
ups and terminations. Firms have until the working day
following the trade to meet reporting requirements.
EMIR mandates the use of LEIs for reporting as well as the
use of Unique Trade Identifiers (UTIs) that are common to
both parties to a trade and are used to report to a trade
repository. Both these identifiers raise data management
issues and used together in a complex system they can be
difficult to manage. One of the difficulties of the LEI is that
firms must map it to their client and counterparty entity
data. To ensure correct mapping, many firms are working
to centralise entity data and create an entity master that
will accommodate the LEI and other proprietary and
vendor identifiers, as well as support entity hierarchy data.
The UTI poses different problems as there is no standard
mechanism for the issue of the identifiers. The result is that
UTIs are usually based on bilateral agreements between
trading parties. Without agreement on a common UTI,
firms have to deal with a large number of trade repository
reconciliation breaks.
As a result of the data management issues around LEIs and
UTIs, only a small percentage of trades have so far been
matched and reported correctly, a situation that needs to
improve as regulators increase their scrutiny across Europe
and apply fines for incorrect reporting.
EMIR was introduced in August 2012, with a reporting
deadline of February 2014. ESMA has registered six trade
repositories: DTCC Derivatives Repository, UnaVista,
KDPW, Regis-TR, CME TR and ICE Trade Vault Europe.
KnowYour Customer
Know Your Customer (KYC) regulations are designed to
ensure that financial institutions can verify the identity of
their clients on an ongoing basis and are aimed at preventing
money laundering, financial fraud and, increasingly, activities
such as identity theft and terrorist financing. KYC is not a
single regulation and instead spans the requirements of
countries operating under different legal systems.
Essentially, companies subject to KYC regulations must
collect and retain information about clients before doing
business with them. The information is not new, it is legal
entity data, but it can present data management challenges
for financial institutions that must quickly identify clients
and classify them correctly according to their circumstances,
including country of origin, business type, source of assets
and income, types and purpose of transactions, and funds
held. This information needs to be kept up to date and
frequently submitted to regulators, meaning firms must
continually reassess their KYC procedures to ensure their
client data is both accurate and complete.
The complexity of KYC reporting requirements means some
firms may need to do more than keep a central repository
of information and track related audit trails. They may need
to work towards linking KYC compliance requirements with
customer data due diligence. From a data perspective, both
internal and external data feeds must be maintained, not
only for purposes of data distribution, but also in support of
risk management processes that store the data and provide
analysis of customer records.
While KYC deadlines vary between countries, most countries
with anti-money laundering concerns have had regulations in
place since the early 2000s, prompting financial institutions
to get their KYC processes up to speed and avoid penalties
imposed for non-compliance.
In many cases, entity data management in support of local
KYC compliance can help firms comply with international
regulations that require entity data, such as Dodd Frank
Regulation (cont.)
As a seasoned standards practitioner, CGS aggressively
promotes the LEI and propagates its global adoption through a
collaboration with DTCC’s GMEI utility, allowing CUSIP/ISIN and
LEI applications through a single interface. On the solutions side,
CGS offers a robust directory of legal entity data, free to existing
clients and produced via a collaboration with Avox, as well as a
linkage file to connect related issuers in the CUSIP database.
www.cusip.com
Regulation (cont.)
At a Glance
Regulation:
Know Your Customer
(KYC)
Regulatory Regime/
Authority:
Multiple
Target Market:
Global financial
institutions
Core Data
Requirements:
Client identification,
classification and
ongoing customer data
due diligence
12. 22
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
23
and the US Foreign Account Tax Compliance Act (FATCA).
Efficient management of KYC documentation used for
client onboarding and standardisation of entity data used
in the KYC process can also deliver significant cost savings
that cannot be achieved using non-standard and manual
processes.
ForeignAccountTax ComplianceAct
The Foreign Account Tax Compliance Act (FATCA) is a US
Government issued regulation that requires foreign financial
institutions (FFIs) to carry the burden of tax reporting to the
US Internal Revenue Service (IRS) for any US clients. FFIs
must enter contracts with the IRS and register for a Global
Intermediary Identification Number (GIIN) through the IRS
portal. GIINs can be used to identify financial entities and
counterparties as being FATCA compliant.
In order to enforce FATCA, the US government is making
Intergovernmental Agreements (IGAs) with other countries
and has signed about 50 Model 1 agreements, which
require FFIs to report all FATCA information to their own
governmental agencies that then report to the IRS. It has
also signed a handful of Model 2 agreements, which require
FFIs to report directly to the IRS. Many more countries that
have negotiated IGAs, but not yet finalised them, are being
treated as having an IGA in place following guidance set
down by the IRS in April 2014.
Beyond FATCA, and acknowledging the desire of countries
other than the US to operate tax avoidance schemes, a global
version of the legislation, GATCA, is being promoted by the
Organisation for Economic Co-operation and Development
(OECD), which last year proposed a global standard for the
automatic exchange of tax information between countries.
The OECD proposal has been endorsed by the G20 and
accepted by more than 40 countries that could impose their
own FATCA style rules from the start of 2016.
The Global Legal Entity Data Source
Bloomberg’s high quality entity data is the result of the people,
process and technology we employ in acquiring, scrubbing,
normalizing, mapping and delivering the data. Our experts
monitor the evolving regulatory landscape, MA activity,
sanctions lists, news and other primary sources with the goal
of delivering timely, complete data. Bloomberg’s entity data
is mapped to the LEI and fully integrated into all Bloomberg
data sets, ensuring firms are able to assess risk and maintain
compliance.
CounterpartyLink provides legal entity intelligence solutions to
global buy and sell side institutions. Our services offer entity
information direct from registration and other primary sources
for KYC/AML, verified beneficial ownership to 10% to aid with
FATCA and the 3rd EU ML Directive, industry codes (including
LEIs) for Dodd Frank and EMIR compliance, DO information for
screening, BICs and FRNs for Transaction Reporting, full parent
hierarchies for risk analysis, with documentary evidence so you
can prove it.bloomberg.com/enterprise www.counterpartylink.com
Regulation (cont.)
Overview
Entity data is growing in volume and importance as
financial institutions move from a securities-centric
approach to business to a more entity-centric approach.
This shift reflects the realities of the financial crisis,
which showed institutions lacking knowledge of their
counterparties and risk exposure, and the backlash of
regulation, which requires them to improve entity data
management and their understanding of counterparties
and exposure.
While securities data has been largely automated as part
of the trading process, entity data has lagged behind, but
it is catching up as data suppliers and consumers consider
how best it can be sourced, validated, cleansed and
consumed.
To date, most financial institutions have researched a fair
amount of data related to entities they do or intend to do
business with internally and sourced the rest from external
suppliers, but the balance is beginning to tip in favour of
external suppliers as banks are challenged by the sheer
volume of entity data they must source and maintain; use
cases for entity data grow in number; and institutions
realise the cost benefits and efficiencies of using third
parties to support ongoing entity data requirements.
Entity data suppliers
Financial institutions typically source entity data from a
number of suppliers. Some suppliers are large market data
vendors, others are niche providers dedicated to entity
data, and a few are industry-based organisations such as
business registries and Local Operating Units (LOUs) that
issue Legal Entity Identifiers (LEIs) within the Global LEI
System.
Whatever their size and scope, the role of entity data
suppliers in highly regulated markets is to continually
drive up the accuracy, consistency and quality of data
Entity Data Suppliers
At a Glance
Regulation:
Foreign Account Tax
Compliance Act (FATCA)
Regulatory Regime/
Authority:
US Government
Compliance deadline:
December 31, 2014
Target Market Segment:
Global financial institutions
Core Data
Requirements:
Client onboarding,
data maintenance and
reporting
15. Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
2928
Entity Data Management
Overview
The financial crisis, increasing market regulation and the
need to identify new business opportunities are driving
financial institutions to migrate from securities-based
data management to entity data management. Where a
securities-centric view of capital markets failed to answer
questions on risk exposure raised during the financial crisis,
an entity-centric view provides a clear understanding of all
parties to a financial transaction, supports the aggregation
of risk exposure to counterparties, and offers a platform on
which to build new business initiatives.
The transition to entity data management poses challenges
and requires both time and management buy-in, but it is
not insurmountable and can be achieved using either in-
house data management expertise, vendor solutions or a
mix of build and buy components.
For many financial institutions, the challenges include
entity identification and pulling together entity data from
multiple data silos and data sources to create an entity
data master that can provide a single and consistent view
of entities, be they customers, counterparties or issuers.
Entity identification and matching related entities across
unlinked datasets often involves fuzzy matching based
Challenges
n Multiple silos of entity
data
n A lack of links between
internal entity data
n Multiple sources of
entity data
n Numerous proprietary,
vendor and industry
standard entity
identifiers
n Differences in
granularity of entity
data
n Growing volumes of
entity data
n Internal risk
management
requirements
n External regulatory
requirements
on entity names, addresses and other attributes. At best,
this is an imperfect science. A better solution to achieving
entity identification is an unambiguously defined and
universally recognised identifier that serves as a cross-
reference across entity datasets within a company and
can be used in communication with other companies.
The Legal Entity Identifier (LEI) is making a good start in
this direction, but is not expected to eclipse the many
proprietary and data vendor identifiers used in the market
any time soon.
Without the benefit of a single, global and standard
identifier, LEIs, proprietary identifiers and data vendor
identifiers must be mapped together and the underlying
data reconciled to deliver a single, accurate and consistent
view of an entity. When data is removed from silos in this
way and a master data management approach is taken,
it is possible to see real benefits. Not only are regulatory
requirements for entity data met accurately and in a timely
manner, but also, by way of example, enriched data can be
made available to the risk practice; account management
can link data about issuers, customers and counterparties
to see how firms are related, build deep relationships and
identify cross-selling opportunities; and trading can better
understand risk exposure and improve reporting.
A robust entity master can also help firms address
operational problems such as stale or out-of-date data that
is difficult to monitor and update when held in multiple
databases. It can also lower the cost of owning and
managing customer and other entity data.
Finally, the industry mantra of complete, accurate and
timely data is as applicable to entity data as to any other
data, but it is only part of the data management picture,
which should also include sound data governance and the
development and use of best practices by all stakeholders
in entity data.
Entity Data Management (cont.)
Solutions
n Deployed software
solutions
n Hosted software
solutions
n Managed services
n Entity data utilities
Filings Data Extraction
Problem: SEC filings contain few tags, not all 10-Ks contain even
semi-structured data, and most content is unstructured, which makes
data collection labor-intensive, slow, and expensive.
Solution: WorkFusion retrieves filings from EDGAR, breaks each
document into tasks based on relevant content, and distributes
tasks to human analysts. WorkFusion programmatically quality
controls their output, and cleansed data is used to train WorkFusion’s
machine learning algorithms to automate filings extraction. www.workfusion.com
16. 30
Entity Data Management and Downstream Applications
Best practice
An example of a
best practice project
describes a bank using
a vendor Legal Entity
Identifier directory as
an entity data file to
which about 30 other
identifiers are mapped.
Downstream systems
are populated with data
from the entity master. If
the project is executed
well, the bank should
be able to quickly and
relatively easily calculate
counterparty risk across
the organisation.
Data management solutions
While the end game of entity data management is the
provision of a single, accurate, consistent and frequently
updated view of every entity with which a financial
institution has relationships and does business, there are a
number of ways to get there. Depending on existing data
management architectures and appetite for change, firms
may choose to build entity data management solutions
in house or opt for vendor solutions that fit their business
models. Vendor solutions include enterprise software,
hosted solutions, managed services and emerging data
utilities.
Enterprise software – Vendor software solutions that are
deployed in house by financial institutions to manage
entity data typically take a master data management
approach that links client, counterparty and issuer data
into a centralised entity master. These solutions provide
a single view of each client and its roles and relationships
across a business and can be used to facilitate regulatory
compliance, client onboarding, cross-selling and a better
understanding of risk exposure. Expected operational
improvements include greater efficiency, reduced cost and
enhanced data quality.
Hosted solutions – Firms with limited IT resources or
strategies to outsource non-core activities can benefit from
hosted solutions that remove the burden of deploying and
maintaining software in house and provide on demand
access to entity data management software that is hosted,
run and maintained by a vendor. These solutions are
usually based on a vendor’s enterprise software offering
and provide a dedicated instance of the software for
each client as well as the ability to configure the software
to meet business needs. Expected benefits of hosted
solutions include reduced total cost of ownership,
increased speed to implementation, and a lower
requirement for technical skills.
Entity Data Management (cont.)
Advanced Information Management
AIM Software’s GAIN Entity Master ensures consistent and high
quality entity data throughout the financial institution. Cleansing,
matching and maintaining entity data across all systems, business
lines and channels, GAIN acts as a central repository of high
quality entity data that fuels all operations and reporting. As a
result, firms benefit from a consistent view of their counterparties,
enabling better compliance and exposure control, improving
client experience - on-boarding, multi-channel interaction, and
facilitaing the identification of new client services. www.aimsoftware.com
Enriching data for the financial markets
Data quality has a direct impact on your ability to manage risk, on-board clients and
meet reporting requirements.
Avox provides market participants with high quality legal entity information, helping
them monitor risk and make informed decisions.
Clients can now access regulatory reporting content for help with Dodd-Frank, EMIR
and FATCA classifications.
For more information, smartdata@avox.info
www.avox.info
A DTCC COMPANY
BOSTON, LONDON, NEW YORK, SYDNEY, TOKYO, WREXHAM (WALES)
17. 32
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
33
Entity Data
Essentials
To meet client
onboarding and
regulatory compliance
requirements entity data
must be:
n Accurate
n Complete
n Consistent
n Timely
n Accessible
n Auditable
n Updated
Overview
Accurate, complete and consistent legal entity data is
fundamental to many downstream applications. The
applications noted below are already mandatory for most
financial institutions, but they will continue to develop and
more may be added as further regulation is implemented and
existing regulation is more stringently enforced.
Most applications use entity data as a means of identifying
customers, counterparties and issuers. Some, such as client
onboarding and client screening, assess whether entities
are suitable to do business with and, if this is not clear, raise
an alert for additional checks to be made by application
users. Client onboarding is a particular pain point for many
institutions, as application processes need to be improved
to meet increasing regulatory scrutiny and avoid penalties
including large fines for non-compliance.
Other applications, such as risk management, use entity data
to discover an institution’s exposure to any other entity, a
data management task that can be complex as entities are
often related in a hierarchy or ‘family tree’ to other entities
that also carry risk that could affect them. Similarly, regulatory
compliance applications require entity data to discover and
disclose parties to financial transactions.
While these applications have different objectives, they also
have much in common, including the need for accurate,
complete, consistent and timely entity data and, perhaps most
importantly, data that is continually maintained, updated and
can be used to form a concise and precise audit trail.
Client onboarding
Client onboarding is a process that financial institutions
must complete before doing business with a client. It is an
extensive process, with entity data at its heart, and must
comply with Know Your Customer (KYC) regulations that
require firms to be able to verify the identity of their clients
on an ongoing basis with a view to preventing activities
Fenergo’s sophisticated Client Lifecycle Management solution enables
financial institutions to efficiently manage the end-to-end regulatory
onboarding and entity data management processes. Its rules-driven
solution ensures compliance with multiple regulatory frameworks
and supports the collection, centralization and sharing of client and
counterparty data and documentation across the financial institution.
By expediting compliance and improving operational efficiencies,
Fenergo’s solutions can onboard clients faster, improve time to
revenue and enhance overall client experience. www.fenergo.com
Downstream Applications
Managed services – Like hosted solutions, managed
services remove the burden of deploying and maintaining
software in house, but unlike hosted solutions they are
based on a central vendor managed platform that can
be accessed by many clients. A managed entity data
management service will typically cleanse and de-duplicate
client entity data, and map vendor and industry standard
entity identifiers to the client identifier to deliver an entity
master. The service will provide ongoing maintenance of
legal entity data, taking into account corporate actions
that may change the data, and deliver updates to the
client’s entity master file at requested intervals. Expected
benefits of managed services include reduced total cost of
ownership, increased speed to implementation, flexibility
to meet changing client requirements, and regular feeds of
updated entity data to clients’ master files.
Entity data utilities – Data utilities advance the concepts
behind existing managed services by proposing shared
services based on a single platform that handles data once
and disseminates it many times. In terms of entity data,
the aim is to ease the industry’s data management burden
and deliver economies of scale by consolidating data
management processes that are repeated across financial
institutions in data utilities. Both commercial vendors
and industry consortia are developing data management
utilities, but questions linger about how flexible they will
be in meeting different clients’ requirements, how they will
manage data ownership and governance, and the extent
to which firms will have to alter internal processes to take
advantage of the utility model. The expected benefits of
data utilities match those of other managed services, but
whether they can go further in terms of reducing costs,
improving entity data quality and enhancing operational
efficiency remains to be seen.
Entity Data Management (cont.)
Bureau van Dijk’s Compliance Catalyst is the ultimate risk
assessment tool that streamlines research and on-boarding
processes by combining extensive data with your own risk
models. Compliance Catalyst integrates comprehensive
information on companies including directors, company
structures, beneficial owners, PEPs and Sanctions intelligence and
adverse news stories, in a bespoke platform. It generates fully
audited and secure documentation detailing your analysis on a
company, its group and its environment. www.bvdinfo.com
18. 34
Entity Data Management and Downstream Applications
including money laundering, financial fraud, identity theft and
terrorist financing.
For many years, client onboarding has been a predominantly
manual process that is suboptimal for both clients and banks.
Clients are bombarded with forms that must be filled in from
different departments of a bank selling different products,
but collecting similar information about the client, making the
onboarding process lengthy and cumbersome. This problem
is exacerbated for clients when they apply for products from
many banks.
Looking at the other side of the relationship, the time it takes
banks to onboard clients can extend time to revenue, reduce
customer service satisfaction and jeopardise the customer
relationship. A bank doing business with a client outside the
confines of KYC and Anti-Money Laundering regulations is
increasingly likely to be penalised and will not only pay hefty
financial fines, but also face damage to its reputation.
Increasing regulation coupled to tough competition to win
clients and sustain their loyalty is driving financial institutions
to readdress client onboarding using innovative technology
approaches designed to provide a swift, seamless and cost-
efficient process that will support not only onboarding, but
also business opportunities such as cross-selling and new
product development.
Some large banks are developing their own client onboarding
applications, a practice that is expected to dwindle in the
near future for reasons of cost, time and lack of human
resources, while other financial institutions, particularly
mid-sized organisations, are adopting or considering vendor
solutions. These tackle the data management challenges
of onboarding and solve the problems of client and bank
dissatisfaction with existing systems by automating a large
part of the process, ensuring ongoing data maintenance and
updates throughout the lifecycle of a client, and providing an
audit trail that will meet regulatory requirements.
While these outcomes are fairly consistent across vendor
iMeta’s Assassin platform delivers a complete end-to-end solution
for client and entity on-boarding and lifecycle management.
Providing a comprehensive on-boarding process that is “ready
to trade,” Assassin can manage the complex regulatory and
operational data requirements of capital market organisations.
Offering a single view of the whole client lifecycle, it supports
compliance with regulations covering AML, FATCA, MiFID, DFA
and EMIR. The full suite consists of: Assassin KYC, Assassin Credit
Legal and Assassin SSI. www.imeta.com
Downstream Applications (cont.)
Client and Entity
On-boarding and
Lifecycle Management
Assassin is a software platform that fully
supports the end-to-end regulatory and
operational processes financial institutions
need for the on-boarding and ongoing
management of clients and related entities.
Provides comprehensive on-boarding
process that is “ready to trade”
Supports data covering KYC, AML,
FATCA, MiFID, DFA and EMIR
Incorporates market leading platform
for Account and SSI Management
Integrates with industry data sources to
reduce manual input and operational risk
Configurable data model, workflow and
business rules, which adapt to fit individual
customer needs
Flexible integration layer facilitates straight
through processing to downstream systems
Tel: +44 (0)2380 762012
Email: enquiries@imeta.co.uk
Web: www.imeta.com
19. 36
Entity Data Management and Downstream Applications
solutions, there are various operating models and
technologies behind their delivery. A number of vendors
offer rules-based software solutions that typically provide a
central platform for the collection and sharing of client and
counterparty data and documentation across an organisation.
Data models, rules and workflows can be configured to meet
the needs of individual financial institutions and the central
data repository is constantly monitored and updated to cover
not only one-time client onboarding, but also the complete
lifecycle of a client. These types of solutions play well into
the demands of client onboarding, but with a centralised
entity database they can also support compliance with other
regulations that call for entity data management.
Another technology approach uses machine learning as
part of a software-as-a-service solution that automates data
collection for client onboarding and KYC. Raw data is fed
into the system, workers manage the data and algortihms
are trained to match the patterns of the workers to automate
data collection. Any exceptions are flagged to data analysts,
the algorithms learn again and processed data can be
uploaded to a customer database and updated in response
to any changes in the source data.
A number of solutions based on the utility model are also
emerging to meet the needs of customer onboarding.
Some of these are offered by data and data management
vendors, while others are the result of industry collaboration.
Utilities are based on the one-to-many model of handling
data once and distributing it many times. In the case of
client onboarding, clients upload data and documentation
to a utility once, the utility validates, stores and continually
updates the data, and it is used by multiple financial
institutions that participate in the utility for client onboarding.
The aim of the utility is to offer financial institutions
economies of scale, including reduced cost and improved
time to revenue, but it should be noted that regulatory
responsibility for the client data remains with the institution
and cannot be delegated to the utility.
Downstream Applications (cont.)
Corporate Actions
Problem: Actions (MA, buy-backs, splits, etc) emerge from a
wider variety of formats and sources and are hidden by different
names and identifiers, complicating data collection and attribution.
Solution: WorkFusion monitors, aggregates and ingests actions
data from any source, lets subject matter experts instrument the
optimal process, and automates the management of data analysts
and machines to collect, validate, and structure corporate actions
data. www.workfusion.com
Full Regulatory
Compliance
GreaterUpsell/
Cross-Sell
ImprovedOperational
Efficiencies
Costs
Client Exper
ience
Reduced Enhanc
ed
Clie
ntOnboarding
Accelerated
A BetterWay to Manage
KYC, Regulatory Entity Data
With Fenergo’s Client Counterparty Data Management solution,
up to 80% of all centralized entity data and documentation
can be re-used to support multiple regulatory obligations
– across FATCA, CRS and global OTC derivative rules
(like Dodd-Frank, EMIR, MiFID II, Canadian and APAC derivatives).
Find out more!
Download our new paper on Managing the Regulatory Delta and find out how your
institution can centralize and capitalize on your client and counterparty data.
Visit www.fenergo.com
With Fenergo, your Client and Counterparty Data is in safe hands!
$
Fenergo
Entity
Data
Platform
New Approaches
to Client
Onboarding
n Rules and workflow-
based software
solutions for client
lifecycle management
n Software-as-a-service
machine learning
for automated data
collection
n Data utilities that
handle data once and
distribute it many times
20. 38
Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications
39
Client screening
Client screening requires counterparty data to be checked
against financial sanctions, trade embargoes, politically
exposed persons (PEP) and other watch lists to detect
whether an order has been made to prohibit companies
carrying out transactions with an organisation or person.
The data included on sanctions and other watch lists is,
essentially, entity data. In the case of individuals, it includes
information such as name, title, date of birth, place of birth,
aliases, nationality, passport number, address and any other
information relevant to the identification of the individual.
Entity information follows a similar pattern, including name,
aliases, acronyms, address and other relevant information.
Like client onboarding, screening counterparties against
sanctions and other watch lists requires not only initial
checks ahead of doing business, but also ongoing checks
to ensure compliance with any changes to the lists. The
data management challenge is to continually monitor
counterparties for any change – perhaps a change in
sanction or PEP listing status, or a change of domicile
– and manage data quickly to ensure the right business
decisions are made to avoid any sanctions breaches.
Software applications for screening can be implemented
by financial institutions in house, but the trend is towards
managed services that maintain updated sanctions, PEP
and other watch lists and configure entity matching
options to meet institutions’ specific requirements.
Risk management
The need to aggregate exposure and concentration risk by
counterparty, country and asset class is a common theme
for both regulators and internal risk officers. It requires
extensive data, including legal entity data, as well as
sophisticated calculation, data management and reporting
tools. The entity data must be complete, accurate and
extensive to cover large institutions’ risk management
Downstream Applications (cont.)
requirements. It must also be mapped to a single entity
identifier and include hierarchy data to provide a clear and
comprehensive view of risk across an organisation, and a
true understanding of exposure.
The industry standard and free-to-use Legal Entity
Identifier (LEI), an element of the Global LEI System
that was introduced after the financial crisis exposed
the industry’s inability to identify parties to financial
transactions and understand their exposure, is designed
to help regulators measure and monitor system risk. It
is also being used by market participants as a common
entity identifier to better understand entity relationships
and improve risk management internally. While most
financial institutions maintain some entity data in house,
they also use data vendor feeds to drive risk management
applications.
Regulatory compliance
Regulations that already mandate the use of entity data
for compliance include Dodd-Frank and European Market
Infrastructure Regulation (EMIR), which require the LEI
to be used for transaction reporting, and the Foreign
Account Tax Compliance Act (FATCA), which requires
Global Intermediary Identification Numbers to be used to
identify financial institutions within its scope.
Entity data is also key to compliance with Know Your
Customer (KYC) and Anti-Money Laundering (AML)
regulations, and is mandated for use in forthcoming
regulations including Basel III, Markets in Financial
Instruments Directive II (MiFID II), Solvency II and the
Alternative Investment Fund Managers Directive (AIFMD).
The entity data required to comply with these regulations
can be supplied to financial institutions by data vendors
that offer entity data feeds and datasets designed to
meet the compliance requirements of specific regulations.
Compliance can also be achieved using data management
solutions that centralise entity data for both client
onboarding and regulatory compliance.
Downstream Applications (cont.)
Bureau van Dijk offer detailed, reliable information to help you
conduct customer due diligence, minimise reputation damage
and risk exposure. Use our extensive company data via our global
database Orbis, or our bespoke platform Compliance Catalyst,
a solution that streamlines your AML research and on-boarding
processes, combining extensive data with your own risk models.
We can help you screen your suppliers or customers against
sanction lists quickly and easily - even showing you indirect links
from companies associated with you to sanctioned entities. www.bvdinfo.com
Sanctions List
Checks
Financial institutions must
check counterparty data
against financial sanctions,
politically exposed
persons and other watch
lists that include entity
data such as:
n Name
n Title
n Date of birth
n Place of birth
n Aliases
n Acronyms
n Address
n Nationality
n Passport number