“…We engaged Epsilon to be our strategic partner…. I am happy to say that Epsilon met or exceeded our expectations on this project. We successfully completed the RFP project and selected a system that was right for our business…”

FHLB
Chief Financial Officer

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

The January 1, 2022 implementation deadline of Fundamental Review of the Trading Book (FRTB) regulation may be three years out, but the data management challenges of compliance need to be considered now. Most importantly, a BCBS consultation paper on the standardised approach to calculating minimum capital requirements for OTC derivatives needs urgent industry input on what could become impossible requirements around non-modellable risk factors (NMRFs) under FRTB. The consultation paper - Revisions to the Minimum Capital Requirements for Market Risk – has a closing comment deadline of June 20, 2018.

The challenges of NMRFs include the requirement to provide frequent, real price observations of the risk factors if they are to be used in an internal model approach (IMA) to calculating minimum capital requirements for market risk in line FRTB. Without this evidence of market liquidity, products can’t be traded; with the evidence, capital requirements may be sky high, leading banks to withdraw non-profitable products.

The Data Management Super Bowl

Tim Lind, managing director of data services at DTCC, describes FRTB as ‘The Data Management Super Bowl’ in terms of trade data aggregation required for the risk factor eligibility test (RFET) of NMRFs, essentially the provision of at least 24 real price observations of the value of the risk factor over the previous 12 months, with no more than a one-month gap between any two observations. He adds: “Trade reporting per se proves nothing. This is all about the liquidity of underlying instruments as a proxy for risk.”

Considering, among other issues, approaches to gathering required data, the consultation paper states: “The Committee is also aware of nascent efforts to establish data pooling schemes that could improve the availability of real price observations for the RFET, but that may face confidentiality driven challenges that prohibit the sharing of actual prices to subscribers of such a service.”

Potential data pooling solutions

While many banks are working to understand whether they can meet NMRF data requirements internally, the likelihood is that they will not be able to demonstrate 24 real price observations in the required timeframe. One potential solution is to pool data for wider industry use, a service that DTCC is developing for OTC credit derivatives on the basis of its data repository capabilities.

Whatever the level of difficulty in sourcing satisfactory data, the consultation paper does not back away from the concept of the FRET of NMRFs, and states: “The Committee proposes clarifications to the RFET and a number of principles to inform assessments of the quality of data that banks use to calibrate their internal models.”

Potential data pooling solutions

While many banks are working to understand whether they can meet NMRF data requirements internally, the likelihood is that they will not be able to demonstrate 24 real price observations in the required timeframe, across all relevant asset classes.

One potential solution is to pool data for wider industry use, a service that DTCC is developing for OTC credit derivatives. The organisation’s FRTB Real Price Observations Data Service uses DTCC’s global data infrastructure to pool observable transaction data, helping banks meet FRTB requirements and reduce risk capital charges.

Another potential solution is a utility-style response to the data management problems of NMRFs, which could be offered to a consortium of banks and based on the Secure Data Pooling Service (SDPS) developed by Iason Consulting, a risk consultancy, in partnership with Financial Machineries. The SDPS provides a cryptographic, blockchain-like technology that allows pooling of transacted data. A key feature of the service is that the administrator will not know or own the source data, or resultant data. The ownership of this data will be retained by participating banks, who can choose to share or commercialise the data in an open source format.

Other features include high levels of security, for example, no bank contributing data has access to any other bank’s data, and nor does the administrator see the source data. The only information made available to all parties is the final time series volume weighted average price, obtained via calculations on the contributed data. The Secure Data Pooling algorithm is public to contributing members, with open source code and readily available information. The service will be hosted in the Amazon Web Services cloud.

Antonio Castagna, founder of Iason Consulting, says: “The key differentiator we offer is that this managed utility will not own the data. We will calculate and administer observable prices anonymously, but the resultant data will be owned by contributing banks, which can commercialise it if they wish.”

Whatever the level of difficulty in sourcing satisfactory data, the consultation paper does not back away from the concept of the RFET of NMRFs, and states: “The Committee proposes clarifications to the RFET and a number of principles to inform assessments of the quality of data that banks use to calibrate their internal models.”

Design flaws

On the topic of concerns expressed by market participants that the approach defined for NMRFs may be subject to design flaws that result in disproportionately high capital requirements for some risk factors relative to the risk they pose to a bank – for example, due to an arguably liquid risk factor not meeting requirements of the RFET or due to an overly conservative treatment of certain types of NMRFs – the Committee states it ‘has not received compelling evidence for these issues and seeks further feedback in response to the consultative document that could support a final decision on them. In the absence of compelling evidence, the Committee does not propose revisions to these aspects of the treatment of NMRFs’.

Damaging impact on liquidity

While banks need to consider whether their individual trading desks can pass FRTB’s rigorous approval process to use an IMA or whether they will have to fall back on the standardised approach described by the regulation, the problems of NMRFs also require them to carry out internal stress tests to see if a desk is viable from a profitability perspective. NMRFs may be a matter of FRTB compliance, but a lack of price observations, the capital costs the risk factors can generate and resulting decisions to withdraw products from the market could have a damaging impact on liquidity.

FRTB is a high stakes game, make sure you add your comments on the consultation paper here.

Show Author Info?: 
No
Author: ateamgroup
Posted: May 22, 2018, 9:17 am

Fenergo has partnered technology consulting and managed services provider Delta Capita to deliver a ‘Powered by Fenergo’ client remediation offering designed to address market demand and act as a catalyst to the adoption of Fenergo’s client lifecycle management solution either onsite or as a managed service delivered by Delta Capita. The strategic partnership is the first in Fenergo’ Partner Platform Programme.

The companies’ service is aimed at helping Tier 2 and Tier 3 banks, as well as non-bank financial services organisations, that are seeking to simplify and reduce the total cost of ownership associated with operating internal business platforms. It will also provide client remediation projects on flexible terms with a view to helping clients that are struggling with remediation due to a lack of proper tooling, controls and reporting that is resulting in escalating staff costs.

Marc Murphy, CEO at Fenergo, says: “This is a significant development for Fenergo as it allows us to service new segments of the market. Firms in these segments will not only benefit from Fenergo’s industry standard solution, but will also have access to our client community, a global regulatory collective of over 20,000 risk and compliance experts.”

Joe Channer, CEO at Delta Capita, adds: “This partnership will bring a disruptive service offering to market, helping clients meet their client lifecycle management challenges. The proprietary approach to managing the client lifecycle is unsustainable for many medium- and smaller-sized firms. Clients want to reduce the cost and complexity of ownership and are seeking utility style business models offering industry standardisation based on best practice and mutualised cost benefits. This partnership is a response to that demand and offers clients a real alternative to in-house managed solutions.”

The managed service will be a global solution, serving clients locally through Delta Capita hubs based in the UK, Benelux, Frankfurt, Johannesburg, New York, Singapore and Hong Kong.

Show Author Info?: 
No
Category: 
Author: ateamgroup
Posted: May 21, 2018, 10:01 am

If your organisation is still struggling with the data management aspects of compliance with General Data Protection Regulation (GDPR), which comes into force next week, a report from the EDM Council recommending the use of its Data Management Capability Assessment Model (DCAM) may be helpful.

The report, General Data Protection Regulation (GDPR): The Role of Data Management, was developed with 40 practitioners from 24 member firms, including 14 global systemically important banks (G-SIBS) and regional banks.

The report points out that, because the privacy function is executed across the enterprise wherever personal data is managed, compliance to GDPR is only achievable where a comprehensive data management framework is in place. The group concluded that using DCAM provides the structure and critical capabilities for supporting the data and data management requirements of GDPR compliance.

The EDM Council report, along with two supporting documents, GDPR Requirements Analysis Quick Reference Guide and GDPR Work Group Analysis Worksheet, are available online at www.edmcouncil.org.

Show Author Info?: 
No
Author: ateamgroup
Posted: May 17, 2018, 10:28 am

Data lineage is key to regulatory compliance and financial institutions’ ability to understand and use their data to business advantage. It is also important from an operational perspective, as a successful implementation can identify systems and data feeds that are no longer necessary and can be switched off, saving money and resource. The webinar will consider the drivers of data lineage, best practice implementation and beneficial outcomes.

Webinar Date: 
Tuesday, June 26, 2018 - 15:00
Author: ateamgroup
Posted: May 17, 2018, 10:18 am

The Global Legal Entity Identification Foundation (GLEIF) is planning a research project to discuss the use of distributed ledger technology (DLT) for LEI issuance and has this week introduced a free of charge certification process – the GLEIF Certification of LEI Mapping Service – that is designed to ensure organisations that map the LEI to their own identifiers use reasonable processes to do so accurately. Also on the agenda are LEI Golden Copy Files and API access to the GLEIF LEI database that are due to come onstream later this year.

The foundation is also braced for another run on the LEI as the six-month period of reduced requirements for the identifier to meet Markets in Financial Instruments Directive II (MiFID II) compliance ends in early June.

To find out more about the GLIEF’s plans and expectations on LEI uptake in coming months we caught up with CEO Stephan Wolf.

Looking first at the potential of DLT, Wolf says the GLEIF’s initial investigation into the technology (DLT) will start in the second half of this year with an open research project to discuss how the technology could be used to support the LEI. Results of the research are expected to be published in 2019 and are likely to focus on the use of the distribution and encryption capabilities of DLT in LEI issuance.

LEI mapping service

The foundation’s most recent addition, the GLIEF Certification of LEI Mapping Service, certifies and publishes publicly available, open source relationship files that match identifiers against corresponding LEIs, easing the process of gathering, aggregating and reconciling counterparty information. This can be useful for purposes including compliance, regulatory reporting, client relationship management and due diligence.

Essentially, the mapping service evaluates the mapping processes and algorithms applied by organisations seeking to link other identification schemes to the LEI. When seeking certification, the mapping partner documents the nature and extent of mapping processes, the GLEIF reviews the processes, validates sample data from the mapping partner and determines if the sample data meets or exceeds GLEIF’s pre-determined accuracy and quality criteria. In cases where it does, certification is confirmed, although the partner must maintain quality standards.

Golden Copy Files

The GLEIF’s Golden Copy Files were first published for public review in February 2018 and respond to market requirements for more frequent publication of LEI data to support different time zones. The files are generated from GLEIF LEI concatenated files, updated and distributed three times a day at eight-hour intervals, and accompanied by four delta files. The delta files are designed to identify only newly issued LEIs and revisions to a LEI’s reference data historically reported in a Golden Copy File published either eight hours, 24 hours, seven days, or 31 days earlier.

Wolf comments: “Golden Copy Files reduce complexity by providing one complete record for each LEI, while the delta files mean LEI users don’t have to parse large amounts of LEI data every day. By simplifying processes, we can drive out hindrances to using the LEI.”

The Golden Coy Files also add geocoded versions of addresses found in LEI data to each LEI record, providing more normalised address files that offer a more consistent format across all data records.

The Golden Copy Files are in beta test with public release expected in the fourth quarter of 2018, although based on feedback from stakeholders, any changes will be minor. Wolf expects these files and delta files to be the main way of communicating LEI data going forward.

Also in beta test and due for release this summer is an application programming interface (API) that allows electronic access to the LEI database and Golden Copy Files to ease use of the identifier in downstream systems.

LEI issuance

Looking at LEI issuance, Wolf notes a surge of activity in the months before MiFID II and Markets in Financial Instruments Regulation (MiFIR), which also requires LEIs, went live on 3 January 2018, continued but slower growth after ESMA announced a six-month temporary easing that removed the threat of No LEI, No Trade, and ongoing uptake towards June 2018 when LEIs will be mandated for use in MiFID II and MiFIR.

With close to 1.2 million LEIs now issued to legal entities, Wolf says: “In the run up to MiFID II, daily issuance of LEIs was up to 10,000. Although that has slowed down considerably, we have seen continued growth in total LEIs issued through the ESMA delay and issuance is four times what is was at this time a year ago. One thing no one knew with any certainty before the advent of the LEI was how many legal entities there are in Europe, the LEI is bringing transparency to that.”

Show Author Info?: 
No
Author: ateamgroup
Posted: May 16, 2018, 11:51 am

Thomson Reuters has upped the fight against financial crime with the addition of Media Check to its World-Check One platform. Media Check provides media screening and processing powered by machine learning and delivering increased efficiency and accuracy by filtering unstructured content from over 11,000 global print and web sources.

Benefits of World-Check One’s Media Check include enhanced compliance workflow and assurance that only relevant content is presented to compliance professionals. This is achieved through intelligent searching, a unique Anti-Money Laundering taxonomy informed by 15 years of World-Check experience, and machine learning algorithms developed by the World-Check research team. The result is a reduction in false positives and improved content navigation for better and more informed decision making. Media Check also provides continuous, up-to-date media and data monitoring.

Phil Cotter, managing director, Risk Segment, Thomson Reuters, says: “Adding a machine learning dimension to our World-Check One platform gives clients an exceptional means to help pinpoint the most relevant media information, thereby maximising the efficiency of their due-diligence processes.”

World-Check One was introduced in 2014 as a screening solution and has since been enhanced, most recently in March 2018 with the addition of a Customer Risk Screener app that integrates World-Check data into the Salesforce AppExchange. Also released this year is a Watchlist Screening functionality that enables users to upload and manage internal and public third-party lists for screening alongside the World-Check data in one simple workflow.

Show Author Info?: 
No
Author: ateamgroup
Posted: May 16, 2018, 10:39 am

RSU Rating Service, a Munich based provider of wholesale credit rating systems and related IT services for the Landesbanken in Germany, has selected GoldenSource’s Market Data Solution to support the ingestion of market data from multiple data suppliers and provision of individual data feeds to many banks.

GoldenSource was selected above competitors on the basis of the total security of its system in terms of multi-tenant data separation and integrity, ensuring that RSU’s clients are only able to retrieve and view data meant for them.

Dana Wengrzik, managing director at RSU, says: “Alongside GoldenSource’s obvious technical capabilities, we were particularly impressed by its willingness to work alongside us as a long-term partner for our Central Market Data Service. The requirements of our business services are complex and we need a provider that can help us with operational efficiency while providing expertise on areas such as regulation and best practice.”

Charlie Browne, head of market data and risk solutions at GoldenSource, adds: “With European regulations becoming more data driven, RSU needs a solution that supports its business and compliance needs. We have the expertise to provide RSU with the best data foundation for managing and cleansing the growing volumes of market data required by the industry in an increasingly onerous regulatory environment.”

Show Author Info?: 
No
Author: ateamgroup
Posted: May 15, 2018, 5:27 pm

The SmartStream Reference Data Utility (RDU) is building momentum in capital markets, helping users drive down costs and preparing to add equity and fixed income reference data services to the listed derivatives solution it already provides.

To find out more about the utility’s plans, solutions and users, we caught up with CEO Peter Moss, who joined the company in August 2016, about a year after it was established by SmartStream and leading banks Goldman Sachs, JPMorgan Chase & Co and Morgan Stanley. His mission was to give the business direction, build repeatable products and create a robust and resilient organisation.

Moss explains: “The risk of a start-up is that it delivers consultancy type projects rather than repeatable solutions, a utility has to provide repeatable products. The company’s first year focussed on satisfying the shareholder banks, the second on making everyone aware of the utility and building out our product line, and this year, the third year, is about actively selling the managed reference data service we created for the founding banks and our MiFID II products.”

The banks work closely with the RDU, providing expertise and development help, and were all live on the utility by the end of 2017. Following initial plans and solving one of the banks’ biggest problems, the utility provides listed derivatives reference data, a dataset that Moss describes as ‘second to none’. Moving forward, the company is building an equities reference data solution that is due to come to market in the third quarter of this year and a fixed income solution that is expected to be available in 2019.

Moss says: “The goal is to provide a full security master, so we need to add equities and fixed income as part of that. Equities is first as they are more closely aligned with listed derivatives.” While the RDU had to do most of the leg work to build a listed derivatives dataset as data vendors don’t make adequate provision in this space, the utility is able to lean harder on vendors to develop equity and fixed income datasets.

The extension of asset classes covered by the SmartStream RDU should attract additional users, although numbers are already rising in great part as a result of the company’s Markets in Financial Instruments Directive II (MiFID II) service, which was released on the weekend of New Year’s Eve after a working Christmas in the office. Moss says: “MiFID II is very dependent on instrument reference data, which is exactly what we provide. We make the data as available and easy to use as possible.”

Customers signed up as the service was released and more are following, many of which got to the MiFID II starting line, but are not necessarily comfortable with what they built. Moss comments: “We are having conversations and expect more contracts this year as banks begin to automate what they built for MiFID II. These banks tend to be Tier 2 and 3 and may be systematic internalisers at some point, but they didn’t throw money at MiFID II in the same way as Tier 1 banks.”

The company has since released a second MiFID II solution, a centralised Systematic Internaliser (SI) Registry developed in collaboration with a group of Approved Publication Arrangements (APAs) responsible for trade data collecting and reporting under MiFID II. The registry is operated by SmartStream RDU and allows SIs to register financial instruments for which they are providing SI services in the registry through their APA. The RDU will then make the data available to the market. This is important as industry participants must identify whether trading counterparties are SIs for the financial instruments they are trading so that they can determine which counterparty must report the trade. To date, more than 50 SIs are on the SI Registry and more are joining every day.

These MiFID II solutions as well as the RDU’s overarching reference data services are driving momentum at the company, which has towards 20 customers and is working towards break even. The addition of equity and fixed income data will open up a larger customer set including not only sell-side firms that are already serviced, but also buy-side firms.

The USP of the RDU is its ability to help customers reduce the costs of managing data. It also fills a resource gap at firms struggling to hire and retain staff to source, cleanse, normalise and automate reference data. And there is more. Moss says: “The benefits add up. We can provide better data quality than customers previously had to drive their automated processes and dramatically reduce exception management. Fewer exceptions is a key measure of success, with a recently onboarded client reducing its trade processing exceptions by 75%.”

The company’s credibility is based, in great part, on the fact that it is an industry initiative. In Moss, it has a highly experienced and respected CEO who came out of retirement from Thomson Reuters in 2015 after a 24-year career at the company that culminated in the role of managing director of a $6 billion global financial services business. On his leadership of the SmartStream RDU, Moss concludes: “I will stay at least until the company breaks even and is on a path to good profitability.”

Show Author Info?: 
No
Related: 
Data Management Summit (DMS) - New York City, September 20th 2018
Author: ateamgroup
Posted: May 14, 2018, 11:42 am

Nathan Wolaver has returned to Asset Control as managing director of the company’s Americas business after a three-year stint as global head of data management solutions at Broadridge. Based in Asset Control’s New York office, Wolaver is responsible for all aspects of the company’s business operations across the Americas, including sales, customer support and professional services. He reports to CEO Mark Hepsworth.

Wolaver certainly has plenty of experience of Asset Control, having first joined the company in 2005 as a senior sales executive. He moved on to Aleri in early 2008 as regional sales director, northeast, and re-joined Asset Control later in the year as vice president and sales director for the Americas, a role he held for six years before becoming managing director for nine months and then moving to Broadridge.

Wolaver re-joins Asset Control again at a time of accelerating growth in the Americas and as the company plans new product releases. Commenting on his appointment, Hepsworth says: “We are very pleased to welcome Nathan back to lead our American operations. His depth of knowledge of the financial data management sector and his previous experience at Asset Control are a real strength and a strong addition to our leadership team. The Americas is strategically very important for Asset Control. We have some of our largest clients globally here and we see excellent opportunities as we roll out our new range of products.”

Show Author Info?: 
No
Author: ateamgroup
Posted: May 10, 2018, 11:45 am

Governor Software has executed a new update to its institutional compliance oversight solution through a licensed link to the FCA Handbook, allowing clients to automatically integrate and update their internal client policies and controls with FCA guidance. The move, believed to be the first of its kind, provides compliance professionals with a new visualisation tool to help them stay afloat in the midst of regulatory data overload.

“Keeping abreast of the FCA Handbook is no mean feat for financial institutions, with requirements differing between organisations and updates frequent. While other companies publish this handbook within their software we believe we are the first to take the structure and enable clients to link their policies directly to it,” said CEO Richard Pike. “This not only saves considerable man-hours, automatically alerting appropriate team members when their policies need updating or reviewing, but also provides comprehensive oversight and compliance through auditable mapping.”

The FCA integration is the latest step forward from Governor, a Dublin-based firm partially owned by the Irish Government, which launched back in 2015 with the goal of turning oversight on its head to create a top-down solution based on dynamic visual status management.

With the Financial Conduct Authority (FCA) expected to extend its Senior Managers and Certification Regimes (SMCR) and Conduct Regime (CR) to the vast majority of regulated financial firms by 2019, the question of compliance oversight is becoming an increasingly pressing priority for senior management – and an area that is currently both overlooked and underserved.

“Board members of financial institutions are under an increasing amount of scrutiny, and with well over 200 offences in the UK for which directors can be held personally liable, the cost of non-compliance can be severe,” explained Pike. “Yet the sheer volume of information to be processed makes the oversight of these requirements a complex and time-consuming business.”

The challenge is not one of regulatory compliance – banks and financial institutions will already have a wide range of robust and rigorous regulatory compliance procedures. The question is how to monitor, filter, track, and translate these systems to ensure effective and comprehensive oversight.

“To achieve excellence in governance you have to be able to understand your obligations to all stakeholders, measure your status against those obligations and prove that status to a third party,” noted Pike. “Just ringing up the head of AML every few months and asking if everything is OK just doesn’t cut it anymore. Our system takes the information from your core compliance and risk systems, and presents it at the higher levels in an auditable format.”

This is done through a unique process of visual normalization, based on the individual risk appetite of the client. With data stored in multiple forms across multiple categories, a bottom-up approach struggles to compare apples with pears (or capital calculations with open audit test problems). By contrast, the Governor Software Solution simply captures whether a client is “in” or “out” of appetite for each category of risk, using a traffic light system to deliver a set of binary results that removes the need to compare across metrics.

Displayed in the form of a dynamic graph rather than a static list, it operates like a kind of living mind-map, reflecting the performance of compliance functions in real-time to provide an overall picture of oversight across the organization.

The recent FCA integration takes this refreshingly logical approach to the next level, allowing financial institutions to tailor their compliance to the demands of the financial watchdog. “When a regulator walks through your door they think and assess financial institutions in terms of the FCA Handbook,” noted Pike.

“This update basically allows our clients to present their data in the way the regulator thinks.”

Show Author Info?: 
No
Author: ateamgroup
Posted: May 10, 2018, 11:33 am

With just weeks to go before the compliance deadline of General Data Protection Regulation (GDPR) on 25 May 2018, will your organisation make it over the line, what are the toughest challenges of getting GDPR right, and what should you do if you are worried about being compliant in time?

To answer these and other questions, A-Team Group hosted a webinar dedicated to the countdown to GDPR. The webinar was moderated by A-Team editor Sarah Underwood and joined by Garry Manser, head of data governance at Visa; Sue Baldwin a vendor management expert and independent consultant; and Tudor Borlea, product specialist at Collibra.

Webinar Recording: Countdown to GDPR

On the upside, an early poll of the webinar audience showed 6% of respondents completely ready to meet the GDPR compliance deadline, 52% expecting to be ready in time and 42% not expecting to be ready but able to show intent if challenged by regulators. Referencing the poll, Borlea said financial institutions are used to regulation so are managing GDPR well and should be in a defensible position by 25 May. Baldwin noted unfinished final guidelines from the Information Commissioner’s Office (ICO) and the EU detracting from the process of getting completely compliant. We’ll reflect on how the GDPR deadline day goes in our next webinar covering the regulation on 29 May.

At this stage, outstanding data management challenges of the regulation identified by a second audience poll are, in descending order, managing data across silos, ensuring data is accurate and updated, setting alerts for data privacy breaches, identifying personal data, and providing data access for individuals.

The speakers agreed that legacy systems and data silos are significant challenges to compliance with this and other regulations, and noted the need for data remediation and culture change to help get GDPR right across the organisation.

Featured Download: Poll results on Countdown to GDPR from our recent webinar audience

Discussing what financial firms that won’t be ready for GDPR should be prioritising now, Manser said the key thing to do is cover consent by using privacy notices on your websites that are GDPR compliant and describe how personal data is used and stored, and offer an option for data subjects to opt out of personal data being used by your organisation.

The panel warned against completion celebrations on 25 May, noting that this is only the start of a long journey to maintain GDPR compliance and transform early tactical approaches into strategic solutions. It also warned that there will be large numbers of ambulance chasers asking for access to their data just because they can.

If your tight for time, worried about compliance and concerned about the huge fines of non-compliance, make sure you can show reasonable levels of intent to be compliant on the big day and look forward to the benefits of a robust GDPR solution including improved reputation and rising profitability.

Listen to the webinar to find out about:

  • Approaches to compliance
  • Outstanding challenges
  • Last minute priorities
  • Data conflicts with MiFID II
  • Benefits of compliance
Show Author Info?: 
No
Related: 
Book Webinar Now: Just days after the GDPR deadline, how did your firm cope with go live and what still needs to be fixed?
Author: ateamgroup
Posted: May 9, 2018, 12:48 pm

Private equity firm Cathay Capital and French public investment bank Bpifrance have signed up to acquire a majority stake in NeoXam. The data management and transaction solutions company has been up for sale for months and its acquisition leaves only Asset Control, which has been up for sale even longer, on the enterprise data management (EDM) shelf.

Cathay Capital and Bpifrance are buying a majority stake in NeoXam from BlackFin Capital Partners, which funded the company along with private investors led by CEO Serge Delpla when it took off in February 2014.

Over the past couple of years, and in the wake of a spending spree that included the acquisition of IBOR and EDM provider SmartCo – perhaps the company’s best buy as it brought in a consistent data model and integration capability – NeoXam has increased revenue by 25% to €62.5 million, hired over 150 employees and signed more than 32 deals including recent wins at United Overseas Bank and Quilvest Asset Management.

While BlackFin is known to have been looking to shift its share in NeoXam, it has been working closely with the company to identify suitable suitors and supported the selection of Cathay Capital, which has resources in Europe, China and the US and will extend NeoXam’s market reach in the US and Asia-Pacific, and Bpifrance for its institutional footprint.

Serge Delpla, founder and CEO at NeoXam, says: “Since 2014, we have been creating with BlackFin a new leader, aggregating together established software and seasoned teams with fast-growing French gems. This partnership has been successful. Now, Cathay Capital’s proven track record and Bpifrance’s institutional strength will be pivotal in providing us with the resources needed for NeoXam’s expansion plans.”

Laurent Bouyoux, chairman at BlackFin Capital Partners, adds: “We are proud of the robust and profitable model achieved by NeoXam in less than four years after its carve-out led by BlackFin and thanks to successful organic and external developments. We have been very supportive of Serge Delpla and the management in their ambitious geographical expansion strategy, which allowed Neoxam, initially France and Europe focused, to become a global player.”

The acquisition is expected to close at the end of this month.

Show Author Info?: 
No
Author: ateamgroup
Posted: May 4, 2018, 1:52 pm

The winners of A-Team Group’s RegTech Awards 2018 have been revealed, with winning solution and service providers named across categories from best data management solution for regulatory compliance to best reference data for regulatory compliance, best trade repository for regulatory disclosure, best vendor solution for data governance, and best compliance as a service solution.

As well as specific product and service categories, awards were presented for the best solutions meeting the requirements of specific regulations including Markets in Financial Instruments Directive II (MiFID II), Market Abuse Regulation (MAR) and General Data Protection Regulation (GDPR) – see complete list of categories and winners below.

The awards were presented by Lauren McAughtry, A-Team RegTech editor, at a well-attended ceremony at Brewers’ Hall in the City of London today. McAughtry said: “Congratulations to all the winners of this year’s A-Team Group RegTech Awards. It is a pleasure to present the awards and recognise some of the established vendors and start-ups offering RegTech solutions designed to improve market participants’ regulatory response and provide efficient and effective compliance. The burden of regulation is putting pressure on our industry to change its approach to compliance technology, and it is RegTech that will lead the way.”

This year’s A-Team RegTech awards received a record number of votes and celebrated not only winning vendors of RegTech solutions, but also highly commended providers that came very close in the voting.

The winners are:

Category

Winner

Best AI Solution for Regulatory Compliance

IBM Watson Regulatory Compliance

Best Benchmark Regulations Vendor Solution

RIMES RegFocus BMR Control

Best Best-Execution Solution

Thomson Reuters Velocity Analytics

Best Data Management Solution for Regulatory Compliance

Asset Control - AC Plus

Best Data Solution for Tax Compliance

Bureau van Dijk

Best GDPR Vendor Solution

Collibra

Best Compliance as a Service Solution

Thomson Reuters Regulatory Intelligence

Best KYC Software for Client On-Boarding

Fenergo - Client Lifecycle Management Software

Best Reference Data Provider for Regulatory Compliance

Bloomberg

Best Regulatory Alert Management Solution

IBM Watson Regulatory Compliance

Best Regulatory Consultancy – Europe

Deloitte

Best Research Subscription Management Solution for MiFID II

Red Deer

Best Risk Calculation for Regulatory Compliance

AxiomSL

Best Solution for Managing Conduct Risk

Corlytics

Best Solution for Managing Conduct Risk

Highly Commended
Revista Systems

Best Solution for Managing Operational Risk

Thomson Reuters Connected Risk

Best Solution for Records Retention

smartAnalytics

Best Solution for Records Retention

Highly Commended
SteelEye

Best Solution for Securities Financing Transactions Regulation

RegTek Solutions Validate.Trade for SFTR

Best Time Synchronisation Solution

Corvil UTC Traceability Solution for MiFID and CAT Compliance

Best Trade Repository for Regulatory Disclosure

DTCC Global Trade Repository (GTR)

Best Trade Surveillance Solution for MAD/MAR

NICE Actimize

Best Vendor Solution for Data Governance

ASG - Enterprise Data Intelligence

Best Vendor Solution to Address a Dodd-Frank Requirement

AxiomSL

Best Vendor Solution to Address an FRTB Requirement

IBM Algo Aggregation for FRTB

Contribution to Market Understanding of MiFID II

SmartStream RDU

 

Show Author Info?: 
No
Author: ateamgroup
Posted: May 3, 2018, 3:47 pm

Arachnys has made key marketing and sales appointments, naming Steve Mann as chief marketing officer, Robert Lloyd-Watts as regional sales director for EMEA and APAC, and Bill Bennett as regional sales director, North America.

The company, a provider of customer risk decisioning solutions for Know Your Customer (KYC), Anti-Money Laundering (AML) and Extended Due Diligence (EDD), says the additions to the team are being made at an inflection point as financial services firms face increasingly complex investigatory and operational challenges. Ed Sander, Arachnys president, adds: “These highly experienced individuals will play a key role in delivering comprehensive customer risk decision solutions to our clients.”

Mann will focus on developing the Arachnys brand and driving demand for its financial crime solutions. He has over 25 years’ experience in B2B enterprise marketing, product strategy and brand development and has worked at companies including LexisNexis, SAP and Computer Associates.

Lloyd-Watts will focus on solving complex KYC and AML operational investigation requirements for financial institutions. He has over 20 years’ experience delivering financial crime solutions and most recently worked at BAE Systems on AML, KYC and fraud detection.

Bennett has over 20 years’ experience working with financial institutions to reduce risk and improve performance. Prior to joining Arachnys, Bill was at SAS where he delivered risk and compliance analytics solutions for global accounts.

Show Author Info?: 
No
Author: ateamgroup
Posted: May 2, 2018, 1:11 pm

It is no secret that Markets in Financial Instruments Directive II (MiFID II) remains a work in process, despite go live back in January. But what still needs to be done, how easy will it be, and how can firms work towards accruing the benefits of compliance?

A recent A-Team Group webinar, MiFID II Revisited, discussed ongoing issues of implementation, including problems around interpretation, data sourcing, reporting, transparency, and the European Securities and Markets Authority’s (ESMA) Financial Instruments Reference Data System (FIRDS). It also touched on the benefits of compliance.

Webinar Recording: MiFID II revisited

The webinar was moderated by A-Team editor Sarah Underwood, and joined by Beate Born, global MiFID II trading project lead at UBS; Olivier Rose, head of projects and international data management at Société Générale Securities Services; Matthew Luff, director at QuA Vodis; and John Mason, global head, regulatory and market structure propositions in the Financial and Risk Division at Thomson Reuters.

An early poll of the webinar audience questioned the most challenging aspect of MiFID II delivery. Some 45% of respondents said interpreting the regulatory requirements, 38% noted sourcing, mapping and integrating MiFID II data, 8% late changes to the regulation made by ESMA, and 8% testing and integrating with Approved Publication Arrangements (APAs) and Approved Reporting Mechanisms (ARMs). A lucky 2% said they did not face any challenges.

Featured Download: Poll results on MiFID II revisited from our recent webinar audience

Commenting on the interpretation problem, Mason said the consequences of an interpretive approach to MiFID II implementation created inconsistency. He exampled two APAs making different interpretations of requirements, which would result in different trade reporting, perhaps based on different data quality metrics or timeliness, and push both results into the MiFID II transparency regime.

On challenges around data sourcing, mapping and integration, Rose noted that all firms had ‘integrated something’, but now need to focus on where they are sourcing data from, whether it is correct and working for them, and whether it meets their expectations. He highlighted the arrival of the Legal Entity Identifier (LEI) as a mandatory addition to MiFID II reporting next month, saying: “The LEI is a major issue. Sourcing LEIs for the first time is relatively easy, renewing them is a nightmare.”

Discussing APAs and ARMs, Born said reporting to these devices is not working particularly well at the moment, but suggested data quality will improve over coming months and ease reporting problems. Luff noted many buy-side firms moving to the assisted reporting model, which allows them to use third parties to support regulatory reporting. He also identified problems with accurate counterparty matching in transaction reporting and said this may need to be resolved by intervention from ESMA.

The speakers agreed that the early months of MiFID II are not delivering expected levels of consistent transparency, an issue that firms can address by improving adoption of standards and regulators are more than likely to revisit.

ESMA’s FIRDS remains something of a dilemma, with the regulator saying it is not a source of golden reference data, but firms using it as such because there are no other ways to source the data. Mason pointed out that regulators have not previously been part of firms’ operational workflows and said that, in this instance, FIRDS is not working as well and needs improvement around data consistency and timeliness of publication to support effective trade and transaction reporting.

While there is much still to do to ensure sustainable MiFID II solutions, compliance does bring benefits, including business opportunities based on datasets created by the regulation’s requirements.

Listen to a recording of the webinar to find out more about:

  • Ongoing MiFID II challenges
  • Best practice solutions
  • New data generated by MiFID II
  • Likelihood of a consolidated tape provider
  • How to gain benefits from compliance
Show Author Info?: 
No
Author: ateamgroup
Posted: May 2, 2018, 10:44 am

Fenergo pulled off a coup today with the announcement of industry stalwart Conor Coughlan as its new Global Head of Marketing – a strategic poach from Thomson Reuters, where Coughlan has headed the marketing function for the risk division for the last two years.

An award-winning B2B financial services marketer, Coughlan is a leading RegTech social influencer and a recognized thought leader with over 19 years experience in financial services. Joining Fenergo, he will be focusing on their Client Lifecycle Management software solutions for financial institutions, which aims to help firms efficiently manage the end-to-end regulatory onboarding and entity data management processes.

Reporting directly to CEO Marc Murphy, Coughlan’s new role should provide ample opportunities for him to spread his wings. “Conor has been brought onboard to pioneer Fenergo’s digital marketing transformation and to deepen our penetration within the financial services sector,” noted Murphy.

“I am delighted to be joining Fenergo and leading a team of such talented, committed and passionate marketing, communications and business development professionals,” said Coughlan. “Fenergo has established itself as a market leader and I look forward to helping the company build on its success and bring it to the next level.”

Show Author Info?: 
No
Author: ateamgroup
Posted: May 1, 2018, 1:30 pm

AIM Software is building its client base in North America with the selection of its GAIN enterprise data management (EDM) software by Conning, a global investment management firm headquartered in the region. The investment management company is implementing three GAIN modules covering security master, portfolio pricing and corporate actions as part of a drive to realise its 10-year strategic growth plan.

Maurice Heffernan, chief information officer at Conning, says: “We selected AIM Software because of its focus on the buy side and its productised approach. GAIN will serve as a data quality firewall that powers our downstream systems and operations with high quality data. It will also provide a centralised point of control for all of our data sources.”

Brian Baczyk, chief data officer at Conning, adds: “The platform will provide us with both the governance and agility required to manage business change and execute on our strategic growth goals.”

Conning will use the GAIN business applications as a centralised data hub to streamline the sourcing, scrubbing and validation of its security, pricing and corporate actions data, and to sustain data lineage and data governance. GAIN will be integrated with Conning’s IBOR and ABOR systems to deliver consistent and reliable data across all functions.

The deal complements AIM Software’s expansion in North America earlier this year, when it recruited a number of senior executives including Sanjay Vatsa, head of Americas. Commenting on the addition of Conning to the company’s client portfolio, Vatsa says: “We are pleased to welcome Conning to the AIM Software user community and to be playing a key role in the realisation of its growth strategy.”

Show Author Info?: 
No
Author: ateamgroup
Posted: May 1, 2018, 8:00 am

Changes to the controversial Consolidated Audit Trail (CAT) – which will record all equities and options traded in the US – have been put forward in a bid to tackle serious concerns in the industry about the level of sensitive client data the CAT will collect. The CAT Operating Committee approved a different approach to personally identifiable information (PII) collection earlier this month ‘in an effort to minimise the PII captured and stored in the CAT’. The change is not final, however, and is being discussed with the Securities and Exchange Commission (SEC).

The current CAT plan is to create a database that captures around 58 billion daily trading records and PII on some 100 million institutional and retail clients. This prompted the Securities Industry & Financial Markets Association (SIFMA) to warn last November: “This raises serious concerns around data protection and the ability to confidently secure critical investor information.”

Christopher Bok, programme manager with the Financial Information Forum (FIF) industry association says the push now is for a two-stage approach to PII capture. He explains: “The first phase would require the reporting of firm designated identifiers (FDIDs), large trader identifiers (LTID) and legal entity identifiers (LEIs), as applicable on CAT order and allocation reports, concurrent with the start of industry member reporting. The second phase would require the building of an FDID request/response system, through which regulators could obtain from firms PII related to specific FDIDs, LTIDs and LEIs.”

The current CAT schedule is for large broker-dealers and self-regulatory organisations (SROs) to start sending data to the CAT Processor this November, with smaller firms following a year later in November 2019. FIF and the SROs have both called for changes to the start of reporting.

Bok said the SROs have proposed an extended implementation timeframe that would trigger reporting on April 13, 2020. The Plan Processor is working off this date in the technical spec drafting process. But FIF has put forward a more streamlined approach that it believes should allow phase one reporting to begin in November 2019.

Bok explains: “The current CAT implementation plan outlined in the CAT NMS Plan encompasses all CAT-reportable events – equities, options, allocations, PII – and that plan is working towards an April 13, 2020 implementation date, although the date has not been officially approved by the SEC. The FIF plan, in contrast, suggests initial reporting is possible in November 2019 because it would only include equities and equity market-making activity, with options and allocations, expanded equities and PII reported at a later date. The idea is that the CAT would phase in reportable events to allow it to develop and for complex business scenarios to get fleshed out and developed.”

Show Author Info?: 
No
Author: ateamgroup
Posted: April 30, 2018, 5:14 pm

Winston Churchill probably doesn’t spring to mind when considering Markets in Financial Instruments Directive II (MiFID II), but his famous Battle of El Alamein description – “This is not the end, it is not even the beginning of the end, but it is, perhaps, the end of the beginning” – fits the new EU regulation nicely.

Now we’re past MiFID II’s 3 January D-Day (sorry Winston), data management practitioners may be tempted to breathe a sigh of relief over its relative success. Some companies may already be scaling back their MiFID II teams, but work on the regulation is far from complete. So, what data management challenges remain, and when can firms expect to extract business benefits from the regulation?

The impact of MiFID II, which requires every instrument traded in Europe to have a set of reference and reporting data, has been huge. John Mason, global head of regulatory and market structure propositions at Thomson Reuters, estimates that around 3 trillion new data points have been introduced as result of MiFID II. And the change keeps coming. Last month, the European Securities and Markets Authority (ESMA) introduced MiFID II trading restrictions relating to the double volume cap (DVC). In June, it will bring in its six-month-delayed requirement for mandatory identity codes or Legal Entity Identifiers (LEIs) to be used across financial transactions. And in September, it is due to rule on which investment firms need to become Systematic Internalisers (SIs) based on trading volumes.

Live issues

In the meantime, MiFID II practitioners and observers point to plenty of live data sourcing, reporting, standardisation and consolidation issues. Which is inevitable, says Peter Moss, CEO of regulatory and trading reference data provider SmartStream RDU, considering the scale of MiFID II. He says: “The regulators have defined how data needs to be collected, but it’s coming from several hundred organisations. The reality is that several hundred organisations don’t end up having the same interpretation of what the standard is, not initially. The biggest problem we’re facing is just the bedding down of all of that new data.”

By way of example, Moss cites: “Definitional challenges around ISINs – the unique number that is used to identify instruments – how should they be allocated? Take an FX swap. The market would like to report that as two items because the industry generally feels that is the better approach, but the regulation says one, so there’s a conversation going on around that.”

Another live issue Moss notes around data fields. He says: “ESMA says a data field is optional, but when it comes to a certain type of instrument it’s actually not. Take the maturity date on an interest rate swap. The maturities attribute is not relevant for some fields but is relevant and necessary for an interest swap, and some venues aren’t supplying it. Occasionally you get attributes that are just incorrectly understood so you get invalid fields in the attribute.”

Mason agrees there are problems to deal with. He points to issues such as whether an instrument has been deemed traded on a trading venue (TOTV), and therefore whether it is reportable. Also, how instrument classifications are made. Mason classes these issues as ‘teething problems’, but they do still need work. As Moss warns: “A lot of firms decided that after 3 January, they would start to wind down their MIFID teams. The reality is you can’t stop, this is still an actively changing space and I think MIFID projects will be required for most of this year. They’ll need to continue to fix the things that aren’t quite working and stay aligned with the tweaks that ESMA and the regulators are putting out there.”

Beyond the teething problems

Looking forward, as well as ongoing ‘tweaks’ in the DVC, LEI, SI and other areas, the likelihood is that ESMA and other regulators will introduce more significant changes once it has reviewed the initial impact of MiFID II.

For example, Moss believes there are about 9 million instruments in the Financial Instrument Reference Database System (FIRDS), the EU’s consolidated source of financial reference data. But he says: “We have a client that expects the number of instruments to go up to several 100 million and, to be honest, there are organisations that are already struggling with data volumes. I think data volumes will grow to the point where ESMA might choose to moderate FIRDS to prevent volumes becoming too large.”

Mason too predicts major change as a result of MiFID II. He says: “I think we’ll see some fundamental shifts, as well as things just coming out in the wash. The inconsistencies across regulators on deferral periods for reporting mean we may see ESMA step in and say everything’s going to be deferred across Europe in an equivalent way.

“Changes should be about levelling the playing field. We have some Approved Publication Arrangements (APAs) that initially published data without necessarily feeling the need to do any validation, correction or enrichment to the data. The publication of data needs to become more consistent and drive greater transparency.

“Also, the level of transparency in the market isn’t what people were expecting it to be, either because data is not consistent or there isn’t enough data, and that’s potentially down to ESMA and the National Competent Authorities (NCAs) in terms of waivers, deferrals, large-scale block trading, all the things that are preventing publication of instruments to the marketplace for perfectly valid reasons. We may see some fundamental moves there and I suspect the move towards a consolidated tape in Europe is almost inevitable.”

Another unintended consequence of MiFID II is that ESMA’s FIRDS database is taking on the role of providing golden source reference and reporting data, despite the regulator’s opposite intention. As Moss says: “ESMA’s FIRDS database isn’t supposed to be a golden source, which ESMA has been very clear about, but the truth is, the data in the FIRDS database is absolutely essential. It’s coming from hundreds of organisations and there is no easy way for anybody else to actually pull the data together. So, ESMA may not want FIRDS to be used for golden copy, but there is no other source of the data, so it has inevitably become golden copy.”

Mason agrees: “For all ESMA’s comments about FIRDS not being a golden source, it would take a brave organisation to step away from whatever the regulator is classifying something as and say, ‘we think it’s something else’. It may happen over time, but it’s going to take an ongoing dialogue with the regulator to come to a consensus.”

Ultimate prize

One concern around MiFID II’s teething troubles and potential future change, is that data managers will be preoccupied with this and, perhaps, take their eyes off the prize of extracting business benefit from all their regulatory work.

Considering this, Mason sees a route to achieving benefits through data standardisation enforced by MiFID II and other regulations. He says: “If I were a data manager, I would be pushing my organisation to adopt standards, because we know regulation is not going away. The concern is that regulators have asked for the same thing three different ways, and people have responded to this on a regulation-by-regulation basis. But there’s no reason why the Alternative Investment Fund Managers Directive (AIFMD) should be treated differently to MiFID II and European Market Infrastructure Regulation (EMIR). We should be able to classify financial instruments in the same way across the industry and I think we’ll see regulators starting to standardise taxonomies and classifications.

“LEI codes are a standard, ISINs are a standard. There are certain codification areas where standardisation can start. For data managers, there is a business benefit in that because once you start standardising, you start being able to get a more holistic view of your data. Your data can start going to work for you, you can start mining your data as a data treasure, not just to support a financial process.”

Mason contrasts financial services firms to data-led businesses such as Google or Yahoo. He says: “There is value in our data. If you think of the transactions that financial institutions make and the resulting details they have on us as individuals, there is richness in datasets, but I don’t think we necessarily tap into that enough. Google, Yahoo, these new businesses, make money from data and I think that will start to influence thinking in the financial services industry as to how data and the management of data can lend itself to business benefit. MiFID II hasn’t been implemented to drive that, but it’s a healthy by-product.”

Moss suggests firms could benefit from MiFID II long-term by putting in place strategic and sustainable regulatory data management systems, rather than making tactical changes for each regulation. He says: “Data managers need to continue to fix the things that aren’t quite working and stay aligned with ESMA, but a lot of firms have thrown stuff together, they’ve got infrastructure and a set of processes, but they’re not necessarily robust and sustainable for the longer term. Some of them may be a bit clunky from a technology perspective. Firms are starting to look at this and what they can do to make sure the processes they’ve built are suitable for the next 20 years. This is an going requirement, it’s not going to go away.”

In the short term, however, Mason suggests the focus needs to be on the here and now. He concludes: “Over the next few months, MiFID II will be all about best execution. Are firms sourcing the data they need to demonstrate best execution, are they monitoring best execution, and are they providing clients with a high level of service here? This is where competition lies. Similarly, do firms have all the data they need for transaction reporting? I don’t think this is done and dusted and, three months in, not many firms, if any, can say ‘yes, transaction reporting is a well-oiled machine with all of the data attributes we need’.

Show Author Info?: 
No
Author: ateamgroup
Posted: April 25, 2018, 10:37 am

Silwood Technology, a vendor of metadata discovery software, has expanded its series of General Data Protection Regulation (GDPR) Starter Packs for major application packages. Starter Packs for Oracle E-Business Suite and Microsoft Dynamics AX 2012 join existing solutions for SAP, JD Edwards and Siebel that are designed to help users quickly and simply identify the precise location of personal data.

Acknowledging the 25 May 2018 compliance deadline of GDPR, the company is encouraging application package users and its channel partners to ensure they have, at minimum, instigated the discovery and documentation of personal data before the deadline.

The GDPR Starter Packs are based on Safyr, Silwood’s metadata discovery software that enables users to access, understand, share and use the underlying data structures of major application packages.

Nick Porter, founder and technical director at Silwood, says there are two reasons for interest in the Starter Packs: “First, it is highly unlikely that all organisations will have documented personal data in leading application packages by 25 May. Those that are not fully compliant will need to undertake this work later as part of their data protection programme, but there is an urgency now to find tools to find data quickly and accurately.

“Second, GDPR compliance is not a one-time event. When maintaining compliance, Safyr will be of value in keeping data catalogues, inventories and glossaries up to date with the locations of personal data across packaged applications”.

Customers using Silwood’s Safyr application for supported application packages can request a Starter Pack free of charge. Each pack contains a set of Safyr Subject Areas that describe and identify each type of personal information, including social security number, IP address and phone numbers. The packs work with customer systems by overlaying the subject areas on metadata extracted from customised application packages.

Show Author Info?: 
No
Author: ateamgroup
Posted: April 25, 2018, 10:02 am

Contact Us

 

 

Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004

 

(347) 770-1748

(212) 931-0572

 


 

Careers

 

If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.