“…We engaged Epsilon to be our strategic partner…. I am happy to say that Epsilon met or exceeded our expectations on this project. We successfully completed the RFP project and selected a system that was right for our business…”

FHLB
Chief Financial Officer

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

Global investment house Schroders has become the first asset manager to partner with specialist benchmarking and data analytics provider The Disruption House (TDH) to assess the capabilities of new technology providers.

TDH says the aim is to “support Schroders’ innovation agenda and assist the firm in its goal of engaging more start-ups”.

“We are committed to a programme of innovation throughout the firm and are keen to work with appropriate early stage companies,” confirmed Stewart Carmichael, Chief Technology Officer at Schroders. “One of the challenges we face is having the time to evaluate all the options under consideration, when there are so many startups offering similar services. We hope that the accelerated market scanning and the company assessments will help us to identify and evaluate innovative startups quickly and effectively.”

The TDH platform helps financial institutions to assess and monitor startup counterparty risk through its TDH Scorecard program, which reviews and rates technology providers based on a range of criteria including financial performance, business model, customer service, leadership and technology solutions. It hopes that this approach can help financial institutions to feel more comfortable when selecting younger RegTech firms.

The company was founded by Rupert Bull, the former co-founder of research organization Expand Research, which subsequently sold to Boston Consulting Group, and Chris Corson, who ran both the emerging markets business and was a member of the global fixed income business at Credit Suisse.

Show Author Info?: 
No
Related: 
RegTech Summit for Capital Markets - London, 4th October 2018
Author: ateamgroup
Posted: July 18, 2018, 2:36 pm

Jackson National Asset Management (JNAM) has selected GoldenSource to provide a fully hosted data management platform to help meet its regulatory obligations. Specifically, the asset manager, a subsidiary of Jackson National Life Insurance, will use the GoldenSource platform to meet requirements for structured data elements inherent in the impending form N-PORT and N-CEN filings with the Securities and Exchange Commission.

The initiative is aimed at improving data quality, enabling increased capacity, automation, and streamlined business processes. The GoldenSource platform will be used to generate golden copy data from multiple suppliers, ensuring the availability of the right data for JNAM’s data requirements. JNAM has to date used a single data supplier.

According to Daniel Koors, chief operating officer of Jackson National Asset Management, the firm recognised the need for a central data repository to support its activities in numerous areas. Using the GoldenSource hosted solution, he says, JNAM was able to integrate several different data suppliers “with minimal IT involvement from our side,” allowing the firm “to transition to a comprehensive data infrastructure solution to support our growing business.”

JNAM is an SEC-registered investment adviser that provides investment advisory, fund accounting and administration services for several funds and separate accounts that support parent company Jackson National Life’s variable products and employee 401(k) retirement plan.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 18, 2018, 1:07 pm

By Ken Krupa, Chief Technology Officer, MarkLogic

The standard corporate financial reporting language known as eXtensible Business Reporting Language, or XBRL, has been around for more than 20 years. Based on the eXtensible Markup Language (XML) of the World Wide Web Consortium (W3C), XBRL is a common global mechanism for communicating and sharing data across business systems. It has been adopted for the reporting of accounting, finance, tax, risk and climate change data, as well as being increasingly used for enterprise data management. It is used by the United States Securities and Exchange Commission (SEC), the Committee of European Banking Services, as well as the UK’s HM Revenue and Customs (HMRC) and Companies House. But why is XBRL arguably more important now than ever before?

Digital transformation is not a new term, but its impact is being felt across industries. It is therefore unsurprising that business reporting, accounting and auditing are among the next in line to be forced to adapt to the digital age. On 1 January 2020, the European Securities and Markets Authority (ESMA) will issue new guidance for corporate reporting under the European Single Electronic Format (ESEF). From this date, all European Union issuers will be required to use XBRL for filing financial reports. In doing so, ESMA will effectively mandate that all consolidated financial statements conforming to the International Financial Reporting Standard (IFRS) be rendered machine-readable.

There is good reason for this. Corporate reporting cannot remain founded in paper in a digital age. Annual reports are a key part of any company’s understanding of its financial situation. It is the belief of the ESMA regulators that moving all corporate reporting to a standardised digital format will improve transparency, as well as facilitating the analysis, comparison and accessibility of annual financial statements to internal and external stakeholders among the general public. Regulators have recognised this shift to digital and are doing what they can to embrace it. Now it’s time that businesses take note as well.

The implications of this regulation go beyond a standard digital reporting format though. On the compliance side, ESEF will require all businesses affected to be able to demonstrate taxonomy, or the structure that was used to classify their financial information, which will lead to better data governance. However, there is also a significant practical benefit to this change. Adopting XBRL as a corporate reporting language removes the often arduous and occasionally error-stricken process of manually analysing large amounts of corporate financial information.

In making data machine-readable through XBRL, the ESEF directive will make the financial information of more than 5,000 companies in the European Union easily transferrable across technologies that natively process XML, such as NoSQL databases. In the UK, according to a white paper by the Financial Reporting Council, more than two million companies already report using Inline XBRL (iXBRL) to HMRC, while another two million file their accounts using iXBRL with Companies House. However, ESEF will require many more companies, including all listed companies, to file digital accounts with XBRL in the near future.

This is a sign of things to come in the UK and across the globe. The Bank of Japan was among the early adopters, but more recently the Bank of England announced a Proof of Concept (PoC) project to explore how XBRL could help it to significantly reduce the cost of change, drive resource efficiencies and improve speed and flexibility of access to large quantities of regulatory data from financial institutions. The NTT DATA-led PoC, which uses a MarkLogic NoSQL database at its core, gives users the ability to browse data on a web application whilst masking the perceived complexity of XBRL. Early indications show that the PoC is providing significant benefits in the speed at which regulatory data is being imported, stored, analysed and visualised.

Despite this shift towards digital reporting as a standard, many businesses may be thinking that this regulation is a long way off and that now is too early to act. That may be a workable strategy for some organisations, but for most businesses that want to make better use of the financial data at their disposal, shifting to XBRL reporting now will see them reap immediate rewards, beyond regulatory mandates. All too often, data for internal reporting, statutory reporting, tax reporting and prudential reporting is stored in silos, preventing businesses from making informed decisions about corporate data as a whole. However, a NoSQL database that uses XBRL can circumvent these issues.

The Financial Reporting Council has echoed this call to action. In its report, it recommends that “Compliance teams should embrace the adoption of the new ESEF standard as an opportunity to take a leap forward in digitalising the business reporting process, rather than seeing the new ESMA regulation as a reporting burden.” In short, forward-thinking businesses will see this change as a positive means to get their data to meet compliance issues now and in the future, and to ensure that data is always visible and available for contextual analysis.

Solving these data challenges requires a database that empowers businesses to integrate all of their data with minimal disruption to the business. We recommend a design approach for reporting solutions that ensures agility and flexibility, while supporting multiple business outcomes that leverage the same integrated data sets. The solution should deliver a regulatory reporting platform that incorporates best practices and operational effectiveness and allows for adaptive growth in scope and scale. The design goal should not be to remedy one-off reporting requests, but to build in a capability to respond to emerging requirements with relative ease and cost efficiency.

As many companies have already discovered, there is an easy way to bring all these data silos together. Using an Operational Data Hub (ODH) for financial data - built on a flexible, enterprise-grade NoSQL database with integrated Google-like search - can pay dividends for data challenges where the data and requests from regulators change over time. Now is the time to act.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 18, 2018, 10:48 am

Regtech and compliance specialist NorthRow has been selected to join the fourth cohort of the Financial Conduct Authority (FCA)’s Regulatory Sandbox, an initiative launched in 2015 to provide a channel for businesses to test products and services in a controlled and customised live market environment.

NorthRow (formerly Contego) delivers customer onboarding through a combination of automated and managed identity verification (IDV), AML and KYC services, provided via a single API. Launched in 2011, it was recently selected to handle the onboarding verification process for the Open Banking Implementation Entity (OBIE), a body created by the UK’s Competition and Markets Authority in 2016 to create software standards and industry guidelines that drive competition and innovation in UK retail banking.

Adrian Black, CEO of NorthRow, said: “Accessing regulated financial data in a protected environment differentiates us from other identity verification providers who rely on manual, time-consuming processes to verify bank account data. It is the first step in our ambition to achieve FCA approval, which is required to access bank account data.”

The FCA received 69 submissions for the fourth phase of the Sandbox, with 29 applications accepted – the largest Sandbox cohort to date.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 16, 2018, 1:57 pm

By Liz Blake, Global Head of Eagle Managed Services

According to a recent Experian white paper, ‘Building a Business Case for Data Quality’, 83% of organisations have seen bad data stand in the way of reaching key business objectives. In particular, the research identified lost sales opportunities, inefficient processes, and client relationships as among the more prominent areas affected, but also underscored that the internal impact can extend all the way to the culture of the organisation.

Nearly everyone today recognises the challenges created by the exponential growth in the volume, velocity and variety of data. How asset managers deal with this information glut, however, can dictate whether it presents an opportunity or a threat.

True data visionaries, for instance, will rethink their data function altogether to leverage the right balance of technology and services to instil newfound agility and ensure data is working for the business, not against it. This is in stark contrast to ‘incrementalist’ mentality in which asset managers simply tack new capabilities onto legacy systems and fight an ongoing struggle to keep pace with mounting internal and external demands.

The rise of managed services runs parallel to the transformation trend occurring across the industry as organisations seek new ways to streamline existing processes while making their data actionable through enhanced reporting, timely insights and evidenced-based decision making. It’s against this backdrop of transformation that managed services is equipping chief operating officers with a new paradigm to conceive a global operating model featuring scalable cost-effective solutions that solve current needs and future-proof the organisation as requirements evolve.

It is sometimes difficult to conceive the extent to which bad data can affect the culture of an organisation. In fact, it’s often not until after a transformation is complete and new capabilities have been put in place that the larger enterprise fully appreciates the advantages of a sustainable and robust data solution.

In one example that I will outline, Eagle Managed Services was tapped by a global asset manager whose internal data function effectively served as a triage unit to reconcile and resolve data errors. On certain days, the errors and false positives might number in the thousands over a 24-hour cycle. Given the capacity issues, in large part due to poor data quality, the data process was managed on a monthly basis. This only magnified the pressure to validate the data and deliver it in a timely manner across the enterprise.

A baseline analysis helped identify the breadth of the problem. We also supported testing for the upgrade process and ultimately took over the file and the data monitoring function. The number of errors, in short order, was reduced significantly and following the systems upgrade, we incorporated daily controls and automated data quality checks. In addition to providing a permanent fix, this accelerated the pace at which monthly reporting could be produced for both internal and external clients.

More importantly, though, the data management function, with the support of managed services, became a strategic asset to the organisation as opposed to a bottleneck and source of scepticism. This is just one example and it only scratches the surface of the types of capabilities a managed services offering can impart. But it speaks to why so many organisations are rethinking their global operating model. And for many, managed services has simply become part of the buy versus build analysis that accompanies any new investment.

At our recent client conference, one of the speakers noted that before any operational decision is made ‘we will ask ourselves, is this a core competency?’ The speaker further explained that if it’s a function that can be managed by a vendor with specialised expertise, ‘money will be better spent on someone on the research side, seeking alpha’.

While it seems cut and dried, once an organisation recognises its core competencies and acknowledges those areas that may reside outside the core, that’s the first step to becoming a true data visionary.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 16, 2018, 9:00 am

By Martijn Groot, Vice President of Product Management, Asset Control

Business users across financial services are more data hungry than ever before. They want to interact with data directly and quickly collect, manipulate and analyse the data in order to streamline operations.

There are two drivers for this. First, users increasingly expect instant access to data. Second, jobs conducted by workers in risk, finance, control and operational roles are becoming more data intensive. Workers often need to aggregate content to draft a regulatory report, for example, or to sign off on a portfolio valuation.

To meet this growing user demand, organisations need to move to a new style of data management that manages large data volumes, and supports ease of access. Unfortunately, the data provision processes and technology infrastructure within financial institutions today are lagging behind this desired goal.

In addressing the challenge, the first step is to ‘acquire’ the necessary data sources and if required different perspectives on the data. Organisations may therefore need to assimilate and assess different prices and the different opinions of brokers and market makers.

The next step is data mastering, which allows businesses to cross compare, cross reference and tie all the different data collection threads together. This helps them enrich their datasets or, for example, calculate average prices. The third element is making the data accessible to users, a process that includes ensuring it is easily embedded into workflows.

In the past, businesses have tended to concentrate on sourcing as much data as possible and placing it in large data warehouses. Unfortunately, they have focused less on how to operationalise the data and make it accessible.

To address these issues, businesses need to look closely at the needs of the users they serve. The first group, operational users, need an overview of the whole data collection process. This should include insight into where data comes from, how much has been collected, and what gaps there are. Monitoring this gives the organisation an early warning if something goes wrong.

The second category consists of users who need to interact with the data. They might want to back-test a model or price a new complex security, and they need to be able to easily interrogate the data. The third group, data scientists, expect easy integration via languages like Python or R, or just enterprise search capabilities that enable them to quickly assess available datasets.

To address the needs of these groups, businesses need to deliver:

  • Visibility of the approved data production process to ease the operational burden and satisfy regulatory requirements
  • Easier programmatic integration for data scientists to enable them to access data easily and cheaply
  • A Google style enterprise search on the data set.

To provide this level of business user enablement depends on having the right technological infrastructure in place. Many firms still carry complex legacy applications and since the financial crisis, they also have significant cost pressures and need to get more from their existing infrastructure. There will therefore be a need for rationalisation of the landscape, but also a requirement to bring in new technologies to better deal with the data intensity of current risk and evaluation processes.

The current industry focus on prudent evaluation of risk and the emergence of regulations such as Fundamental Review of Trading Book (FRTB) are putting even greater pressure on financial services organisations. In line with the changes that FRTB brings, market data and risk systems need to support convoluted market data workflows utilising new sources to meet the real-price requirements and regulatory prescribed classifications. To manage all this, organisations need to find a way to smoothly source and integrate market data, track risk factor histories, and proactively manage data quality – all through one integrated and scalable platform.

Cloud and database technology

Often, organisations across this sector will need new capabilities to cater to the sheer volume of data they need to process. That typically means technologies that can manage new deployment models in the cloud, but also deliver ease of integration for data scientists and effective enterprise search for more general users.

From the database perspective, we see a trend for businesses to adopt new technologies such as NoSQL. Traditional database technologies are struggling to cope with the growing volumes of data these organisations are collecting via mobile banking apps and for regulatory filings, for example. NoSQL is also typically cheaper to run than these technologies. It scales more easily and delivers more flexible infrastructure cost control.

Finding a way forward

Today, organisations across the financial services sector are having to manage increasingly data intensive processes in areas like operations, evaluation and risk. At the same time, they are increasingly challenged by users who have different expectations of the data management systems they engage with and who are increasingly looking for a self-service approach.

In this new era of financial data management, they need to put new processes in place that focus on the needs of the user, and leverage technologies that are open and flexible and deliver high-performance, plus ease of access and control.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 12, 2018, 9:00 am

Client lifecycle management (CLM) solutions provider Fenergo is taking an ambitious jump into asset management with the launch of a strategic new buy-side division.

Headed up by Kevin O’Neill, formerly the head of the US asset manager segment for the Royal Bank of Canada’s Investor & Treasury Services, the new team will focus on leveraging the firm’s existing client base to expand to the buyside as part of its growth strategy.

Fenergo initially plans to build on its existing relationships with global financial institutions, many of which already incorporate asset and wealth management and/or fund administration entities. However, the firm is also in discussions with a number of larger European and US asset managers, and eventually expects a trickle-down effect into mid-size and boutique asset managers over the next few years as the digital agenda gains traction.

“What we are seeing to date is strong demand from our existing clients, and the feedback has been truly positive,” says O’Neill. “The buy-side represents a substantial opportunity for us.”

Fenergo believes that extending its onboarding and compliance capabilities to asset managers will enhance their end-client experience, reduce regulatory risk, and increase speed to market, making the buy-side more efficient.

But what new challenges will the buy-side represent, compared to the corporate and institutional base on which the firm has built its business? Apparently, surprisingly few. “Comparing the CLM demand aspect on both the buy-side and sell-side we in fact found many similarities, so extending our solutions to address onboarding and regulatory compliance requirements on the buy-side was a logical next step,” says O’Neill. “We don’t see a huge difference between the two sets of requirements, although there are of course some subtle distinctions, such as the need to meet different regulatory requirements.”

He identifies a number of key themes that Fenergo believes will be important for asset managers in the current environment: including continued market volatility, shifting centres of wealth towards emerging markets, and the importance of delivering high quality performance with the right risk levels at the right price. “Given the intense competition in the financial services marketplace, the customer experience has to be seamless right from the onboarding process through to the complete lifecycle; we feel we have the technology and functionality to completely transform that experience for buy-side managers today.”

In the asset management space there is an ongoing shift towards alternatives, with more and more asset managers today looking towards private equity, liquid alternatives, real estate and infrastructure offerings. In these alternative segments, the underlying investments and fund structures can typically be extremely complex, involving multiple legal entities across numerous jurisdictions.

Having the ability to deal with this complexity, both at an entity level and at an investor level, is one of the key elements that Fenergo is keen to address with its new platform. “The complexity involved in distributing financial products to multiple jurisdictions can generate numerous administrative and regulatory challenges. Buy-side firms are struggling with that administrative burden. We want to ensure that their regulatory journey is aligned to their distribution strategy, and we are exploring how we can digitalize that journey to give clients access to the information through a simplified front door,” says O’Neill.

Although the firm will initially provide the same suite of CLM, onboarding, due diligence and regulatory compliance solutions to its buy-side clients, there are also plans afoot to eventually develop tailored products based on customer need.

To achieve this, Fenergo hopes to leverage its Regulatory and Technology Forums, where existing and potential clients gather to develop best practices and create new processes around client onboarding. “From an investment management perspective, what we are seeing is that managers are more willing to collaborate around those areas and that is a key part of our initiative,” notes O’Neill.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 11, 2018, 1:04 pm

Banks are spending too much money trying to comply with big-ticket regulations because the tools they are using are not efficient enough. That is the view of IBM Watson Financial Services, which is in the process of expanding its regtech portfolio based on AI and machine learning, and covering governance, risk and compliance (GRC), financial crime, and financial risk.

IBM brought Watson Financial Services to market early last year before setting up a regtech business based on the Watson artificial intelligence and cognitive computing platform. At the same time, it acquired Promontory Financial Group, a specialist in risk management and regulatory compliance, that initially trained Watson on 60,000 regulatory citations.

Considering the challenges of regulatory compliance, Michael Curry, vice president of engineering at IBM Watson Financial Services, says: “Most of the banks we talk to do not have clarity on what their complete set of obligations are for the businesses and jurisdictions in which they operate. That is a scary prospect. On the other side, we know that many of our clients are struggling to be able to comply with some of these very big-ticket regulations like AML/KYC in a cost-efficient way. They are spending too much money trying to comply. That’s because the tools that they are using are not very efficient.”

Curry notes that current risk and compliance systems are disparate and disconnected, often with varying taxonomies, user interfaces, and skills requirements. He explains: “This makes it increasingly difficult to achieve a holistic view of risk and compliance, and results in a lack of confidence in what a bank’s real-time obligations and associated responsibilities are. Most importantly, this fails to engage the first line of defence that, lacking the proper tools, is often unable to make the best and most informed decisions. We are investing in how to engage these people more efficiently. It is the people at the coal face who need to be able to operationalise the system.”

One of IBM’s regtech developments based on IBM Watson capabilities addresses the efficiency of AML processes. David Marmer, vice president within the company’s industry platforms group, says: “This is an area where all the banks we work with are struggling. Most of the clients we talk to that have an existing AML implementation are seeing 90-99% false positives in the alerts that are coming out of the systems. We believe there is a huge opportunity to improve that, which is why we are investing heavily in a machine learning approach. We are seeing the ability to cut the false positives down by 30-50% on average, so we can really make a big impact.”

Moving forward, IBM is expanding its regtech offer through both organic growth and targeted investments. In May 2018, the company acquired Armanta, a provider of aggregation and analytics software to financial services firms. The acquisition will advance the company’s capabilities in financial risk, and there is more activity on the horizon, particularly around financial crime and AML/KYC.

Curry concludes: “As exposure to risk grows, more and more data must be considered to mitigate threats and gain competitive advantage. At the same time, organisations must keep pace with the growing volume of evolving regulations and compliance issues. That’s where our solutions come in. This is an extraordinarily strategic portfolio for us, and we are seeing very strong traction in this space.”

Show Author Info?: 
No
Author: ateamgroup
Posted: July 9, 2018, 9:56 am

Eagle Investment Systems is responding to financial institutions’ digital transformation programmes with a componentised platform, agile data management solutions, additional cloud options, and a revolution in the release of software updates.

To find out more about the company’s plans, we talked to Eagle CEO Mal Cullen about market development, client desires and the drive towards an open ecosystem that allows clients to make changes to their technology as requirements change, and partners to join the Eagle platform and extend its capabilities.

Cullen says: “Although Eagle has been a leader in data management solutions for over 20 years, the needs of our clients are changing, so we needed to invest in technology and resources.” He acknowledges that regulatory compliance continues to be a large part of Eagle’s business, but also recognises financial institutions’ requirement to move towards more efficient, agile and open systems. With this in mind, Eagle has made a major investment in redesigning its platform over the past 18 months.

Cullen explains: “There is technology available today unlike anything we have seen before. Our strategy is to create an open platform that allows components to be plugged in, provides the agility to swap out tools, and supports outsourcing where required.” The company’s commitment to collaboration as part of a componentised platform is clear, with about 40 strategic alliances already made, of which about one-third are new vendors in the market.

From a data perspective, Eagle is responding to client interest in being able to merge different datasets into existing datasets and integrate other types of data such as unstructured data and environmental, social and governance (ESG) data to create new datasets and gain more insight. Data can be delivered in flexible packaged solutions from the cloud, as a managed service or through application programming interfaces (APIs). Most recently, Eagle has added the option of allowing firms to use an API layer to build their own data extraction models.

The company’s cloud strategy is also developing. As well as delivering its data management, investment accounting and performance measurement solutions from its secure private cloud, Eagle Access, the company is collaborating with Microsoft to create a cloud-based multi-tenant data management platform. Offered through Microsoft Azure, the platform is designed to help investment managers capture the diverse data needed to manage assets and seek alpha. The relationship with Microsoft also builds on Eagle Access, creating a robust hybrid cloud offering. These developments provide cloud innovation that is increasingly expected by investment managers and also, as Cullen puts it, ‘a bigger surface area for Eagle to cover’.

The changes in Eagle’s platform design are matched by a major shift in how it releases software updates. The company previously released large updates every 18 months and released the last of these at the end of 2017. It has since moved to an agile development and deployment methodology releasing smaller, targeted releases every four to six weeks, avoiding delays between large updates and providing clients with more agility.

Eagle has about 200 direct clients on a global basis and is experiencing growing interest in its managed services, particularly in Europe and Asia. As the technology subsidiary of BNY Mellon, its solutions are also used by many of its parent’s clients, some of which could also be interested in joining the multi-tenanted data management platform.

Cullen says Eagle’s clients are split between firms seeking cost efficiencies by setting up new operating models that use managed services and outsourcing to deliver business requirements, and firms that have a growth perspective and are using Eagle solutions to gain insight from new data and add value to the front office. Whatever their approach, however, Cullen says clients and prospects are looking less at solution features and functions, and more at securing a long-term partner that can help them fulfil their strategy.

Show Author Info?: 
No
Category: 
Author: ateamgroup
Posted: July 9, 2018, 8:00 am

The MiFID II mandate requiring the use of Legal Entity Identifiers (LEIs) to identify parties to financial transactions came into play on Monday, causing barely a ripple in trading markets, but still posing problems in terms of lapsed LEIs – a problem that Stephan Wolf, CEO of the Global LEI Foundation (GLEIF), suggests could be solved by embedding the identifier in digital certificates.

Day one of the delayed MiFID II mandate saw LEI issuance at a total of 1.2 million, although with 89 regulations using the identifier, it is impossible to assess how many LEIs have been issued to comply with MiFID II. The number of Local Operating Units (LOUs) – or LEI issuers – in the global LEI system stands at 32, with DTCC the largest issuer but losing market share, in part due to a subtle change made by the GLEIF in how LOUs operate.

While they were previously endorsed by the LEI Regulatory Oversight Committee (ROC) to operate in local jurisdictions, the GLEIF has ended the endorsement process and instead requires organisations to be able to demonstrate they have the necessary skills and access to local markets to become LOUs. Wolf provides an example: “An organisation can apply to issue LEIs in 10 jurisdictions, but it must be capable of doing so.”

Beyond MiFID II in the EU, the LEI is gaining traction in jurisdictions such as the US where it must be used by firms transacting with European counterparties, India where it is used to identify credit borrowers over a certain size, and Singapore and Hong Kong where it is also used as a regulatory identifier.

While getting the LEI this far since its inception in 2012 is quite a feat, particularly in light of previous identifier schemes that failed to flourish, it is not perfect, with lapsed LEIs causing concern. The problem stems from some jurisdictional regulations requiring use of the LEI and others requesting it. Exacerbating this, the MiFID II regime requires traders to renew LEIs annually, but not their counterparties. For example, if SAP is a counterparty, the European Securities and Markets Authority (ESMA) says a bank can’t insist on renewal. The US works along similar lines, although Canada requires all firms to renew LEIs on an annual basis.

This generates two types of LEI – those in financial services firms that are always renewed, and those of counterparties that don’t bother to renew them – which is why the number of lapsed LEIs is growing, says Wolf. He proposes a number of solutions to the problem, including banks asking counterparties for ‘power of attorney’ to register LEIs on their behalf and gather required reference data from an LOU.

A closer relationship between regulators and industry to make LEIs an industry tool could also help. For example, banks could issue onboarding policies requiring every client to have an LEI, a real benefit for Know Your Customer (KYC) and anti-money laundering (AML) processes. Alternatively, an LEI could be used in the front office to exchange data and build trust with a customer. Wolf says: “It is our job to look at the bottom line and add value for banks. For regulators we add transparency.”

The GLEIF runs regular meetings for LEI stakeholder groups such as LOUs, data vendors, and statisticians who gauge what the LEI is doing in the market. A group for technology firms is planned, as fintechs and regtechs build apps that could include the LEI.

On a larger scale and moving on from the initial rationale for the identifier, Wolf suggests the LEI could be included in digital certificates and become part of digital identification. He is keen to discuss this with industry and regulators, and envisages both private and public certificates to cover individuals in both their business and private capacity. He says: “This is the future of the digital economy. Including the LEI as an entity identifier in a digital certificate could combat cybercrime and identity theft, support the Internet of Things and make apps like KYC and client onboarding easier to fulfil.” The problem of lapsed LEIs would also be erased, as digital certificates must be renewed.

Applying for digital certificates from suppliers such as GlobalSign, Verisign and Thawte is voluntary, but that could begin to change with the introduction of eIDAS, an EU regulation covering electronic identification and trust services, essentially services such as electronic signatures that confirm an online document or other electronic data is sent from a trusted source, is authentic and hasn’t been tampered with. The regulation came into effect on July 1, 2016. By September 2018, EU governments must be able to allow citizens to use electronic identification to access online public services in all EU member states.

Wolf says: “eIDAS requires governments to give digital certificates to all users of their services. This is ground breaking, confirms identity and means no more signing of paper. It is only a matter of time until digital certificates are widely adopted. In that time, the GLEIF will work to include the LEI in certificates. This could make business transactions cheaper and access to transacted money faster.”

Show Author Info?: 
No
Author: ateamgroup
Posted: July 3, 2018, 10:14 am

The Derivatives Service Bureau (DSB), which is responsible for issuing ISINs for OTC derivatives under MiFID II, has released a second and final consultation paper regarding user fees and contracts for 2019. Industry feedback is invited on topics including potential changes to user support services, service level agreements and resiliency after some proposals in the first consultation were discarded due to industry feedback.

The first consultation sought industry views on a broad range of topics arising from user feedback during the prior 12 months. The second consultation is intended to summarise industry responses and set out further details, including next steps where additional feedback is provided. It opened on June 28th, 2018 and will close at 5pm UTC on July 27th, 2018. The second consultation document includes a response form to be emailed to industry_consultation@anna-dsb.com. A final consultation report is due to be published on August 20th, 2018.

Emma Kalliomaki, managing director of the DSB, says: “In the responses to the first consultation, some of the contrasting interests and needs of various user groups became apparent. In this second consultation, we are investigating these interests more deeply to determine the best path forward for the DSB and its users.” She noted that the recently formed DSB Technology Advisory Committee will also provide guidance on matters related to infrastructure, connectivity and disaster recovery.

Specific areas of investigation in the second consultation are: revision of the user categories and fee model; changes to DSB functionality, including provision of more market timeline adaptive template models; service levels, including time of operation, technical support, streaming thresholds and weekly caps; and access and user agreements, including potential changes to the terms of any differentiated agreements for intermediaries.

Show Author Info?: 
No
Author: ateamgroup
Posted: July 3, 2018, 9:59 am

StatPro has acquired the regulatory risk services bureau of ODDO BHF, which provides a managed service for regulatory risk reporting that will be integrated with the company’s Revolution portfolio analytics platform.

The deal was made for an undisclosed sum in cash and adds 10 clients to StatPro’s client base in Germany and Luxembourg. StatPro will also take on the employees of ODDO BHF risk services in Frankfurt, where they will join StatPro’s existing operations. The regulatory risk service extends StatPro’s capability and produces annualised recurring revenues of about €1.7 million.

Justin Wheatley, CEO at StatPro, says: “Over more than 10 years, ODDO BHF Bank has established its regulatory risk service as the benchmark in the German and Luxembourg markets. With this acquisition, we gain new clients, the expertise of the ODDO BHF risk team,and add to our existing managed services for valuations and performance measurement with risk reporting.

“Once software replacement is completed, we will expand the service to other geographies. Ultimately, we will be able to offer clients a choice of service delivery, with clients using either software-as-a-service or StatPro’s managed service.”

Show Author Info?: 
No
Author: ateamgroup
Posted: July 2, 2018, 4:25 pm

Despite the compliance deadline of Fundamental Review of the Trading Book (FRTB) regulation being pushed back to January 2022, the time to address the data management challenges of the regulation is now. A recent A-Team Group Webinar explored the key data challenges presented by the regulation, identified possible solutions, and discussed best practices for implementation.

Webinar Recording: Solving the data management challenges of FRTB

The Basel Committee’s Standards for Minimal Capital Requirements for Market Risk, also known as FRTB, first published in 2016, aim to address accepted weaknesses in the regulatory capital framework as it relates to firms trading books. The final part of the Basel III suite of rules, and effectively the last piece of major post-financial crisis regulation, the objective of the regulation is to upgrade Basel 2.5 rules to ensure banks have a big enough buffer of capital to protect against the risk they hold, not by inhibiting or reducing trading activity, but by making sure that where risks are taken, there is sufficient capital to support them.

Although the deadline has been pushed back, the amount of work that needs to be undertaken between now and then should not be underestimated – and not all firms have made the strongest start. In a poll of webinar participants, just 14% believe they are making good progress. 36% still remain in the planning stage, while a further 14% have not started at all.

Featured Download: Poll results on Solving the data management challenges of FRTB from our recent webinar audience

Ignacio Ruiz, founder and CEO of MoCaX Intelligence and former head of counterparty risk analytics at Credit Suisse, warned: “It might seem far away, but there is so much to do that if people don’t start now, it may be too late.”

Looking at the toughest data management challenges of FRTB, Jacob Rank-Broadley, director of regulatory and market structure propositions at Thomson Reuters, said: “Data sourcing and data quality are the two largest challenges for banks.” The choice of whether to using the internal or standardised model approach to satisfy the regulation’s requirements is also challenging. Ruiz said: “Using the internal model approach, the problem is not so much compliance, but the capital penalty, which is outrageous. Because there is a netting effect, it is very easy for the capital calculation to blow up as a result of poor quality data.”

Non-modellable risk factors (NMRFs) present further complexity and are linked to the risk factor eligibility test (RFET). NMRFs require banks to collect real price observations for executed trades or committed quotes on a regular basis. Rank-Broadley commented: “The idea of sourcing this kind of data is fundamentally new, at least from a risk perspective, so it is no small challenge for banks to source a wide enough range of executed trade data and committed quote data to get sufficiently complete results to pass the compliance test.”

Quality of data for NMRFs is one issue, managing it is another. Satinder (Sid) Jandu, director at Viewset and a former FRTB project manager at a Tier 1 bank, said: “This is what has people running for the hills, because this is the most difficult part.”

Another challenge is the P&L attribution test, a fiercely debated topic under FRTB. The test assesses the difference between the P&L calculated by the front office and the P&L calculated by risk. A gap should in theory indicate a breach, but effectively calculating that gap could be immensely expensive. “If you don’t nail down the technology now, it is going to be really costly in the future. Think about it carefully, look for new technologies, and invest time in making the right decisions,” said Ruiz.

So, what should firms be doing to prepare for FRTB implementation? First, engage a risk liaison team to help you interpret the regulations. Second, decide who is running the programme, make sure you engage the front office properly, and remember that not all solutions are IT. Ruiz said: “You cannot do this in isolation. You have to engage with other teams – with risk, with the front office, with finance – and the sooner you do this, the better. Delivering in phases is also really important. FRTB is a humongous programme and it touches nearly everything in the bank. If you don’t set up tangible phases with achievable targets that can be delivered in reasonable timeframes, it is very easy to get lost in the jungle.”

Right now, industry sentiment appears pretty pessimistic. In the final poll of the webinar, 89% of participants suggested FRTB will result in the withdrawal of some products, while 78% think it will mean higher costs of trading and 44% expect the closure of some trading desks. Some final advice from Rank-Broadley: “Start early, don’t underestimate how much time a proper FRTB evaluation will take.”

Show Author Info?: 
No
Related: 
Webinar Recording: Solving the data management challenges of FRTB
Author: ateamgroup
Posted: July 2, 2018, 11:45 am

North American financial services provider BMO Capital Markets has partnered with multi-client regulatory platform JWG to access the firm’s AI-powered, natural-language processing change management system, RegDelta.

The system aims to reinvent the global postal system for managing change, through curating holistic data set, enriching it via sophisticated data science, and providing SaaA technology to the enterprise, enabling clients to retain control of obligations efficiently and without drowning in mountains of paper. The platform, hosted by AWS, links the firm’s business operating model directly to its relevant obligations to create a standardised, centralised repository to manage global regulatory change and mitigate risk.

“We are delighted to welcome BMO to the RegDelta fold,” said JWG CEO PJ Di Giammarino. “JWG are harnessing the power of an artificial intelligence technique called natural language processing (NLP) to replace manual efforts to map regulators’ changing requirements to business models in an efficient and traceable manner. By implementing industry standards, we are now quickly integrating our proprietary intelligence in a way that our clients control.”

Show Author Info?: 
No
Author: ateamgroup
Posted: June 28, 2018, 1:39 pm

Money.Net, a web-based platform for financial data, news and analysis, has selected OpenFin technology to deploy and deliver its software to global financial institutions. The company has also become a member of the Fintech Open Source Foundation (FINOS), a non-profit foundation promoting open innovation in financial services, and OpenFin’s Financial Desktop Connectivity and Collaboration Consortium (FCD3) initiative, which aims to drive standards for desktop application interoperability.

Morgan Downey, CEO of Money.Net, says: “Joining the OpenFin ecosystem gives us an opportunity to sell content directly to customers on the OpenFin platform, an entirely new channel to deliver, and monetise, innovations to some of the world’s largest companies. By presenting Money.Net content through a secure and compliant platform, we’re becoming part of the workflow of the modern worker.”

The company says the Money.Net desktop application, Excel add-in, Matlab integration and mobile apps provide global multi-asset class coverage – equities, commodities, FX, fixed income – at one-fifteenth of the cost of a legacy terminal.

Mazy Dar, CEO of OpenFin, says: “When a user adds Money.Net to their workflow, they’re getting real-time content that informs better decision-making and ultimately drives better performance. And that translates to the kind of results customers expect in the future of work.”

OpenFin is a secure desktop operating system built for capital markets and used by more than 50 of the world's largest banks, buy-side institutions and trading platforms to support digital transformation. It standardises the operating environment for hundreds of industry applications on financial desktops, allowing seamless deployment, interaction and interoperability between apps to enhance workflows and increase efficiency. OpenFin includes hundreds of apps that can now be leveraged by Money.Net.

Show Author Info?: 
No
Author: ateamgroup
Posted: June 27, 2018, 2:09 pm

By Tim Lind, Managing Director at DTCC Data Services

A decade on since the global financial crisis, large swathes of financial regulation aimed at addressing the weaknesses exposed during that time have come into force. Among many lessons learned in terms of market risk, there were two specific gaps that regulators wanted to address: ensure risk model sensitivities are appropriately calibrated to account for tail risk; and ensure banks are sufficiently capitalised to account for illiquidity in OTC asset classes. The Fundamental Review of the Trading Book (FRTB) published by the Basel Committee on Banking Supervision (BCBS) in January 2016 seeks to close those gaps.

FRTB represents the next phase of market risk capital rules and introduces a new set of data challenges that banks must overcome in order to avoid dramatic increases in capital allocation. The challenge for banks is that internal models used to assess risk will be dependent on trade data that is difficult to obtain, particularly for illiquid instruments.

This article outlines the data challenges that banks face and how best they can be addressed, as well as the imperative for market participants to start preparing now for the internal model application phase that will be introduced in 2019 to ensure readiness for full FRTB implementation in four years.

The fundamental aim of FRTB is to address issues in the market risk capital framework that surfaced during the global financial crisis. In short, banks will need to allocate more capital to less liquid instruments with higher risk profiles. The framework requires banks to provide evidence of sufficient liquidity across market risk factors related to the positions in their trading book, including those that are capitalised using approved internal models.

These models can only use risk factors that fit a certain real-price criteria, and each of these factors must be supported by a minimum of 24 ‘real price observations’ a year with a maximum of one-month between two observations. Real-price data is defined as such when a bank has conducted an actual trade between arms-length parties or from a committed quote.

Risk factors that cannot be incorporated into internal models, which can constitute more than 50% of all risk factors in some cases, are known as non-modellable risk factors (NMRFs). NMRF requirements under FRTB will potentially mandate large increases in capital that banks must maintain for market risk purposes (market risk capital).

Banks have an opportunity to reduce their market risk capital charges by using pooled observable transaction data to demonstrate that associated risk factors meet the real-price standards under FRTB. One of the unique factors of FRTB is that it will require unprecedented collaboration between banks and data suppliers to capture and normalise the maximum number of trade observations to reduce their overall number of NMRFs.

Concerns around NMRF requirements were raised by market participants during the BCBS consultation process. However, the latest update from the Committee, published in March 2018, states that there has been no compelling evidence in the form of actual data to support these concerns and that without this the BCBS does not propose revisions to the NMRF rule. Based on this stated position, market participants should start to prepare now for FRTB with the mindset that it is highly unlikely that the proposed risk factor rules will change significantly from where they stand currently.

Any individual bank, regardless of its sophistication and risk infrastructure, cannot solve the challenge of NMRF requirements on its own. In 2017, DTCC conducted a new ‘Real Price Data Study’ which analysed 10 billion over-the-counter (OTC) derivative transactions. It revealed that by using pooled observation data, dealers have the potential to realise a 50% or greater reduction in non-modellability (by notional) in credit, rates and FX; and a 20% or greater relative reduction in equity positions.

Further, the research found that industry data pools demonstrate significantly higher levels of modellability than individual firm data. Based on this clear evidence, the optimal model to mitigate the impact of FRTB market risk capital charges is a collective one where banks pool their data to prove that associated risk factors meet the real-price standards. No actual pricing of individual contracts will be exposed in this process, so pooling of trade data can be achieved while protecting the proprietary trading strategies of dealers.

Although the new implementation date for FRTB is 2022, market participants need to start preparing for FRTB now as 2019 marks the first phase of the thematic review of the internal model’s approach (IMA). That phase involves the consultation, planning and creation of actual internal model applications for which banks will need real-price observable data.

This process will also enable market participants to understand how increased capital allocations will impact their overall trading structure and understand their overall P&L attribution for individual trading desks. In some cases, the cost of capital may outweigh the revenue in trading certain asset classes or markets, which could lead to banks withdrawing liquidity from the market. These types of strategic decisions require time and implementation needs to be completed by the time the capital accord comes into effect, which is why we are now seeing FRTB programmes at many banks in high gear mode.

If market participants are to properly calibrate market risk capital requirements under FRTB, they will need access to pooled price observation data which will maximise the levels of achievable modellability while minimising non-modellability. Further, given that the FTRB rules have the potential to impact what products banks continue to make markets in there is a need for them to start the planning process now to ensure readiness for the internal model application phase. Without the requisite data and ample amount of preparation time, the challenges posed by FRTB may be hard to overcome.

FRTB represents a significant evolution in risk management to close gaps in existing methodologies. However, if not properly implemented it could have negative impacts not just on banks, but on the market and economy in general. If higher capital charges result in banks withdrawing liquidity from the market, the cost to manage credit, currency, interest rate, and commodity risk will increase for end users such as corporations, insurance companies, and asset managers.

The key lesson we should have learned 10 years ago is that the origin of crises begins when market participants withdraw liquidity from the market. Preparation, focus, and attention on data are going to be essential to ensure FRTB achieves the desired outcomes and doesn’t actually lead to higher costs to manage risk for the institutions that need it most. The stakes are high and the entire economy will certainly bear the consequences if FRTB is not appropriately applied and provisioned.

Show Author Info?: 
No
Author: ateamgroup
Posted: June 26, 2018, 5:36 pm

The Data Service Bureau (DSB) that issues ISINs for OTC derivatives under MiFID II has set up change and challenge processes. The change request addresses issues with existing ISINs or with the product definition templates supplied by the DSB for the creation of OTC derivative ISINs. The challenge process allows users to challenge existing ISINs, for example on the basis of an inconsistent underlying ISIN, using a web-based form.

While firms within the scope of MiFID II say the DSB is doing a better than previously expected job on issuing ISINs for OTC derivatives and that all is going relatively smoothly, the DSB acknowledges that some changes are inevitable.

To help users achieve change, it has published a change request document that provides users with a detailed definition of the process required to take a change request from initiation through to release. It also provides examples of change scenarios.

The document notes the DSB’s aim to follow a common change request process to provide participants in the process with a clear understanding of the state of the request and further steps that are to be taken in addressing an issue, and states: “The change request process set out in this document is expected to evolve as industry use of the DSB service matures, therefore this should be considered a living document.”

Show Author Info?: 
No
Author: ateamgroup
Posted: June 26, 2018, 10:58 am

Deutsche Bank has selected SmartStream Technologies’ Centre of Excellence (CoE) to provide an off-site operations platform including three services that will allow the bank to streamline, simplify and reduce the costs of its reconciliations environment. The decision to use the SmartStream managed service utility is part of the bank’s transformation programme.

The three managed services to be provided by SmartStream to Deutsche Bank cover reconciliations onboarding, production support and operational reconciliation services.

The services should make the processing of reconciliations faster and more cost-effective. The integration of the overall service will be done in a phased approach in which Deutsche Bank expects to increase productivity and reduce costs through mutualisation.

SmartStream’s Transaction Lifecycle Management (TLM) Reconciliations Premium solution performs Nostro and securities, as well intersystem, reconciliations at Deutsche Bank. These processes will be transferred to the CoE. Further efficiencies will be gained by reducing the complexity of systems, processes and technology, while creating a governance structure to define best practices for all reconciliations at the bank.

Bobby Handa, head of the global reconciliations group at Deutsche Bank, says: “This step is part of our journey to reduce the complexity of our IT environment. Modernising our reconciliation processing is critical to increasing productivity, reducing costs, as well as meeting regulatory requirements. Our aim with the CoE is to apply continuous process improvement across our business lines, as well to minimise and eliminate any likely risks.”

Haytham Kaddoura, CEO at SmartStream, adds: “As a team, we are working to provide a complete and sophisticated end-to-end reconciliations process that is designed to give the bank a competitive edge with centralised reconciliation services. This will enable growth, provide stability and mitigate risk.

Show Author Info?: 
No
Author: ateamgroup
Posted: June 21, 2018, 4:48 pm

The European Securities and Markets Authority (ESMA) confirmed today that the six-month delay to the mandate requiring the use of Legal Entity Identifiers (LEIs) for all issuers and counterparties to transactions under MiFID II will not be extended.

The initial delay from January 3, 2018, when MiFID II came into play, to July 3, 2018 was designed to provide a smooth introduction of the use of LEIs as ESMA decided that not all firms requiring the identifiers had succeeded in obtaining them in time for the MiFID II start.

ESMA and National Competent Authorities (NCAs) say they have since observed a significant increase in the LEI coverage of both issuers and clients. Based on these observations, ESMA and the NCAs have concluded that there is no need to extend the initial six-month period.

Instead, NCA activity with respect to LEI requirements is shifting from monitoring to ongoing supervisory actions. To ensure a high degree of supervisory convergence and the full application of MiFID II, ESMA and the NCAs are coordinating the development of an appropriate and proportionate common supervisory action plan focused on compliance with the LEI reporting requirements under respective regulatory provisions.

Commenting on the ESMA decision from an industry perspective, Larry Thompson, DTCC vice chairman, says: “ESMA’s announcement that no additional forbearance will be afforded to market participants means they need to make it a priority to apply for their LEIs ahead of the July 2 expiry date. Firms outside Europe that transact in European markets must also put the necessary measures in place to comply with the MiFID II LEI requirement by this time, otherwise they won’t be able to trade with European counterparties.”

Show Author Info?: 
No
Author: ateamgroup
Posted: June 20, 2018, 12:37 pm

The Derivatives Service Bureau’s (DSB) Technology Advisory Committee (TAC) set up earlier this month will hold its first meeting next Wednesday, June 27th 2018. Topics on the agenda range from infrastructure issues about the stability and resiliency of the DSB platform to usability issues related to download formats and increased efficiency. Many of the topics come from industry feedback on the DSB’s initial industry consultation on the 2019 fee model and user contract.

Marc Honegger, TAC sponsor and board member of the DSB, which is a subsidiary of the Association of National Numbering Agencies (ANNA), says: “As an industry utility run on a cost-recovery basis, we look forward to receiving the TAC's views on the appropriate level of investment in technology for the DSB to meet its responsibilities as a critical market infrastructure.”

Sassan Danesh, member of the DSB management team and the designated DSB officer of the TAC, adds: “In launching the DSB, we were focused on enabling derivatives markets to meet the MiFID II reporting requirements for OTC derivatives. We succeeded in that goal. Now we are addressing the refinements requested by DSB users. The TAC represents a substantial cross-section of stakeholders, but we also encourage other members and stakeholders to view the webinar to ensure that their interests are being addressed.”

In the interests of transparency, TAC meetings are public and observable through webinar access.

Additions and changes to the TAC membership since it was announced earlier this month include:

  • Rajiv Malik, vice president, JP Morgan
  • Henrik Martensson, markets CTO office, SEB
  • Elodie Cany, director, technology product development, Tradeweb
  • Rabobank has changed its representative to James Brown, delivery manager, IT systems
  • Thomson Reuters has changed its representative to David Bull, head of FI content management
Show Author Info?: 
No
Author: ateamgroup
Posted: June 20, 2018, 10:15 am

Contact Us

 

 

Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004

 

(347) 770-1748

(212) 931-0572

 


 

Careers

 

If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.