default image

“We engaged Epsilon to validate our mission critical valuation and hedge accounting model. We were impressed with the level of expertise of the Epsilon team members…”

Federal Home Loan Bank
Risk Group, Vice President

April 28, 2015

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

The New York edition of A-Team Group’s highly successful RegTech Summit is less than a week away and the industry is gearing up to converge on New York for an event filled with senior speakers, expert practitioners and cutting edge innovators in the RegTech space. Ahead of the big day, we stole a few moments with headliner Peter Moss, CEO of the SmartStream Reference Data Utility (RDU), to get a preview of his keynote speech on the challenges facing regtech as the industry evolves.

With the provocative title: ‘Garbage in, garbage out – why Reg “Tech” is only half the answer’, Moss is not shy of asking the hard questions. So how does he think the industry is likely to evolve? He says: The RegTech evolution has been great, it has been a breath of fresh air to the industry. But - and this is a big but - as MiFID II has proven, accurate data is the foundation upon which everything is built.”

Moss’s presentation will explore just why it is so important to get the data right, and why an appropriate focus on that data is crucial in order to ensure regulatory processes function efficiently. He will also look at MiFID II as an example of how regulatory approaches are likely to evolve around the world.

“There is a new focus on getting data from the market and making sure that there are clear standards for collecting that data,” he told DMR. “That involves working with all market participants to ensure they understand what is needed and have the right interpretation of attributes. But it will also mean some big changes, such as requiring players to send in data rather than specific regulatory reports. MiFID II has actually taken a big step in that direction already, but we expect it to become a more general trend globally.”

So how can you ensure that you are prepared for this shift, and have the required focus on data standards?

To find out, register for A-Team Group’s RegTech Summit in New York City on November 15, 2018 .

Show Author Info?: 
No
Related: 
RegTech Summit - New York City, 15th November 2018
Author: ateamgroup
Posted: November 8, 2018, 5:51 pm

RegTek.Solutions has formed a strategic partnership with Deutsche Boerse (DB) to deliver a certified testing and pre-validation service for clients of its ambitious new Regulatory Reporting Hub – the latest move in a major drive by the exchange to dominate the European regulatory reporting landscape. The deal represents the start of what is expected to be a substantial series of collaborations between the two organisations, as they combine their expertise to kickstart DB’s global goals.

DB has already been on a serious expansion drive over the past year as it seeks to capitalise on Brexit uncertainty and overtake London Stock Exchange (LSE)-owned LCH Clearnet as Europe’s leading clearing house. The German exchange operator reported a 15% jump in third-quarter group revenues and a 10% increase in net income, with particularly strong performances from its OTC clearing business (which saw revenues jump 123%) and its Eurex clearing house, which processed €4.3 trillion in Q3, up from just €315.9 billion the previous quarter. CEO Theodor Weimer has since confirmed that the group is actively seeking new M&A opportunities – with fixed-income, commodities, foreign exchange and data all key areas of interest.

To support this activity, the exchange has underpinned its expansion with an aggressive move into the regulatory reporting space: seeking to leverage its existing data handling capabilities into a one-stop-shop assisting clients to meet the onslaught of global regulatory reporting requirements of the past decade. Its Regulatory Reporting Hub was first launched in 2017, bundling all the Deutsche Boerse solutions for regulatory compliance onto one platform serving both buy-side and sell-side clients as well as corporates and trading venues. The exchange reports to 25 National Competent Authorities (NCAs), covering multiple regulations.

So where does RegTek.Solutions come in?

The relationship first started well over a year ago, when DB partnered with the software solutions provider and its parent company, consultancy firm Risk Focus, to provide key system components for the Regulatory Reporting Hub’s OTC Trade Reporting solution, including RegTek.Solutions’ Validate.Trade product.

In September 2017, DB joined forces with London-based VC investor Illuminate Financial to pump US$5 million into RegTek.Solutions. “While we have already been working with the team for some time to enrich our Regulatory Reporting Hub, the investment will help us deepen our relationship with RegTek.Solutions and enable the firm to service clients required to validate, report and reconcile trade data,” says Ankur Kamalia, MD & Head of Venture Portfolio Management and DB1 Ventures at Deutsche Boerse, at the time of the deal.

Last week’s partnership announcement marked the first official collaboration since the investment – offering over 2,300 clients of DB’s Approved Reporting Mechanism (ARM) service access to RegTek.Solutions new pre-validation platform. The solution will add enhanced testing and data quality remediation capabilities, as well as pre-submission validations, for all transactions destined for any of the 25 NCAs to which DB is connected. It is pre-configured to incorporate regulators’ expectations on data format and content in addition to DB’s API specifications, and will support a broad range of validations including regulation-specific validations, repository acceptance criteria, messaging standards, enhanced quality validations, and best practice. It will also enable firms to write and customise their own proprietary rules, avoiding a one-size-fits-all scenario.

“We are now able to offer our clients an additional service, which pre-validates their data before being sent to the Regulatory Reporting Hub. The result is a significant improvement in the quality of data we receive and submit for validation,” says Georg Gross, Head of Regulatory Services at DB. “The pre-validation process will allow our clients to identify any deviation from ESMA’s guidelines even earlier in the life cycle of their transactions.”

This kind of work is nothing new for Rektek.Solutions, which already offers a similar service to other reporting venues, although not necessarily on the same basis. For example, it has a solutions agreement in place with the Depository Trust & Clearing Corporation (DTCC) in the US, helping clients with the integration, quality and accuracy of data that goes into the DTCC’s global trade repository. But the DB deal represents collaboration on a higher scale.

“This partnership is different because we are getting an advanced level of integration with DB,” says RegTek.Solutions CEO Brian Lynch, speaking to DMR this week. “They are helping us to maintain the service level around the product, and they are working with us to actively push it out to market – it is a much closer relationship than the deals we have in place with other venues. It is not officially exclusive, but it is certainly unique.”

A key benefit of the partnership is the range of coverage it provides. The two firms come from opposite ends of the spectrum: RegTek.Solutions is a software provider that is able to put solutions directly into the clients’ hands and work on their side of the fence – assisting them to aggregate, extract and identify data and improve the quality of their reporting. They can then deliver that data to the DB hub, a central service provider that offers a pure utility function and manages the larger-scale integration across NCAs.

“Between the two of us we can span the entire scope of the reporting challenge, so together we make a very fine team,” says Lynch.

RegTek.Solutions also offers the advantage of global scale, with a strong North American footprint around Dodd-Frank and Canadian trade reporting for OTC derivatives as well as an established presence in Australia, Japan, Hong Kong and Singapore. Although DB is currently primarily Europe-focused, the group is already taking steps to expand out into new markets – witness the launch on November 1 of its brand new European Energy Exchange (EEX) in Singapore as part of a bid to revamp its Asia business – and RegTek.Solutions could certainly help along the way.

So although the validation engine might be the first collaboration between the two firms, it is unlikely to be the last – and going forward we should be able to expect plenty of further activity.

“There is an incredible amount of synergy between what we do and what DB are aspiring to achieve,” confirmed Lynch. “This is just the first step in many opportunities to collaborate. Our entire suite of tools is complementary to the Regulatory Reporting Hub’s goals, so we certainly expect to continue the relationship and build it out over the course of the next year.”

Show Author Info?: 
No
Author: ateamgroup
Posted: November 8, 2018, 10:36 am

Following its expansion into asset management earlier this year, client lifecycle management (CLM) specialist Fenergo is branching out further with the creation of a private banking and wealth management division led by Sales Kinetics founder Steve D’Souza. With plans for an IPO in 2020 and a €100 million revenue target for 2019, the move is the latest step in Fenergo’s ambitious expansion plan that has already seen revenues double for 2018.

The expansion into the private banking space ties in with the firm’s strategic objective of broadening its client base beyond core institutional and corporate banking institutions and into a wider range of financial markets, tapping new sectors as it enhances and develops its product proposition.

Fenergo CEO Marc Murphy says: “The creation of this division demonstrates our commitment to growing the business even further. We will continue to invest in our core CLM platform and will now enhance its capabilities to better serve this new segment.”

D’Souza will be based in London and report directly to Murphy. He worked with Unisys, Odyssey, Parity, Prospero and TCA Consulting before setting up Sales Kinetics in 2008.

On the Fenergo expansion, he says: “The objective is to expand the scope of Fenergo as a provider outside the corporate banking space, and my goals are to complement that universal message. As a bank, why are you still buying point solutions in these sectors, when you should be looking at a bank-wide utility?”

The division will apply Fenergo’s existing cross-border regulatory strength to the private banking market, but according to D’Souza, some new relationships could also be on the cards. “There will be some areas where we will need to build more capability, and we could potentially consider partnering with other firms to provide that capability,” he notes.

The firm will initially play to its strengths with a focus on Tier 1 accounts, but plans to leverage additional offerings including managed services and no-code solutions in order appeal to Tier 2 and 3 organisations as it grows.

D’Souza explains: “We are looking at helping the end client as well as the customer. There are a lot of challenges in this space – digitisation, challenger banks, robo-advisors, regulatory change – and these are all putting pressure on private banks to evolve their client onboarding strategy. We are absolutely going to be an engine for regulatory change across this sector, and this will fit into the end client’s journey and improve their digital experience by improving the regulatory processes of our clients.”

The private banking division might not be the last expansion that Fenergo has up its sleeve for 2018, and we understand there could be one or two more announcements coming out in the next few months.

Meantime, the company reports that it has doubled its year-on-year gross margin for 2018, with a large proportion of revenue resulting directly from recurring software fees. Revenues have also doubled, to €58 million for the year ending March 31, 2018, up from €30 million in 2017. The company has also made a marked turnaround in profitability, with a pre-tax profit recorded at €2.6 million.

Murphy is pouring all this straight back into the business – the firm has invested around €30 million over the past year to fund strategic expansion into key regions across Asia Pacific, Europe and North America, along with a further €10 million into research and development. It also has plans to hire an additional 300 staff by March 2019, bringing total headcount to over 1,000.

Show Author Info?: 
No
Author: ateamgroup
Posted: November 7, 2018, 1:35 pm

Asset managers are adopting advanced analytics and alternative data to generate alpha and support client acquisition and retention, and business operations. The technology favoured for advanced analytics is machine learning, although natural language processing is also in the picture and smart robotic process automation is in trials.

Element22, a boutique data advisory firm, details adoption of alternative data and advanced analytics in a report sponsored by UBS Asset Management. The report, 2018 Analytics Power, discusses the results of a survey of 20 asset management firms in North America and Europe with combined assets under management (AuM) of $14.8 trillion, almost 20% of global AuM.

It notes that the survey participants are at varying stages of a four-year journey to develop robust alternative data and advanced analytics capabilities, and that some firms have reached an inflection point in generating alpha, improving business operations and increasing client acquisition and retention with alternative data and advanced analytics.

Predrag Dizdarevic, founding partner of Element22, says: “The benchmark study reveals broad-based experimentation with advanced analytics and alternative data across all types of asset managers. The leaders are realising substantial value from their programmes, especially in alpha generation, and we expect this to grow in the coming years. Newcomers should be as aggressive as possible in ramping up their programmes, otherwise they risk falling insurmountably behind the leaders which could be a key differentiator in the industry.”

Ulrich Koerner, president, UBS Asset Management, concurs, saying: “Amid an environment of downward pressure on fees, and an increasing shift from active to passive investment strategies, asset managers must find ways to differentiate themselves and remain competitive in the coming years. With more alternative data available than ever before the most successful firms will likely be those that leverage advanced data analytics solutions across their business to generate value for themselves and their clients.”

Show Author Info?: 
No
Author: ateamgroup
Posted: November 7, 2018, 1:31 pm

Regulatory reporting and risk management solutions provider AxiomSL has signed a landmark deal with Openbank, the digital bank of Spain’s Santander Group and one of the world’s first fully fledged online institutions. AxiomSL will provide Openbank with central bank, capital, liquidity, trade, transaction and AnaCredit reporting on a global level, for both European and South American countries, through its strategic platform.

The agreement extends an existing relationship with Santander Group, which already uses AxiomSL’s regulatory platform to meet reporting requirements in Mexico and the US. Cristobal Miralles, chief technology officer and chief operations officer at Openbank, says: “AxiomSL’s ability to accommodate multi-jurisdictional and multi-faceted regulatory requirements was the key factor behind our decision to select its solutions. The platform will automate the full process from data capture to reporting submission to meet our central bank, capital and liquidity compliance requirements, saving us time and resources to focus on our core business activities.”

Ed Royan, CEO of AxiomSL EMEA, comments: “As regulatory requirements intensify, firms must adopt agile solutions to efficiently meet highly complex demands with ease and confidence. AxiomSL offers a single platform that can be used to tackle these multiple compliance requirements globally.”

The AxiomSL deal comes a year after an enterprise-wide overhaul at 22-year old Openbank, which holds more than €6 billion in deposits. Launched in 1995 as a telephone-based banking offshoot of Santander, in 2017 the bank transferred all its IT assets and client transactions to the cloud and revamped its online and mobile presence to become Spain’s first fully digital bank with the goal of attracting 30 million new digital customers by 2018.

Show Author Info?: 
No
Author: ateamgroup
Posted: November 7, 2018, 11:46 am

By Giles Nelson, Chief Technology Officer, Financial Services, MarkLogic

The cost of dirty data – data that is inaccurate, incomplete or inconsistent – is enormous. Earlier this year, Gartner reported that, on average, poor quality data cost an organisation $15 million in 2017. These findings were reinforced by MIT Sloan Management Review, which reported that dirty data costs the average business an astonishing 15% to 25% cent of revenue.

With global revenues of around $80 billion per year, just in investment banking, this means the cost of dirty data in financial services is astronomical. So, where does it come from and what can be done about it?

What’s the source?

Human error is a significant source. An Experian study found human error influences over 60% of dirty data. When different departments are entering related data into separate data silos, without proper governance, fouling of downstream data warehouses, data marts and data lakes will occur. Records will be duplicated, with data such as misspellings of names and addresses. Data silos with poor constraints will also lead to dates, account numbers or personal information being shown in different formats, making them difficult or impossible to reconcile automatically.

Further, once created, dirty data can remain hidden for years, which makes it even more difficult to detect and deal with when it is actually found. Most businesses only find out about dirty data when it’s reported by customers or prospects – a particularly poor way to track down and solve data issues.

And, still in 2018, dealing with print is an issue for many financial services firms. The scanning, marking up and import of printed documents is a recipe for the introduction of errors.

Many organisations search for inconsistent and inaccurate data using manual processes because their data is decentralised and in too many different systems. Harvard Business Review reports that analysts spend 50% of their time searching for data, correcting errors and seeking out confirmatory sources for data they don’t trust. These processes tend to fall into the same trap as the data – instead of consolidated processing, each department is responsible for its own data inaccuracies. While this may work in some instances, it also contributes to internal inconsistencies between department silos. The fix happens in one place, but not in another, which just leads to more data problems.

The impacts of dirty data

All of these issues result in enormous productivity losses and, perhaps worse, to a systemic loss of confidence in the data being used to power the business. The estimates above of revenue loss because of poor data seem extraordinary, but even if they represent the upper limit of the true cost, the impact is still very significant.

In a highly regulated industry, such as financial services, dirty data has an even greater cost. Missing, incomplete and inaccurate data can lead to the wrong trade being made, decisions taking even longer as further manual checks are made, and regulatory breaches being made. MiFID II has, of course, placed significant extra burdens on financial firms to ensure their data is in order.

Cleaning up the mess

What can be done? Here are a few things that organisations having difficulty with dirty data should be thinking about:

  • Achieving one golden version of data has long been an objective. Be careful though – doing this for all the data in an organisation without setting the whole data estate in concrete is an impossible task.
  • Take a data-first approach, rather than model first. Cleaning up dirty data involves the removal of invalid entries, duplicates, combining previously siloed records etc. The path to clean-up can be incremental. Taking the conventional approach and imposing a data model first, before doing anything with the data, leads to less flexibility and more cost.
  • Start building confidence in the data. Too often, data is present in isolation, with no knowledge of its provenance – when it was created, its source system and whether it’s been combined with other data. This metadata is valuable in proving a data item’s worth and actually preventing dirty data in the first place.

In conclusion, it’s worth stopping dirty data slowing you down. The business impact of dirty data is staggering, but an individual organisation can avoid the morass if it takes the right approach. Clean, reliable data makes the business more agile and responsive, and cuts down wasted efforts by data scientists and knowledge workers. And remember that 25% potential loss of revenue. It’s there to be clawed back.

Show Author Info?: 
No
Author: ateamgroup
Posted: November 6, 2018, 5:39 pm

Following the runaway success of A-Team Group’s RegTech Summit in London last month, the New York event rolls into town next Thursday (November 15) with a knock-out line-up of speakers and a showcase presenting innovators offering brand new solutions to help firms meet their regulatory and compliance obligations.

Ahead of the event, we caught up with Abel Picardi, managing director, compliance at Bank of China, a panellist on the Industry Leaders session, to get a sneak preview in advance of the big day.

Q. What does regtech mean to you?  Why are you excited about it?

A. Regtech is an advanced set of automated tech tools that will assist financial institutions to manage and mitigate regulatory risks. I’m extremely optimistic that regtech will eventually replace old and ineffective systems, eliminate spreadsheet and word documents, and facilitate the documentation of compliance developments, along with its implementation. It will also enhance the quality of analyses and reporting, both externally and internally.

Q. What is your biggest regulatory challenge?

A. Currently, there are two critical challenges: one is quality of data, the other is the astounding volume of regulatory requirements, which are difficult to manage using spreadsheets.

Q. Why is it such a challenge?

A. Changing behaviour and enhancing a culture where integration of best standards/practices should be the enterprise wide objective. Streamlining of ineffective regulatory requirements needs to occur.

Q. What role do regtech providers have to play in helping you solve your challenges?

A. Regtech providers need to move away from the one-size-fits-all approach. They need to come up with solutions that are unique to a financial institution. Regtech providers need to be able to explain how the architecture of the tools employed will deliver an effective design and lead to accurate results.

Q. What regtech firms are you currently using or planning to implement?

A. We are looking at various firms and their tools to determine whether they are compatible with our current and future needs. We are looking at end-to-end solutions that will be able to provide a single view and prioritise all the risks impacting the bank.

Q. What other cool or interesting regtech firms have you seen out there?

We’ve seen firms possessing potential capabilities in providing the right tool to identify, assess, and report all risks.  It is difficult to tell if it can work in our business environment.

Show Author Info?: 
No
Related: 
RegTech Summit - New York City, 15th November 2018
Author: ateamgroup
Posted: November 6, 2018, 1:21 pm

By Dennis Slattery, CEO, EDMworks

Change is constant and unrelenting, and the causes are many and varied: challenger banks, fintech, regtech, cost pressure, cryptocurrencies, regulation and emerging technologies are just a few examples, with many more on the horizon.

Organisations have to deal with change, so how can they do this and what is the best approach?

All financial services companies have data at their heart. The approach to dealing with change has to ensure that curation and use of data is improved and the value of data to the organisation and its customers is increased over time.

Tactical approaches generally fail to increase the value of data to the wider enterprise. In fact, tactical approaches often diminish value by creating duplication, which in turn creates doubt and lack of trust in data. Trust is the foundation of any financial firm, so loss of trust is a serious matter.

Data is of systemic value to an organisation, requiring an approach to managing data in response to a series of internal and external changes to also be systemic. In EDMworks’ latest white paper, ‘Transform your Data Governance by Redefining and Reskilling the Organisation’, we explore the challenges facing organisations, share new insights into data, and describe how organisations need to adapt systemically, at all levels. In this way, managing data becomes part of the DNA of the company and an intrinsic part of its culture and processes.

Show Author Info?: 
No
Category: 
Author: ateamgroup
Posted: October 30, 2018, 11:29 am

The FCA has updated its progress on Brexit, saying it is on course to be ready for a hard exit from the EU and that it is important for both sides to coordinate to avoid disruption in the event of a no-deal situation. Its preference, however, is a permanent arrangement post Brexit that would allow for close alignment and equivalence with the EU, without the UK being a rule taker.

FCA chief executive Andrew Bailey set out the FCA’s latest views on Brexit last night during a speech at the City Banquet held at Mansion House. On the issue of a smooth transition or hard exit, he said: “We are prepared for a range of outcomes including an implementation period that smooths transition and a hard and sudden exit. It’s a lot of work, but I think we are on course. . . So I think we can handle it. But, as I have said before, we urgently need the engagement of our EU counterparts so that we can put in place memorandums of understanding (MoUs) and other important practical arrangements. This is not just a self-serving UK point; it applies to both sides. MoUs will support cross border supervision of firms and data sharing will support our ability to jointly oversee markets.”

Bailey said this type of regulator-to-regulator coordination is essential to minimise disruption in a no-deal situation, but noted the FCA’s preference for a better deal. He said: “Of course, there is a broader solution to removing cliff edges which is for both the UK and EU to commit to taking reciprocal equivalence decisions on each other’s regimes, as early as possible. Our work to onshore the EU rulebook demonstrates that on day one, the UK will have the most equivalent framework to the EU of any country in the world.”

Implementing the EU rulebook would maintain reciprocal equivalence across the EU, an option favoured by the EU and underlined by Bailey: “My own view is that the principle of proactively recognising equivalence makes a great deal of sense, and is consistent with the arguments put forward by the EU in the context of Brexit in terms of not constraining domestic choices. So, it ought to have broad support, except probably amongst those who take a more mercantilist view and are prepared to sacrifice the principle of open markets, with which I strongly disagree.

“I think that if we appropriately temper our approach to the domestic side of things by a commitment to seeking broadly equivalent outcomes, and our opposition to any suggestion of a race to the bottom – a bonfire of rules and the like – protections can be made to work quite effectively.”

Show Author Info?: 
No
Author: ateamgroup
Posted: October 29, 2018, 11:37 am

With MiFID II nine months into implementation since go live in Janaury 2018 and the systematic internaliser (SI) regime having become mandatory for all firms within its scope in early September, a panel of regulatory experts at A-Team Group’s recent RegTech Summit in London reviewed how the regulation is playing out and discussed how to achieve sustainable compliance.

The panel was moderated by Gouri Khatua, regulatory consultant at Grant Thornton. Panel members included Nicholas Philpott, head of market structure at Standard Chartered Bank; Martijn Groot, vice president of product management at Asset Control; Peter Moss, CEO at the SmartStream RDU; and Malavika Solanki, member of the management team at the Derivatives Service Bureau.

The conversation started with a review of the SI regime and the role of regtech in helping firms decide whether to be SIs in particular securities, and if they are or become SIs, operate effectively and efficiently. Talking more broadly about MiFID II, the panel noted ongoing reference data challenges in providing pre- and post-trade transparency, and in meeting the data management requirements of best execution reporting.

On a more positive note, panel members remarked on how firms that are MiFID II compliant are beginning to look at opportunities to exploit data gathered to generate value for the business, and how they have used implementation to make internal improvements, such as applying common standards across global locations, which has proved particularly beneficial in achieving the requirements of best execution under MiFID II.

At an industry level, the panel said MiFID II has driven more structure into the market, presented potential for greater harmonisation of regulations across Europe and the US, and indicated how the future of regulation may play out, with firms sending well-structured and complete data to regulators and regulators making better use of the data.

Show Author Info?: 
No
Related: 
A-Team Group RegTech Summit Reviews the First Nine Months of MiFID II
Author: ateamgroup
Posted: October 26, 2018, 9:09 am

MiFID II was designed to increase market transparency and improve investor protection. So, how is it doing nine months in, what challenges remain – and there are plenty, and what are the positive outcomes for firms within its scope and the industry as a whole?

You can find out the answers to these questions and more by listening to this podcast of a panel discussion on developing sustainable MiFID II compliance at A-Team Group’s recent RegTech Summit in London.

Teaser: 

MiFID II was designed to increase market transparency and improve investor protection. So, how is it doing nine months in, what challenges remain – and there are plenty, and what are the positive outcomes for firms within its scope and the industry as a whole?

You can find out the answers to these questions and more by listening to this podcast of a panel discussion on developing sustainable MiFID II compliance at A-Team Group’s recent RegTech Summit in London.

Author: ateamgroup
Posted: October 26, 2018, 9:03 am

AxiomSL has partnered Germany’s SKS Unternehmensberatung in a move set to increase its influence in continental Europe.

AxiomSL’s regulatory reporting platform is already compliant with both national and international requirements in Germany, and its collaboration with SKS, a provider of regulatory, risk and compliance management systems, is expected to enhance its market position. The firm also hopes to use its new partner’s regional expertise and functional capabilities to build out a wider presence across neighbouring countries such as Austria and Luxembourg.

The two firms plan to work together on client implementation projects, targeting a simplified process whereby regulatory reports are generated and submitted to the relevant authorities within reduced implementation timeframes.

AxiomSL EMEA CEO Ed Royan, comments: “AxiomSL’s platform and track record of global success combined with SKS’s functional capabilities and regional footprint will enable firms to benefit from differentiated solutions and meet regulatory demands in a hassle free, timely and cost-effective manner, leveraging a single platform and proven regulatory reporting and risk management technology.”

Show Author Info?: 
No
Author: ateamgroup
Posted: October 25, 2018, 2:47 pm

Financial crime is rising exponentially, requiring financial institutions to review and renew client onboarding, Know Your Customer (KYC) and Anti-Money Laundering (AML) processes. A panel discussion at A-Team Group’s recent RegTech Summit in London considered the challenges and inefficiencies of today’s onboarding, KYC and AML solutions and proposed how regtech deployment could improve the situation.

The panel was moderated by Denisse Rudich, strategic advisor in financial crime at Firedrake, and joined by Anu Ratan, senior global AML policy and advisory manager, Tier 1 banks; Targ Patience, group chief compliance officer at the Gibraltar Stock Exchange; and Aoife Harney, regulatory consultant at Fenergo.

The panel noted inefficiencies of manual processes used for client onboarding, KYC and AML, and noted the potential of regtech. Harney proposed regtech to provide a centralised data repository and technologies such as machine learning and artificial intelligence to support process automation and accuracy on an ongoing basis.

Patience described the Gibraltar Stock Exchange’s work with rechtech to provide a centralised client identity repository, but noted that it is not only necessary to know your clients, but also the businesses you are working with, which can be difficult considering often complex business ownership structures.

Ratan discussed the issues of bringing regtech solutions into live and legacy onboarding KYC and AML environments, and advised financial institutions to look at how regtech can fit into the environment, its cost and return on investment, and its effect on customer loyalty, before investment and deployment.

Listen to this podcast to hear the views of the panel on how to improve onboarding, KYC and AML as a means of fighting financial crime.

Show Author Info?: 
No
Related: 
Role of Regtech in Improving KYC, AML and Customer Due Diligence
Author: ateamgroup
Posted: October 25, 2018, 2:40 pm

The challenges of financial crime are significant and becoming harder to resolve, but advances can be made by implementing regtech solutions to improve customer onboarding, Know Your Customer (KYC) and Anti-Money Laundering systems at the heart of due diligence.

This podcast comes from a panel discussion at A-Team Group’s recent RegTech Summit in London. It reviews the inherent inefficiencies of AML; how firms are using digital identities for faster onboarding, KYC compliance and an improved customer experience; and the potential and limitations of regtech to support the fight against financial crime.

Teaser: 

The challenges of financial crime are significant and becoming harder to resolve, but advances can be made by implementing regtech solutions to improve customer onboarding, Know Your Customer (KYC) and Anti-Money Laundering systems at the heart of due diligence.

This podcast comes from a panel discussion at A-Team Group’s recent RegTech Summit in London. It reviews the inherent inefficiencies of AML; how firms are using digital identities for faster onboarding, KYC compliance and an improved customer experience; and the potential and limitations of regtech to support the fight against financial crime.

Author: ateamgroup
Posted: October 25, 2018, 2:36 pm

A report from Thomson Reuters and the Association of Certified Anti-Money Laundering Specialists (ACAMS) notes that since the launch of US AML requirements for financial institutions in May 2018, firms have shifted human capital focus away from regulatory change management towards more efficient customer due diligence (CDD).

According to the 2018 Anti-Money Laundering (AML) Insights Report, the increased certainty provided by the Financial Crimes Enforcement Network’s (FinCEN) new CDD Rule has had a dramatic impact on the human resources strategy of financial firms. Over a quarter (28%) of survey respondents anticipate an increase in staffing for AML compliance purposes, compared to just 8% in 2017. This focus has resulted in a decrease of regulatory enforcement, with just 22% of organisations experiencing regulatory action compared to 31% the previous year.

Chris Maguire, managing director, Corporate Legal at Thomson Reuters, says: “Developing customer risk ratings is a key component of the CDD Rule. The most commonly used factors to develop the risk rating were customer activity, geographic location and political exposure, with politically exposed persons being the top standard measure of risk, as it was in the 2017 report. Organisations have also improved their collection and speed of gathering necessary information.”

The CDD rule may continue to require substantial time and investment, but improving data management and quality, investing in new technology and process automation, and streamlining business processes are key areas of focus. In these areas, the challenges are increased regulatory expectations, properly trained staff and outdated technology.

Show Author Info?: 
No
Author: ateamgroup
Posted: October 25, 2018, 10:44 am

Element22 is bringing its data strategy, analytics and execution expertise to Europe with the establishment of a London office and appointment of Mark Davies, formerly CEO of Avox, as lead of the European practice.

The company was set up in 2014 in New York City by managing partner Predrag Dizdarevic, and has developed into a boutique consultancy, working with financial institutions to create value through the deployment of high level business strategy, monetisation of data assets, and development of products and services.

Beyond strategy, Element22 works with clients on execution by supporting programmes including the design of data governance, development of advanced analytics platforms, and building of sustainable data quality solutions. Emerging technologies are increasingly critical, with Element22 bringing its experience of technologies such as cloud tools, data lakes, knowledge graphs, ontologies and advanced analytics to client projects.

The company’s London office has been set up in response to client and prospect requests for similar capabilities in the UK and the rest of Europe. It is already finalising negotiations on three engagements in Europe, which will be led by Davies as a partner at Element22.

Davies says: “This is a great time to be working in this space. New technologies are breaking through and many firms that are struggling with legacy data challenges want to understand how these technologies can help them. We can bring new world technologies to old world problems and deliver real benefits for our clients while helping them modernise.”

Davies has experience of partnering with Element22 in previous roles and notes the solutions the company delivers in the US are equally relevant to the UK and the rest of Europe. Like the US business, the European practice will work across the sell-side and buy-side with local and global clients, and also with firms servicing financial institutions. The locations will share expertise while Davies builds a team of data management, analytics and technology specialists with experience in financial services.

Predrag Dizdarevic, managing partner of Element22 Group, says: “Opening our London office reflects both our expanding client needs and our growth on the global scene. We are pleased that Mark Davies, who was one of our long-standing partners, has decided to join us and lead our expansion. We are looking forward to working together to bring data value to our clients.”

Show Author Info?: 
No
Author: ateamgroup
Posted: October 24, 2018, 8:41 am

DTCC has introduced Equity Kinetics, a data services product providing institutional investors with a comprehensive market view of activity across all US equity trading venues. The data is ideal for quantitative market participants seeking insights to enhance their understanding of the US equities markets.

DTCC Equity Kinetics facilitates analysis of US equity market activity by providing a daily feed of trade data based on activity cleared through DTCC’s National Securities Clearing Corporation (NSCC) subsidiary. This data includes aggregated trade volumes for the market, the 10 most-active brokers, and an anonymous peer group of nine global brokers, by security and transaction type, covering buy activity and sell activity including sale, short sale and short sale exempt data. The service includes historical data from December 2011 onwards.

Tim Lind, Managing Director of Data Services at DTCC, said, “Post-crisis regulation and the related focus on increased transparency through transaction and trade reporting have led to a surge in demand for data generated from financial market infrastructures (FMIs) like DTCC. We capture and optimise data from our processing engines and data repositories to provide innovative solutions that help our clients address challenges related to risk management, capital adequacy, liquidity and market transparency. DTCC Equity Kinetics allows clients to gain greater insight into movements and trends across select market segments and asset classes.”

Show Author Info?: 
No
Author: ateamgroup
Posted: October 23, 2018, 9:59 am

Broadridge Financial Solutions is gaining interest in a ‘designed by users for users’ solution that aims to address industry, business and operational challenges around global asset servicing. Developed with a global Tier 1 bank, and cloud and software-as-a-service (SaaS) enabled, the solution streamlines corporate actions, dividend and coupon processing across multiple asset classes, business lines and regions by automating the full asset servicing lifecycle.

The Broadridge global asset servicing solution is designed to help firms mitigate the drawbacks of inefficient process in asset servicing by standardising and automating processes for announcements, notifications, elections, accruals, entitlements, and settlements globally. It can support and enhance front-office activities through comprehensive data management and analytics, helping traders and portfolio managers mitigate against losses and pursue revenue generation, for example through arbitrage opportunities.

Tom Carey, president of Broadridge global technology and operations, says: “The number of corporate actions is increasing across global markets, each one navigating a complex network of intermediaries and custodians. With fragmented systems and regulatory pressures increasing processing challenges, financial institutions need a modern solution that simplifies architecture, streamlines operations and improves risk management. Our solution simplifies technology for capital markets clients and helps back-office functions drive business and operational value.”

The solution is being deployed by investment banking, wealth and asset management business lines that, with one centralised platform, can view, manage and report across portfolios, events and global trading models, bringing visibility and transparency to the asset servicing lifecycle.

Show Author Info?: 
No
Author: ateamgroup
Posted: October 22, 2018, 4:55 pm

By Tim Lind, Managing Director, Data Services at DTCC

In recent years, it has been claimed that data has eclipsed oil as the world’s most valuable resource. Financial market infrastructures (FMIs) are on a constant search for ‘new oil’ and the value of data is certainly on the list of new services they are developing. Post crisis reforms and the requirement for greater transparency in financial markets through new transaction and trade reporting rules has led to a much larger volume of data being captured within the financial system.

However, while there’s no shortage of data, what the industry really needs is insights. Therefore, the challenge for institutions collectively is to harness the millions of transactions that flow through their infrastructures and create actionable information that will enhance decision-making at all levels. FMIs play an important role in the provisioning of data for the industry and a best practice approach must be adopted to ensure standards of data quality, confidentiality and security are maintained.

FMIs are naturally intermediated in transaction flow and can play a critical role in capturing, aggregating and discovering value in the data assets that flow through their services, and return that value back to the participants who use that infrastructure. This involves developing innovative and unique data products that provide insight into areas such as market risk, liquidity assessment, trade decision support, capital management models in preparation for constantly moving frameworks – such as the upcoming Fundamental Review of the Trading Book (FRTB) requirements, benchmark valuation and trade data – to support alternatives for benchmark rates to replace LIBOR.

Historical, aggregated and anonymised transaction data provide the baseline for quantitative and analytical models that consider patterns of liquidity and trade activity and can facilitate more effective decision making, which can lead to improved trading, asset allocation, price discovery, client service, collateral management and risk management. Due to the wide-ranging applications of historical transaction data, the opportunity to create value for the community of participants in the infrastructure is very significant. While transaction data are a historical record of what happened in capital markets, marrying them to other economic data can help develop predictive analytics, which is the holy grail of value-added data services.

That said, while FMIs host data, this does not mean they are by default data providers. First, there is a significant amount of data management technology and infrastructure, which is required to govern the process of data provisioning. This includes procedures to extract, aggregate, normalise, curate, store, encrypt, entitle, publish and support data services. Second, data services require an invoicing process, a legal/contractual infrastructure, as well as a function that will continue to develop and innovate them.

The availability of transformative technologies, including artificial intelligence (AI), cloud and machine learning, is creating opportunities while lowering the cost and technical challenges of bringing new content services to market. Cloud enables the ability to store massive amounts of data in environments where AI can enable quick execution of regressions that discover patterns, outliers and relationships in large data sets. AI is largely dependent on access to normalised and consistent historical data to support predictive models.

Clearing and settlement infrastructure are natural aggregation points for the highest quality data available, and the transactions that flow through infrastructure services are essentially the historical record and barometer of capital markets. Combined with AI tools and the skills and imagination of data scientists, historical data provides the foundation of probability, prediction and indeed, the next generation of alpha creation.

In those cases where the data provider follows strict guidelines on governance and ensures that the data and its supporting infrastructure are of the requisite quality, the data which they provide can help market participants make more informed decisions.

Ultimately, data services are built on a foundation of trust, and for data providers to be successful it is critical to establish and maintain that trust with all market participants. To ensure this marriage, appropriate aggregation and anonymity rules should be applied.

It is crucial that the proprietary investment or trading strategy of any entity is not divulged or inferred by allowing proprietary information to be reverse engineered. Capital markets is an industry based on data and predictions. The value created for investors by financial services institutions is the core of this industry, therefore an appropriate balance must be struck between transparency and the protection of proprietary work of any individual firm.

Post crisis, regulation and the focus on increased transparency through transaction and trade reporting have led to a surge in the volume of data available to market participants. As a result, FMIs have an increasingly important role in deriving value from the data assets that flow through their services and delivering that value to market participants who use the infrastructures. Critical to their continued success is the ability to follow a best practice approach to data governance as well as privacy and proprietary issues.

Show Author Info?: 
No
Author: ateamgroup
Posted: October 22, 2018, 12:49 pm

Refinitiv’s proposal to close the Wrexham facility where it produces its Verified Entity Data as a Service – formerly the entity data service provided by Avox – looks like the beginning of a slippery slope as the company states that the proposal ‘has been made as we seek to focus our operations in fewer, larger centres as part of a global review of our operations.’ The statement continues, ‘This means we intend to close the site in Wrexham and relocate the work to larger operational centres in other locations as we look to drive results for our customers in a highly competitive environment.’

The company’s intended closure of the site puts 300 jobs at risk and is expected to transfer the entity data service to Bangalore, where Refinitiv (formerly Thomson Reuters Financial & Risk business) has operated a large data processing centre for about 10 years. It says it is in a consultation process with employees at Wrexham to consider whether there are other job opportunities for them within Refinitiv, and notes that some technology roles at Wrexham may be moved to Nottingham, where the company has a technology development centre.

Refinitiv’s proposal to close the Wrexham site has been met with considerable local concern, and questioned by former management and staff. In a LinkedIn post, Ken Price, formerly CEO of Avox, writes: “It is with sadness and frustration that I read about the former Avox business which Steve French and I started in 2002 in Wrexham, Wales being closed down and relocated to Bangalore by the new owners, Blackstone. The spreadsheet jockeys have completely missed the boat on this one. The benefits of the Wrexham operation far outweigh the marginal additional costs.”

Another post questions whether the high quality of data and communication with customers provided by the Wrexham team will be the same going forward. Refinitiv states, ‘Should our proposal move forward, we would focus on maintaining the high levels of quality to which clients have become accustomed.’

Show Author Info?: 
No
Author: ateamgroup
Posted: October 17, 2018, 12:42 pm

Contact Us

 

 

Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004

 

(347) 770-1748

(212) 931-0572

 


 

Careers

 

If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.