“…We engaged Epsilon to be our strategic partner…. I am happy to say that Epsilon met or exceeded our expectations on this project. We successfully completed the RFP project and selected a system that was right for our business…”

Chief Financial Officer

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

By: Martijn Groot, vice president, product management, Asset Control.

Enterprise data management, or EDM, has long been associated with the discipline of sourcing, mastering and distributing data that typically includes valuation data, instrument master data and entity data. However, major changes in business and regulatory reporting requirements and enabling technologies mean that each of the main steps in the EDM process is undergoing massive change, heralding nothing short of the demise of EDM as we know it.

Over the past few years, organisations have leveraged the increasing power of EDM systems to bulk-load and warehouse security master data on a typically daily basis. But while effective, such models no longer meet regulatory requirements – for example, the demands set out in Markets in Financial Instruments Directive II (MiFID II). There is a necessary evolution among both data and EDM suppliers to move from the traditional end-of-day file-based delivery of reference data towards a more specific model, where individual data items are sourced on demand via application programming interfaces (APIs).

This is augmented by an increasing availability of content that is not available via the structured offerings from the enterprise data providers, which means an opportunity for data scientists to differentiate and extract new insights. The richness of raw materials to work with includes web crawling sources to mine news and spot corporate events, sentiment analysis, satellite and geospatial information, traffic and travel patterns and property listings. All this data will come with new indices and summary statistics which, when properly accessible, can be analysed and monitored for investment signals and risk management.

But what does this mean for end-to-end data management processes? How will organisations manage the complex mix of real-time, intra-day sourced data, new content and existing data resources; ensure information is used efficiently and effectively in servicing their business users; and satisfy the regulators?

Effective Data Capture

The key to successfully leveraging real-time data is a different approach from the classic EDM model of creating the security master data source. Unlike the daily download and reconciliation, systems must now be able to respond to real-time user requests for information. MiFID II regulation, for example, presents the need for organisations to retrieve a specific set of data including, but not limited to, International Securities Identification Numbers (ISINs) for OTC derivatives, and Traded on a Trading Venue (TOTV) and underlying assets Traded on a Trading Venue (uTOTV) flags for new as well as existing OTC instruments.

To support these needs, the ANNA Derivatives Service Bureau (DSB) announced its financial data service, which requires users to supply product definition and further input attributes to the ANNA DSB via a RESTful API or the FIX Protocol. In response, an EDM solution must capture system requests related to client set up or instrument in real time, then verify whether or not the requests can be serviced from an existing data set, to prevent an unnecessary and costly hit on an external source, and only go out to look for the additional data from the ANNA DSB when needed.

Show Author Info?: 
Author: ateamgroup
Posted: January 18, 2018, 11:21 am

Brian Sentance has stepped down from the role of CEO at Xenomorph Software after 23 years at the helm and following a successful first close of a two-stage funding round. He is replaced as CEO by Ron Zeghibe, a director of Xenomorph Holdings since May 2016. Matthew Skinner, managing director EMEA, joins Zeghibe as part of the new management team.

Sentance founded Xenomorph with Chris Budgen and Mark Woodgate in 1995. The principles of the company were to make it easy for clients to manage and analyse large quantities of data, and do so with enough flexibility to match the ever-changing needs of financial markets.

In a farewell letter posted on LinkedIn, Sentance writes: “These principles have enabled the company to successfully navigate the various waves of technological, economic and regulatory changes that have challenged both us and our clients these past 23 years. Fast forward to 2018 and industry terminology such as ‘Big Data’ and ‘Self-Service Business Intelligence’ becoming mainstream has ultimately validated many of our original ideas, ideas that form a great foundation for Xenomorph to help its clients successfully meet the future data challenges of the industry.”

The company’s key product, TimeScape EDM+, has been developed as both a cloud and enterprise data management and analytics solution designed to support a number of regulatory, risk and valuation challenges.

Sentance concludes: “So what next for me? I will take a bit of a break, but I still hope to be involved in the finance and technology markets in a variety of capacities over coming months and years.”

Show Author Info?: 
Author: ateamgroup
Posted: January 17, 2018, 1:09 pm

Danske Bank UK has implemented Datactics’ RegMetrics data quality solution to improve its single view of customers in Northern Ireland. The bank concluded a successful proof of concept last summer, went live with RegMetrics this month, and is looking at additional use cases for the software.

The bank looked at three solutions to enhance its single customer view processes – Datactics RegMetrics, a vendor solution already implemented but not performing well, and in-house software that would need to be built out – before selecting Datactics on the basis of its ease of use, quality of data output and customer service. The initial use of the software will provide greater assurance of compliance with Financial Services Compensation Scheme (FSCS) requirements for accuracy of customer data. The bank also expects to achieve a better customer experience through higher levels of data quality.

Marion Rybnikar, head of data at Danske, says that beyond improving the quality and accuracy of customer data, the bank is considering using RegMetrics on a wider scale to cover other types of data, such as product information and transactional data. She explains: “RegMetrics is fast and easy to use, allowing us to properly cleanse and match our data to one single customer view ahead of submission to the FSCS. On top of this, its usability means our SMEs can develop data quality rules themselves – extending the functionality to multiple regulatory requirements and broader data quality and governance applications – and automatically generate meaningful interactive reports in Tableau, our house reporting tool. It’s the quality and accuracy of these outputs that ensures we can save time, improve our data and enhance the efficiency of our processes.”

Danske hopes to make further improvements to its mortgages and loans products through a collaboration with a data scientist at Queens University, Belfast, who will use the bank’s improved customer data to support innovation. Looking forward, Rybnikar suggests the bank’s UK robotics team could work with Datactics’ DQM workflow interface and FlowDesigner data quality process designer to identify and then fix broken data in the bank’s online portal. The software could also support automation across the bank.

Stuart Harvey, CEO at Datactics, says the company is working with a number of European retail banks on regulations such as FSCS and Section 17, and notes that while many banks are taking a one-off approach to compliance projects, Danske Bank is looking at a more innovative implementation of the company’s solutions. He says: “The bank is creating a central engine that cleans, matches and dedupes data to achieve a single customer view, but it sees this as a repeatable solution and can add value by using the data in upstream analytics.”

Show Author Info?: 
Author: ateamgroup
Posted: January 17, 2018, 1:01 pm

Tim Lind has returned to the front line of capital markets solutions as managing director of DTCC Data Services. At DTCC, he will be responsible for guiding the firm’s data businesses, including services that leverage data derived from DTCC’s global processing platforms, and ensuring ongoing alignment with risk management and regulatory requirements.

Lind previously headed regtech consultancy RTech Advisors, which he established towards the end of 2016 having left Thomson Reuters in a wave of redundancies. He worked at Thomson Reuters for six years, most recently as global head of financial regulatory solutions. Before joining Thomson Reuters, Lind was managing director and chief strategy officer at post-trade operations firm Omgeo, which was acquired by DTTC in 2013.

Tim Keady, managing director and head of DTCC Solutions, comments: “Tim brings to DTCC more than 25 years of domestic and international experience in capital markets, including a rich background in data offerings. We look forward to Tim’s contributions, working with our community to continue advancing our data strategy with an emphasis on reducing risk, enhancing data transparency and maximising value for our clients.”

Show Author Info?: 
Author: ateamgroup
Posted: January 17, 2018, 10:16 am

The Derivatives Services Bureau (DSB) set up to create ISINs required for OTC derivatives under Markets in Financial Instruments Directive II (MiFID II) has finalised fees for user contracts running from October 2, 2017 to December 31, 2018. It notes, ‘we are mindful that the smaller than expected number of users contributing to the DSB’s cost recovery results in an increase in individual user fees’.

Explaining the higher than expected user fees, the DSB, a subsidiary of the Association of National Numbering Agencies (ANNA), says a large number of investment firms have subscribed to the DSB’s free data services, which enable OTC ISIN lookups and downloads. Some trading venues, which originally discussed subscribing multiple Multilateral Trading Facilities (MTFs) and Organised Trading Facilities (OTFs) as fee-paying Power Users with the DSB, have so far contracted significantly lower numbers.

User fees recover the DSB overhead costs. The total annual overhead on which the cost-recovery fees were calculated is €9.2 million, which is 4.8% higher than the €8.8 million previously stated. The additional sum reflects development and operating costs identified in Q4 2017 by regulatory imperative and industry requests.

The fee calculation was based on contracts in force as of January 5, 2018 and the user categories those contracts represent (see table below). Excess revenues caused by additional contracts signed after January 5, 2018 will go to defraying user fees for the next contract year.

User Numbers

The difference between the preliminary annual fees compared to the final annual fees are as follows:

User type

Contracted firms 5 Jan 2018

Preliminary annual fees

Final annual fees

















The current proportion of cost-recovery payments by business sector is as follows:



Total Value

%Cost Recovery

Investment Banks




Trading Venues




Other sectors including Asset Management and Data Management




Adding to the comment above about the increase in individual user fees, Emma Kalliomaki, DSB managing director, says: “We believe that all the user numbers will continue to grow. We are continuing to receive new inquiries for the DSB paid-for services from firms that are just realising they will be creating OTC ISINs, and we expect new users with the increase in systematic internalisers later in the year.”

On cost-recovery payments by business sector, she says: “The proportionately higher participation of banks, relative to trading venues in the cost recovery, validates the design of the OTC-ISIN as internally useful for business operations beyond satisfying reference data reporting obligations under MiFID II. In 2018, we will continue our collaborations with industry to ensure the DSB receives appropriate guidance on industry’s evolving needs.”

Later this year, the DSB will reopen the fee model consultation with the industry. The objective will be to refine the cost-recovery model for 2019, considering the data and usage patterns established in 2018.

Show Author Info?: 
Author: ateamgroup
Posted: January 16, 2018, 4:37 pm

With regulators taking a tough stance on non-compliance, enforcement actions running into billions of dollars, and compliance departments under stress, how can financial institutions pull-back from the brink, improve control of compliance, and reduce regulatory risk? We caught up with John Byrne, CEO at Corlytics (and former CEO at Information Mosaic) to discuss how firms can improve compliance by employing regulatory risk intelligence and analytics.

Corlytics was founded in 2013 to provide regulatory risk intelligence to both financial institutions and regulators. A self-styled regtech, the company collects, normalises and analyse global enforcement data and other important regulatory information to give firms evidence-based intelligence that can be used to make better regulatory planning and execution decisions.

Byrne explains: “Post-trade activity is often seen as a cost, but most regulatory issues, such as fraud, often have a root cause in lack of controls in the middle and back office. Financial services firms invest in risk and compliance systems, many invest about 20% of IT spend here, but the outcome of regulation can still be horrific as middle and back office controls are starved of spend. Which is why we started Corlytics. People talk about regulatory risk, but we define and measure it on a risk weighted basis.”

Corlytics Controls Explorer gathers enforcement action, under licence, from regulators around the world. An action from the SEC, by way of example, could be about 200 pages of legal judgement. Corlytics categorises the information into 160 data attributes that are also used across UK Financial Conduct Authority (FCA) and Asian enforcement actions - to ensure it is making like-for-like comparisons. It then normalises the data before applying analytics to expose the root causes of enforcement actions.

This is no mean feat, with Corlytics employing a multidisciplinary team of legal analysts, risk professionals and data scientists to understand and analyse enforcement actions. The company uses a modicum of machine learning to match legal text and the 160 data attributes it uses in risk models developed in Python, and artificial intelligence bots to pick up any changes published by global regulators.

The software is usually procured by compliance, internal audit, heads of non-financial risk, or chief control officers, an emerging function in financial institutions’ front-line defence. Byrne says: “Banks model credit and market risk, but do little around the biggest risk in the bank, legal and regulatory risk. Balance sheets often show about 100 people working on credit impairments and only three on legal issues with regs.”

By using Corlytics Controls Explorer, financial institutions can understand their regulatory risks, implement controls to reduce them or, at least, gain early warnings of potential risk.

The company has more than 10 clients split pretty equally between regulators and banks, and expects the balance to endure as client numbers rise in response to the desire of both regulators and banks to achieve beneficial outcomes from regulation.

Corlytics’ recent projects include work with the FCA to produce an intelligent regulatory handbook. The project applied a central, common taxonomy to all regulations, allowing material in the handbook to be tagged and machine read, thus turning a legal document into a searchable database. The company is also involved in an FCA sandbox working on how to reduce the risk associated with compliance modelling for regulated firms.

Show Author Info?: 
Author: ateamgroup
Posted: January 15, 2018, 4:29 pm

By: Doug Morgan, Group Chief Executive, Cordium

The past few years have been challenging for the compliance teams of financial firms and 2018 will be no different – if anything, the pace of evolution is set to accelerate on a number fronts. From input gathered from clients and industry contacts, Cordium has pulled together a list of the 10 trends that investment firms should focus on in 2018 and beyond. In no particular order, these are:

  • Don’t assume MiFID II is done and dusted: The deadline for MiFID II compliance may have passed, but this package of EU regulation will continue to have a significant impact during 2018. While it appears many firms missed the initial deadline and extensions have been granted in some markets, we expect firms will need to continue to fix and update the programs they’ve implemented. There is also a high likelihood of a sequel to MiFID II, focused on addressing elements found to be problematic during implementation.
  • Prepare yourself for increased risk and capital rules: EU and Hong Kong-based investment firms will need to start implementing new frameworks designed to improve the way they manage risk. In particular, firms regulated by the Hong Kong Securities & Futures Commission now have to implement a new Fund Manager Code of Conduct, which will come into force in November 2018. EU firms are facing an even more significant set of changes with the implementation of a new prudential capital framework for investment managers. New rules could be in effect from as early as 2019.
  • Check your conduct: Regulators will increase their focus on monitoring the behaviour of individuals within financial firms through structured frameworks that hold individuals more accountable for their decisions and actions. For example, UK investment firms will need to prepare for a new senior managers and certification regime, expected to come into force in mid-to-late 2019. In Hong Kong, a similar regime was published in December 2016 requiring firms to submit their management structure and comply by April 16, 2018.
  • Be prepared for increased vigilance around market abuse: Regulators will continue to put real priority on stamping out market abuse and insider trading. For example, it’s likely the SEC in the US will issue some form of guidance around material non-public information (MNPI) during 2018, which will require firms to implement more specific policies and procedures. In Europe, there will be an increased focus placed on using the data generated by MiFID II’s transaction reporting – estimated at more than one trillion data points each year – to tackle market abuse.
  • Brace yourself for Brexit: Firms should be planning now for the UK’s EU withdrawal in March 2019, creating operational strategies which can be implemented depending on the specific outcomes of the negotiations. They need to closely examine a range of factors, including where current and future revenue streams will come from and how their supply chain might be impacted by any potential deal. It’s important to allocate senior management and board time to these issues and for the firm to engage with key external stakeholders. Firms need to plan during 2018 to ensure they are not only positioned to continue business as usual but also to prosper despite what Brexit brings.
  • Prepare for increased regulatory scrutiny: The sheer volume of rulemaking in the wake of the financial crisis – which occurred a decade ago – has been tremendous. With these rules mostly in place, regulators will focus on ensuring they are being properly adhered to. For example, the SEC has invested heavily in analytics and requested additional data from firms through regulatory filings, such as the recently amended Form ADV. Regulators are actively using this information to better understand the overall industry environment and to target firms with issues more selectively.
  • Create comprehensive cybersecurity processes: Cybersecurity will remain one of the most aggressive areas of regulatory evolution in 2018. Across the globe, governments and regulators are scrambling to implement new rules and improve existing frameworks for the management of the cybersecurity risk of financial firms. All firms will need to be able to evidence the specifics of their cybersecurity programmes to regulators.
  • Prepare for a clampdown on cryptocurrencies: Regulators have begun to state clearly how they will regulate the financial products and markets associated with cryptocurrencies. Their focus on this sector will continue to increase as firms look for safe ways to incorporate cryptocurrencies into their investment strategies. While monitoring the rapidly evolving regulatory activity in this space, all firms investing in cryptocurrencies need to be prepared to answer questions about risks, such as secure custody, and how they’re addressing existing compliance requirements.
  • Embrace FinTech and RegTech: Firms need to keep up on developments in this space to better understand how technology can help them comply with regulations in more cost effective ways. Many firms are already considering document-tracking solutions to ensure the firms’ compliance practices match their stated policies and to evidence these processes to regulators.
  • Reduce your reputational risk: Increased regulation creates the potential for greater reputational damage, so compliance teams need to think more strategically about how to mitigate this risk. This could involve regular reviews by a Chief Information Security Officer (CISO) to ensure data is adequately controlled and protected, providing a document solution to track compliance policy adherence and process completion, or installing a solution that captures MNPI from employees more effectively.

Reflecting a new normal, 2018 will bring new challenges for compliance and technology teams at investment firms. By taking fresh approaches to the methods used to achieve compliance and aligning programmes with their firms’ growth strategies, compliance teams can be more confident in confronting the changes which the new year brings.

Show Author Info?: 
Author: ateamgroup
Posted: January 15, 2018, 10:47 am

Thomson Reuters has fulfilled its commitment to clients within the scope of Markets in Financial Instruments Directive II (MiFID II) with go live of key services designed to help them achieve compliance.

Services that have been updated since the January 3rd, 2018 MiFID II compliance deadline include: updated MiFID II compliant data already available to clients from 57 global exchanges and eight new MiFID II trading and reporting venues, including Tradeweb’s Approved Publication Arrangement (APA) and MTS BondVision’s Multilateral Trading Facility (MTF); an enhanced MTF, which went live on schedule with trading and MiFID II compliant trade reporting taking place from the morning of January 3rd; and updated instrument reference data capabilities to ensure comprehensive coverage of the key financial instruments covered by the regulation, including the addition of 1.6 million new pre-fixed individual identifiers (ISINs) for over-the-counter (OTC) derivatives from the ANNA Derivatives Service Bureau (DSB), coverage of 300,000 new financial instruments, additional data for 900,000 existing instruments, and the addition of over 5 million records from the Financial Instruments Reference Data System (FIRDS).

These services are part of a comprehensive suite of MiFID II solutions designed by Thomson Reuters to help clients navigate compliance. The solutions include: enhancements to the company’s data analytics platform, Velocity Analytics, to support best execution compliance, transaction cost analysis and systematic trading; a partnership with VisibleAlpha and enhancements to Eikon to assist in compliance with research unbundling; the introduction of an APA connectivity solution for trade reporting requirements; a legal entity identifier (LEI) profiling solution; and enhancements to its tick history feed.

Debra Walton, global head of customer proposition at Thomson Reuters, says: “Implementing MiFID II has been a major test for the industry and we are happy to have played our part as a trusted partner to make it as easy as possible for firms to comply. The MiFID II journey doesn’t end here, and we will be working closely with our clients to help them meet the next phase of deadlines, such as best execution and the reporting requirements for systematic internalisers.

“As well as being a significant challenge, MiFID II means there is more data flowing through the global financial community than ever before, creating exciting opportunities for firms that can harness the data efficiently and discover profitable new insights.”

Show Author Info?: 
Author: ateamgroup
Posted: January 11, 2018, 5:51 pm

Optimising client onboarding and Know Your Customer (KYC) processes continues to challenge banks operating in a highly regulated and competitive market. The challenge is exacerbated by increasing requirements to track and understand entity hierarchies and ultimate beneficial ownership. The webinar will discuss the ongoing challenges of client onboarding and KYC, and how best they can be addressed. It will also detail requirements for ultimate beneficial ownership d

Webinar Date: 
Tuesday, May 1, 2018 - 15:00
Author: ateamgroup
Posted: January 11, 2018, 1:46 pm

The 2018 regulatory agenda is in motion, with Packaged Retail and Insurance based Investment Products (PRIIPs) going live on January 1st, and Markets in Financial Instruments Directive II (MiFID II) and Markets in Financial Instruments Regulation (MiFIR) taking effect on January 3rd. Looking forward, the compliance deadline for General Data Protection Regulation (GDPR) is May 28th, and although pushed back from a 2019 deadline, compliance with the Fundamental Review of the Trading Book needs attention this year.

With so many regulations on the agenda, we looked at their data management requirements and the increasingly critical need to take a strategic to compliance during a recent A-Team Group webinar, Data Management Requirements for the 2018 Regulatory Agenda.

An early poll of the webinar audience asked which regulation delegates expected to be the most onerous in terms of data management at their organisation. Some 44% said the MiFID II and MiFIR hangover, 41% GDPR, 10% preparing for FRTB, 2% Benchmark Regulation, and 2% PRIIPs.

The webinar speakers – Dessa Glasser, principal consultant, Financial Risk Group (FRG) and former CDO at JP Morgan Chase; Kelvin Dickenson, vice president of compliance solutions at Opus/Alacra; and Chris Casey, global head of regulatory and reference data at Bloomberg – noted similar priorities. Dickenson commented: “MiFID II will remain top of mind in 2018. Even if firms were ready on the compliance deadline, in 2018 they will be questioning whether their solution is working ell and doing what it is intended to do.” Casey added: “Many of the firms we are working with didn’t have enough time to complete testing as a result of late additions to MiFID II and new connections to data sources such as the Derivatives Service Bureau, so there is still work to do.”

Glasser selected GDPR as the toughest challenge in 2018, on the basis of its extensive reach and significant penalties for non-compliance.

Looking at the data requirement for 2018, the speakers acknowledged an ongoing rise in volumes of data, the introduction of more external third-party data, and the resulting need for improved data control and lineage. The speakers went on to discuss best practice approaches to this year’s regulatory requirements and a variety of technology solutions.

Webinar Recording and Transcript: Data management requirements for the 2018 regulatory agenda

They concluded that the 2018 agenda requires firms to stop talking about a strategic approach to compliance and take action. Dickenson commented: “This is critical going forward. Data requirements of different regulations can be brought together, and a strategic approach can replace regulations in silos that duplicate work and can drive customers away.”

An audience poll showed 40% of respondents taking a strategic approach, 20% implementing such an approach, and 11% with no plans for a strategic approach. Similar percentages said a strategic approach is on the agenda, nothing has yet been done, or a strategic approach is being planned.

Featured Download: Poll results on Data management requirements for the 2018 regulatory agenda from our recent webinar audience

Indeed, plans aplenty, but who should lead the strategy programme and what are the goals? Casey said: “People often say the chief compliance officer should lead the strategy, but this role is limited in what control is has, so there needs to be buy-in from the chief technology officer and chief data officer too. A successful strategy includes common data sources, a common infrastructure layer, high security and low level of customisation.” Or, as Dickenson put it: “The only person who can own the strategy is the chief compliance officer. This officer is best at deciphering what needs to be done, but worst at execution, so needs to work in partnership with the chief technology officer, chief data officer and chief risk officer.”

Show Author Info?: 
Author: ateamgroup
Posted: January 10, 2018, 12:41 pm

After a period of intense activity, acquisition and reorganisation, NeoXam moves into 2018 with a focus on its Investment Book of Record (IBOR) and DataHub solutions. It is also responding to customer requests to include standardised data in its solution modules and aiming to help asset managers deal with the challenges of regulation, data quality and cost control by implementing its best-of-hybrid model – more about that later.

NeoXam took off in February 2014 with funding from Blackfin Capital Partners and private investors led by CEO Serge Delpla. Its early acquisitions were the GP3 fund accounting and Decalog compliance solutions that it bought from SunGard. Staying on the acquisition trail, NeoXam bought a further three software companies and by June 2015 its portfolio included Density Technologies, a provider of front-to-back office software solutions; Nexfi, a provider of complex fund management software; and SmartCo, provider of the DataHub enterprise data management (EDM) and IBOR solutions.

The company operates three lines of business covering investment management, investment accounting and data management, the latter based on the SmartCo DataHub and the company’s recognition that asset managers could gain value from data consistency provided by a central EDM solution integrated with both NeoXam and other vendor software solutions – the company’s best-of-hybrid approach. The initial use case for DataHub is usually one of its many modules, which include corporate actions, reference data and a security master. These use cases often lead on to adoption of the company’s product master and then interest in its IBOR solution.

Yan De Kerland, NeoXam head of sales EMEA, explains: “The acquisition of SmartCo, particularly DataHub, has enabled us to deliver our vision of integrated systems. Two years ago, we were lacking a shared data layer among our products. SmartCo brought a common, consistent data model and better ability to integrate systems.” On the company’s investment management products, he says: “We cover the asset management value chain, but customers can take what they need. All our solutions are modular and have open application programming interfaces (APIs) so that we remain product agnostic, but can provide tight integration.”

From a product perspective, the company is developing a solution for the Fundamental Review of the Trading Book (FRTB) based on DataHub. The solution supports the regulation’s market data management and governance, and data quality and model building requirements, and interfaces with customers’ risk systems. Considering the high cost of market data, it also controls and optimises data requests.

Investment is also being made in the expansion of delivery models for the company’s investment management solutions. While most of NeoXam’s clients have deployed solutions and some use a hosted version of the company’s software, more and more would like to migrate to data-as-a-service, an area that the company is working on. Mobility and data distribution are also on the agenda, with NeoXam working to improve delivery of data to mobile devices and end users.

From a geographic perspective and in line with the company’s history – it is headquartered in Paris, France – NeoXam’s initial markets were in continental Europe, where its investment management products gained traction and gave way to opportunities in the UK. China also became a hotspot for the company, where a localised version of the GP3 fund accounting software found favour. It has since built sizeable buy-side and sell-side markets in the US and Asia, which, like Europe, are growth targets for this year and beyond.

Show Author Info?: 
Author: ateamgroup
Posted: January 10, 2018, 12:17 pm

By: Conor Coughlan, global head of risk, regulatory and compliance marketing, Financial and Risk Division, Thomson Reuters

Looking forward and following current patterns of industry change and analysis, 2018 is going to be a pivotal year for regtech collaboration, service and solutions providers.

In essence, the themes we can expect to see more of in 2018 are:

  • Continuing expansion and scale of current offerings
  • Greater innovation and collaboration between market players
  • Maturity and increased investment (backing) for regtech firms
  • Start of an acquisition and commercial partnership movement

More broadly the niche perception of regtech relating solely to firms addressing small scale, singular regulatory or compliance workflow only challenges is over. Regtech is now seen to encapsulate all forms of solutions origination, innovation, development and deployment for specific or large-scale risk, regulatory or compliance challenges.

In addition to smaller players, more mature market vendors such as Thomson Reuters, Bloomberg, Six Financial, Accenture and ICE have now aligned elements of their businesses to focus on dealing with regtech. In addition, hundreds of new players are entering the market every year. This quasi ‘ICT, governance, risk and compliance (GRC) and reporting’ solutions and services arena is, to put it bluntly, booming. Given the ever-increasing regulatory demands facing the banking, financial services and insurance (BFSI) sector, and now with General Data Protection Regulation (GDPR), that is not surprising.

Industry participants are looking to large scale players and proven entrants to collaborate more effectively and deliver a broader and deeper range of solutions and services, that can, in the future, offer end-to-end compliance, governance, risk management and reporting oversight and capabilities.

In particular, due to the current and future level of regulatory oversight, the BFSI sector, although not exclusively, is looking to regtech players to address how they can be both compliant with GPDR, MiFIDII, FRTB, PSD2, PRIIPS (+ so many more) and deal in some cases with contradictory (data usage restrictions) requirements and/or complimentary obligations relating to data collation, analysis, transparency and reporting requirements, which may offer more synergistic benefits, if successfully addressed.

Collaboration and partnership

In 2018, it is evident that collaboration, partnership and more formally ‘joint ventures’ will continue to increase, as the BFSI sector endeavors to address a swathe of regulatory pressures and better align the ‘right firms, with the right players’ in order to drive scale. Evidence clearly exists whereby large-scale market vendors (in part due to direct customer requests) have been working with more specialist regtech firms to address specific challenges. This has aided such longer-term players to pivot some of their related business offerings and bring their considerable experience and best practices to bear. In many cases, such players are offering regtech firms access to their Innovation Labs and Incubation Centres.

Equally, many previously perceived niche players are now ripe for acquisition, as they have proven the validity of their business models and a clear demand for their services, not to dismiss that many are also highly lucrative. Not all will follow this path in 2018, but many are now coming of age. This year, we can expect to see larger market vendors purchasing a range of proven regtech firms. In addition, we will see some venture capitalists (VC) and private equity (PE) players starting to curate a portfolio of aligned or complimentary firms, in order to shape and develop a new regtech leader or to align some clear and obvious capabilities into one group or alliance of such firms.

Equally, many conceptual and unproven regtech start-ups will fall by the wayside, but for example, firms like www.enforcd.com, which have won the backing of the Bank of England and Accenture, are likely to take their offerings and scale them even further.

Overall, 2018 is going to be a very exciting and evolving period for regtech. We should see the landscape starting to consolidate and the possible emergence of some new leaders.

What are the greatest obstacles for regtech?

For me, the answer to what the greatest obstacles are for regtech and innovation in general are is very clear. To put it simply:

  • Public and private investment
  • Government and competent authority policy
  • Culture

Without investment, direct or indirect, our ability to innovate is being greatly stunted. We have some clear examples around the world, such as Singapore and Hong Kong, where this is not the case. However, when I look to the UK for examples, its level of public investment is appalling when compared to other jurisdictions. Governments and devolved regions, in addition to private entities, must actively invest in innovation and in this case regtech players. Innovation is not free, it does not happen generally by accident or more precisely without a mandate for innovation and experimentation already being in place. If you want to see rapid change, you must strategically invest to get it.

Government, including development agencies, and regulatory authority strategies and policies, must be aligned to stimulate and foster greater regtech innovation. Innovation requires strategic forward-looking policies and investment positions. If you want to win the ‘Regtech Race’ then you best make it a strategic objective and form a coherent policy and framework around it. Merely offering sandbox access is not the same as strategically stating you want to be a hub for regtech innovation. Having clear public strategies and policies in place is vital to innovation, experimentation and development.

Culture, both internally and externally, both corporate and public, needs to be aligned to fostering innovation. In particular, corporations and private entities need to open their minds to want to innovate and seek new solutions and methods to address regulatory and compliance challenges. If you want to be more agile, responsive and capable, you can’t keep operating the way you have always done in the past. Equally, if public organisations are seeking greater innovation, change and solutions to better address known or unknown risk, regulatory or compliance challenges, then you need to adopt a fostering culture of curiosity, openness and innovation.

Without one or more of the above being in place, our ability to further develop and expand the regtech arena is greatly reduced. We will not see the transformational changes we want in the near future.

Regtech technologies with the biggest impact

Blockchain: Distributed ledgers, including blockchain, bitcoin, Ethereum platforms and so on, are already generating new use cases and it’s clear their capabilities will lend themselves to the regtech arena. One such area is how firms address Know Your Customer (KYC) and Anti-Money Laundering (AML) requirements and obligations.

It’s very possible that the crypto-currency bubble will in fact power a new regtech eco-system(s) that will offer far reaching benefits and innovation capabilities, long after initial currency / transaction concepts have disappeared. Any possible low cost, ideally low energy usage, multi-hosted, shared, standardised digital / software platform that can offer increased transparency, tracking, auditing and reporting capabilities will have many benefits for the regtech community, providing the platforms are secure, stable and scalable. So yes, these platforms will better enable the development of a new range of regtech solutions and services.

Smart Data: The era of Big Data is over. It’s now the era for ‘Smart Data’ or, to be more precise, knowing what do with your data and how it can offer your firm and customers value adding returns. Collating datasets for the sake of it does not offer your firm or the market any true advantage unless someone can derive value from it.

Assuming you have processed, normalised, standardised and aggregated all your content/data (big task), you are now in a position to critically examine and possibly correlate a range of value adding data sets, or establish previously unknown correlations or indirect/direct patterns.

Unless you have specifically selected to focus on a range of data points (better for smaller firms, as it lowers data processing costs), you are now facing a huge headache when it comes to storing and processing large volumes of company and market data.

Ultimately, data analysis, smart data and our ability to make sense of large amounts of fixed or real-time data comes down to our ability to deploy machine learning and artificial intelligence (AI).

Machine Learning and Artificial Intelligence: Machine learning, or defined pattern recognition and processing, is more established and proven. However, it does tend to be far more rigid and process specific. It has been in use for decades and offers many of the world’s leading firms a competitive advantage when it comes to processing large swathes of complex data.

AI is still unproven, limited forms have been deployed to address very standardised workflows. However, the vision of the all-knowing, highly intuitive, exceptionally intelligent computer mind/assistant to help you process vast amounts of complex and varying data has not yet arrived.

Value-adding versions of lesser AI powered systems, processes and platforms are coming online. AI solutions and services will help firms and their customers to better identify risk, regulatory and compliance patterns and challenges.

In particular, with regards to regtech, AI, or versions thereof, could help firms to better analyse trading patterns, money movements, engagements by suppliers and partners, procurement practices, internal employee actions (or non-actions) in addition to customer usage patterns, which could in practice help firms better predict / determine how they can offer more tailored value adding services in addition to meeting their own regulatory and compliance requirements.

One example of successful AI development is Thomson Reuters Data Privacy Advisor (TR DPA), which has been developed with embedded AI capabilities to better enable firms to meet their global data privacy obligations. This service launches formally in 2018 and has been designed for related legal, risk and compliance professionals. In this case, it is dealing with a highly complex and ever-changing set of multi-jurisdictional challenges. However, you will need to wait for the formal launch to see what else the solution does.

Show Author Info?: 
Author: ateamgroup
Posted: January 9, 2018, 11:32 am

General Data Protection Regulation (GDPR) comes into force on May 25, 2018, replacing and extending data privacy rules set down in 1995. This time around, the key to successful implementation is strong and sustainable data governance. The webinar will discuss the role of data governance in GDPR compliance, explore best practice implementation, and detail how compliance can be maintained in the face of exorbitant fines for non-compliance.

Webinar Date: 
Thursday, February 22, 2018 - 15:00
Author: ateamgroup
Posted: January 4, 2018, 1:23 pm

Happy New Year and welcome to the Year of Regulation, aka the Year of the Dog and, for some capital markets participants struggling to stay abreast of regulatory requirements, the Year of Anxiety. At the top of the agenda are Markets in Financial Instruments Directive II (MiFID II) and Markets in Financial Instruments Regulation (MiFIR), which go live today despite a few hiccups and gaps in the specs.

Also taking effect this week are Benchmarks Regulation, and Packaged Retail and Insurance based Investment Products (PRIIPs) regulation. These regulations will be followed in May by General Data Protection Regulation (GDPR) and towards the end of the year by early reporting under Securities Financing Transactions Regulation (SFTR).

The compliance deadline for the Fundamental Review of the Trading Book (FRTB) has slipped from January 2019 to January 2020, and it may slip further, but its complex requirements and capital calculations also need ongoing commitment to implementation. You can find out more about these regulations and others in the fifth edition of the A-Team Group Regulatory Data Handbook, or by registering for some of our forthcoming webinars.


Go live of MiFID II and MiFIR marks the biggest market reform in over a decade and the start of an EU regime driving financial markets transparency and investor protection. Despite today’s compliance deadline, for most firms within the scope of MiFID II, this is only the beginning of fixing things that don’t quite work, dealing with information that is due to be published by the European Securities and Markets Authority (ESMA), such as a list of firms holding Organised Trading Facility (OTF) licences, and preparing to work with unfinished elements of the regulation such as the Financial Instruments Reference Data System (FIRDS). The success of trading OTC derivatives on regulated markets using ISINs distributed by the Derivatives Service Bureau operated by the Association of National Numbering Agencies (ANNA) also hangs in the balance.

On the regulatory front, while the European Commission took a hard line on compliance last year, cracks are already beginning to show. Perhaps the biggest, and most disappointing, pull back is on the No LEI, No Trade rule, with ESMA handing out a six-month period of grace on the mandate for Legal Entity Identifier (LEIs) to be included in MiFID II reporting less than two weeks before the compliance deadline.

Despite the No LEI, No Trade mantra and programmes designed by firms to help their counterparties obtain LEIs, ESMA said that not all firms would have required LEIs by January 3, 2018 and suggested the last-minute reprieve would ensure a smoother introduction of the rules. Another blow for the LEI, but hopefully one it will recover from within the next six months.

More locally, the UK Financial Conduct Authority (FCA) today granted ICE Futures Europe and the London Metal Exchange an additional 30 months to comply with MiFID II’s clearing regulations. In a statement, the FCA said it had granted the extension after taking account of the risks and to ensure the 'orderly functioning' of the clearing market is maintained. The FCA’s decision follows yesterday’s move by the German regulator, BaFin, to grant the Deutsche Börse owned futures exchange Eurex a MiFID II extension.

These early regulatory amendments to MiFID II are not ideal, but perhaps not exceptional in the circumstances of regulatory implementation with a cost pitched at about $2.5 billion and ongoing compliance costs of $750 million.

The rest of the regulations

If MiFID II is top of the agenda today, don’t lose focus on the rest of this year’s regulations. Benchmarks Regulation came into force this week, affecting all firm using benchmark data, as well as PRIIPs, a huge undertaking in data management and product information distribution that has led some firms to review their product ranges.

GDPR is also a giant, replacing and extending previous privacy rules to cover the processing of personal data of all EU residents. Data management challenges include huge volumes of data, record keeping, security, avoiding breaches, and problems of data proliferation. Fines for non-compliance run up to a whopping 4% of group annual turnover, begging the question of whether financial services firms will meet the May 28, 2018 compliance deadline.

Hot on the heels of these regulations come EU SFTR requirements designed to increase the transparency of shadow banking; ongoing SEC regulatory modernisation and the implementation of the US Consolidated Audit Trail; and the next ‘big thing’, FRTB.

So, there it is, 2018 - The Year of Regulation. We hope it is a good one for you.

Show Author Info?: 
Author: ateamgroup
Posted: January 3, 2018, 2:02 pm

By: Sapient Consulting

Data has become one of an organisation’s most valuable assets, although many struggle to turn it into a profitable asset. In addition to lacking specialist knowledge, as well as the right tools and experience, companies often face challenges with the availability and usability of their data, overcomplicated legacy technologies, and a shortage of resources for analysis. Another challenge are stricter global guidelines for sharing personal information.

Recent advances in retail banking regulation resulting from the European Payment Services Directive II (PSD II) and the work of the Competition and Markets Authority (CMA), as well as regulators taking a more stringent approach on openness and competition, are forcing banks to open up their application programming interfaces (APIs). This brings new entrants to the market and allows a more open comparison of products and services under the Open Banking Programme.

These changes are designed to disrupt existing business models. Similarly, for investment banks a refocus on fee-based business activities requires a better understanding of how to service customer needs when proprietary trading is no longer profitable. In the face of these changes, organisations will have to come up with innovative ways to find value for their customers.

Strategies for data monetisation

When it comes to increasing the value of data, there are a number of broad strategies that companies can use to monetise data. The preferred strategy or combination of strategies will shape the direction of the analysis to focus on either one or more outcomes.

The strategic opportunities for monetisation can be classified into three broad categories:

  • Reduced cost: Organisations will primarily realise cost reductions by increasing operational efficiencies. Using data to streamline technologies and infrastructure can decrease costs by improving efficiencies and eliminating duplication across systems. Data can underpin improvements in team productivity by optimising processes, which can lead to lower workforce-related costs.

    There are also opportunities to ease spending on data brought into a company and improving the quality of data and distribution can dramatically lower costs through better use of golden sources.
  • Increased revenue: With a deeper understanding of data, organisations can gain and explore new business insights and revenue opportunities. Once data is analysed, useful insights about customer behaviour and purchasing trends can help drive more profitable services.
  • Sell data: Data is the most basic commodity to sell when looking to monetise your information assets. When data is exclusive, the potential for higher revenue exists as long as there is a market for it. When data is too sensitive to sell directly, the opportunities should still be considered because there could be value in anonymising it.

Three steps to unlock the value of data

A disciplined three-step approach will help organisations understand and obtain the most value from their data.

Step 1 – Business model analysis

To maximize data monetisation, organisations should first evaluate their current business model. A good understanding of the operating landscape will help steer an organisation in the right direction as well as understanding the full context of business potential and constraints.

The key areas firms should assess include:

  • Customer behaviour: Performing a behaviour analysis helps to understand the customer and their needs in more detail. The aim is to recognise what the customer is doing when interacting with the organisation and why. It also identifies certain behaviours that can help predict similar patterns in the future or among other customer groups.
  • Company structure and governance: A company’s structure and governance is the link between people, skillset and desired output. This is a clear guide to the maturity and readiness of data. Companies should aim to have a disciplined data governance structure set up to help enforce a clear plan to develop required skillsets and technologies, as well as secure budget and embed trust in the data quality framework.
  • Key activities and partners: Looking at the key activities the organisation is performing helps understand what data is gathered, possessed and used throughout day-to-day activities. The firm’s key partners and suppliers will also be looked at, as well as any dependencies and SLAs in place. This picture of the firms’ activities provides visibility into the data types, usage and manipulation occurring, and offers a complete view of the internal current state of all data, not just the information deemed important by regulations or record keeping requirements.
  • Customer relationships and channels: In the digital age, customer relationships and channels form an important aspect of most business models. This includes how the company interacts with customers, levels of customer intimacy and self-service, as well as the overall benefits and value proposition for the customer.
  • Potential boundaries: The potential boundaries an organisation could be facing need to be taken into consideration. This includes every reason why a firm cannot use its data freely, sell data or explore new business streams or growth opportunities due to regulatory, reputational, ethical or other issues.
  • Cost structure and revenue streams: Organisations must assess their cost structure. Are the main costs data related and if so, why? Are key resources required to keep expensive and inefficient processes alive due to data processing that is not optimised? Revenue streams need to be understood to know what customers are paying for and what value they would be willing to pay for.
  • Ideas and strategies from other organisations and Industries: Lastly, it is important to look at how organisations both within the same industry and in different industries are monetising data.

Step 2 - Data methodology

Conducting a business model analysis clarifies the context behind how and what existing data is generated and how it is used by an organisation. It sets the stage for a deeper analysis, looking closely at the overlaps and intersections of data sets to find areas of impact where the customer needs exceed the services offered. Output from the previous step provides an understanding of: customer needs; industry-wide view; assessment of how customer channels are evolving; and a competitor view.

The data methodology that follows the business model analysis contains six steps:

  • Inventory of data sets: The first step is to understand existing data models. Creating a similar data set for the customer’s needs and competitor’s offerings indicates the gaps between the services the organisation is offering versus customer need, competitor offerings and industry-wide developments.
  • Use cases for data monetisation: After completing the inventory of data sets, organisations can create potential use cases for the identified data models. Identifying and analysing where data sets overlap will highlight how firms can apply these elements in multiple areas or circumstances. Finding revenue generating ideas from the changes invoked during regulatory compliance is an example of this process.
  • Feasibility of data monetisation: Because of constraints around the usage of data for deriving monetary value, firms should conduct a feasibility study to see which scenarios fall outside of legal and regulatory boundaries and hence can be ruled out in an early stage.
  • Data preparation: Once firms know which data sets are available and can be used without restrictions, the data needs to be transformed to a state where analytics can be run, and insights generated.
  • Data analytics and building data sets for monetisation: Analytics can now be performed on the prepared data to derive insights.
  • Industrialise analytics: Once monetary insights are available, it’s important to industrialise the process by formalising data ingestion through production source systems.

Step 3 – Estimate the value of the data

The final part of the approach relates to assigning a monetary value to the data. Measuring and improving data quality takes time and effort; similarly, making data available from legacy systems comes with a cost. Understanding the regulatory and legal impact, and any steps needed to comply with data protection regulations in certain jurisdictions, should also be considered when calculating the total value that can be extracted from the data.

To understand the potential monetary value for capitalisation, consider the equation below. Even if high-level estimates are taken, all variables in the equation must be represented in monetary terms.

This equation is only a starting point. With the insights gained in the previous stages, firms should customise the equation to their particular analysis. The opportunity value can be derived by using standard market-sizing techniques and from the previous analysis the constraints and costs to be subtracted from the opportunity value will be well understood.


  • Data has become a critical asset for organisations in their efforts to survive and thrive in today’s fast-changing landscape, with many turning to it to unlock opportunities for competitive advantage.
  • The key to success is to holistically look at a company’s business model and operating environment before beginning to understand the details of the data.
  • Often potential opportunities may seem too conceptual. By leveraging the disciplined three-step approach detailed in this article, firms can separate concept from reality and successfully capitalise on their information assets.

Authors: Maria Hammargren, a senior business consultant with Sapient Consulting; Prateek Kulshreshtha, a senior business consultant with Sapient Consulting, Cian Ó Braonáin, global lead of Sapient Consulting’s Regulatory Reporting practice.

Show Author Info?: 
Author: ateamgroup
Posted: January 2, 2018, 12:52 pm

General Data Protection Regulation (GDPR) takes effect on May 25, 2018, requiring financial institutions to meet stringent new rules on managing the personal data of EU residents, and setting astronomic fines for those that fail to comply.

Webinar Date: 
Thursday, January 25, 2018 - 15:00
Author: ateamgroup
Posted: December 14, 2017, 7:07 pm

A group of six banks and data vendors is working with Ethereum smart contracts to improve the quality of counterparty reference data through anonymous reconciliation. The blockchain initiative was borne out of the need to improve data quality for Markets in Financial Instruments Directive II (MiFID II) and Markets in Financial Instruments Regulation (MiFIR), and a pilot project is underway and due to be complete by the end of January 2018.

The project was initiated by UBS in its innovation and has been joined by Barclays, Credit Suisse, KBC, SIX and Thomson Reuters. Its intent is to reconcile the reference data of Legal Entity Identifiers (LEI) mandated for use under MiFID II and MiFIR, and streamline the LEI process for all participants.

Christophe Tummers, head of data at UBS, explains: “Traditionally, a firm such as ours quality checks data against multiple sources, but we do not have a quality baseline against peers. By using blockchain inspired smart contracts, the reconciliation of data can happen in almost real-time for all participants, anonymously.”

The Ethereum solution takes specific reference data for each legal entity and cryptographically conceals it at each institution using a process called hashing. The source data is held and remains within the participating institution. Only the hashed data is submitted, anonymously, to an Ethereum private blockchain powered by Microsoft Azure. The Ethereum smart contracts then reconcile the data against the consensus and provide each participant, via a user interface, the ability to search and view their own specific data in real-time. A user can then see where the anomalies lie in the data set and work to resolve those.

Robert Jeanbart, division CEO at SIX Financial Information, says: “MiFID II creates complex data management challenges for businesses. This initiative presents a unique opportunity for firms to benchmark content alongside their peers before it is used in regulatory reporting.”

Mark Davies, global head of RMS Data Services at Thomson Reuters, says: “This is a collaborative project that uses the latest blockchain technology to solve a real-world business challenge by improving the quality of counterparty reference data.” Emmanuel Aidoo, head of blockchain strategy at Credit Suisse, adds: “This is an important project as it establishes blockchain benefits in a broader context than clearing and settlement. The use of blockchain to solve regulatory requirements in a cost-effective way is very appealing.”

The project is a pilot phase in a mock-live environment using 22,000 non-sensitive LEI reference attributes for cash equity issuers. It is due to complete by the end of January 2018, with further, staged rollout dependent on the findings.

Show Author Info?: 
Author: ateamgroup
Posted: December 13, 2017, 1:35 pm

Kingland Systems continues to innovate with the fourth generation of its enterprise software platform. The platform provides artificial intelligence (AI) focused on data management, extended enterprise data management capabilities, new analytics, and cloud optimised DevOps software to support high performance software strategies. It also accelerates specific solution delivery by avoiding extensive customisation, providing 60% to 80% of core capability on Day 1, and focussing remaining time and budget on unique client requirements.

The platform uses a microservices architecture of more than 40 components to create client specific and cloud optimised solutions, and consists of four elements covering cognitive computing, data analytics, enterprise data management, and an enterprise applications foundation that accelerates project implementation and provides cloud optimisation, scalability and automated testing and deployment. The fourth-generation platform updates all these elements.

Tony Brownlee, a partner at Kingland, explains: “The fourth-generation platform formalises our cognitive computing capability and reimagines how master data management needs to operate on a modern platform. The microservices architecture helps our clients build, maintain and upgrade solutions.” He adds: “The platform is not a product, but key capabilities and components that solve clients’ problems and deliver quick, agile systems that can be maintained over years to come.”

The AI element of the platform is cloud-based and uses application programming interfaces (APIs) and software-as-a-service (SaaS) delivery to integrate with legacy systems. Its focus is on data management, data collection, and business process automation, and it has been enhanced in response to client requests to unlock data in legacy documents.

Brownlee says: “Our AI engine is very fast. It can read a 300-page document and extract data in seconds. This helps users discover and maximise new data. Typically, the data covers customers, legal entities and individuals, noting their location, services they have received, how they are related to each other, and news about issues such as mergers and acquisitions or bankruptcy.”

The company is also experiencing growing demand for cloud-based machine learning, particularly for risk, credit risk, transaction processing, clearing and settlement, and compliance. Brownlee comments: “Clients want more machine learning and the ability to load diverse types of data. The goal is to deliver data faster than can be done internally at a lower cost.”

While Kingland continues to invest in its technology and deployments across a number of industries, Brownlee concludes: “The fourth generation is something to celebrate for us. It can solve some significant problems in the world.”

Show Author Info?: 
Author: ateamgroup
Posted: December 13, 2017, 1:28 pm

Understanding in-house consumption of vendor data and ensuring compliance with multiple contracts for market data and other information can be challenging, particularly for financial institutions managing large volumes of contract clauses across hundreds of suppliers.

Author: ateamgroup
Posted: December 12, 2017, 5:52 pm

A-Team Group announced the winners of its Data Management Review (DMR) Awards at a well-attended ceremony at Merchant Taylors’ Hall today. The awards are designed to recognise leading players providing data management solutions and services to capital markets participants. See complete list of awards and winners below.

The DMR awards are now in their fifth year and, on this occasion, were hosted by Andrew Delaney, Chief Content Officer at A-Team Group. Delaney was joined by Michael Edwards, better known as Eddie ‘The Eagle’ Edwards, as guest speaker and presenter of the awards. Eddie ‘The Eagle’ Edwards was the first competitor to represent Great Britain in Olympic ski jumping in 1998.

Delaney says: “Congratulations to the winners of this year’s A-Team Group DMR Awards and thank you to everyone who voted. We had a large number of nominations and votes across all categories, which signals growing acceptance of third-party data management solutions and a drive towards innovation in capital markets.”

This year’s award categories range from best sell-side and buy-side data management platforms to solutions for managed services, corporate actions processing, data aggregation, data lineage, entity data, Know Your Client (KYC) and client onboarding, performance measurement, and fund accounting, portfolio and data management. Categories for data providers include best sell-side data and best buy-side data, pricing and valuations data, index data and corporate actions data.

As well as product categories, editor’s recognition awards were presented at the event for best data management practitioner, best data management vendor professional, and best innovation in data management. These particularly prestigious awards were received by Julia Bardmesser, global head of data integration at Deutsche Bank; John Mason, global head, regulatory and market structure propositions, Financial and Risk Division at Thomson Reuters; and S&P Global Market Intelligence.

And the winners are:


Company Winner Name

Best Sell-Side Enterprise Data Management Platform


Best Buy-Side Data Management Platform

IHS Markit Enterprise Data Management

Best Sell-Side Managed Services Solution

SmartStream RDU

Best Buy-Side Managed Services Solution

Bloomberg Polarlake

Best Corporate Actions Processing Platform

IHS Markit Information Mosaic

Best Risk Data Aggregation Platform


Best Fund Accounting, Portfolio Management & Data Platform


Best Data Provider to the Sell Side

Thomson Reuters

Best Data Provider to the Buy Side


Best Entity Data Solution

Thomson Reuters

Best Pricing & Valuations Data Provider

ICE Data Services

Best Index Data Provider


Best KYC & Client On-Boarding Solution


Best Data Lineage Solution


Best Corporate Actions Data Provider

ICE Data Services

Best Performance Measurement System


Editor's Recognition Award for Best Data Management Practitioner

Julia Bardmesser

Editor's Recognition Award for Best Data Management Vendor Professional

John Mason

Editor's Recognition Award for Innovation

S&P Global Market Intelligence


Show Author Info?: 
Author: ateamgroup
Posted: December 8, 2017, 4:02 pm

Contact Us



Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004


(347) 770-1748

(212) 931-0572





If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.