Data Management Review
Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.
By: Stephen Wood, Global Head of Enterprise Deployment at OpenFin
Let’s say you’re running IT for a bank. You will own software. You could build software and, of course, you’ve got a choice of software-as-a-Service (SaaS) options from third-party vendors. But those vendors have to be vetted. Do they meet your firm’s security requirements? Will they still be around a year from now? Will they be supporting the product?
Security onboarding is a tangible, and time consuming concern, but that’s only the beginning. Once you’ve picked a vendor, there’s the install itself to consider. Quality assurance, functionality, fault and conflict testing are all time consuming processes as you work through the inevitable kinks before a platform is production ready. Then, you need to send it to your internal packaging team to ensure it works in your desktop environment and meets security requirements. Desktop apps are notorious for creating the largest footprint for cyber attack, so you have to be sure of what you’re getting yourself into.
Now, let’s say you’re a humble software vendor. Your resources may be hard-pressed to withstand the intensive and intrusive process of a large bank’s vendor risk assessment – penetration tests of your infrastructure, questions about your business continuity plan, information about your own risk compliance and processes, how you onboard your staff, and the wealth of documentation they require. And this is before you get to packaging.
Working with your client’s IT team, you’ll spend the next three months (but probably more) in a cyclical process of testing, iterating and packaging. In the meantime, you’re burning cash while your product is quarantined from the marketplace. Interestingly, your client is also burning cash, about $600,000 a year of internal costs to package platforms like yours. Never mind upgrade rollouts, which your new client might not even take because of the daunting nature of the packaging process.
Caught in the vortex of compliance, security issues and environmental complexity, this cycle creates friction for financial institutions and vendors alike. Lucky for everyone involved, there’s an easier way.
OpenFin streamlines desktop app deployment and completely eliminates the need for software packaging and costly security reviews. It looks like magic, but it's actually quite straightforward. OpenFin enables developers to build web apps that look and feel like native desktop apps – multi-window, multi-monitor, push notifications, popups and everything else a native app can do.
With OpenFin OS, a desktop app is web deployed just like a website is deployed via a browser – instantly. OpenFin ensures desktop security in the same way (and using the same mechanism) that is used by a web browser. Of course OpenFin OS itself needs to be approved (just like a browser), but once that is done there is no need for desktop apps deployed via OpenFin to be packaged and security reviewed. OpenFin also provides IT security with tools to seamlessly update to new approved versions of the OS.
OpenFin OS is already approved and running in over 1,500 buy and sell-side firms in over 60 countries and there are already more than 1,000 applications in the ecosystem. So you can finally say goodbye to software packaging. There is a better way.
Refinitiv has named Sherry Madera as global head of industry and government affairs with a view to enhancing its engagement across government, policy initiatives, industry trends and regulatory developments. Madera joins Refinitiv from the City of London Corporation and has over 20 years’ experience across both the public and private sectors, including corporate finance, banking, asset management and, more recently, global policy leadership and advocacy in financial and professional services. At Refinitiv, Madera and her team will engage with customers, industry groups, regulators and governments to help tackle key customer challenges and advocate for fair, efficient and sustainable financial markets.
RegTek.Solutions has hired Chris Cornish as a consultant business analyst based in London. He joins from IHS Markit where he was a senior business analyst and a Securities Financing Transaction Regulation (SFTR) specialist. He was previously at ICBC Standard Bank, where he worked on SFTR implementation. At RegTek.Solutions, Cornish will focus on Validate.Trade, a product used by banks and asset managers to monitor the quality of reporting data, with a view to adding SFTR reporting quality assurance.
Banking software specialist Temenos has formed a strategic collaboration with Bloomberg that could catapult it into the buy-side market. The collaboration makes the Temenos Multifonds Global Accounting product accessible through the Bloomberg terminal, allowing asset managers to generate NAV estimates independent of, and in parallel to, their fund administrators, and enabling accurate daily oversight and continuity of operations in case of outages.
The collaboration, made under the auspices of Philippe Chambadal, executive vice president of capital markets at Temenos, builds on Bloomberg’s relationship with the buy-side to provide sales, marketing and services, and provides a new business opportunity for Temenos of around 40,000 buy-side firms, of which about two-thirds are based in North America.
Max Chuard, chief financial officer and chief operating officer at Temenos, says the collaboration will ‘dramatically increase’ the company’s growth in the $5 billion buy-side technology space by supporting asset managers and alternative investment funds that use Bloomberg buy-side solutions.
He adds: “We will offer an easy, plug-and-play solution with a low total cost of ownership, starting with NAV oversight and contingency, and with a view to unlocking further opportunities in the front and middle office over time.”
Jean-Paul Zammitt, global head of financial products at Bloomberg, says the collaboration enables Bloomberg to help buy-side firms meet their need for dependable and independent NAV oversight to support operational contingency plans and satisfy investors. He also notes the potential of the collaboration to deliver additional solutions to buy-side institutions in the future.
Accuity has promoted David Wilson to the role of managing director and CEO with responsibility to deliver innovative solutions for payments, risk and compliance, develop new strategic partnerships, and continue to deliver high quality client service. He steps up to the role after serving for 18 months as chief operating officer (COO) at the provider of financial crime compliance, payments and Know Your Customer (KYC) solutions.
Wilson replaces former CEO and president of Accuity, Hugh Jones, who moves on to become a global managing director of Accuity’s parent company RELX Group, with ongoing oversight of Accuity. Also promoted is Tom Golding, who steps into the COO role. Golding joined Accuity from Thomson Reuters in 2016 to manage the company’s risk and compliance business line, and will now guide technology, product and operations across the company.
Commenting on his appointment as lead of Accuity, Wilson says: “Accuity is an exciting business with a global base of clients, tremendous talent, and a dedication to technology innovation. I look forward to guiding our teams as we strive to lead the industry with solutions that transform the way our clients manage financial crime compliance and create efficiencies for global payments.”
Datactics has made plans for 2019 that will move selected solutions into the public cloud, extend implementation of innovative technologies, create a data quality clinic, and add a matching engine for sanctions screening to its open data projects. The company is also working with clients on data-quality-as-a-service.
The company’s commitment to cloud technology, initially private cloud solutions, is aimed at increasing performance and reducing costs, and allows Datactics to scale its data profiling, cleansing, scoring and matching capabilities for data quality and regulatory compliance on demand.
Alex Brown, pre-sales R&D manager at Datactics, comments: “Cloud solutions provide instant matching for purposes such as looking at end of day reference data from vendors and exchanges and carrying out symbology matching and data quality checks.”
Moving into the public cloud and working with Amazon Web Services (AWS), Datactics has already spun up an instance of its Legal Entity Identifier (LEI) Match Engine and is developing additional public cloud solutions.
Stuart Harvey, CEO at Datactics, explains: “Our experience is that banking clients are adopting cloud technology rapidly and need cloud-enabled solutions to run in-house, typically from their own private cloud. The Datactics LEI Match Engine and imminent Sanctions Match Engine, which will support banks’ AML activities, are public cloud, live data demonstrations of our technology. They complement the local, on-premise, private cloud mainstay of our deployments in data quality and matching open and proprietary data sources.”
The Sanctions Match Engine adds to Datactics’ open data projects, which already include Refinitiv PermID and LEI matching engines, and will consume, cleanse normalise and match sanctions data from multiple organisations including the US Office of Foreign Assets Control (OFAC).
Datactics is also planning to introduce additional innovative technologies, including machine learning and other strands of artificial intelligence (AI), to automate workflows, address issues including the resolution of data errors, and support business cases requiring data matching and reconciliation. An early automation programme provides machine-assisted data remediation. In terms of business use cases of machine learning and AI, Datactics pre-sales manager Luca Rovesti, suggests Know Your Customer (KYC) compliance, data migration and deduplication.
The company’s machine-assisted data remediation service is being developed in collaboration with a team from Ulster University and will be the first element of a Data Quality Clinic that Datactics plans to release this year. The company is also working with the university on data discovery, which could be used for regulatory reporting, risk data aggregation and data feed onboarding.
The Global Legal Entity Identifier Foundation (GLEIF) is calling for applications from C-suite level executives in the UK, Switzerland, China and India to join its board of directors. New board members will be appointed in June 2019 in line with the mandatory rotation of members of the board and to ensure the right balance of jurisdictional representation.
To ensure the balance of sectors and skills represented on the board, two candidates will be selected for their knowledge of data standards, operations and/or technology for the digital economy. One candidate will be selected from a non-financial sector, in this case the supply chain and logistics sector, and one from the financial sector, preferably the banking sector and with an extensive knowledge of Know Your Customer (KYC), trade finance and correspondent banking.
The term of office for a board member is three years, renewable for a second term of three years. To achieve more diversity in the board of directors, the GLEIF is encouraging female candidates to apply. All applicants should have contacts with the leaders and/or regulators in their industry and/or jurisdiction in order to serve as effective ambassadors for the GLEIF.
Details of the application process to join the board of the GLEIF can be found here. Applications must be submitted by Monday February 11, 2019.
By: Tony Brownlee, Chief Strategy Officer at Kingland Systems
Sometimes we all need a good story, especially as we look at the priorities for 2019.
We all know the classic children’s story of Goldilocks and the Three Bears. Goldilocks stumbles into a house and proceeds to discover three bowls of porridge, three chairs, and three beds before the three bears find her sleeping in a bed. What if our good friend Goldilocks lived in today’s world of data? What would her world be like? Would she be surprised by what she found? Let’s take a look.
Like all of us, Goldilocks had a perfect career plan to get into the world of ‘data’. She’d dreamt of her data career since she was a little girl and always knew she wanted to live and breathe data. After making all the right decisions, wrangling data, governing data, and resolving a forest of data issues, she’s now a data executive at a major financial institution. Life couldn’t be better.
One day, Goldilocks found herself reviewing three data programmes for her company, looking for the perfect programme to prioritise for next year. She reviewed the vision, details, and plans of all of them and was trying to decide where she should spend her time.
The first data programme was too scary. The data quality was terrible – multiple systems and manual entries exacerbated the quality, and no one trusted the data. The regulatory and risk pressures were mounting and leadership was just realising how important the data was.
The second data programme was too big. There were too many types of data. There were hundreds of feeds as well as millions of documents and unstructured data. The vision was clear, but would be impossible to deliver without significant change.
The third data programme was too disruptive. There were too many legacy systems. There were hundreds, if not thousands of people trying their best to maintain and use the data. For every good idea, there were three excuses why it would never work.
So, what did Goldilocks do? Well, I can tell you this...she didn’t find a bed and curl up for a nap!
The focus on data started at most major enterprises around the 2000-2005 timeframe. The internet was maturing. Information was more and more digitised. The first Chief Data Offices were getting established. Everyone’s favourite data conferences and industry events were starting to pop up on the calendar. In those days, master data management, data quality, data integration, and data governance were the hammers that many of us, even our friend Goldilocks, swung at every data programme.
But now, nearly 15 years later, Goldilocks has got a little smarter. Whether working on a data programme to launch a new product to optimise growth, or implementing an enterprise data programme driven from risk management or compliance, most data programmes are scary, big, and disruptive. This is the 21st century Goldilocks scenario that most executives find themselves in today.
The hard reality is that Goldilocks doesn’t just get to pick one data programme; she actually has to deliver on all three. But to do so, she’s going to think about things differently…
- First, she knows these data programmes must deliver results. Her decision making framework will be focused on the investments that will deliver short-term improvements to the data, but in a long-term sustainable way.
- Second, she knows that enterprise data management doesn’t mean all data must be centralised and in a golden copy. Those projects take too long, and as fast as data evolves, they will be too hard to maintain in a large institution. Delivering more valuable data to a division or a department will be faster, less expensive, and less risky to implement.
- Third, she knows that technology has evolved. She can now take advantage of cloud solutions to deliver faster. She knows that capabilities such as matching, text extraction, data quality, or search can be delivered more modularly and integrated with existing processes and data.
- Fourth, she knows to look at structured and unstructured data, and different data models. She doesn’t have to buy more data feeds when artificial intelligence (AI) can take unstructured sources or existing processes and produce the same data. She knows relational data structures and data lakes should and can be used with much less investment than 10 years ago.
- Lastly, she knows new data requirements will emerge. She’ll look for ‘patterns’ to expand data quality and data management practices so that new attributes can be managed without adding hundreds of people to process exceptions and perform remediation projects.
I know a lot of Goldilocks executives in the banking, insurance, public accounting and retail industries. These are the leaders who continuously learn, investigate new technologies, and take that extra time to understand the data. Too scary? Improve the data quality on the most important data. Too big? Look to innovative technology and deliver more from unstructured data alongside some heavily used data or existing processes. Too disruptive? Implement a stand-alone solution that can integrate and progressively introduce the data rather than making wholesale change.
And after all that, if you're still looking for a bed to curl up in... grab a blanket, or give us a call and let's talk.
The Regulatory Oversight Committee (ROC) of the Legal Entity Identifier (LEI) is calling for final comments on a second consultation document on fund relationships in the Global LEI System (GLEIS) by Monday January 14, 2019. The consultation follows an initial review of fund relationships towards the end of 2017 and is aimed at making sure the implementation of relationship data is consistent across the GLEIS and facilitating standardised collection of fund relationship information at a global level.
The issue of how fund relationships should be represented in the GLEIS has been an ongoing issue since the introduction of the LEI over five years ago. The second consultation document
details proposals for a limited update to the way fund relationships are recorded in the GLEIS and suggests the aims of consistent relationship data across the GLEIS and standardised collection of fund relationship data can be achieved by providing a definition for each fund relationship and better aligning the data structure with the structure for direct and ultimate accounting parent entities defined in a ROC report published in March 2016.
Responses to a questionnaire annexed to the second consultation document must be made by January 14, 2019 and will inform the final version of the policy framework for fund relationships that the ROC will approve for implementation by the Global LEI Foundation (GLEIF). Implementation will not take place before January 2020.
Are you struggling with client onboarding and lifecycle management? Investing data management resources in onboarding and not achieving a commensurate return on investment? Worried about customer loyalty and competition?
Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company.
DTCC has partnered Xceptor to support regulatory reporting under Securities Financing Transactions Regulation (SFTR). SFTR reporting obligations are expected to come into play in late 2019 or early 2020.
The partnership allows DTCC clients to use Xceptor’s artificial intelligence (AI) automation software within the DTCC global trade repository to transform data in support of compliance with SFTR. This should reduce the operational burden of reporting by allowing firms to enrich, normalise and validate data before submitting it to a trade repository. Firms will be able to enrich reporting with both internal and external reference data, manage exceptions using native workflows, and benefit from real-time gap analysis and testing.
Andrew Kouloumbrides, CEO at Xceptor, explains: “SFTR places importance on data quality, and with low reporting tolerances, the self-serving capabilities of the DTCC service, powered by Xceptor, will enable firms to get their data into shape while also being able to use existing trade file formats. End-of-day books and records reconciliation is also native to the service, ensuring clients have full transparency of their submissions.”
As well as working with Xceptor to streamline SFTR reporting processes, DTCC will explore other opportunities to use the software provider’s capabilities across the DTCC platform.
genesis, an international capital markets software firm, and OpenFin have partnered to provide clients with scalable and interoperable technology and desktop solutions that can be built and deployed at speed. The partnership is designed to drive front to back office digital transformation projects by bringing together genesis’ agile software development and OpenFin’s application interoperability across financial desktops.
genesis provides a microservices technology framework that enables fast and agile software development by breaking down problems into small components of functionality, while ensuring that data used is consistent in real-time. OpenFin is an operating system designed for capital markets and used by banks, asset managers, hedge funds and brokers to power digital transformation strategies by securely deploying interoperable financial applications directly onto permissioned desktops.
By partnering with OpenFin, genesis can build and deploy its suite of products and solutions as desktop applications directly onto OpenFin’s operating system. Clients benefit from the interoperability and connectivity provided by OpenFin, as it allows genesis’ solutions to share information, context and intent with other applications on the end user’s desktop. At the same time, the OpenFin Creative Studio tool can help potential users reduce the time to build proof of concepts to hours and days, rather than weeks and months, accelerating application deployment and time to market. genesis has already implemented a solution for clients supporting treasury, broker dealer and wealth management requirements on the OpenFin operating system.
Stephen Murphy, CEO at genesis, says: “The combination of genesis’ microservices framework and OpenFin’s operating system allows us to provide highly scalable solutions on both the server and desktop. This gives clients an alternative to legacy in-house or vendor technologies and provides technology that is open and interoperable.”
US institutions are paying more than $25 billion a year to comply with financial crime requirements. A survey by LexisNexis Risk Solutions, based on responses from over 150 decision-makers at banks, investment, asset management and insurance firms, suggests smaller firms are hit hardest, relative to their bottom lines, with the cost of AML compliance reaching up to 0.83% of total assets. Larger firms can see costs of up 0.08% of total assets.
Daniel Wager, vice president of global financial crime compliance at Lexis Nexis Risk Solutions, says: “As compliance costs rise, mid- to large-sized firms are using a wider array of newer technologies and data sources to prevent financial crime. While these firms report a higher average compliance spend per year ($18.9 million), they are actually lowering the cost of compliance. The overarching goal is to achieve compliance with greater efficiency and with less human capital.”
The executives surveyed reported that regulatory reporting, customer risk profiling and sanctions screening are among the key challenges for US financial firms. Operational inefficiencies pose significant challenges at firms that use less technology. Financial institutions are now seeking to leverage AML compliance processes to better understand and manage customer relationships and improve financial risk management.
The survey report suggests that implementing a layered approach to AML compliance technology may not only be necessary, but crucial, to improving compliance processes. Firms that use layered solutions, including multiple services like cloud-based KYC procedures, shared interbank databases and machine learning and artificial intelligence (AI), take significantly less time to complete due diligence than those using just one of these technologies.
The report concludes: “Many firms are still relying on manual efforts with their AML compliance technology, which is not optimal for either performance or cost-effectiveness.”
The Global Legal Entity Identifier Foundation (GLEIF) continues to build out services around the LEI with the introduction of a beta version 2.0 of the LEI search tool. Version 2.0 allows users to explore information on more than 1.3 million organisations contained in the public LEI data pool and provides enhanced functionality including the option to identify corporate ownership structures or pinpoint other identifiers that have been mapped to an LEI.
Key additions to version 2.0 of the LEI search tool include: a user interface supporting quick and customised research; easily identifiable ownership information; other identifiers mapped to the LEI showing automatically with search results. The tool also offers an expert mode that allows users to configure and combine their own search filters to facilitate the design of complex queries. For example, identifying all LEIs registered within a defined timeline and with a legal name containing the term ‘bank’ and owning companies in a specific country.
Stephan Wolf, CEO at the GLEIF, says: “To ensure the tool continues to evolve in line with market needs, the GLEIF invites comments on the beta version 2.0 of the LEI search tool by June 30, 2019.”
The Derivatives Service Bureau (DSB) has published 2019 user fees for the provision of ISINs for OTC derivatives. Power and standard user fees rise 4.5% to €117,500 and €39,200 respectively, while infrequent users, essentially low volume or ad hoc users of the service, will see their fees unchanged at €3,000 in 2019.
The service works on a cost recovery basis with changes in numbers of various types of users effecting 2019 fees. Based on 2018 figures, banks pay the largest percentage of the user fees at 56%, with trading venues paying 31%, and other categories including buy-side and data vendors paying 13%. As well as paying users, the DSB provides OTC ISIN data to registered users, the largest category of users making up two-thirds of the DSB’s user base, free of cost.
MUFG Europe, the European subsidiary of Japan’s Mitsubishi UFJ Group, has selected Wolters Kluwer’s OneSumX software to manage a number of regulatory and risk reporting obligations, including AnaCredit and Interest Rate Risk in The Banking Book (IRRBB), for its Dutch and German business lines.
The bank is an existing user of OneSumX, which uses a single source of data to ensure consistency, reconciliation and accuracy across regulatory reporting and includes Wolters Kluwer’s Regulatory Update Service that provides active monitoring for regulatory change in around 30 countries.
MUFG will also use Wolters Kluwer’s OneSumX AnaCredit solution, an integrated and scalable solution that uses grid and in-memory computing to handle the large data volumes required for AnaCredit, a European Central Bank project to set up a dataset containing detailed information on individual bank loans in the euro area.
Sander van der Laan, executive director and head of MUFG Europe’s risk management division, says: “New requirements under AnaCredit and IRRBB are changing the frequency on which we report. As an existing customer of Wolters Kluwer, it was a natural choice to extend our use of OneSumX.”
A-Team Group has announced the winners of its 2018 Data Management Awards. The annual awards, now in their sixth year, are designed to recognise leading providers of data management solutions, services and consultancy to capital markets participants. See the full list of winners below.
The awards were hosted by Sarah Underwood, Editor of A-Team’s Data Management Review, and presented by guest speaker John Sergeant, a renowned political journalist and broadcaster, after a celebratory lunch attended by over 100 guests at Merchant Taylors Hall in the City of London.
Award categories ranged from best sell-side and buy-side enterprise data management platforms to managed services, corporate actions, risk data aggregation, entity data, quality analysis, Know Your Client (KYC) and client onboarding, data governance, data lineage and regulatory data management. Award categories for data providers covered sell-side and buy-side reference data, corporate actions data and index data.
As well as solution categories, three editor’s recognition awards were presented. Andrew Barnett, chief data officer at Legal & General IM, was presented with the award for best data management practitioner; Peter Moss, CEO at the SmartStream RDU, with the award for best data management vendor professional; and ASG Technologies with the award for innovation in data management.
Underwood said: “Congratulations to the winners of this year’s A-Team Group Data Management Awards and thank you to everyone who voted. We had a large number of nominations and votes across all categories reflecting the criticality of excellent data services and data management solutions in capital markets.”
A-Team Group 2018 Data Management Awards
Best Sell-Side Enterprise Data Management Platform – Asset Control
Best Buy-Side Enterprise Data Management Platform – NeoXam, DataHub
Best Sell-Side EDM Managed Services Solution – SmartStream RDU
Best Buy-Side EDM Managed Services Solution – State Street Corporation, DataGX
Best Corporate Actions Solution – AIM Software, GAIN Corporate Actions
Best Risk Data Aggregation Platform – Moody’s Analytics
Best Reference Data Provider to the Sell-Side – SIX
Best Reference Data Provider to the Buy-Side - Refinitiv
Best Entity Data Solution – Bureau van Dijk, Orbis
Best Corporate Actions Data Provider – Exchange Data International
Best Index Data Provider – RIMES Technologies
Best Graph Database Provider – Neo4J
Best Smart Lake Provider – Cambridge Semantics
Best Data Discovery Provider – Global IDS
Best Data Preparation Provider – Trifacta
Best Data Quality Analysis Tool - Solidatus
Best KYC & Client On-Boarding Solution – Fenergo, Client Lifecycle Management
Best Data Governance Solution – Collibra
Best Data Lineage Solution – MarkLogic, Operational and Transaction Enterprise NoSQL Solution
Best Regulatory Data Management Platform – SteelEye
Best Big Data Analytics Solutions Provider – SAS
Best Proposition for AI, Machine Learning and Data Science – IBM Watson
Best Consultancy in Data Management – Deloitte
Editor's Recognition Award for Best Data Management Vendor Professional – Peter Moss, CEO at the SmartStream RDU
Editor's Recognition Award for Best Data Management Practitioner – Andrew Barnett, Chief Data Officer at Legal & General IM
Editor's Recognition Award for Innovation – ASG Technologies
Just two months after completion of the deal that saw Blackstone acquire a 55% stake in Thomson Reuters’ financial and risk business, and rebrand it as Refinitiv, Thomson Reuters has put plans in place to reduce costs and streamline its remaining news and information businesses. Some 3,200 jobs will be cut over the next two years, locations will be cut by 30% to 133 by 2020, and the company will reduce the number of products it sells.
Setting out its strategy and growth plans, Thomson Reuters said that following the Blackstone deal, about 43% of Thomson Reuters revenues come from its legal business, 23% from corporate clients, and 15% from its tax business. Reuters News accounts for only 6% of revenues, but will remain a key part of the business under Michael Friedenberg, who joined the company early this week as president of news and media operations. Friedenberg was previously a board member and CEO at IDG Communications.
Thomson Reuters CEO Jim Smith outlined the company’s plans during an investor meeting yesterday. He said the company aims to grow annual sales by 3.5% to 4.5% by 2020, cross-sell more products to existing customers and new customers, and cut the number of products it sells.
The company has also set a target to reduce capital expenditure from 10% to between 7% ad 8% of revenue in 2020, and has set aside $2 billion of the $17 billion proceeds from the Blackstone deal to make purchases to help grow its legal and tax businesses.
Smith said: “We’re going to simplify the company in every way that we can, working on sales effectiveness and on ways to make it easier both for our customers to do business with us and for our frontline troops to navigate inside the organisation.” On the company’s news service, he added: “We believe we can make Reuters News an even greater part of our growth story going forward.”
Quality assurance provider Kaizen Reporting has branched out to include Markets in Financial Instruments Regulation (MiFIR) reporting in its flagship ReportShield service, which will be led by new hire Chris Machin, the brains behind the London Stock Exchange’s (LSE’s) MiFID II transparency service, TRADEcho.
Kaizen managing director Ian Rennie notes: “With both buy-side and sell-side financial firms turning their attention to the quality of the data published in the real-time reports through Approved Publication Arrangements, trade reporting is an important focus for our clients.”
Under MiFID II, investment firms are required to report on both trades and transactions. Trade reporting improves the near to real-time trade transparency information published to potential investors. Financial details of trades are reported to an Approved Publication Arrangement (APA) for dissemination to the market.
In transaction reporting, trade details must be reported in T+1 to an Approved Reporting Mechanism (ARM) that validates the data before sending the reports to regulators. Kaizen’s ReportShield assurance service provides a set of four controls that test the accuracy and completeness of these regulatory reports.
Machin, who as head of client support for Simplitium was instrumental in developing the APA TRADEcho in partnership with the LSE, brings with him to Kaizen a wealth of experience on regulatory regimes including MiFIR, CSDR, SFTR, MAR, EMIR and Dodd-Frank.
Formerly in global banking and with experience at both UBS and Credit Suisse, Machin’s move to Kaizen adds to the company’s growing list of regulatory experts. Earlier this year, the regtech firm hired former regulator and EMIR and MiFIR expert David Nowell, along with the former head of the ICMA Taskforce on SFTR, Jonathan Lee.