“…We engaged Epsilon to be our strategic partner…. I am happy to say that Epsilon met or exceeded our expectations on this project. We successfully completed the RFP project and selected a system that was right for our business…”

FHLB
Chief Financial Officer

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

Impending regulations, following recent price manipulation investigations, will require trading firms to have new data governance and quality controls in place.

Author: ateamgroup
Posted: September 22, 2017, 10:22 am

With the compliance deadline for Markets in Financial Instruments Directive II (MiFID II) just over two months away, A-Team Group has updated its MiFID II handbook to bring you the latest details on the regulation’s compliance requirements.

Author: ateamgroup
Posted: September 20, 2017, 4:15 pm

Alyne is a Munich-based RegTech company offering a Software-as-a-Service (SaaS) solution designed to help organisations improve risk insight. Its proposition is built on a combination of providing a curated content library crafted by industry experts and highly usable functionality at an extremely competitive price. Its mission statement: We make gaining risk insights to your business as easy as browsing your social media.

To find out more about Alyne’s views on RegTech and its product, we talked to Stefan Sulistyo, co-founder and chief customer officer at the company. You can see Alyne in action during the RegTech showcase at A-Team Group’s RegTech Summit for Capital Markets in London on October 5th.

View the comprehensive agenda here

Q: What does RegTech mean to you?

A: In the tradition of other ‘Techs’, such as FinTech, we understand RegTech as digitisation of regulatory compliance processes. Digitisation is a buzzword itself, but it helps to frame RegTech in the context of automation. In other industries, there are visions to completely substitute manual processes at some point, or at least augment human capabilities and capacity, to rapidly scale and leverage the output of manual processes.

One prominent technology analyst firm lists technologies in something called the Hype Cycle. RegTech, in the form envisioned by non-experts, is rapidly rising towards the so-called ‘peak of inflated expectations’. However, it is worth mentioning that automation in the context of compliance controls is not really new. But as with many concepts in the governance-risk-compliance (GRC) arena, it has long been plagued by incomplete and ivory-tower implementations that neglect the human factor, i.e. people’s behaviours and daily decisions that so often make the difference between being compliant and an embarrassing press conference for the board and significant fines.

It should also be noted that RegTech should not just be seen in the context of the financial services industry. All organisations are subject to some form of regulation and the basic management methods and technical and organisational controls for operational risk are usually similar.

Q: What problem does the financial institution have that you believe you can solve?

The financial industry faces digital disruption and at the same time regulatory compliance is one of the industry’s biggest pain points. You could argue that risk management and compliance are already largely digital domains and digitalised, however it’s worth taking a second look when you consider the following characteristics we’ve encountered in these functions in companies around the world:

· Lots of manual interaction

Many risk management and compliance processes involve multiple spreadsheets, feedback and additions being sent back and forth via emails; spreadsheet outcomes pasted into slide decks; and probably some printouts with a busy executive’s hand-written comments.

· Labour intense processes

Compliance reporting and risk management requirements have developed so rapidly that highly regulated organisations like banks have solved an immediate need by hiring more people. Compliance and risk management departments have grown to enormous dimensions.

· Generic and outdated toolsets

While digital tools are used in risk management and compliance, they are often generic tools, such as spreadsheets or outdated solutions, like many GRC tools currently on the market. Processes are not necessarily streamlined and the quality of the output is highly dependent on the structuring and content of the user, as little guidance or content is provided.

Q: Why do they have this problem?

A: For big institutions and companies, it is often difficult to integrate new business models into their operations. Instead of finding a well-suited solution, they throw an insane amount of people at the problem and too many cooks always spoil the broth.

Q: What regulations are of primary concern to you and your customers?

A: We are focused on technology and operational risk topics. So relevant regulations come from financial regulators, but also data privacy rules such as General Data Protection Regulation (GDPR) and cyber security specific rule sets, which are a hot topic in all industries at the moment.

Q: How do you solve regulatory compliance problems?

A: We offer a deep content and rich context library with more than 850 control statements that enable organisations in all industries and of all sizes to mature their cyber security, risk management and compliance. You can bring this to live in our cloud software platform by using our smart workflows and intelligent risk analytics.

Q: What technology do you use?

A: Our SaaS is built on a pure Javascript tech stack leveraging state of the art frameworks and cloud services. In addition, we just launched Project Citadel, on which we are collaborating with the Technical University of Munich to develop the capability to match regulation, policy documents, questionnaires and more to our Alyne library by using natural language processing techniques. You can sign up for updates if you’re interested.

Q: How do you fit into a financial institution’s architecture and data flows?

A: We require no technical integration to start simple use cases. If clients are looking for deeper integration, we can offer various interfaces to existing databases such as user directories, asset libraries, customer management, and contract management.

Q: What other cool RegTech companies have you seen out there?

A: Most of the movers and shakers in the industry have joined the IRTA International RegTech Association - https://regtechassociation.org/

Q: Why are you taking part in the A-Team’s RegTech Summit for Capital Markets event?

A: It is always good to be in London for events as the city is one of the main FinTech and RegTech hotspots in the world. There is a high concentration of banks, financial service institutions and a supporting ecosystem.

Q: What type of people are you hoping to meet at the Summit?

Potential customers and partners.

Show Author Info?: 
No
Related: 
RegTech Summit for Capital Markets - London, 5th October 2017
RegTech Summit for Capital Markets - New York City, 16th November 2017
Author: ateamgroup
Posted: September 20, 2017, 1:26 pm

Roll up, Roll up! The A-Team Group RegTech Summit for Capital Markets is coming to town, with an event in London on Thursday October 5th and in New York City on Thursday November 16th. The agendas for both summits are packed with keynotes, panels and presentations dedicated to solving the regulatory conundrum, while a RegTech start-up showcase will provide insight into innovative compliance solutions.

The London event will open with a keynote from Sophia Bantanidis, transaction banking, EMEA head of regulatory and market strategy at Citi, and will cover the relationship between RegTech and the regulatory landscape, examples of RegTech solutions, RegTech trends and more.

View the comprehensive agenda here

Also in the morning session are a user panel that will discuss how to overcome barriers to RegTech and assess which solutions are right for your firms, a keynote interview with Christian Krohn, head of European regulatory reform at Standard Chartered Bank, and keynotes from summit sponsors Thomson Reuters and SmartStream.

RegTech showcase

To give you some insight into RegTech innovation and start-ups in the market, we have put together a RegTech showcase that will demonstrate regulatory solutions from Onfido, Encompass, Alyne and RegTek Solutions. The showcase will be short, sharp and an opportunity for you to influence the roadmap of these solutions providers. Out in the exhibition hall, you will find more RegTech start-ups with technologies designed to ease your regulatory pain points.

MiFID II, MAR, MAD, GDPR and FRTB

The afternoon session splits into two streams, one covering the role of RegTech in reducing the complexity and minimising the costs of Markets in Financial Instruments Directive II (MiFID II), the other focusing on innovative technologies for optimal compliance with the regulatory requirements of Know Your Customer (KYC) and client onboarding, Market Abuse Regulation (MAR), and the Market Abuse Directive (MAD).

Moving back into the main conference room, find out how the Bank of England is standardising data to drive efficiencies and lay the foundations for RegTech solutions from Beju Shah, head of data collection and publication in the technology directorate at the bank.

A deep dive into General Data Protection Regulation (GDPR) and the Fundamental Review of the Trading Book (FRTB) will finish formal proceedings in time for a networking and drinks reception sponsored by Thomson Reuters.

What a day! We hope you can join us and look forward to meeting you.

PS, we’ll fill you in on the details of the New York City RegTech Summit in the next week or so, but in the meantime, check out the early agenda here.

Show Author Info?: 
No
Related: 
RegTech Summit for Capital Markets - London, 5th October 2017
RegTech Summit for Capital Markets - New York City, 16th November 2017
Author: ateamgroup
Posted: September 20, 2017, 11:20 am

Exchange Data International (EDI) has introduced a corporate actions service that it says undercuts data redistribution costs charged by the New York Stock Exchange (NYSE). EDI’s move is a response to a June 2017 NYSE policy that extends redistribution charges for corporate actions data in the equities space.

Jonathan Bloch, founder and CEO of EDI, explains: “Beyond its extremely high fees, NYSE imposed a requirement on its redistributors to provide names of downstream consumers of its data in order to charge an additional levy, should they redistribute the data, making the data more costly. We don’t do that.”

EDI has built up its corporate actions business over the past 20 years and in 2015 decided to source data on companies listed on NYSE independently of the exchange. The company has since built a corporate actions service based on this data and contends that its stance could be a game changer. Bloch says: “It’s about time the corporate actions sector had a competitive environment. We now provide redistribution users with the same quality datasets, for half the cost, without any onerous redistribution rules.”

He says redistributors are worried that other exchanges will follow NYSE’s approach, making corporate actions data expensive at a time when users are beginning to view it as a commodity. As well as containing redistribution costs, EDI is using technology tools, including web monitoring and machine learning, to automate as much corporate actions data processing as possible and deliver accurate and affordable services.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 19, 2017, 8:44 am

Ahead of A-Team Group’s RegTech Summit for Capital Markets in London, we caught up with Matt Smith, CEO at SteelEye, to discuss the company’s RegTech proposition, potential users, and participation in our London event.

View the comprehensive agenda here

Q: What does RegTech mean to you?

A: For me, RegTech is the application of technical innovation to reduce the complexity and cost of meeting regulatory obligations – with the bonus of improved insight from better organisation of your data. As regulatory markets continue to evolve, many financial firms impacted by new regulations are challenged by the system implications. For those lucky enough to have the resources (both in terms of human and financial capital), these burdens can still be demanding. However, for most firms, especially those smaller sized firms, resources are scarce and there are few places to turn for help.

Q: What problem does the financial institution have that you believe you can solve?

A: We believe there is a big opportunity coming out of complying with the regulations that will help regulated firms gain valuable insight into their commercial performance and operations. Under MiFID II and AIFMD, financial firms in Europe are required to store data (both communications and trade) for between five and seven years. SteelEye allows firms using our platform to leverage data not just for record keeping, but also to help meet their obligations for transaction reporting, best execution and trade reconstruction. The resulting treasure trove of consolidated data, if organised properly, can be used to gain insight and facilitate improvement across a firm's operations, with benefits for not just risk and compliance, but also cost and process improvements, and enhanced customer service.

Q: Why do they have this problem?

A: Legacy systems are typically organised like silos and don't permit easy exchange of data that is now demanded by regulators. Firms that have never undertaken an initiative to build a data warehouse are now having to do so, with very little time available to succeed in this effort.

Q: What regulations are of primary concern to you and your customers?

A: Although the main focus of SteelEye today is to help firms address MiFID II and AIFMD, our flexible innovative technology can cope with any regulation that requires financial firms to record data or report transactions. Other regulations which our platform can support are Dodd Frank, REMIT and EMIR among others.

Q: How do you solve regulatory compliance problems?

A: SteelEye brings together communications data – voice, mobile, email, instant message, Bloomberg, Reuters etc – and trade data from any order management platform, execution system or alternative sources such as spreadsheets. Our advanced data analytics and case management system allow firms to bring this data together, interrogate it and make sense of otherwise disparate information.

Q: What technology do you use?

A: Our platform is fully cloud-based enabling us to reduce the cost for our clients and apply the most advanced security capabilities to ensure our clients' data is stored in a cost-effective way and is fully encrypted and secure.

Q: How do you fit into a financial institution's architecture and data flows?

A: We are an aggregator. Client source systems feed information into the SteelEye platform. What makes us special is that after the data is stored securely for regulatory and compliance purposes, our clients can work with their information using our fully open API framework. This means not only can our clients access and control their data, but they can also build applications and models using their data in our cloud data platform.

Q: What other cool RegTech companies have you seen out there?

A: I am very excited about the AlphaExchange (https://www.alpha-exchange.com). The team there is tackling the complicated challenge of research management and unbundling with a view to revolutionising research with the concept of an open exchange for firms to trade research in a transparent way.

Q: Why are you taking part in the A-Team’s RegTech Summit for Capital Markets event?

A: The market is looking for innovators that are able to take out cost and complexity from the increasing overhead of meeting the regulatory demands imposed by governments around the world. We feel this is an excellent venue for newcomers to the market to present new thinking to firms in need of affordable solutions.

Q: What type of people are you hoping to meet at the Summit?

A: We are looking forward to meeting financial firms that are struggling with endless, complicated and conflicting regulations, and are seeking cost effective answers!

Show Author Info?: 
No
Related: 
RegTech Summit for Capital Markets - London, 5th October 2017
RegTech Summit for Capital Markets - New York City, 16th November 2017
Author: ateamgroup
Posted: September 14, 2017, 10:00 am

General Data Protection Regulation (GDPR) will curb the digital wild west and improve protection of personal data, but the cost of implementation will be high and now is the time to get started to meet the compliance deadline of May 25, 2018.

Taking part in an A-Team Group webinar on the regulation, subject matter experts outlined the challenges of GDPR as well as approaches to best practice implementation. The webinar was moderated by A-Team editor Sarah Underwood, and joined by Sue Geuens, president at DAMA and an independent consultant in financial services; Chiara Rustici, an independent GDPR analyst; Abigail Dubiniecki, a GDPR Specialist at My Inhouse Lawyer; and Tudor Borlea, sales engineer and GDPR specialist at Collibra. Rustici will moderate a panel on the regulation – GDPR: A game changer - are you ready? – at A-Team’s RegTech Summit for Capital Markets in London on October 5, 2017.

Webinar Recording: General Data Protection Regulation – Where are we now?

An early poll of the webinar audience showed most firms in the early days of working towards compliance, with 34% of respondents saying they were in the planning phase, 28% close to implementing a solution and 16% yet to start preparation. Some 15% said they are implementing a solution and just 5% are completely prepared with a solution in place.

Featured Download: Poll results on General Data Protection Regulation – Where are we now? from our recent webinar audience

The message from the webinar speakers was clear. With less than 200 working days before the compliance deadline, senior management needs to accept that there is no avoidance of GDPR and act now.

Dubiniecki said that at a basic level firms need to respond to the regulation’s requirements by considering whether they hold data legally, have informed data subjects of how they will use their data and informed them of their rights, and can ensure access rights to personal data. She added: “GDPR ends the digital wild west, but the challenges of understanding what date is held and how it is used are considerable. The need is to prioritise data and identify higher risk areas and plan for them first.”

Borlea expanded on this, explaining a compliance process that starts with establishing a dedicated, multi-departmental team with management buy-in and goes on to identify the information structure within a firm and how it needs to be adapted to support GDPR. Data can then be collected and assessed, with priority given to high risk data. A gap analysis and a mitigation plan are also needed. With these elements in place, GDPR becomes part of core operations processes and can be monitored and reported on.

Geuens noted the imperative to implement data governance in line with GDPR – a second poll showed firms working towards this – and advised firms not to panic, but start now. Advice from other speakers included start with an understanding of your data landscape, stop hoarding data and collaborate across the enterprise.

Listen to the webinar to find out more about:

  • GDPR requirements
  • Data management challenges
  • Best practice approaches
  • Technology solutions
Show Author Info?: 
No
Related: 
RegTech Summit for Capital Markets - London, 5th October 2017
RegTech Summit for Capital Markets - New York City, 16th November 2017
Author: ateamgroup
Posted: September 13, 2017, 10:28 am

Fenergo is building out its infrastructure with the appointment of two sales leads, 100 new recruits, office extensions and a customer success function dedicated to nurturing customer experience. The expansion is a response to a growth spurt at the regtech provider that is expected to generate revenue of over €50 million and breakeven in the fiscal year to the end of March 2018.

The new sales positions are held by Michele Shepard, who joins as chief revenue officer (CRO), and Greg Watson, who takes the role of managing director of sales and strategy. Shepard, formerly senior vice president of sales and marketing, and CRO at Vertafore, a technology provider to the insurance industry that was sold to private equity for $2.7 billion, takes responsibility for all revenue generation and strategy.

Watson joins from HSBC, where he was most recently managing director and global head of the client management group for the global banking and markets division. In this role, he implemented Fenergo software to support client onboarding across 35 countries. At Fenergo, Watson will use his subject matter expertise to help customers implementing or running Fenergo solutions.

Shepard says: “There are 62 banks live on the Fenergo client lifecycle management platform. We need to maintain these bank relationship as we add more.” She expects another seven to 10 banks to go live before the end of the fiscal year and notes opportunities to extend existing relationships with global banks across additional lines of business, all of which will be supported by a customer success function dedicated to the client experience. 

The addition of 100 jobs follows the addition of 200 jobs in April 2017 and will lead to employee numbers at around the 600 mark by the end of 2018. The new jobs span the company’s functionality from product and technical development to professional services and sales and marketing. The company has also added an office in Toronto, Canada to meet regional demand and plans to expand its New York office, complementing a Boston operation with towards 50 employees.

Shepard says business is being driven by regulation, particularly Know Your Customer (KYC) and Anti Money Laundering (AML), but adds: “We are starting to see more of our bank customers running initiatives that consider their customers’ lifecycle and focus on improving services. Ultimately, this reduces costs and improves the customer experience.”

 

·      Find out more about the potential of RegTech at A-Team Group’s upcoming RegTech Summits for Capital Markets in London and New York City

Show Author Info?: 
No
Author: ateamgroup
Posted: September 7, 2017, 3:45 pm

By: Phillip Lynch, Head of Markets, Products & Strategy, SIX Financial Information

Keeping up with the latest popular television shows was once a relatively simple proposition. With only a handful of major channels vying for your attention, all you had to do was set aside an hour or two each week and you’d have everything you needed to participate in Monday’s water cooler conversation with your coworkers. Today, we’re living in a so-called ‘golden age’, with dozens of critically acclaimed, must-watch shows available online and on demand. For TV fanatics, it’s a great time to be alive. But for those of us with more pressing things to attend to than sitting on our couches and watching TV, we often find ourselves wondering: “How in the world am I supposed to keep up with all of this?”

It’s a question that compliance professionals can relate to, especially when it comes to sanctions. As the global geopolitical landscape continues to grow more complicated, there is a growing array of trade restrictions, the targets of which are in a near-constant state of flux. In just the past few weeks, alterations have been explored or enacted to sanction programmes targeting entities in North Korea, China, Russia, Iran and Venezuela. The changes show no sign of letting up any time soon.

Identifying sanctioned entities and all of their issued securities, as well as detecting domestic or foreign subsidiaries (or other holdings of more than 50%) is a challenging task. Even trickier is deciphering the complex web of beneficial ownership rules and relationships. Corporate actions may influence the sanctioned securities, and an investment into structured products may increase the risk as well. Determining if sanctioned individuals have beneficial ownership is a critical step in putting together firms’ lists of ‘do not trade’ securities that add to the compliance challenge. On top of this, the information must be fed into the enterprise data management system, rules have to be programmed, and all of the data needs to be kept constantly up to date.

How can financial institutions keep track of all these tasks while keeping their operations running smoothly? Those charged with protecting their firms against penalties can understandably feel like they’re constantly struggling to keep pace.

Without a well thought out strategy, financial institutions run a daily risk of not just getting fined, but of receiving serious reputational damage as well. According to figures published on its website, the US Treasury Department’s Office of Foreign Assets Control (OFAC) has already doled out more than $100,000,000 worth of penalties this year. Firms can’t simply cross their fingers and hope not to be next. While some may be able to afford the financial hit, the public relations damage can endure for years.

Compliance departments are going to need to innovate if they want to keep up with the constant stream of new restrictions. Putting together the comprehensive, accurate, and up-to-date lists of sanctioned securities needed to avoid penalties is an undertaking that challenges even the largest and most advanced organisations.

There are compliance departments that think they can save themselves from penalties simply by being cautious. If there’s uncertainty about the beneficial ownership of an asset, they simply won’t trade it. But this approach, while allowing firms to evade the ire of enforcement agencies, ultimately causes damage. If firms compensate for their lack of comprehensive information about sanctioned entities by passing up on perfectly good trades, they are bound to leave money on the table and put themselves at a significant competitive disadvantage.

Rather than fruitlessly struggling to compile all the necessary information by themselves, compliance departments need to innovate. New technology, or so-called RegTech, offers new possibilities for firms to automatically receive up-to-date and comprehensive daily lists of securities and entities to steer clear of in trades. These tools enable compliance staff to focus on more urgent and productive tasks, and free traders to operate confidently and efficiently.

Firms can create fully automated systems that sift through the huge quantities of necessary data, automatically alerting traders when they are in danger of violating the law and providing them with information about which sanctions a trade violates and how an instrument is attached to a sanctioned entity, saving them from the need for time-consuming research.

These innovations don’t merely enable firms to keep up with the task of managing sanctions data. They make it possible for them to operate more safely and efficiently than ever before, even when sanction regimes were relatively simple. Trades can go through without hiccups and firms can have faith in their compliance without having to manually check each transaction. Most importantly, once compliance staffs have been freed from the constant research that sanctions once necessitated, they will be able to devote themselves to protecting their firms against more complex and serious risks. They may even have some time left over to watch TV.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 5, 2017, 4:22 pm

The compliance requirements of Markets in Financial Instruments Directive II (MiFID II) are complex and far reaching, calling on firms within the scope of the regulation to dig deep before deadline day on January 3, 2018. While timing is tight, some of the challenges presented by MiFID II can be addressed using a strong, yet flexible, security master providing complete and high-quality entity, instrument and listings data. This webinar will consider the importance of a strong security master to regulatory compliance at large, and focus on how such a solution can support the investor protection and transparency aspects of MiFID II.

Webinar Date: 
Tuesday, November 7, 2017 - 15:00
Author: ateamgroup
Posted: August 31, 2017, 10:19 am

SIX Financial Information is standing ready to help firms meet the data and document exchange requirements of Markets in Financial Instruments Directive II (MiFID II) and Packaged Retail and Insurance-based Investment Products (PRIIPs) – both of which regulations have early January 2018 compliance deadlines.

The company released its Regulatory Document Hub (RegHub) a month or so ago and is now onboarding product manufacturers and customers, and testing the solution in readiness for MiFID II and PRIIPs compliance. It is also working on adding data and document exchange for FIDLEG, the Swiss Financial Services Act that mirrors MiFID II requirements and is expected to come into force some time in 2018.

RegHub was initially developed to support PRIIPs Key Information Documents (KIDs) in line with a compliance deadline of January 2017, which was pushed back by the European Commission to January 2018 after a failure to finalise Regulatory Technical Standards. In consultation with industry, it was then extended to support MiFID II investor protection requirements such as the need for product manufacturers to define target markets for their products and understand from wealth managers who the products have been sold to.

Philip Lynch, head of markets, products and strategy at SIX Financial Information, explains: “Our owning banks said they didn’t have the facility to support such structured requirements, so we collaborated to build the RegHub platform.” The hub is populated by about 500 product manufacturers, although this number is expected to rise to nearer 1,200 by the end of the year, and allows manufacturers to distribute documents in real time to permissioned asset managers and wealth managers through application programming interfaces (APIs).

In response to MiFID II requirements, the hub is connected to the ANNA Derivatives Service Bureau, allowing OTC product manufactures to obtain ISIN identifiers for OTC products. On a broader scale, the platform acts as a central source for investor protection data, allowing product data, information and documents to be structured and exchanged in real-time and in a standardised way with the right format for each regulation. Information and documents can also be requested on demand at the point of sale.

Lynch comments: “The exchange of information in real time has never before been a requirement, but PRIIPS and MiFID II make it a ‘must do’. RegHub, with its ready to consume data, will help banks automate and scale their operations, reducing costs and increasing the speed of compliance.”

Show Author Info?: 
No
Related: 
RegTech Summit for Capital Markets - London, 5th October 2017
RegTech Summit for Capital Markets - New York City, 16th November 2017
Author: ateamgroup
Posted: August 30, 2017, 11:37 am

John Randles has left the role of CEO at Bloomberg PolarLake and is on gardening leave until the end of September, after which he will become CEO of a technology start-up. Randles has been replaced by Warren Buckley, chief technology officer (CTO) at Bloomberg PolarLake and founder of PolarLake in 2002.

The switch took place last week, with Randles saying he enjoyed his time at Bloomberg, but is not yet free to name or describe the start-up he is moving to. Randles joined PolarLake as CEO in May 2006 and went on to work with the company for six years as it was transformed from being a provider of an industry agnostic enterprise service bus to a provider of a data management platform designed for the securities industry. Innovation, such as early adoption of semantic technology, supported the company’s acquisition by Bloomberg in 2012.

Buckley steps up to the post of CEO at Bloomberg PolarLake having founded PolarLake in 2002 and taken the role of CTO, a role he maintained after the Bloomberg acquisition of the company and that he is only now relinquishing. His early career includes three years as the founder and CTO of XIAM, a provider of wireless content targeting solutions that went on to be acquired by Qualcomm, and five years as a systems architect at Bank of Ireland Group Treasury.

Randles started his career with a seven-year stint at multi-channel banking specialist Eontec before it was acquired by Siebel Systems, where he became CTO of the company’s banking business.

Show Author Info?: 
No
Author: ateamgroup
Posted: August 29, 2017, 2:47 pm

By: Martijn Groot, Vice President of Product Strategy, Asset Control

IT rationalisation has become a major focus for financial services firms over the past couple of years – from Deutsche Bank’s Strategy 2020, which includes modernising outdated and fragmented IT architecture, to HSBC’s Simplify the Bank plan, which includes an architecture-led strategy to halve the number of applications across the whole group over a 10-year period.

This emphasis on streamlining complex infrastructure is being driven by the competitive and regulatory landscape. It has become very clear over the past decade that continuing with line of business data silos has become a significant risk, given not only the cost of regulatory compliance with its demands for cross-sectional reporting, but also the implications for speed of business change.

As a result, a key part of this rationalisation process has been an investment in APIs (application programming interfaces) to enable interoperability between applications and, hopefully, eradicate duplication. However, while many organisations have appointed data stewards with a remit to determine data and application requirements across specific business functions, the siloed mentality remains due to a lack of data governance maturity. From cost reduction to business agility, the realisation of any successful application rationalisation or data supply chain improvement project will require significantly improved models for data governance.

Demand for Openness

At the same time, the business focus is turning increasingly outward, as organisations recognise the importance of the new financial ecosystem. IT is not only tasked with rationalisation but also moving away from individual process automation to automating an end-to-end supply chain involving different service providers.

With a need to expose data to new fintech partners, as well as customers, many banks are putting in place their own API marketplaces through which they expose their data to selected third parties. While such changes in the retail market are being driven in the EU by the revised Payment Services Directive (PSD2), corporate products in cash, foreign exchange, liquidity and finance data will also demand new APIs.

Given this demand for openness both internally and externally, a common, cross-application taxonomy of products and services and uniform data dictionary is clearly important. But this model has to go further. Creating a common data model is a great start, but business users have to be empowered to explore and exploit this consistent information resource, not only to meet regulatory demands, but also to support business change.

Opening up a single, consistent data source to business users via standardised, self-service technologies – such as the Representational State Transfer (REST) API – is transformative. A simple browser-based interface that enables business users to select required data on demand, with the addition of formatting and frequency tools, effectively opens up the data asset to drive new value. Data can be accessed, integrated into other systems and/or explored via standard data discovery tools – all without reliance on IT.

Maintaining Control

Obviously, this model has to be controlled – from avoiding a data deluge to ensuring confidentiality is maintained, the data cannot be left open to everyone. The ability to manage permissions, for service providers, internal users and customers, is essential if the organisation is to ensure compliance with data privacy laws, adherence to content licence agreements and protection of commercially sensitive information. A REST API should include the ability to control access to specific data to avoid exposure of data to users who are not permitted due to licence constraints or data sensitivity.

With the right security measures in place, information that would have taken business users weeks to access while waiting for IT, can now be discovered and reported upon in days. Given the increasing need for reports – both regulatory and data discovery to support business change – this self-service access to trusted, standardised data is key.

Conclusion

The regulatory reporting requirements that have evolved over the past decade may have put the spotlight on the endemic, silo-based infrastructure model, but it has also become very clear to the financial services industry that if operational costs are to be reduced, IT rationalisation is an imperative. At the same time, an integrated financial ecosystem is becoming vital in both retail and corporate markets. Without a mature data governance model that leverages new enablers, including APIs and standard data dictionaries, organisations will struggle to realise both rationalisation and extension goals.

To realise the vision of agile, simplified financial services business models that are competitive in new digital markets, organisations need to not only create a centralised data source, but also explore new standardised technologies to mobilise data and empower users throughout the business and beyond.

Show Author Info?: 
No
Author: ateamgroup
Posted: August 21, 2017, 3:16 pm

After an eight-year career as a senior data management executive at HSBC, Peter Serenita has moved on to join Toronto headquartered Scotiabank as US chief data officer. He will continue to be based in the New York City area where before joining HSBC he spent 17 years in technology and data management roles at JP Morgan and later JPMorganChase.

Serenita joined HSBC as global head of data management – Global Banking and Markets Client Onboarding and Account Maintenance in July 2009. In this role, he implemented a customer data system integrated with the bank’s customer onboarding, trading account opening process and trading documentation system. The implementation supported Global Banking & Markets’ customer programme and HSBC’s compliance with Dodd-Frank requirements for Legal Entity Identifiers (LEIs) to identify derivatives.

Serenita moved on to become chief data officer of Global Banking and Markets at HSBC in August 2012, a position he held for eight months before moving on in March 2013 to become group chief data officer, his final post before leaving the bank to join Scotiabank.

Among his achievements as group chief data officer at HSBC, Serenita implemented a global data organisation covering all businesses and functions across 74 locations. He established data and analytics as a board level priority and defined HSBC data strategy and execution of the strategy to deliver business value. He also defined and delivered data management processes and tools to improve data quality across the organisation. On the regulatory front Serenita was responsible for the bank’s successful BCBS 239 compliance programme.

Show Author Info?: 
No
Author: ateamgroup
Posted: August 17, 2017, 11:16 am

Financial institutions around the world are bracing themselves for the onset of the EU’s General Data Protection Regulation (GDPR), which introduces eye-watering financial penalties for firms failing to meet stringent new rules on managing the personal data of EU residents. GDPR – which comes into effect in May 2018 – will have a major impact on the way financial services firms manage client and prospect information.

Author: ateamgroup
Posted: August 15, 2017, 12:32 pm

MiFID II’s "No LEI, No Trade” requirement mandates all entities trading with European counterparties across all asset classes need to obtain legal entity identifiers (LEIs).

Webinar Date: 
Tuesday, September 19, 2017 - 15:00
Author: ateamgroup
Posted: August 10, 2017, 1:01 pm

By John Stuart-Clarke, Data Protection & e-Privacy Specialist

John works at a large UK Insurer and leads a team of product owners tasked with strengthening the data protection control environment in readiness for the GDPR and the ePrivacy Regulation.

The risks and potential sanctions that GDPR exposes organisations to are eye-wateringly attention-grabbing. They have triggered a variety of responses ranging from the opportunistic repackaging of security products as GDPR Swiss army knives to blind panic at the prospect of a fine that could be big enough to take a company out of business.

In amongst the scare-mongering, confusion and still surprisingly widespread ignorance are the oases of calm we all need to gravitate towards if we’re going to survive until 25th May 2018 and beyond. 

The best path to take you to your personal oasis is one that leads you between these extremes whilst avoiding the road on which GDPR is just another acronym passing you by. You do need to act and to be successful, you need to Keep Calm and Think.

Hear from John-Stuart-Clark, GDPR specialist at our RegTech Summit for Capital Markets

Form a Plan

Having a plan means knowing where you are right now, where you want to get to and how you are going to get there.

If you don’t know where you are going, how will you know when you’ve arrived?

In a data protection context, knowing where you are right now involves understanding your state of compliance with currently applicable data protection laws. In my case, chief amongst these are the UK’s Data Protection Act (DPA) and Privacy in Electronic Communications Regulation (PECR). Don’t be tempted to skip this important first step: if you do, you may miss key deliverables that need to be included within your plan and risk laying poor foundations based on incorrect assumptions.

To know where you want to get to you must create a vision of the future. The best visions are short, exciting and memorable. Your vision may be to fully comply with the law or it may aspire to radical, transformational goals, such as reconstructing your organisation’s relationship with its customers or bringing about a step-change in how your staff think about data.

Envisioning is vital. Even with the most modest of GDPR-related goals, reaching your oasis will involve a long journey and if you don’t keep your eyes fixed on a clearly-defined end-point, you may at some point end up floundering, unsure which way to turn next.

Deciding how you are going to get there requires further analysis. Firstly, understand what’s going to change under GDPR – learn about the new and enhanced rights for data subjects, increased accountability for data controllers and processors, and other key changes. You can do this using the free resources provided by your local data protection authority (the ICO in the UK) and any of the excellent freely-accessible blogs published by legal firms such as DLA Piper, Fieldfisher, Bird & Bird and DAC Beachcroft. If I have missed any other good ones, please mention them in the comments section.

I also suggest you join networking groups, ask lots of questions and read some of the books that cover this subject. Consider joining the International Association of Privacy Professionals (IAPP) who offer a wide range of member-only resources that may prove very useful too.

Now you are ready to identify the changes you need to make within your organisation to enable you to move along your journey. You can do this by comparing where you are now with where you want to get to and identifying the gaps. The gaps represent things you need to change or create. I will call these things products, as you’ll need to produce them. Turn them into a list or draw them as a tree-like structure if you prefer to think of them hierarchically.

Your products will be delivered by work and the work you need to perform will require people with certain skills to help you. It may also require some funding. Trying to size, cost, prioritise and resource this work product by product is a lot easier than trying to comprehend the entire endeavour all at once.

Look from Every Angle

When you’re thinking about products, use a range of different perspectives so that you create the most robust plan possible. I like to think from the perspective of people, organisation, process, information and technology, rather the more commonly referred-to reduced set of people, process and technology.

The organisation perspective is especially helpful when thinking about GDPR, because controllers, processors and sub-processors are very likely to be organisations.

Understanding the connections between these different types of organisations and the numerous individual elements of GDPR are key to achieving a good level of understanding of what needs to be done.

The people perspective is also vital not least because of the emphasis on accountability and on the need to take measured action proportionate to risk. You may need to make some specialised appointments to meet the new obligations: do you need a Data Protection Officer? What about data stewards and data owners?

Even if it’s not necessary to make new appointments, you will need to think about training and support for your specialist (privacy professionals, data protection champions) and non-specialist colleagues (project teams, IT security professionals, front-line operations teams) who will ultimately determine how well your organisation adjusts to GDPR.

If this effort feels like overkill for your organisation, take comfort from the fact that preparing even the simplest of plans will make you more aware of the challenges you face. A coherent plan helps break the massive problem of GDPR down into much more manageable chunks, which makes it much easier to tackle and a lot less scary to contemplate.

Take Your People on the Journey

GDPR isn’t just a regulatory compliance issue and it’s certainly not a once-and-done project. Embedding GDPR successfully will permanently change the way you do business and your organisation may need to adjust its culture to get these changes to stick.

It’s important that you involve as many of your people in preparing for GDPR as possible so that they get to come on the journey too. The sooner they are engaged, the sooner they will start to think of potential impacts, spot changes that need to be made and identify opportunities than you may never otherwise see.

Use Your Own People to Help You

When I speak at public events, I often describe myself as a business analyst who bends himself into whatever shape is required to enable me to achieve the outcome I am in pursuit of.

Business analysts have massive tool chests, full of goodies such as internal and external environment analysis techniques, requirement elicitation skills, experience of crafting business cases and the drive to dogmatically ask “why?” over and over, until the tip of the root of the problem is finally unearthed. Business analysts also know how to map business activities and model data flows and how to present each of these views of the world at different levels of detail, tailored for specific audiences.

You may not have any business analysts within your organisation but you almost certainly have colleagues who have displayed skills such as those described above. These people can help you along your GDPR journey.

It’s important to recognise that you will also need executive support. High-ranking sponsorship (hands-on ownership is even better) for the vision you have created and air-cover that can be called on when challenges are raining down will help you avoid roadblocks and maintain momentum.

Using the Expertise of Others

Whilst you can achieve a great deal on your own, you may eventually need the help of an external expert or two, to complete your journey. Whilst expert risk management, information security or legal advice may be very useful to you, I suggest you do as much as you can with the resources you already have before you bring in external experts. This will help you hone in on your true needs before you commit to formal support arrangements and their associated financial consequences.

Does Any of This Sound Unfamiliar?

I would be surprised if much of what I describe sounds entirely alien to you as most of these suggestions are common sense and the rest are drawn from widely prevalent best practice. But I guess that’s the point of this article.

Whilst GDPR is undoubtedly new, data protection law is not and organisational change is an ever-challenging constant. Once the hype of GDPR fades, what you are left with is an (admittedly complex) organisational change. So rather than searching for silver bullets or mythical super-beings, use what has worked well for you in the past to make GDPR a success for your organisation.

Disclaimer: these are my own words and opinions and they are not necessarily shared by my employer. I am not a lawyer nor am I providing legal advice. Please treat my assertions as opinions and feel at liberty to challenge or contradict them as you wish.

Join us at The RegTech Summit for Capital Markets and hear John speak in the GDPR Panel.

Show Author Info?: 
No
Author: ateamgroup
Posted: August 3, 2017, 2:27 pm

As lines of business demand access to more market data and firms seek to cut the overall cost of data services, the need to understand what market data services are being accessed and used becomes imperative to optimising data subscriptions. This is a challenge, but is made more difficult in an environment where new data services are increasingly delivered via the web rather than traditional desktop applications.

Category: 
Author: ateamgroup
Posted: August 1, 2017, 8:10 am

TP ICAP’s decision to use the Bloomberg Entity Exchange to help clients of its broking businesses register on new trading venues under Markets in Financial Instruments Directive II (MiFID II) was made after considering several Know your Customer (KYC) solutions and is expected to simplify the repapering process for clients ahead of the January 3, 2018 MiFID II deadline. For Bloomberg, the TP ICAP use case of Entity Exchange is significant in its application to emerging regulation.

TP ICAP broking businesses including Tullett Prebon and ICAP have applied to operate Organised Trading Facilities (OTFs) under MiFID II. This requires them to collect information about venue users and transactions, and distribute information including risk disclosures to venue users. Bloomberg Entity Exchange matches these requirements with a web-based electronic platform that centralises the exchange of information and documentation required by TP ICAP to offer execution services to its customers in compliance with MiFID II.

Nicolas Breteau, chief executive at TP ICAP Global Broking, says partnering with Bloomberg will help its clients “understand what the new rules will mean to their trading relationships, especially around trade execution, reporting and transparency”.

Dan Matthies, global head of Bloomberg Entity Exchange, says the platform is well suited to handling the documentation challenges of MiFID II, which include the exchange of millions of pieces of paper. Entity Exchange applies data science to documents, taking resulting data points and using them to auto-match documents to requests. The collection of data points also allows auto-population of standard or custom digitised forms or questionnaires.

Beyond TP ICAP’s use of Entity Exchange to meet MiFID II compliance, Matthies says: “Given Entity Exchange's flexible and policy agnostic approach to documents and data, we are seeing a number of different use cases across the regulatory compliance space globally. KYC information, regulatory driven affirmations and questionnaires are permissioned in an encrypted environment with a full audit trail and version control. As a result, legal, compliance, operations and investor relations professionals are using Entity Exchange to deliver, manage and track legal, regulatory and operational data and documents as a matter of safe and sound practices and efficient compliance.”

Show Author Info?: 
No
Author: ateamgroup
Posted: July 26, 2017, 12:46 pm

Lessons learned from a recent business intelligence project

By: Marc Alvarez

One of the biggest motivations of working in a chief data officer (CDO) role is the opportunity to spend time and thought on applying state-of-the-art business intelligence (BI) to the firm’s own data. Unfortunately, given the pressing need to support regulatory reporting and simply keeping data operations humming, this opportunity always seems to get pushed onto the back burner and almost always tends to stay there as the next crisis comes around the corner.

That’s too bad. I recently proposed to management an effort to take a look under the covers at the details and it’s clear that there is tremendous value in applying today’s BI methods (including artificial intelligence (AI), machine learning, predictive analytics and the like) to drive the business. This observation likely comes as no surprise to the many technology and analytics vendors out there (bit of an “ah ha!” moment), but there is one lesson learned in my case that isn’t immediately obvious: in putting together an initial working model, it turned out that the ‘data science’ components proved to be the least challenging part of the exercise.

The concept

Here’s a quick breakdown of the concept – the idea was to compare the trading activity carried out by the broker/dealer and compare it to broader market activity. In particular, the goal was to look for patterns over time and then generate and send notifications of anomalous market activity to sales and trading desks so they could get a heads up on potential activities affecting their clients.

The premise is built on the hypothesis that by generating and recording statistical coefficients over time, today’s BI capabilities should be able to identify patterns and, more importantly, outliers to provide insight to the dynamics in the market. The most important word there is ‘statistical’ – the goal is to produce a model that is entirely empirical and hopefully free from bias. Anybody familiar with market surveillance applications is likely familiar with the premise.

So far, so good. Getting this onto a whiteboard was the first step. The next was to figure how to make it a reality – in particular a proof of concept (PoC) was called for in order to validate that the application could actually work in dealing with OTC instruments. Breaking it down, this entailed:

  • First was to decide how to execute on this – in a busy firm treading water with demands on the IT department, this would likely require partnering with a solution provider. No big deal – write up a brief and send it around and get some dialogue going. Very interesting dialogue, by the way, the tools and resources available from solution vendors today are truly impressive.
  • Next up, in order to avoid the exercise becoming a ‘neverendum’, scope for the PoC needed to be defined and documented – it’s only fair to the solution vendor to avoid scope creep.
  • Selecting a partner (or partners – it’s worth taking multiple kicks at the can if possible to validate as many different options as possible, there are a lot of options out there today, that’s for sure) – and define a plan (do not try to do this without a basic plan in order to avoid scope creep and tangents).
  • Assemble the data content and make it available to your solution partner.
  • Integrate the data and perform the analysis on the data.
  • Present the exciting new tool to your management and win plaudits all round.

All sounds pretty straight forward doesn’t it? Here’s a look at the realities.

The realities

First, there’s the question of selecting a partner. Given that this is largely an experiment, there is, understandably, some reluctance to spending any significant sums of money on something that may or may not actually work. From the solution partners’ perspective, this looks like an unbudgeted initiative – something they could easily be drawn into that ends up costing them a lot of money. From the firm’s perspective, the need for non-disclosure and protection of intellectual property is top of mind.

Lesson learned – getting even to within sight of the starting line is a path full of obstacles and dependencies. It takes a lot longer than anybody could ever imagine going into the exercise.

Second is the need to define the scope of the project to a sufficient level of detail that an actual meaningful conversation can take place. All too often, what happens here is a concept is sketched out on a white board and then a talented technology person runs with it to produce ‘something’. The temptation is to get activity going as quickly as possible, without thinking through the whole approach.

The biggest issue that came up in this experience was clarifying precisely what would be demonstrated and how success (or failure) would be judged. Fortunately, I’ve been down this path before and learned firsthand the costs of rushing into things. So, time and effort was spent describing the scope and functionality in writing and, most important of all, the criteria by which success was to be judged. As this was intended to be a PoC, this involved a lot of back and forth with prospective business users.

Lesson learned – engaging prospective business users at this stage proved a time consuming distraction. The whole concept proved to be so out of the comfort zone and novel (good things in my opinion) that it was very difficult to get much meaningful feedback beyond ‘we’ll look at it when it’s ready’. Seriously, this was the experience, clearly a lot of interest but a user community that finds it difficult, if not impossible, to express its business requirements. No amount of discovery or requirements definition effort is going to help you here – at the cutting edge an operating, demonstrable application is called for. And good luck with getting anybody to read a document!

Data content

Okay, so at this point you have the general concept defined, buy-in from your business users, requirements defined, and solution partner(s) selected. You’re ready to go…or are you? The missing piece of the puzzle is the data content, which almost always seems to get overlooked.

In this case, most of the content needed to kick start the project was readily available from in-house databases (note, you need to have a handy SQL person on hand to get at it since the IT team is pretty much fully booked on mission critical projects) and vendors. The latter is another hurdle that needs to be considered. In this case, the request was for a custom extract of data on a one-off basis to prove a concept. Finding a vendor willing to work on a custom request and put together a contract for the content is another task that needs to be considered, along with getting budget approval for the spend. And it all takes time and a lot of effort to specify the data requirement and how it will be delivered.

Meanwhile, the clock is ticking and your prospective business users are reading more and more about AI and the like in the press and are starting to ask questions about when there will be something to see. On top of it all, there’s a day job to do so things are really not getting as much focus as they should.

The magic

Finally, all the inputs are delivered to your solution partner(s) and their data scientists. Then the magic happens – with today’s capabilities and the expertise they had available for the project, the solution partners easily turned around initial results and sample applications in as little as a week. More importantly, as more data became available, the application became increasingly functional and impressive. This also starts to throw up all the issues in the data provided as input. In many cases these proved to be showstoppers, requiring corrections and/or additional data to resolve. So, this turns out to be a highly iterative activity dependent on the same complexity of acquiring and delivering data content.

In the end, the concept becomes demonstrable and quite impressive. Be warned, however. At the first presentation to business users you should expect a firehose of questions and requirements – now they provide the input!! And of course, as this was a PoC exercise (marked by really getting your business users to buy in), there has been little thought put into what a production platform is going to look like. It may sound like a nice problem to have, but as the logical next step is to secure budget to produce an operational service, well, you know how that discussion goes.

Here are the key lessons from a front of the curve effort to monetise the firm’s data assets:

  • The data science really was the easy part. Selected solution partners had the skills and technology to deliver statistical and quantitative analysis in a surprisingly timely manner. That was great to see.
  • It takes far, far longer than anyone could anticipate to define and set the scope for the PoC. The temptation to go beyond an initial set scope is very strong and can easily become a major distraction.
  • Getting early input from the prospective business user community is very difficult. The technologies and analytics involved are simply too new and unproven to them. Thought leadership is required as well as a good dose of salesmanship to develop a compelling vision.
  • Plan for success – once it’s all put together and demonstrable, don’t underestimate the demand that will be generated. It’s absolutely essential to communicate how the new analytics are going to get into production and become a resource to help drive the business. In particular, this is where the loop needs to be closed with the IT team as getting something into production is almost certainly going to require the team’s assistance and effort.

All told, it’s clear that options available today to apply data science to help drive the business are here and real. However, successful adoption involves a significant increase in areas that may not at first glance seem obvious. As firms increasingly adopt digital methods of operating and deploy increasingly sophisticated quantitative and statistical techniques, there is a corresponding increase in dependency on the firm’s ability to acquire, collate and manage data content. Welcome to the new normal!

Show Author Info?: 
No
Author: ateamgroup
Posted: July 25, 2017, 1:19 pm

Contact Us

 

 

Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004

 

(347) 770-1748

(212) 931-0572

 


 

Careers

 

If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.