“…We engaged Epsilon to be our strategic partner…. I am happy to say that Epsilon met or exceeded our expectations on this project. We successfully completed the RFP project and selected a system that was right for our business…”

Chief Financial Officer

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

A-Team Group’s London Data Management Summit rolls into town next Thursday with a fine line-up of speakers and a showcase presenting fintech innovators that could help firms create business value from their data.

Ahead of the event, we caught up with Rupert Brown, chief technology officer at The Cyber Consultants, to discuss his views on data management problems and how they can be solved with fintech solutions.

Q: What data management problems do financial institutions have that you believe you can solve?

A: We believe we solve the problem of direct traceability from regulations and standards to the evidence that proves compliance in a standardised way even for principles-based regulations, such as General Data Protection Regulation (GDPR), BCBS 239 and the Senior Managers & Certification Regime (SMCR), where there is no standard reporting specification.

Q: Why do firms have this problem?

A: There are 2 reasons: first, until now there has been no standardised way of reporting against principles-based regulations – this is our core IP; second, responses have been typically constructed as ‘fairy stories’ made out of randomly structured ad hoc spreadsheets.

Q: How do you solve the problem?

A: We build dynamic semantically, rigorous models of arguments in pictorial form supported by a set of verification and analysis algorithms that guide users on what they need to do next and the potential areas of risk.

Q: What technology do you use?

A: We use a mixture of COTS graphical design tools and enterprise No SQL data management technologies. We can also partner with most of the new wave of GDPR driven enterprise content analysis toolsets.

Q: How do you fit into a financial institution’s architecture and data flows?

A: Typically, we connect our platform to any/all of a financial institution’s (and other sectors) information/management processing platforms wherever relevant evidence is deemed to exist to support our compliance argument models.

Q: Which emerging technologies do you see as having the most potential to improve data management and why?

A: We have still to see the proper fusion of software defined networking and data management toolsets. When you think about it, all intra and inter-company dataflows occur across a network and therefore management techniques must be derived from understanding, monitoring and controlling those flows, which does not happen today.

Show Author Info?: 
Data Management Summit (DMS) - London, March 22nd 2018
Author: ateamgroup
Posted: March 15, 2018, 10:32 am

By: Roland Bullivant, Silwood Technology

The EU’s new rules on data protection enshrined in General Data Protection Regulation (GDPR) come into force in May 2018. They fortify the rights of citizens over their own data and put more obligations on organisations of sizes to manage and protect that personal data.

The UK Information Commissioner’s Office provides a great deal of useful information including a 12 step guide to becoming compliant.

One of these steps, Number 2, suggests you ‘should document what personal data you hold, where it came from and who you share it with. You may need to organise an information audit across the organisation or within particular business areas’. One reason for this is the requirement to be able to support requests from data subjects. For example, if a customer wants to know what data is held about her, wishes to have data erased or corrected, or decides to withdraw consent to data being processed, it will be necessary to respond to those requests quickly and effectively.

In addition, if an organisation discovers it has inaccurate personal data that it has shared with other organisations, it will need to inform the other organisations about the inaccuracy, so they can amend their own records.

This is impossible unless the organisation knows what personal data is being held and its location. This article discusses the possible methods organisations might employ to find where personal data exists in an ERP or CRM system, and an alternative software driven approach.

Needles in haystacks?

For many organisations, the primary mechanism used to store personal data will be an enterprise data catalogue, data dictionary or data governance platform. This will provide data analysts with the information about personal data required to enable compliance with GDPR consent and rights of data subjects.

The key ingredient for these platforms will be metadata from across the enterprise IT landscape. For many systems, finding that information is quite straightforward as the metadata or data model or source data is easy to locate and understand. In addition, there are many software products that can scan systems and deliver required information relatively easily.

However, if an organisation is running enterprise CRM or ERP applications from SAP, Oracle, Salesforce, Microsoft or others, finding metadata that relates to personal data will be more of a challenge, especially if its location is not already known.

This is because of the size, complexity and level of customisation of the data models (metadata) that underpin these systems. Also, in most cases, the metadata is very opaque because the database system catalogue provides nothing useful in the form of business names for tables and fields, and no information about table relationships. This means standard database tools or scanners will not deliver anything of value in the search for personal data.

What methods can you use to find personal data information?

As an example, consider the methods you might employ to locate all the tables that store a particular personal data attribute in an SAP system. In the example used for this article, 'Date of Birth' has been selected as the piece of personal data to be located. A base SAP ERP system has well over 90,000 tables and 900,000 fields. In practice, the data model is often much larger and made more complex by the number of customisations that have been implemented.

While SAP has by far the largest and most complex data model, other ERP and CRM packages also have significant numbers of tables and fields. For example, a typical Oracle eBusiness Suite system has over 22,000 tables and 500,000 fields. A standard Microsoft Dynamics AX system has over 7,000 tables and 100,000 fields. Even a large Salesforce implementation can have over 3,000 tables.

Using documentation

The first question to ask when referring to documentation relating to metadata is ‘does it exist?’. If it does exist, can you access it easily? Another potential problem is to try to confirm whether it reflects any changes that have been made to the data model during the delivery project or subsequently.

If documentation is present it can provide a good starting point for metadata discovery. However, for large and complex ERP and CRM systems, trying to find individual instances of attributes that relate to personal data could present a problem. This is because the task of searching for each personal data item through documentation relating to tens of thousands of tables and hundreds of thousands of attributes is significant.

Finally, there is the additional task of ensuring the information is accurately recorded in the data catalogue, which could involve data being rekeyed or copied into the system.

Asking internal technical specialists

Your internal technical specialists will have access to whatever tools are provided for exploring the definitions of tables and field attributes within the system by each ERP and CRM vendor. It is likely they will also have good knowledge of the system and the particular way in which it has been customised to meet specific requirements.

They should then be able to go through a process to locate the personal data attributes for each table and record that information, perhaps in a spreadsheet or directly in the data catalogue tool.

One challenge with this approach is that searching and recording individual personal data attributes across large numbers of tables may not be supported by the tools provided. This would increase the amount of time it takes to achieve the task using this method. One result could be uncertainty as to whether all personal data attributes have been identified.

Engaging software vendors, staff or consultants

It may be necessary to engage the services of external consultants to achieve the same results as internal specialists. These could be application or GDPR specialists from a systems integrator or possibly from the supplier of the data catalogue or data dictionary software.

Depending on their experience and level of competence, they may be able to make use of whatever tools are available to them. Alternatively, they could provide some base templates that can provide the foundation for further exploration and comparison with the metadata as implemented.

Whichever approach is taken, some work with will be needed in order for them to familiarise themselves with the particular changes that have been made to the underlying data model and that might be relevant in the context of GDPR. These are likely to be a drag on the personal data discovery process.

It may be possible, if the metadata has been located using software tools or copied into a spreadsheet for example, to automate some of the processes for bringing personal data attributes into the data catalogue or glossary.

Internet search

When all else fails, staff may resort to searching the internet for data models they hope will contain the personal data they are seeking from source ERP or CRM packages. This can be helpful, however it is best to approach this with a degree of caution. There is a risk that whatever is found may not represent what is in your own systems and so some work will be necessary to try to compare the two versions.

It may be possible to search for a list of all attributes in say, an SAP system, however with over 900,000 in a standard implementation, isolating those relevant for GDPR would be an extensive task.

Using dedicated metadata discovery software

Almost all data catalogue and data glossary software vendors have facilities to connect to source systems, locate and then import metadata into their platforms. This works really well if the source systems have relatively small amounts of metadata that is easy to find, understand and use.

Packaged ERP and CRM systems present a much more arduous challenge when it comes to accessing and making sense of their metadata. The size of their data models, combined with high levels of customisation, often impervious naming conventions, and lack of meaningful information in the database itself mean traditional methods and non-specialist tools are of limited value.

There are a very few software products that offer an alternative approach and provide data analysts with unique intelligence about the metadata in ERP and CRM packages. Typically, these products work by accessing or extracting rich metadata, as implemented, from where it resides in the application and then storing it in a repository. Often, this is in the data dictionary although some applications maintain their metadata elsewhere.

Doing it this way means that customisations to the data model are automatically surfaced so that users can be confident they are working with accurate metadata. Importantly, these products provide logical as well as physical information about tables and attributes, and discover the relationships between tables. This means it is easier to search for and locate personal data attributes.

For example, without a software discovery product with the capability to search for say, attributes across an entire SAP system with the string ‘social security’, the analyst would be reduced to hoping that someone would know that the physical name for that in the database is ‘CS04’. Alternatively, without the specialist search facilities offered by these products it would not be possible to quickly identify that a particular instance of SAP has 90 tables that contain the string ‘date of birth’.

Example showing how metadata discovery software can search for and find a list of SAP tables that contain one or more fields with the string ‘date of birth’

Ideally these metadata discovery products should support the ability for the relevant personal data metadata to be shared with other software platforms, including data catalogues and data glossaries. In contrast to traditional, manual and more resource hungry methods, this software driven approach means the whole process can be accomplished much more quickly and accurately.

Using technology in this way to help discovery of ERP and CRM metadata will assist and accelerate the information gathering part of the GDPR compliance process and improve the confidence and trust the business can have in the data.


By: Roland Bullivant, Silwood Technology

The EU’s new rules on data protection enshrined in General Data Protection Regulation (GDPR) come into force in May 2018. They fortify the rights of citizens over their own data and put more obligations on organisations of sizes to manage and protect that personal data.

Show Author Info?: 
Author: ateamgroup
Posted: March 12, 2018, 6:12 pm

Risk Focus has introduced Regulatory Reporting Advisory (RRA), a service designed to help clients comply with trade reporting regulations, meet aggressive deadlines and build solutions for regulatory compliance, including upcoming regulations such as Securities Finance Transaction Reporting (SFTR), which takes effect in early 2019.

Risk Focus RRA offers support for all aspects of regulatory reporting and controls. Workshops, health checks, assurance reviews, education and training aim to ensure the service provides stability in times of constant change. As the preferred implementation partner of RegTek Solutions, Risk Focus RRA has held on-site and remote workshops for clients, guiding them through changes made to European Market Infrastructure Regulation (EMIR) Regulatory Technical Standards (RTS) late last year and the DTCC Global Trade Repository.

Srikant Ganesan, Senior managing director at Risk Focus Financial Services Solutions, comments: “Risk Focus is the firm to turn to for expertise and advice related to trade and transaction reporting challenges, be it a health-check, a training session or a full-on implementation.”

Risk Focus RRA can help clients assess, select, build and implement control tools and systems around both existing and upcoming regulations. These cover all aspects of regulatory reporting, including key controls such as completeness and accuracy assurances.

Show Author Info?: 
Author: ateamgroup
Posted: March 12, 2018, 4:02 pm

OpenFin is leading an initiative to bring universal connectivity and standards to the desktop applications used across capital markets. The initiative is called the Financial Desktop Connectivity and Collaboration Consortium (FDC3) and initial members include Algomi, AllianceBernstein, Barclays, BNP Paribas, ChartIQ, Citadel, Cloud9, FactSet, Fidessa, GreenKey, J.P. Morgan, Morgan Stanley, OTAS Technologies, RBC, TP ICAP, Wellington Management Company and OpenFin.

The aim of FDC3 is to address the fractured software landscape of capital markets and deliver common software and standards across desktop applications used for trading, market data, order management, analytics and productivity to support faster decision making, improved productivity and streamlined workflow.

OpenFin, provider of an operating system created for financial markets and designed to be unifying and application agnostic, has contributed open source code to support the initiative. It is also open sourcing and making freely available its desktop connectivity technology, which is used by many large banks and buy-side and vendor firms, and providing a central app directory that will be freely accessible and allow applications to identify one another safely and securely.

Mazy Dar, CEO of OpenFin, says: “Today, the humans sitting at desktops are the integration layer between their applications. We believe the time has come to enable financial desktops with the same app interoperability that we take for granted on iOS and Android devices.”

Among the members of FDC3 commenting on the initiative, Bhupesh Vora, managing director, markets technology at Barclays, says: “Communicating and sharing context between multiple apps without the huge overhead of bespoke integration will be a massive boost to productivity for both the development community and ultimately our salespeople and traders. FDC3 will give us on the desktop what FIX gave us for server-side interoperability between venues and clients”.

Jim Adams, managing director, CIB technology at J.P. Morgan, comments: “Our corporate and investment banking staff can use anywhere between five and fifteen applications in their daily workflow. Interoperability would allow this workflow to become seamless across applications and platforms, ultimately making our employees more productive and informed when talking to internal and external clients.”

From a vendor perspective, Steve Grob, director of group strategy at Fidessa, says: “FDC3 aligns with our broader outreach to top-tier firms. In particular, the OpenFin messaging bus resonates with our own approach to provide customers with better levels of innovation and control. This is important because all banks need to move faster than ever before, but without breaking what they already have in place.”

Show Author Info?: 
Author: ateamgroup
Posted: March 8, 2018, 2:00 pm

S&P Global plans to strengthen its capabilities in emerging technologies through the acquisition of Kensho Technologies, a provider of next-generation data analytics, artificial intelligence (AI), machine learning and data visualisation systems to financial institutions and national security. The company will acquire Kensho for about $550 million and expects the deal to close late this quarter or early next.

Kensho was founded in 2013 by Daniel Nadler, a Harvard PhD, with a mission to develop and deploy scalable AI systems that have a real-world impact across government and commercial organisations. The company has been named a ‘technology pioneer’ by the World Economic Forum.

The acquisition of Kensho will strengthen S&P Global's emerging technology capabilities, enhance its ability to deliver actionable insights for clients, and accelerate efforts to improve efficiency and effectiveness of internal operations.

Douglas Peterson, president and CEO at S&P Global, says: “In just a short amount of time, Kensho's intuitive platforms, sophisticated algorithms, and machine learning capabilities have established a wide following throughout Wall Street and the technology world. With this acquisition, S&P Global is demonstrating a commitment to not just participate in the fintech evolution, but lead it.”

Nadler, founder and CEO at Kensho, adds: “Kensho has assembled one of the most elite AI teams in the world, drawing from the scientific community's leading global research universities. Combining our industry-leading expertise in machine learning with S&P Global's deep datasets, global scale analytics platforms, essential benchmarks, reputation and leadership team will allow Kensho to expand and innovate faster, further and in new ways. This deal values Kensho at a premium to its most recent funding round.”

In 2017, S&P Global launched a Fintech Venture Investment programme and invested in several fintech companies including Algomi, a London-based fintech company; Ursa Space Systems, an alternative data technology company; and Kensho.

Show Author Info?: 
Author: ateamgroup
Posted: March 7, 2018, 2:41 pm

Alberta Investment Management Corporation (AIMCo), one of Canada’s largest and most diversified institutional investment managers, has selected FactSet as its investment risk management solution.

Remco van Eeuwijk, chief risk officer at AIMCo, says: “To be considered world class in our approach to risk management, AIMCo required a partner that complemented our organisation and enabled us to focus on our core competency. By providing us with a holistic platform that enables us to integrate risk management with portfolio management, while addressing market data management and operations, FactSet is that partner.”

Following implementation of the FactSet risk management solution by the vendor’s professional services group, AIMCo will have access to risk management, portfolio analytics and investment research.

FactSet recently extended its multi-asset class solution set with the addition of a linear risk model and the integration of the Cognity fat tail approach acquired through the acquisition of BISAM.

On the AIMCo deal, Rob Robie, senior vice president and global head of analytics solutions at FactSet, says: “This is an important win as it showcases the benefits of FactSet’s unified risk capabilities, which are a powerful combination of our multi-asset class risk offering advanced by the Cognity risk framework.”

Show Author Info?: 
Author: ateamgroup
Posted: March 7, 2018, 1:01 pm

Singapore’s United Overseas Bank has selected NeoXam’s DataHub to support group-wide data management and compliance with regulatory requirements of the Fundamental Review of the Trading Book (FRTB) and risk data aggregation and reporting under BCBS239.

Lim Ann Liat, head of markets and enterprise technology, group technology and operations at United Overseas Bank, says: “One of our aims in harnessing technology is to improve business performance. By tapping NeoXam’s DataHub, we will be able to increase the speed in which we manage immense quantities of market data through automation while driving productivity improvements and mitigating operational risk, all in a cost-effective way.”

NeoXam DataHub is a customisable software solution for centralised data management that provides an easy to govern and transparent way to manage the data supply chain for risk management. It acts as a single repository of market data and helps reduce operational risk, address regulatory requirements and achieve market data cost savings.

Tim Versteeg, chief sales office and general manager of NeoXam Asia Pacific (ex China, says: “When FRTB is implemented, a vast amount of market data is going to be required on a daily basis. Not only is the consolidation of this important, but market data costs could soon spiral out of control if there isn’t an optimised process in place. We are working with United Overseas Bank Group in Singapore to ensure that FRTB compliance is a smooth and cost-effective process. Our data centric approach to market data management enables swift implementation of a United Overseas Bank Group golden copy of data and a governance model that will keep the bank one step ahead of the regulatory curve.”

Show Author Info?: 
Author: ateamgroup
Posted: March 7, 2018, 11:29 am

Quantexa, provider of big data management software, and Arachnys, provider of a financial crime risk assessment platform, have teamed up to identify and monitor customer risk.

Quantexa will use the Arachnys cloud-based investigation platform and global news assets to dynamically screen against negative news, locate missing Know Your Customer (KYC) data and provide enhanced risk scoring, giving financial institutions a deeper understanding of the risks associated with their customers.

Arachnys will use Quantexa software to compute relationship and network risk, identify high-risk entities and ultimate beneficial ownership structures, and trigger events for KYC data collection. The combination of technologies is designed to reduce false positive matches and ensure complete views of customer risk across populations, while assuring compliance with regulations.

The partnership comes ahead of the US Treasury’s Office of the Comptroller for Currency’s (OCC) final rule on customer due diligence, which will be implemented on May 11, 2018. The OCC rule states that all financial institutions must adhere to specific requirements in understanding who the ultimate beneficial owner is of every newly opened account.

Vishal Marria, CEO at Quantexa, says: “With the final date on the OCC’s ruling fast approaching, we are able to offer financial institutions technology to help keep them compliant and tackle a crucial issue.”

Ed Sander, president of Arachnys, comments: “The identification of ultimate beneficial owners will financial institutions in 2018. The combined Arachnys and Quantexa capabilities will provide institutions with improvements in their risk management capabilities and help them in the ongoing fight against financial crime.”

Show Author Info?: 
Author: ateamgroup
Posted: March 6, 2018, 6:26 pm

Siren, Capgemini and Cambridge Intelligence have launched the Transformative Enterprise Knowledge Graphs initiative with a view to encouraging industry-wide discussion on the development of principles for data intelligence. The group describes data intelligence as the enabler for ‘knowledge-centric companies to join the dots among thousands of internal and external datasets to perform business critical functions’. Connected knowledge graphs are at its core.

The initiative comprises webinars, events and articles, and will discuss topics and principles including:

  • Lightweight: finding ways to deliver knowledge graph projects without significant upheaval of existing data infrastructure
  • Multiscale: allowing data analysis and exploration at any given scale, from ‘atom to galaxy’
  • Smart and intuitive: harnessing technologies, including artificial intelligence (AI) and semantic, to deliver knowledge graph value in context with the user activity
  • Planning for uncertainties and noise: no enterprise dataset is perfect, so knowledge graph technologies need to effectively solve the problems of search, uncertain links and fuzzy data
  • Multi-modality and variety: creating knowledge graphs that naturally derive value by encompassing the greatest variety of data, including unstructured content and multimedia.

Arindam Choudhury, vice president and global leader for Big Data at Capgemini, explains: “As organisations start to pivot from being data driven to insights driven, time to market for delivering new insights is crucial. Platforms that enable the use of graphs, machine learning and AI to discover connections among disparate data sources to deliver insights, while maximising the use of investments in Big Data technology, will become the go to tools for organisations in 2018 and beyond.”

Joe Parry, CEO at Cambridge Intelligence, adds: “Companies are excited by the transformative potential of knowledge graph technologies, but that excitement often turns to frustration when they struggle to find the accessible, intuitive and powerful knowledge graph exploration tools they need. This initiative presents practical solutions to that challenge. Graph visualisation is the most powerful, flexible and scalable way to understand and explore graphs, bridging the gap between data and insight."

Coinciding with the introduction of the Transformative Enterprise Knowledge Graphs initiative, Siren released version 10 of its data intelligence platform, which has the ability to analyse data and provide knowledge graph capabilities without the need to move data from existing databases and Big Data systems.

John Randles, CEO at Siren, concludes: “Data is becoming a recognised asset in every firm in every industry, but there is widespread frustration when it comes to solving problems with data. If we can all agree a common set of principles, technology providers and enterprise users alike, it should help us all use data intelligence to drive business value.”

Show Author Info?: 
Author: ateamgroup
Posted: March 6, 2018, 5:36 pm

Time is running out: the compliance deadline for General Data Protection Regulation (GDPR) is May 25, 2018, and with fines running up to 4% of annual turnover or €20 million for firms that fail to protect European citizens’ personal data, no-one can afford to take this EU data privacy law lightly. From a data management perspective, market participants cite outstanding challenges as the regulation’s ‘right to be forgotten’, accountability, and getting the best technology in place to sustain compliance.

Essentially, GDPR requires firms to track down and secure all the personal data they hold on EU citizens, wherever they hold it. They must tell the individuals concerned they have the data and get their consent to use or share it. They also have to allow individuals to access their data, change it if it’s wrong and get it deleted or removed if there’s no compelling reason for the company to use it, the right to be forgotten. You can find out more about the regulatory requirement and response in A-Team Group’s GDPR Handbook.

Despite the threat of significant financial and reputational damage for non-compliance with GDPR, readiness for Day 1 is mixed. A poll taken during a recent A-Team Group webinar on GDPR showed 44% of respondents hoping to be ready by May 25, 25% expecting to be ready, 20% expecting to be ready but with numerous workarounds, 6% not expecting to be ready, and only 6% already ready.

Webinar Recording and Transcript: GDPR Programme Insights for GDPR Readiness

GDPR consultant Sue Baldwin, who is helping Lloyds of London with its preparations, says: “Everybody wants to be able to say they are going to be compliant by 25 May and that is not going to be possible.” Colin Ware, BNY Mellon regulatory product manager and former head of Barclays’ GDPR impact assessment, agrees: “There aren’t compliance gaps as such, it’s the scale of the programme that will undoubtedly mean that, although the ethos and the spirit of the regulation will be met by May, I will be astonished if anyone says, ‘well I’ve done absolutely everything’.”

The key question for data managers is: if you haven’t done absolutely everything, what should you focus on now to be GDPR compliant and escape the regulator’s wrath?

Right to be forgotten

One major data management challenge facing financial firms is GDPR’s ‘right to be forgotten’, which mandates that firms must, in most cases, act on individuals’ requests to delete personal data they hold unnecessarily. The problem is that this conflicts with the core principle of many other financial services regulations – such as Markets in Financial Instruments Directive II (MiFID II), BCBS 239, the second payment services directive (PSD2) and the US Consolidated Audit Trail (CAT) – that requires firms to collect and keep more and more data to demonstrate they are acting above board.

Featured Download: Poll results on GDPR Programme Insights for GDPR Readiness from our recent webinar audience

Ware says: “This is probably the biggest dichotomy. The right to be forgotten goes against a lot of the regulations that have been coming out around financial services. We’re being told we need to be more open and transparent, against this idea of ‘well, actually I don’t want you to remember any of my data’. This is a huge problem for financial services firms. At the end of the day, we can’t eradicate the data because to do so breaks compliance with every other regulation.”

Baldwin agrees: “One area that jumps out at me is data retention. There is conflict between what GDPR says and other regulations such as Know your Customer (KYC) and Anti-Money Laundering (AML). The biggest issue for me is this cross-regulation, where people are going to say, what regulation takes precedence here? It’s very difficult for organisations to know and that’s where they’re going to need more help from regulators. You don’t want to get hit by the Financial Conduct Authority (FCA) for not doing something.”

Until the Information Commissioner’s Office (ICO) offers clarity, Baldwin suggests firms should be very specific in their data retention policy and identify what they are going to do and how long they are going to keep data. She says: “If there is a conflict, perhaps you’re going to keep data longer than the individual may want, you have to say the reason we’re doing this is not because we’re following GDPR, but because we’re following other regulations.”

Ware says it may be beneficial to tighten controls on who within an organisation can access personal data. He also suggests obfuscation – where firms hide or protect data in a different system or different database, but don’t completely delete it – may be helpful.


Another key requirement of GDPR is to get ‘accountability’ right. UK data privacy regulator Elizabeth Denham calls this ‘the most important aspect of GDPR’. Essentially, accountability is the need for financial firms to ‘own’ data privacy and set up comprehensive measures to show they take the responsibility seriously.

According to the ICO, the actions needed to achieve this ‘may include’ auditing your data, staff training, and security measures like data minimisation and pseudonymisation.

But the fact that this advice is not entirely clear-cut, combined with the sheer scale of GDPR, means accountability remains a major data management challenge.

To demonstrate enough accountability and compliance by May, firms should follow the ICO guidelines and make sure they document all they have done to achieve this. They should also recognise that, while they have the right compliance and accountability programmes underway, they may well fail to complete the compliance task by May 25.

Baldwin says: “It’s a vast area. Most firms will have done their data audit and got to the stage where there are gaps in their evidence and documentation. Everyone’s trying to fix that by May 25. Bigger organisations may know they’re not going to make it by the deadline, but they‘ve got everything documented and know what’s on the timeline.”

Ware sees a similar scenario unfolding: “Firms will be training staff, reiterating how personal data should be handled and making sure clients are aware of what they are doing.” By May 25, he suggests: “Every company will be able to demonstrate compliance – demonstrate what they have done, the policies, procedures, training. But the bit that no company will really be able to say is, ‘If someone says right now I’m going to come and audit you across everything’, that's going to be OK.”

Ware believes larger financial firms could be working through their compliance programmes for the whole of 2018, after focusing first on securing highest-risk personal data. During the recent A-Team Group webinar mentioned above, he said: “Most financial services companies are taking a pragmatic view. They are taking a risk-based approach, looking first at areas with more sensitive and high-risk types of personal data, then they have plans to manage lower-risk areas going forward.”

The saving grace here is that if firms can show they are actively seeking to be ‘accountable’, the regulator is likely to go easy on them – even if they fail to achieve full auditable compliance by May 25.

Denham said as much in a recent speech: “Yes, GDPR gives me greater sanctions for those that flout the law. You can expect the ICO will uphold the law and there will be no grace period – you’ve had two years to prepare. But I know that when May 25 dawns, there will be many organisations that are less than 100% compliant. This is a long haul and preparations will be ongoing. If you self-report a breach, engage with us to resolve issues, can demonstrate effective accountability arrangements, you will find us to be fair. Enforcement will be a last resort.”

The right technology?

Compliance raises the question of which technologies data managers should be using to deal with GDPR. Baldwin suggests data mining tools are important, though not necessarily those tailored especially for GDPR. She says: “There are a lot of data management tools in the marketplace. These tools need to focus on data mining because what you’re looking at for GDPR is being able to mine specific privacy fields. Use these tools to pinpoint privacy data, rather than looking to GDPR specific tools.”

GDPR security expert Jamie Graves, CEO of software company ZoneFox, says: “The toughest challenge for data practitioners is balancing the need for data to be available versus ensuring it has necessary privacy and security aspects in place to protect it. A proactive approach is vital. User and entity behaviour analytics can make a huge difference to regulating data access and usage in an organisation. This type of technology builds up a baseline of normal user behaviour and alerts unusual behaviour, perhaps a sales report copied onto a USB drive, or someone connecting to payroll through public access Wi-Fi. Incidents like these can be flagged by user and entity behaviour analytics, making it easier to secure against sensitive customer data being exposed to risk and an organisation failing to meet its compliance duties.”

For firms still struggling with compliance projects and systems, the ICO website offers resources including guidance, checklists and sector-specific FAQs. Denham also says the ICO runs voluntary audits, so firms can check they are on the right track and identify weaknesses or red flags before they cause real problems. She concludes: “No strings attached and it’s free.”

* Visit: https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/

Show Author Info?: 
GDPR Handbook
Webinar Recording: GDPR Programme Insights for GDPR Readiness
Author: ateamgroup
Posted: March 6, 2018, 4:39 pm

Legacy systems and manual processes used across the insurance industry to market products and manage claims are close to breaking point, requiring insurance firms to increase the digitalisation of their operations and deliver robust, timely and compliant services to their clients. The webinar will consider the critical need for digitalisation and discuss data management approaches to improve the client experience and the implementation of automation to optimise internal efficiency. It will also touch on technology solutions and the benefits insurance firms can gain from digitalisation.

Webinar Date: 
Tuesday, April 17, 2018 - 15:00
Author: ateamgroup
Posted: March 5, 2018, 5:46 pm

Financial services firms may be trailing the field when it comes to exploiting digitalisation to improve client experience and there are obstacles to success, but the application of analytics to behavioural data can help firms drive business intelligence and improve client outcomes.

Author: ateamgroup
Posted: March 5, 2018, 4:02 pm

AIM Software is building out its North American operations with a view to replicating the success of its European business. The company has recruited a number of senior executives to further its drive into North America and continues to invest in R&D and engage with its user community to develop enterprise data management applications.

To increase momentum in the US and Canada, AIM has hired Sanjay Vatsa as head of Americas. He joined the company in New York a couple of months ago, having previously held leadership positions at Citibank, BlackRock and Merrill Lynch. In his role at AIM, Vatsa will focus on expanding the company’s footprint in North America and ensuring success for the region’s user community. He is joined in the New York office by Jose Manso as sales director, and Jared Geer as sales executive. AIM’s senior product specialist, Adrian Englisch, has relocated from headquarters in Vienna to New York as director of professional services.

We caught up with Vatsa and Gayatri Raman, CEO at AIM Software, to discuss the company’s approach to North American markets and its broader global development. Vatsa says: “I joined AIM because of its value proposition. AIM provides business applications in the reference data space that promote straight-through processing and reduced total cost of ownership, and have configurable business rules enabling flexibility to meet the varying needs of our customers. AIM solutions can be implemented with minimal change to the out-of-the-box functionality. This productised approach to data management has found favour in Europe and is gaining traction in North America.”

Talking about the context of the company’s growth plans for North America, Raman explains: “We are owned by US private equity firm Welsh, Carson, Anderson & Stowe (WCAS). The premise for WCAS’ investment in 2015 was to replicate AIM’s success in Europe in North America. AIM’s focus on business applications for the buy-side, backed up by a leadership team with expertise in Europe and North America, is helping AIM accelerate its growth in North America.”

Having built a team of 10 product, services and sales specialists in New York, and with plans to add more staff, AIM intends to improve its understanding of the client community and deepen its credibility. The company has six clients in North America – two wealth managers, one custodian, two asset managers in the US and one in Canada – and a total of about 100 worldwide, including both direct users of its GAIN enterprise data management platform and indirect users accessing the company’s solutions through partners.

Raman says the company’s success is based on its productised, modular business applications focused on the buy-side, and continual engagement with its user community on product direction. Her goal is to make AIM the market leader in data management for buy-side firms across Europe, North America and Asia Pacific. This, she says, demands focus on innovation and investment in R&D. AIM invests 30% of revenue in R&D.

Looking forward, the company is working on proof of concepts to discover new technologies that could benefit its approach. Raman explains: “We want to partner with or integrate solutions from vendors with innovative technologies that don’t have the balance sheet to claim the market.” Some of these technologies could power new applications AIM is planning to bring to market in the second quarter of this year, adding to existing applications including reference data management, portfolio pricing and analytics, corporate actions data and legal entity data management.

Show Author Info?: 
Author: ateamgroup
Posted: February 28, 2018, 9:51 am

The May 25, 2018 compliance deadline of General Data Protection Regulation (GDPR) is approaching fast, requiring financial institutions to understand what personal data they hold, why they process it, and whether it is shared with other organisations. In line with individuals’ rights under the regulation, they must also provide access to individuals’ personal data and be ready to change, delete or explain its use.

Author: ateamgroup
Posted: February 26, 2018, 4:37 pm

In case you missed it, the Financial Conduct Authority (FCA) has made a call for input on how technology can make it easier for firms to meet regulatory reporting requirements and improve the quality of the information they provide.

The call for input outlines a proof of concept (POC) developed by the FCA and Bank of England during a two-week TechSprint late last year to examine how technology can make regulatory reporting more accurate, efficient and consistent. The POC demonstrates how regulatory reporting requirements could be made machine-readable and executable. This means firms could map reporting requirements directly to the data they hold, creating the potential for automated, straight-through processing of regulatory returns.

The TechSprint participants suggest this could, for example, improve the accuracy of data submissions and reduce their costs, allow changes to regulatory requirements to be implemented more quickly. They also note that a reduction in compliance costs could lower barriers to entry and promote competition.

The FCA call for input outlines how the POC was developed and asks for views on how the FCA can improve on it. It also seeks feedback on broader issues surrounding the role technology can play in regulatory reporting.

Christopher Woolard, executive director of strategy and competition at the FCA, says: “Technology is a powerful shaper of financial regulation, able to make compliance simpler and more efficient. Our TechSprints bring people from across the financial services world together to share their collective knowledge to solve common problems.’

The call for input closes on June 20, 2018. A feedback statement summarising the views received and proposed next steps will be published later in the summer.

Show Author Info?: 
Author: ateamgroup
Posted: February 26, 2018, 2:35 pm

Despite the compliance deadline of the Fundamental Review of the Trading Book (FRTB) being pushed back to 2020 by many national regulators, the time to address the data management challenges of the regulation is now. The webinar will discuss key elements of FRTB – including risk models, liquidity horizons and data sourcing for risk calculations, back testing and hedging – and consider the data management challenges these present. It will also identify solutions to the challenges and consider how best they can be implemented and to what advantage.

Webinar Date: 
Tuesday, June 19, 2018 - 15:00
Author: ateamgroup
Posted: February 23, 2018, 5:13 pm

Regulatory compliance is non-negotiable and business performance based on a solid understanding of data is no longer a ‘nice to have’ but critical to success. The webinar will consider the challenges of sourcing and managing data for both compliance and performance, identify areas where data can be processed once for both purposes, and discuss data management techniques and technologies that can reduce the cost of compliance and promote business performance.

Webinar Date: 
Thursday, June 7, 2018 - 15:00
Author: ateamgroup
Posted: February 23, 2018, 4:42 pm

By: Roy Kirby, Senior Product Manager at SIX

When major regulatory deadlines loom large, there's an inevitable tendency for the financial industry to scramble for minimum viable compliance. In layman’s terms, this means doing whatever it takes, regardless of the expense, just to keep the prying eye of the regulator away. Ring any recent bells?

The trouble is, while taking this approach may seem like a sensible option now, it’s unlikely to service future requirements and actually goes against the spirit of the regulations. This is why, as the post-January 3, 2018 dust starts to settle, financial institutions need to quickly adjust to ensure compliance with all regulations, not just Markets in Financial Instruments Directive II (MiFID II).

In order to adapt to achieve sustainable long-term compliance, firms cannot afford to keep adding to the vast array of information already housed across multiple systems every time a new rule is enforced. After all, regardless of the rule in question, they all require overlapping sets of data.

Instead, firms need to clean up the siloed information scattered across the business and consolidate their approach. And be under no illusions, as the local regulators begin to shift their focus from just demanding data consistency, to seeking both data consistency and quality, now is the time to reassess.

In response to this, expect firms to be piling the pressure on data vendors to provide one really strong source of information. The problem is, as highlighted by our end of year survey of over 100 sell- and buy-side participants, over a third of firms are still addressing regulations separately with their own data and systems. If this is still the case, how can the sector even begin to think about moving beyond just doing the bare minimum to be compliant?

It may well be that the sheer scale and complexity of MiFID II becomes the catalyst for change among these firms. Since the implementation deadline, there are certainly signs that institutions are adopting a more consolidated approach as they begin to re-evaluate what more can be done with their reference and market data.

It is not hard to see why. Embracing a standardised, more scalable data service that enables firms to extract the reference and pricing information needed for each regulation is an obvious next step. The crossover between MiFID II and the recently implemented Packaged Retail and Insurance-based Investment Products (PRIIPs) regulation is a prime case in point. A lot of the data market participants are currently distributing for MiFID II is already reflected under PRIIPs.

As the industry navigates its way through these unchartered post-MiFID II waters, it is clear that those who took the tick box approach to January 3rd may well feel significant cost and operational ramifications further down the line. The challenge for these firms is that MiFID II is just one of the many requirements intertwined like a plate of regulatory spaghetti. While there may be somewhat of a post-implementation day lull in the air currently, this will not last for long. As and when the next piece of legislation lands, firms can’t just layer on top of each layer as eventually the entire stack of compliance cards will come tumbling down.

With this in mind, the natural next step is to look at the role of mutualised approaches to the challenge, through which key industry players can bring the required information together. This is exactly why firms grappling with this regulatory onslaught should be challenging their data partners to provide exactly what they need right across the business, in the form they need it. After all, the age of ready-to-consume data has very much arrived, so there should be nothing stopping financial institutions from using it.

Show Author Info?: 
Author: ateamgroup
Posted: February 22, 2018, 3:32 pm

The US Commodity Futures Trading Commission (CFTC) and the UK Financial Conduct Authority (FCA) have agreed to collaborate and support innovative firms through their respective fintech initiatives, LabCFTC and FCA Innovate.

The so-called Cooperation Arrangement on Financial Technology Innovation (FinTech Arrangement) supports both regulators’ efforts to facilitate responsible fintech innovation and ensure international collaboration on emerging regulatory best practices. It also shares information on fintech market trends and developments, and facilitates referrals of US and UK fintechs interested in entering each other’s markets.

CFTC chairman J. Christopher Giancarlo says FCA Innovate is ‘the gold standard for thoughtful regulatory engagement with emerging technological innovation.’ This is the first fintech arrangement the CFTC has made with a non-US counterpart. Giancarlo says: “By collaborating with the best-in-class FCA fintech team, the CFTC can contribute to the growing awareness of the critical role of regulators in digital markets.”

Andrew Bailey, chief executive of the FCA, adds: “International borders shouldn’t act as a barrier to innovation and competition in financial services. That is why agreements like the one we have signed with the CFTC are so important. As our first agreement of this kind with a US regulator, we look forward to working with LabCFTC in assisting firms, both here in the UK and in the US, that want to scale and expand internationally in our respective markets.”

As part of the arrangement, the FCA and CFTC are planning to host a joint event in London to demonstrate how firms can engage with both regulators.

Show Author Info?: 
Author: ateamgroup
Posted: February 20, 2018, 2:11 pm

Silwood Technology, provider of Safyr metadata discovery software, has turned its attention to how firms running vendor application packages can meet the May 25, 2018 compliance deadline for General Data Protections Regulation (GDPR).

The company has researched five large and widely used application packages – SAP, JD Edwards, Microsoft Dynamics AX 2012, Siebel and Oracle E-Business Suite – to determine how difficult it will be to identify personal data in the applications (rather than databases) as required by GDPR. The terms data of birth and social security number were selected for research purposes, although many other elements of personal data could be used, and searches were performed to see how often they appeared. Safyr retrieves metadata about each application from the application layer, including any customisation, and can return searches in a few minutes.

Silwood’s research looked at several instances of each package to provide an indication of how many occurrences of each personal data field might be found in a typical system. By way of example, it found there are typically more than 90,000 tables in a SAP ERP system and over 900,000 fields. Social security number appeared in over 900 tables and date of birth in over 80 tables.

Nick Porter, founder and technical director at Silwood, points out that less than 1% of a typical SAP system contains personal data, but it could cause GDPR data breaches costing an organisation up to 4% of its annual turnover. While some firms are approaching the discovery of personal data manually, Porter argues that at this stage in the GDPR game, automation is the only way to reach compliance on time.

He says: “Silwood and Safyr are a small, but important part of GDPR compliance. The market is short on data discovery tools, which are often the elephant in the room, but we bring automation to identifying personal data, not just for GDPR, but for programmes that need to govern data but must first find the data.”

Safyr acts as a repository of metadata from a vendor package and identifies where the data is in the system. With the data discovery task done, Silwood exports the metadata to partners and resellers offering data analysis and governance for GDPR. These include ASG Technologies, IBM, Adaptive, Datum and Erwin, many of which use Safyr as a scanner for personal data embedded in vendor application packages.

Silwood has also released a Safyr GDPR Starter Pack for SAP users trying to find personal data in their ERP systems and will soon release a starter pack for JD Edwards. Starter packs for the other vendor applications mentioned above are in development.

Show Author Info?: 
Author: ateamgroup
Posted: February 20, 2018, 10:46 am

Contact Us



Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004


(347) 770-1748

(212) 931-0572





If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.