“…We engaged Epsilon to be our strategic partner…. I am happy to say that Epsilon met or exceeded our expectations on this project. We successfully completed the RFP project and selected a system that was right for our business…”

FHLB
Chief Financial Officer

Hot Topics

Data Management Review

Data Management Review (formerly Reference Data Review) is your single destination for knowledge and resources covering data management approaches, trends and challenges as well as all the regulations impacting financial data management for the enterprise.

A-Team Group’s Data Management Summit hit an all-time high in New York City last week with a packed auditorium, buzzing exhibition hall, great line-up of speakers, market leading sponsors and interactive discussion throughout the day on how to deliver the next level of data management and business success.

Keynote presentations

Suvrat Bansal, head of innovation, chief data officer and managing director at UBS Asset Management, led the conference with an inspirational keynote on the firm’s journey towards effective and added value digital transformation. In line with UBS Asset Management’s long history, the focus of the transformation programme is on providing innovative investment products and services for clients, and improving client service. The data strategy involves people and information, and is designed to equip staff with the digital information they need, when they need it and make everyone more productive.

A second keynote, presented by Brennan Carly, global head of enterprise at Thomson Reuters, discussed why trusted data belongs in the cloud. He noted the cost of data consumption and management at four to eight dollars for every dollar spent and argued the case for cloud in terms of scale, agility, reduced costs and greater data integrity. Reviewing results of a recent survey on cloud take-up by Thomson Reuters, Carly described the surge in adoption of public cloud solutions over the past few years as cloud providers have improved data security to a level on a par with, or greater than, enterprise data security. To succeed in the cloud, he recommended a multi-vendor strategy and strong data governance.

Peter Moss, CEO of the SmartStream RDU, presented the summit’s final keynote, asking delegates if their firm’s data management strategy is effective. He reviewed today’s data management challenges, including rising customer expectations around electronic transactions, the need to implement data mining and artificial intelligence solutions to differentiate your trading business, and the regulatory requirement to provide full visibility into your systems, processes and data. Reflecting Bansal’s keynote, he also touched on the CEO desire for firms to be ‘digital’. Based on the scale of these challenges, Moss emphasised the need for a strong master data foundation that can be delivered by a move from sometimes dated enterprise data management systems to the next chapter of data management, the utility model.

Speakers and panel sessions

Harry Chopra, chief client officer at AxiomSL, presented a session covering how to improve data rivers and the beneficial role of regulatory initiatives across multi-business line financial institutions. He described the challenges of profitability, risks, liabilities and liquidity faced by consumer banks to managing settlement risk in capital markets, and managing suitability, credit and tax reporting in private banking and wealth management. Look at solutions, he said a data entity and control platform can address the challenges, while regulatory initiatives provide a catalyst to integrate data rivers across risk and finance, optimise risk and regulatory spending by reusing data, and yield business intelligence.

The summit’s panel sessions were informative, interactive and not without a touch of humour. Predrag Dizdarevic, partner at Element 22, followed Bansal’s keynote with a CDO panel that discussed how to add value to data by using metrics and analytics. Dessa Glasser, principal at Financial Risk Group, moderated a session on the regulatory landscape, which covered the data management implications of existing and forthcoming regulations, such as the Fundamental Review of the Trading Book (FRTB), and considered how firms are coping with compliance and whether regtech can help.

The summit chair, Sarah Underwood, an editor at A-Team Group, hosted a panel on identifiers and data standards, which agreed that the utopia of data standardisation is unlikely to be achieved. She also moderated a lively panel looking at the challenges of data lineage and how to get it right.

A discussion on diversity in data moderated by Jennifer Ippoliti, legal practice data management director at JP Morgan Chase, set out statistics that caused a sharp intake of breath, but countered them with proactive ideas and long-term plans to improve diversity across the workplace.

The final panel sessions of the day focussed on the future, with David Blaszkowsky, an independent consultant, leading an upbeat conversation on the use of innovative technologies such as machine learning and artificial intelligence to revolutionise financial organisations. Dale Richards, president at Island 20 Ventures, kept the ball rolling with a discussion on the potential of alternative data and its use by data scientists, but also the problems of incomplete, low volume and inaccurate data.

With the summit’s keynotes, presentations and panel discussions complete, and more delegate questions asked than could be answered during the conference, conversations continued during an animated drinks reception hosted by A-Team Group and joined by many of the summit participants.

If you would like to catch up with the latest opinions and guidance on data management, podcasts of some of the sessions from the New York Data Management Summit will be available soon. To make sure your secure a place at our next Data Management Summit, which will be in London on 21 March 2019, register now.

Show Author Info?: 
No
Related: 
Data Management Summit (DMS) - London, 21st March 2019
Author: ateamgroup
Posted: September 25, 2018, 9:15 am

The Derivatives Service Bureau (DSB), founded by the Association of Numbering Agencies (ANNA) to facilitate the allocation and maintenance of International Securities Identification Numbers (ISINs) for OTC Derivatives, is looking to expand its committee to include custodians and data vendors in 2019.

It is now inviting applications to participate in its Product Committee from January 8th as part of a scheme to encourage broader industry representation beyond the existing buy-side, sell-side and trading venues membership it has. The result would be to increase the number of voting members from 9 to 15. Participation is also open to trade associations as non-voting members.

The Product Committee discussion will move in 2019 from developing ISIN products and defining product templates, to solving additional OTC Derivative ISIN use cases and examining the introduction of hierarchies, as well as discussions on what to prioritise. The membership expansion follows previous calls from the industry to be involved in developing the DSB ISINs Road Map.

“Working successfully with industry over the past eighteen months, the DSB has produced a fully automated open and easily accessible near-real-time allocation of ISINs for 82 product templates,” said Malavika Solanki, a member of the DSB Management Team, who spoke this week at our A-Team Group TradingTech Briefing entitled “MiFID II: Interacting with the New Market Structure” in London on Tuesday. “By expanding the composition and structure of the Product Committee and including trade associations in discussions, the DSB can continue to remain agile and flexible as industry’s use of the DSB service evolves.”

“Having broader industry participation on the Product Committee to discuss the best possible ISIN creation and use will only serve to bring greater transparency and efficiency to the OTC derivatives market,” added Emma Kalliomaki, Managing Director of ANNA and the DSB.

The new Committee will be announced on December 4th 2018. Interested parties have until November 2nd to submit their applications.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 20, 2018, 11:42 am

FactSet plans to roll out additional elements of its cloud-based FactSet Data Exploration platform that will deliver a fully hosted application production environment over the next six months.

The company introduced the platform on Microsoft Azure this summer offering content from the Open:FactSet Marketplace as well as programming tools including Microsoft SQL, RStudio and Python. Its aim is to allow users to evaluate financial and alternative data sets and build investment applications efficiently without the need to invest in IT infrastructure to test and adopt new datasets.

Two months after the introduction of Data Exploration, which can be used by clients with or without FactSet terminals, Rich Newman, senior vice president and global head of content and technology solutions at FactSet, says the platform is gaining interest. This is coming from large buy-side asset managers and hedge funds, as well as sell-side investment banks and advisory firms.

The company plans to add more datasets including alternative data to Open:FactSet Marketplace for use in Data Exploration, although Newman notes the need to scrutinise all data before it is brought into the marketplace and integrate the data to provide a single data environment.

The platform’s roadmap moves on from offering access to data to providing a data testing environment in the cloud, then a research environment to which firms can add their own data, and finally a production environment fully hosted in the Azure cloud and managed by data scientists building investment applications.

Newman concludes: “The platform is designed to help firms build investment strategies that generate alpha. It supports speed of development by moving production into the cloud, while Azure provides unlimited capacity for clients to test data.”

Show Author Info?: 
No
Author: ateamgroup
Posted: September 20, 2018, 8:25 am

By: Clive Bellmore, CEO Europe and Africa, BackOffice Associates

Analogies about how vital data is to the digital economy are bountiful. It has even been branded ‘the new gold’, a critical and precious asset that directly impacts business outcomes for companies across the globe and in all sectors. And never more so than in the capital markets sector.

Electronic trading alone generates millions of messages every day. In addition, volumes of transactional data are rapidly growing in response to increasing numbers of transactions. The bulk of this data is scattered across various departments, geographies and systems – with the quality often varying.

Putting data in the hands of employees across a company can be a powerful thing, but ensuring that it’s refined and clean can be a complete game-changer. As well as being the base for operational, risk management and trading decisions, data is intrinsic to the reporting requirements capital market companies have for clients and regulatory bodies. Given the advent of MiFID II and PRIIPS, reporting regulations have increased significantly.

Recent research from State Street Corporation shows the majority (88%) of asset managers see data requirements as a challenge to their distribution strategies, while more than a third (36%) agree that MiFID II will make cross-border activity more difficult.

Poor data quality is a potential risk for senior executives and business leaders alike. Not only can it have a potential impact on business performance, but within the capital markets sector dirty data - for example, inaccurate, incomplete or inconsistent data - could expose CEOs and chief financial officer to regulatory disclosure risk. Despite this, there is still a large number of companies that seem unable to properly refine their data, and aren’t giving data the respect it deserves.

This needs to change. What is lacking in the sector is a data-first mindset. Once instilled within an organisation, it can ensure that not only is the data refined to the best possible quality, but that it retains its commercial value while adhering to regulatory requirements. 

So how can this be done?

Creating, or changing, a mindset can be daunting. Even when it’s done for both regulatory and revenue reasons. For it to be a success, it needs to be reinforced and accepted by everyone in the company, starting from the top down.

It may cause an eye-roll or two, but education needs to be the starting point. Often this lies firmly at the feet of the chief financial officer or chief data officer, but given the fact the chief marketing officer is using data on a daily basis, it needs to be a team effort. Everyone needs to play their part. Together they need to clearly communicate the breadth of risks that poor data quality creates to the entire organisation.

Once you’ve educated the organisation on the importance of data and why each team should care, another way to reinforce and establish change is to appoint a ‘data advocate’ within each team. This must be someone who is willing and able to push the needle and help reinforce the message and cultural change.

Having buy-in from the board should also help to release the budgets needed to develop, implement and maintain ongoing and consistent data governance strategies and best practices. In order for these to work and be accepted, it is vital realistic time frames are put in place. Contingency plans also need to be in place for team members who may be pulled away from their data duties when any extraordinary demands from the business occur. This will also ensure the whole business sees the importance that is being placed on data.

Finally, and perhaps most importantly, is the need to implement technology that can help take on some of the heavy lifting. Data quality management services are often highly flexible and can be customised to meet the specific needs of a company. Alongside providing best practices in data quality and the correct process to remediate data errors and securing data, the ability to minimise the need for internal organisation staffing is precious. Especially when it comes to developing and maintaining a data-first mindset throughout a business, it helps to take the perceived burden off already stretched teams and their resources.

Making data central to any capital markets business is essential. Once it becomes the core of the company culture, it can be intrinsic in unlocking valuable revenue opportunities, and help keep the regulatory wolves from the door.

It’s also a lot easier and less painful than many companies may think. With a solid and fully supported education campaign, feasible data governance strategies, best practices and the right technology, developing a data-mindset should be a walk in the park. Companies within the sector need to get working on putting this in practice and cannot rest on their laurels, the risks are simply not worth it.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 19, 2018, 2:56 pm

Dr Kay Swinburne, the Welsh Conservative MEP known for her instrumental role in the architecture of MiFID II, has issued a stark warning on the highly politicised environment to be expected over the next six months.

As Brexit negotiations take centre stage and technical considerations become subsumed by partisan agendas, Swinburne further stressed the potential impact this approach could have on MiFID market developments.

Addressing over 150 industry players in her keynote speech at our very own A-Team Group TradingTech Briefing entitled “MiFID II: Interacting with the New Market Structure” in London yesterday, she emphasized that: “Politics are being prioritised in the nitty gritty of legislation like never before. The next six months will be unprecedented disruption and it will be very, very turbulent… my advice to you is to keep your head down.”

Swinburne noted that the UK’s departure from the EU in 2019 would bring substantial changes to a system already in flux. The leaving date in March 2019 will be followed by the election of a new EU Parliament in May, along with a whole new set of EU Commissioners and a new President of the EU Commission, and the addition of extended powers for EU supervisory authorities under the next mandate. A new Chair of the European Central Bank (ECB) will succeed Mario Draghi in October, with a subsequent board level reshuffle. In short, “everything will change” and a lot of people will be coming to the table “cold” with little knowledge of MiFD II and new agendas to promote – which is why, Swinburne urged, the industry must present clear, independent and transparent data and educate the new cohort in an unbiased manner to establish trust.

But there are other challenges ahead. For example, how will thresholds be calibrated following a UK exit? Will the EU decide to include or exclude the UK’s data (and, based on the recent example of Switzerland, to which side would a data exclusion actually be detrimental?).

“It is a strange situation we are in,” mused Swinburne. “After Brexit I think the UK will stay fairly closely wedded to MiFID II and the principles within it. I am not so sure our EU colleagues will, however, so it is a question of wait and see. The UK was a great supporter of MiFID II and I think will continue that spirit going forwards. But if I had known that we would be using the Third Country Provisions in MiFID II ourselves, I might have spent a little more time and detail in writing them!”

The Third-Party Provisions present one of the key concerns in a post-Brexit MiFID landscape.

Since the UK’s EU Referendum in June 2016, the approach from EU countries towards the provisions has been varied. “In some, it has been ignored altogether,” explained Swinburne. “In others, complex tiering systems have been proposed – for example, replacing a country by country approach with a company by company system for those firms with systemic importance.”

Another worry is that with the absence of the UK, the capital markets focus in the EU may not be the same going forward. “Without the UK putting data on the table and talking in a very open and pragmatic way, MiFID II could start to look very different,” Swinburne warned her audience. “So you as an industry need to supply that data and tell them what you need. You have to set the tone for future iterations of MiFID and direct where it goes to. It is really important that the UK remains involved, and really important that as we start to exit, we find ways of re-engaging and keeping that information and data flowing.”

Before we look at future iterations, we must evaluate current performance – and according to Swinburne, it is too soon to tell how MiFID II has impacted the market. “MiFID II was not a post-crisis piece of work, and it predated even the financial crisis. It was always supposed to be part of a much broader package of reforms, and part of a bigger look at the trading environment in Europe. It was always envisaged as a never-ending project, and we knew that it was a massive piece of work from the outset – it was never going to be easy, straightforward or definitive,” she explained. “We spent nine years working on MiFID II and now we are being asked to play judge and jury as to its effects, just nine months after implementation? Yes, there are teething problems, and we all have questions, but it is too early to say whether it has been successful yet.”

All the same, some adjustments are already in motion – in particular the SI tick size regime, which is due to be addressed in an amendment to the EU Investment Firm Review (IFR) proposed by German MEP Marcus Furber. “The reality is that even before MiFID II was implemented in January, we knew there was a tick size issue,” admitted Swinburne. “We needed to find a legislative vehicle to implement the change, which is why we are using the IFR. We do believe that Systematic Internalisation (SI) activity goes against the spirit of MiFID II legislation – nobody in Parliament believes that marginal incremental improvements in price justify giving SIs special status. Parliament has decided it is not going to turn a blind eye to this. We will have that change coming very shortly.”

Looking ahead however, Swinburne played down the impact of a new iteration; stressing that “MiFID III” was unlikely to bring the same level of regulatory revolution as its predecessor.

“MiFID III inspires a lot of fear in absolutely everybody,” she noted. “When it comes into effect, it will be about politics as well as about the impact on the market – the timeframe will coincide with Brexit and the changes will be about Brexit and how the market must adapt…

“But it will also be about expansion and finessing – it won’t be a major piece of work. MiFID II broke all the rules for a follow-on piece of legislation – reviews are built into every piece of EU legislation and usually each review makes very small, technical changes that build on each other.

“MiFID III next time round will be no exception.”

Show Author Info?: 
No
Related: 
RegTech Summit - London, 4th October 2018
RegTech Summit - New York City, 15th November 2018
Data Management Summit (DMS) - London, 21st March 2019
Author: ateamgroup
Posted: September 19, 2018, 10:29 am

Get ready as the second annual A-Team Group RegTech Summits are fast approaching. Held in both London and New York, 2018 promises to be a bumper year as the aftermath of MiFID II, the upcoming FRTB, the chaos of GDPR and the opacity around CAT (to name just a few) create a raft of questions about the implications for data and technology to which capital markets urgently need the answers.

On October 4, 2018 at the Guoman Tower Hotel in London, the RegTech Summit will provide a valuable forum for exactly these topics. Following on from the success of the inaugural RegTech Summit last year, Europe’s leading RegTech conference will bring together over 400 practitioners managing regulatory change and implementing RegTech solutions to discuss current state of play on adoption in capital markets and how they can leverage the full potential of RegTech to move forward.

Stephane Malrait, MD and Global Head of Market Structure and Innovation in Financial Markets at ING, will open the day with a keynote address exploring the relationship between RegTech itself and the wider regulatory landscape, and asking how market players and solutions providers can better collaborate to create a genuine impact within the RegTech space.

Other discussion topics include:

  • An evaluation of sustainable MiFID II compliance;
  • Best practices for regulatory reporting efficiency;
  • The potential of AI and machine learning to advance RegTech solutions;
  • The evolution of trade surveillance;
  • How to fight financial crime by improving AML and KYC;
  • The much-needed creation of a regulatory change management framework to manage risk and navigate change; and
  • A RegTech Showcase will add insight into emerging compliance solutions.

Distinguished speakers include Kevin Taylor, Head of Compliance for Europe and Asia Pacific at TD Bank; Jean-Marc Guiteau, Global Head of Regtech Innovation & Development at BNP Paribas; Paul Clulow-Phillips, Managing Director and Global Head of Capital Markets Surveillance at Societe Generale; Nicole Sandler, Vice President of Fintech and Regtech at Barclays; Che Sidanius, Global Head of Financial Regulatory & Industry Affairs at Thomson Reuters; Peter Moss, CEO of SmartStream RDU; Beju Shah, Head of Data Collection & Publication at the Bank of England; Rabya Anwar Former Head of Regulatory Change and BREXIT, Investment Management Firm; Justin Nathan, Chief Technical Surveillance Officer at Credit Suisse; Targ Patience, Group Chief Compliance Officer at the Gibraltar Stock Exchange; and Giles Spungin, Managing Director and Global Head of Regulatory Compliance and Operational Risk Analytics at HSBC.

The event will also feature industry-leading RegTech sponsors including Thomson Reuters, The SmartStream Reference Data Utility, Fenergo, Asset Control, Synechron, Verint, RegTek Solutions, Bertin, Datactics and Aquis Technologies.

Just a few short weeks after the London Summit, A-Team Group heads Stateside to finish off the season with the RegTech Summit for Capital Markets in New York City on November 15, 2018 at Convene in the Financial District.

Key speakers include Sophia Bantanidis, EMEA Head of Regulatory & Market Strategy and Citi Accelerator Mentor at Citi; Brad Giemza, Chief Risk Officer at R.J. O'Brien; Robin Doyle, Managing Director, Office of Regulatory Affairs at JP Morgan Chase; Raymond Hanson, Managing Director, Global Head of Global Markets Program Delivery & Regulatory Technology at Credit Suisse; Joshua Beaton, Head of Americas Trade and Transaction Reporting at Morgan Stanley; Laura Glynn, Director of Global Regulatory Compliance at Fenergo; and many more.

If you are not already registered, visit https://datamanagementreview.com/events/regtech-summit-capital-markets-london/book to reserve your place for the London event and for the New York event.

Show Author Info?: 
No
Related: 
RegTech Summit - London, 4th October 2018
RegTech Summit - New York City, 15th November 2018
Author: ateamgroup
Posted: September 19, 2018, 9:33 am

Digitalisation is at the top of the agenda at many large capital markets firms, but how can it be achieved without disruption and with outcomes that benefit your clients, staff and stakeholders? Suvrat Bansal, head of innovation and chief data officer, and managing director at UBS Asset Management, will share his experience of leading a major digitalisation programme at the asset manager at this week’s A-Team Group Data Management Summit in New York City.

To give you a little insight into Bansal’s success story at UBS, we talked to him ahead of the event to find out how he has approached and is implementing digitalisation across the firm. The focus, in line with UBS Asset Management’s long history, is on providing innovative investment products and services for clients, and improving client service. The data strategy involves people and information, and is designed to equip staff with the digital information they need, when they need it and make everyone more productive.

Bansal explains: “We have approached digitalisation in two ways: edge cases, such as UBS Partner, where we are developing new digital advisory services for our clients; and digital transformation of our current businesses, which requires significant engagement, staging and careful execution. Executing these in parallel with data at the centre is ultimately where we add most value.”

To create an inclusive programme, Bansal decided on a relatively simple and easily understood implementation policy. The policy creates central capabilities in analytics, master data and data governance. These capabilities are then offered to business and functional leaders for their digital transformation efforts prioritised by value drivers. The goal is to provide access to the central capabilities in data and technology, empowering divisional leaders to develop digitalisation in their areas.

Bansal says: “Each transformation effort is defined as a 'service pod'. Each pod develops metrics on the state of its business area and what it should look like after digitalisation. Triangulation of service pods, central data capabilities, and technology, drives organic transformation internally.”

The programme is investing in new technologies where appropriate and making best use of open source solutions. It is also looking actively at the integration of alternative data to drive new products and services, but only if they work for the firm’s investment products and clients. As Bansal concludes: “The drivers behind optimal digitalisation are the same as always, improvements for our clients.”

Show Author Info?: 
No
Related: 
Data Management Summit (DMS) - New York City, September 20th 2018
Author: ateamgroup
Posted: September 17, 2018, 5:36 pm

Bloomberg has responded to customer calls for easier access to data with Enterprise Access Point, an online platform that provides normalised reference, pricing, regulatory and historical datasets to Bloomberg data license holders. Following the launch of the platform last week, we caught up with Gerard Francis, global head of enterprise data at Bloomberg, to find out more about the service and its potential going forward.

He says: “Enterprise Access Point responds to customer challenges of understanding what data they have already licensed and what data would be useful to them, and normalising the data – we do that for them. The platform also makes data directly programmable for developers and data scientists.”

Francis describes Enterprise Access Point as a managed service and notes that it makes no changes to the company’s data license model. It uses open technology standards to encourage adoption, covers all data except real-time data, and is based on the Bloomberg cloud, allowing clients who are permissioned to pull data directly from the platform’s website. Francis comments: “For existing clients, data is easier to access and integration and normalisation costs are reduced, if not completely eliminated. For new clients, the platform makes data very accessible very quickly.” Qualifying cost reduction, he says the industry norm is that every dollar spent on data requires a further $5 to $7 dollars to make the data ready to consume. Enterprise Access Point reduces that cost.

By pre-preparing data, the platform allows users to browse quality data online, examine the metadata, trial sample datasets prior to acquisition, and immediately put them to use. If a user is not licensed to use particular data, the top 10 rows of the data can be accessed to give the user a feel for whether it could be useful and whether to subscribe for the data.

For business users, access to the data is provided by a RESTful API. Francis suggests use cases including improved risk management.

For developers and data scientists, data from Enterprise Access Point is available as CSV data frames and supports multiple technologies including Jupyter and Python Pandas. For professionals leveraging artificial intelligence (AI), the data is also available in a graph format. Web developers using the service can benefits from Bloomberg’s RESTful Hypermedia API, which allows URL consistent data to feed directly into an enterprise’s software components, including machine learning tools.

With historical datasets covering the past 10 years, Francis notes potential use of the platform by not only data scientists, but also quants and compliance teams working on Fundamental Review of the Trading Book (FRTB) regulation.

Matthew Rawlings, chief data officer in Bloomberg’s enterprise data department, says: “Having access to deep data history is critical for any investing or business governance strategy based on data science insights. By providing consistent data feeds along with history through API protocols, Enterprise Access Point allows scientists to apply data models with greater confidence and efficiency.”

Enterprise Access Point initially offers Bloomberg data, including some alternative datasets, but this is expected to change over time as more and different data is added to the platform.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 17, 2018, 10:48 am

The Derivatives Service Bureau (DSB) is planning to develop a strategic technology roadmap for OTC derivative identifiers. The roadmap is a response to feedback from the DSB’s latest industry consultation that has also led to the creation of a Strategy Subcommittee of the DSB Technology Advisory Committee (TAC) that will take forward the technology-related work of the original ISO standards working group SG2.

The subcommittee will follow an open and inclusive process with a diverse stakeholder base, and will take into consideration other relevant regulatory and industry initiatives that the DSB should aim to be consistent with.

Marc Honegger, TAC sponsor and board member of the DSB, says: “Having successfully enabled OTC derivative users to meet their MiFID II reporting requirements in 2018, we now look forward to working with industry in 2019 to develop the DSB strategic technology roadmap.”

Membership of the TAC Subcommittee will be announced on October 11, 2018, with the first meeting due to be held on November 8, 2018. The meeting agenda, full list of TAC Strategy Subcommittee participants, and records of meetings will be available on the DSB website.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 14, 2018, 9:20 am

Nearly a year after Marlin Equity Partners put Asset Control up for sale with a price tag of £100 million – an astonishing eight times EBITDA – in November 2017, it has found a buyer in Sovereign Capital Partners, a UK-based private equity firm with a track record of acquiring high quality businesses and supporting their growth. The price of the purchase was not disclosed.

Under Sovereign’s management, Asset Control’s existing management team will continue to lead the company, including Mark Hepsworth who was appointed as CEO in August 2016 and has focussed on developing the company’s product offering to meet client needs and growing the company. The team will be joined by Brian Traquair, former president of capital markets at SunGard Data Systems and a fintech advisor and investor, as chairman of the board of directors.

Commenting on the acquisition, Hepsworth said: “This investment will benefit our clients, who can expect continued stability and investment in product development to further enhance their data management capabilities. Asset Control has grown rapidly in recent years and we look forward to continued successes with Sovereign in launching new products and winning new clients.”

Sovereign intends to use its experience in the financial services and technology sectors – its main claim here is a 2010 investment in compliance and regulatory consultancy Cordium, which it exited in 2015 – to support Asset Control through its next phase of growth.

Sunil Jain, investment director at the private equity firm, said: “The global marketplace for data management systems is growing rapidly and Asset Control is uniquely positioned to capitalise on this opportunity in the financial services market. Asset Control stands out due to its robust historic performance and attractive growth potential. We look forward to building on what Mark and the team have accomplished in recent years”.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 14, 2018, 8:47 am

With regulatory compliance still at the top of the agenda and business demand for meaningful data rising, is your firm’s data management strategy effective? Join next week’s A-Team Group Data Management Summit in New York City to answer the question with the help of Peter Moss, CEO of the SmartStream RDU.

Moss will discuss today’s data management challenges, including rising customer expectations around electronic transactions, the need to implement data mining and artificial intelligence solutions to differentiate your trading business, and the regulatory requirement to provide full visibility into your systems, processes and data. He will also touch on the CEO’s desire for the company to be ‘digital’.

These challenges are, and will continue to be, immense – but they can be met with a strong master data foundation.

So how do you build that foundation? Sourcing data from one vendor will leave you with data gaps and limited commercial leverage. Multiple data vendors can provide better data coverage, but the data will use different standards and formats and you will need to manage the mapping. To manage the mapping, you will probably have implemented an enterprise data management (EDM) solution and the rules you defined will be complex and constantly changing, and there will be many manual exceptions and workarounds. If your EDM solution looks rather like this, it could be holding back your trading business.

Considering the challenges of data management and the problems posed by vendor data and ageing systems, Moss will move the story forward and discuss the potential of the utility model to manage data on your behalf and provide access to accurate, standardised master reference data that can meet the needs of your compliance, risk management and business data users.

Show Author Info?: 
No
Related: 
Data Management Summit (DMS) - New York City, September 20th 2018
Author: ateamgroup
Posted: September 13, 2018, 1:12 pm

Asset Control has released ACX, a cloud-deployed delivery, storage and analytics platform designed to manage increasing data volumes, reduce infrastructure costs and provide business and technical users with easy access to data exploration. The platform is an additional module in the company’s AC Plus product family and includes open source technologies, although it is not an open source product.

Martijn Groot, vice president of product management at Asset Control, says: “Our clients have outgrown technology infrastructure that has lots of duplicated data stores, which make it difficult to mine and access data, and can no loner cope with regulatory, business and data lineage requirements. They need to escape the boundaries of enterprise data management (EDM) and embrace data management that puts data in the hands of users and enables broader business use cases.”

He adds: “By designing ACX specifically for the cloud environment, it can not only provide easier access to data, but also help firms reduce infrastructure costs by moving data into the cloud and sunsetting some internal systems.”

The platform was developed in conjunction with clients over the past year. Open source components were selected on the basis that they are scalable to match growing data volumes and are easily deployed in the cloud. They also offer a more attractive licensing model for users than EDM solutions.

Groot comments: “While banks like the capabilities of open source solutions, they also need the solid support of a service contract.”

From a practical perspective, ACX integrates with AC Plus, which provides operational control and financial data mastering, while ACX stores complete data and records of change to support data exploration. The platform also makes it easier to integrate data back into workflows.

ACX can be deployed in a public cloud, it is currently in the Google cloud but is cloud agnostic, or internally in a firm’s private cloud. Its use cases include model development, stress testing, product control and compliance with complex regulations such as Fundamental Review of the Trading Book (FRTB). The platform is being tested by a Tier 1 bank and Groot says it is gaining interest from other large banks that need to review their technology landscape.

The platform’s technology stack includes the Cassandra highly scalable NoSQL database; Spark, an open source component used for data processing and allowing users to bring models to the data; and Kafka, messaging software for fast real-time streaming of data that ensures any updates are instantaneously reflected in ACX.

It supports NoSQL based market data discovery and distribution with programmatic access to market data via REST services and native integration with Python and R. It can also be integrated with third-party business intelligence tools, all of which reduces time required for risk model development and deployment. Business users can access data directly through enterprise search capability and use Spark to manipulate data and develop and onboard financial models.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 13, 2018, 10:40 am

The Chartered Financial Analyst (CFA) Institute released the Global Investment Performance Standards (GIPS) 20/20 Exposure Draft for public comment late last month. The draft outlines significant changes to the current standards that could alter financial firms’ approaches to compliance. Comments on the draft must be made by December 31, 2018, with final adoption of the revised standards expected in mid-2019 and a targeted implementation date of January 1, 2020.

GIPS are a set of voluntary reporting guidelines based on the principles of full disclosure and fair representation of investment performance and have been widely adopted by asset managers keen to promote their compliant capabilities to clients.

The new and simplified GIPS 20/20 draft comprises numerous updates, including specific guidance for asset owners, and should go some way towards extending the reach of the GIPS initiative, which has already been adopted is already adopted by 1,538 firms representing more than 60% of global assets under management.

With the draft updates now publicly available, tech firms are looking towards upgrading solutions to help asset managers meet the new requirements. Most recently, BNY Mellon’s Eagle Investment Systems and ACA Compliance Group announced a collaboration to improve GIPS compliance and verification.

Mark Goodey, senior principal of investment analytics at Eagle, comments: “An asset manager’s compliance with GIPS standards has become a necessity in order to have credibility with investors. By streamlining the data collection process, we’re helping our clients provide the kind of transparency that is expected today.”

The GIPS 20/20 Exposure Draft provides specific sections for both firms and asset owners, with the goal of reducing complexity and eliminating the need to go back and forth between sections to determine which requirements apply. This builds on a growing trend of asset owners keen to leverage the standards to demonstrate the integrity of their own practices to other constituents.

Key changes to the draft include removal of the requirement to create a composite that includes only one or more pooled funds if the firm does not offer the strategy of the pooled fund as a composite strategy to segregated accounts. If a pooled fund strategy is the same as a composite strategy, the pooled fund must be included in the composite and a GIPS Composite Report must be presented to prospective composite clients.

Firms selling participation in a limited distribution pooled fund must prepare and present a GIPS Pooled Fund Report to all pooled fund prospective investors, reflecting only the specific pooled fund’s information. Broad distribution pooled funds do not require a report, although firms must offer either a Pooled Fund Report or a GIPS Advertisement if they wish to promote their GIPS compliance. This update could make GIPS compliance not only more attainable, but more attractive for fund managers.

GIPS 20/20 also relaxes the rules around money-weighted returns, removing the asset class distinction and offering more flexibility in terms of reporting categorisation. It also reverses the requirement imposed in 2010 to manage a carve-out with the firm’s own cash balance, instead allowing GIPS firms to allocate cash to carve-outs, in a move that could make GIPS compliance substantially more attractive to private equity and real estate fund managers. In terms of performance portability, the current one-year grace period to bring any non-compliant assets into compliance will only apply to performance at the new or acquiring firm, with no limit on when firms may port history from the prior firm or affiliation.

Show Author Info?: 
No
Author: ateamgroup
Posted: September 12, 2018, 11:57 am

Mention financial crime prevention and your mind probably jumps to the latest machine learning and artificial intelligence solutions for transaction monitoring, fraud prevention, autonomous monitoring and so on. But these solutions, while essential and invaluable, only address part of the problem – they are reactive rather than proactive, catching issues only after the fact.

This is a fundamental challenge at the heart of money laundering, terrorist financing and other forms of financial crime, but there are answers. Ed Sander, president of Arachnys, a New York-based firm focusing on customer risk intelligence solutions for Know Your Customer (KYC), anti-money laundering (AML) and enhanced due diligence (EDD), says: “Previous solutions have focused primarily on spotting transactions that have already occurred or are in flight. Unlike fraud, money laundering is very hard to follow in real time, which means that most AML solutions track events that have already happened.”

However, a new movement is evolving – one that focuses on entities that could perform the criminal transactions rather than the transactions themselves, with the goal of catching money laundering activity before it even happens.

“If you know enough about a known bad actor, then you can identify unknown bad actors through matching behaviour patterns,” explains Sander. “If you can do this, then you have the potential to stop events before they happen.” In essence, this is a form of profiling, through which analysts can track, flag, and in theory, prevent risk before it enters the bank.

While a relatively new area, this intelligence-led approach is already gaining traction. In April 2018 HSBC announced plans to integrate new technology from Quantexa, a UK-based big data analytics company, to combat financial crime by analysing internal, publicly available and transactional data within a customer’s wider network to stop potential money laundering before it happens.

The deployment of the technology follows a pilot of the software at HSBC in 2017 and will see the global bank and data start-up work together to better detect potentially illegal activity in its broader context, helping the bank fulfil its regulatory responsibilities and provide better understanding of the overall risk.

Quantexa CEO Vishal Marria says: “We will be supporting the bank to join the dots of all their data to give a broader understanding of their customers and transactions across the globe. Through a better understanding, HSBC will be better equipped in its fight against financial crime.”

Arachnys too has joined the movement, with the recent development of cloud-based Entity Management solutions (soft launched in August) that leverage machine learning and work with banks to help them spot potential issues in advance. 

“It is a whole new world, with a paradigm shift occurring in the industry,” says Sander. “Of course you need the old technologies – the regulators still require solutions to monitor transactions. But a lot of banks, around the world, are investing significant sums into entity profiling capabilities, and this area of AML and financial crime prevention is likely to grow very quickly.”

 

Show Author Info?: 
No
Author: ateamgroup
Posted: September 10, 2018, 2:14 pm

SS&C has announced its fourth acquisition of the year – the purchase of virtual data room pioneer Intralinks from Siris Capital Group for $1.5 billion. The addition of Intralinks will increase SS&C’s account footprint and add cloud-based virtual data rooms and secure collaboration solutions for its global banking, deal making and capital markets clients.

The acquisition, which is expected to close in the fourth quarter of 2018 and will likely give Siris Capital co-founder Frank Baker a seat on the SS&C board, follows hard on the heels of the company’s $1.45 billion purchase of Boston-based Eze Software announced at the end of July.

In March, the firm swallowed CACEIS North America to boost its North American presence, while in April it acquired DST Systems to expand its presence in the institutional, alternative, wealth management and healthcare segments. In the same month, SS&C backed out of the bidding race for Fidessa after ION Trading made a late offer to gazump initial bidder Temenos.

Bill Stone, chairman and CEO at SS&C Technologies, says: “Intralinks brings a wealth of expertise and a leadership position in the data sharing and collaboration technology space. We share many of the industry’s largest customers and together are well-positioned to meet the needs of major banks, alternative funds and other corporations seeking to automate document-centric, collaborative workflows.”

Show Author Info?: 
No
Author: ateamgroup
Posted: September 10, 2018, 11:25 am
AIM Software has named Florian Rosenberg as head of engineering with responsibility for leading the company’s product engineering team in the development and delivery of the GAIN product suite. He reports to AIM chief technology officer Deepak Srinivasan and is based at headquarters in Vienna.
 
Rosenberg has several years of experience in technology leadership, having worked at IBM to deliver innovative products and cloud services with a particular focus on DevOps and artificial intelligence , and the development of IBM’s ‘deep learning as a service’. He has also held leadership positions at CSIRO, Data61 in Australia and Braintribe in Vienna.
 
Gayatri Raman, CEO at AIM Software, says: “We are constantly focused on doing better for our clients. Florian Rosenberg has the experience and technical vision to help accelerate our innovation plans and capitalise on the opportunities in front of us.”
 
Rosenberg adds: “AIM is a global leader in EDM software for the buy-side with a world class customer base. We have an incredible team of people, an unwavering focus on meeting client needs, and a clear vision for the future. This combination is exciting and I look forward to being part of the next chapter.”  
Show Author Info?: 
No
Author: ateamgroup
Posted: September 5, 2018, 1:11 pm
Thomson Reuters has responded to the need for Systematic Internalisers (SIs) to make their first regulatory report under MiFID II by December 2018 with RTS 27 Now, a targeted reporting solution that uses the company’s high-performance processing platform, Velocity Analytics, to manage data required for SI reports.
Under the MiFID II SI regime, banks had to register with their national competent authorities as SIs by the end of August 2018, based on whether their trading activity exceeded levels set across different instruments by ESMA on August 1, 2018. By the end of the year, all banks that are registered as an SI must submit an RTS 27 report on execution quality covering price, cost, size and speed of execution. This creates a challenge for some banks to compile the trading and market data they need and analyse it in time to submit their first report.
 
Brennan Carley, global head of enterprise for the Financial & Risk business at Thomson Reuters, says: “The MiFID II reporting regime is complex and many banks face a short window to conduct an urgent data retrieval and analysis exercise to compile their first report, which is why we have created a simple service with all the data, analytics and support they need.” 
 
To help SIs within the MiFID II regime, RTS 27 Now provides a flexible operating model and rapid service-based response, including operational data services so users can more easily complete their first report before the December deadline. As part of the Thomson Reuters Elektron Data Platform, Velocity Analytics can leverage all the data provided by Thomson Reuters, including new data resulting from 28 venues created by MiFID II. 
While RTS 27 Now is designed to support a specific reporting requirement, Carley says firms could use the underlying Velocity Analytics platform across the business.
 
The platform, which is powered by in-memory, time-series database technology from Kx, provides ultra-high-speed processing of cross-asset real-time and historical data, enabling firms to solve a wide range of challenges that require high-performance analysis of large datasets. These include managing trading against reporting thresholds, which requires best execution reporting, transaction cost analysis, and quantitative and systematic trading – including real-time SI determination.
Show Author Info?: 
No
Author: ateamgroup
Posted: September 5, 2018, 12:34 pm
The Association of National Numbering Agencies (ANNA) and the Global Legal Entity Identifier Foundation (GLEIF) have agreed an initiative that will link International Securities Identification Numbers (ISINs) and Legal Entity Identifiers (LEIs). The aim is to improve market transparency and support risk and exposure management.
The initiative will map new and legacy ISINs to corresponding LEIs, allowing firms to aggregate the data required to gain a clear view of their securities exposure within a given issuer and its related entities. Once implemented, the ISIN-to-LEI mapping table will be freely available to all on both the GLEIF and ANNA websites.
 
Stephan Wolf, CEO at the GLEIF, says: “While linking ISINs and LEIs has been mandated by some regulations, we see this initiative as beneficial to the global market as it is the first step towards having the tools to aggregate data necessary to assist with risk and exposure management.”  
Confirming ANNA’s commitment to promoting the use of standards, the initiative includes the ISO standards for the ISIN (ISO 6166) and the LEI (ISO 17442). Dan Kuhnel, chairman at ANNA, comments: “We are constantly looking at ways to promote standardisation and bring about harmony in the financial industry. We look forward to working with the National Numbering Agencies to help move this initiative forward into the implementation stage.”
 
Show Author Info?: 
No
Author: ateamgroup
Posted: September 5, 2018, 12:22 pm

Data monetisation has become key to revenue growth at financial institutions, but how can they get it right and achieve competitive advantage, and how will General Data Protection Regulation (GDPR) impact their progress?

Webinar Date: 
Thursday, September 27, 2018 - 15:00
Author: ateamgroup
Posted: August 30, 2018, 12:25 pm

Singapore Exchange (SGX) has introduced a Reference Data System (RDS) that is based on NeoXam’s DataHub data management solution and is designed to provide a central data repository for its instrument related reference data and corporate actions data. NeoXam’s software is also expected to support future enhancements to RDS, such as automatic generation of events and the introduction of more data products to the platform.

The exchange says NeoXam’s scalable solution was the most suitable platform on which to build RDS. Besides acting as a central data source within the exchange, RDS facilitates the creation and delivery of an external SGX reference data feed providing timely and accurate Singapore security, issuer and financial statement data. These datasets can be used in parallel with SGX’s existing Corporate Actions Feed service for an enriched offering.

Ng Kin Yee, head of market data and connectivity at SGX, says: “We were looking for software which is scalable while having the capacity for our team to maintain the system internally. NeoXam’s DataHub allowed us to adhere to ISO and industry conventions, with the flexibility to adapt to changing data demands of the industry.”

Tim Versteeg, general manager at NeoXam APAC (ex China), adds: “The infrastructure of an exchange and numbering agency like SGX has to meet the highest regulatory and global standards. We are working with SGX to achieve and continue to ensure compliance in a smooth and cost-effective manner by using our out-of-the-box, adjustable data models and workflows, which enabled a swift and agile implementation.”

Show Author Info?: 
No
Author: ateamgroup
Posted: August 29, 2018, 1:00 pm

Contact Us

 

 

Epsilon Consulting Services

90 Broad Street, Suite 2003
New York, NY 10004

 

(347) 770-1748

(212) 931-0572

 


 

Careers

 

If you are interested in joining Epsilon’s financial consulting firm in New York City, please visit our Careers page to view jobs and submit a resume for consideration. See our service areas page for specific locations we provide consultations in.