Toward A Modern Data Fabric For Capital Markets

Toward A Modern Data Fabric For Capital Markets

fabric markets

Every financial institution I talk to is re-considering some aspect of their data infrastructure. Renewal is of course an ever-present feature of the financial technology landscape with firms racing to compete. However, current conversations feel different. When it comes to data, the subject turns quickly away from discrete systems towards data strategy and a unifying data fabric that elevates enterprise data management to a new level.

Financial institutions across the spectrum of capital markets activities are recognizing that insights into client preferences, market trends and other factors can yield substantial benefits in terms of customer satisfaction, new product development, and ultimately profitability.

At the same time, they’re realizing that the systems they have in place—some of them decades old by design—aren’t equipped to deal with the demands of modern-day financial markets.

Existing legacy data solutions—often deployed across different clouds and on-premises—are starting to create a drag on innovation, hindering firms’ ability to profit from emerging digital technologies. This is driving efforts to adopt new data fabric technologies to harness the vast data sets at their disposal and transform themselves into data-driven organizations.

The ongoing digital transformation in capital markets comes against the backdrop of a broader migration trend toward cloud hosting and delivery. Although they were initially hesitant for security and performance reasons, many financial firms now find cloud attractive. Cloud can boost agility to respond to emerging opportunities, while underpinning the shift from an on-premises capital expense model to a hosted operating cost model, reducing pressures on the balance sheet.

But for many, the journey to becoming a data-driven enterprise is more a marathon than a sprint. As they explore new opportunities for leveraging their substantial data inventories, firms are finding hurdles and bottlenecks in the form of data quality issues and integration challenges.

Often, the fragmented nature of the institution—the legacy of relentless M&A activity in financial services—means different operational functions have different requirements from the same data sets.

This is hindering firms’ ability to derive true insight from their data sets, in the form of advanced analytics that can drive new innovations, new products and new areas of business.

The Legacy of Legacy

The legacy of legacy, in short, is lack of performance, whichever way you cut it. It can mean the inability of legacy platforms—like the mainframes and client/server installations practitioners (quietly) admit to retaining for important number-crunching applications—to perform in the face of today’s unyielding volatility and huge data volumes. Or it can mean the shortcomings of conventional cloud database solutions, which continually fail to deliver on their promise of performance at a lower cost.

As a result, financial institutions are facing pressure to modernize legacy applications and infrastructures. Despite the profits from trading they enjoyed during the market volatility of the Covid-19 pandemic, sell-side firms are struggling in the ongoing low interest-rate environment to make money from their traditional lines of business. They’re under pressure to wring cost savings from existing systems and processes.

Buy-side firms continue to struggle to demonstrate the value of active vs. passive investment strategies, with even the best performing hedge funds leaving investors underwhelmed. Aside from the largest investment managers, these firms have limited expert IT resources, making it difficult to pull together the data and analytics they need to seek out investment opportunities and optimize performance.

Adding to their operational challenges, firms are continuing to face heightened regulatory scrutiny in many areas of their activities. For sell-side firms, capital adequacy and risk remain data-intensive challenges with severe consequences for getting it wrong. For buy-side firms, the emergence of ESG (environmental, social and governance) investing as a “thing” is creating new data sourcing, ingestion, and management challenges.

What’s to Be Done?

Faced with these challenges, firms are assessing their legacy estates with a view to modernizing applications. They’re transforming infrastructures to adopt digital technologies that streamline processes and improve client journeys and outcomes.

In many cases, this involves migration of the data infrastructures supporting specific business functions to cloud and SaaS delivery. This transformation can help add agility, support sophisticated analytics, and streamline the onboarding of clients and financial products. All of these require the kind of data-centric decision-making a consistent data fabric can enable.

Regulatory Challenges

But many firms are struggling to realize the vision of gaining insights into client preferences and market opportunities that may yield new products and activities. Emerging regulatory measures around capital adequacy are increasing the need for more granular analysis of their position and transaction data, to optimize the amount of capital available to business lines.

The stakes are high: Underemployment of capital equates to opportunity loss, resulting in poorer business outcomes. Non-compliance with capital adequacy rules risks regulatory censure and reputational damage.

Firms are also finding that rules around privacy, client data, conduct, and ESG investing are forcing them to deal with new types of data, often in unstructured format, and introducing restrictions on where data can be hosted, creating data sovereignty challenges.

Incoming regulations around operational risk—the EU’s Digital Operational Resilience Act (DORA), for instance—are impacting the run-time analytics environment and further straining operational resource.

Combined, these factors are making it difficult for firms to progress toward a more digital approach to business, as data quality, consistency and availability issues hinder automation. Ongoing market volatility is boosting data volumes, even as firms face the need to deal with more granular data sets. This presents performance challenges that can hamper innovation and adoption of data-driven strategies, while substantially adding to costs.

Operational Hurdles

Siloed and fragmented data sets—often demarcated along lines of business and often the result of years of M&A activity—present a significant barrier to tackling the data quality issues that are hindering automation and advanced analytics.

Different user types —across trading, portfolio management, risk, compliance, finance—have different data needs, creating a barrier to adopting a standardized approach to data management, inhibiting the drive toward instituting a data-driven organization and automating the enterprise.

Firms may lack a single golden copy of the truth for many data types, including transaction data, entity data, security master files and even calendar information. This situation is often perpetuated by a propensity for individual businesses to build their own solutions to pressing issues when no solution is immediately available.

Traditional approaches often involve some element of centralization, often restricting firms’ ability to share information cost-effectively when architectures really need to support the data-dispersed reality of many financial services businesses.

Finally, data teams often are not adequately resourced to perform the data quality functions needed to ensure data integrity in support of more automated processes. As a result, they may struggle to conduct the data cleansing, normalization, validation, and integration required for the task at hand.

Because data is frequently specific and coupled to the line of business, there is little appetite to change to a more data-centric, service-driven approach to data provision across the enterprise.

How Yellowbrick Can Help

Yellowbrick is a modern and open elastic Data Warehouse that runs consistently across cloud and on-premises with predictable and controllable costs. For firms across the spectrum of financial services—from banks and insurers, buy and sell side firms, through to custodians, service providers and other intermediaries—Yellowbrick is the catalyst for modernizing data warehouse infrastructures, allowing firms to reduce concentration risk in terms of exposure to cloud operators, increase efficiency through predictable spend and faster time to analysis and insight, and enjoy the benefits of digital transformation.

Yellowbrick delivers on the Distributed Data Cloud (DDC) which is an approach for enterprise data across cloud platforms that recognizes that different data may have different ownership in the same cloud region, across different regions or spanning on-premises and cloud.

DDC offers a single operating model and base technology across the entire data fabric for a simplified data architecture, lowering the cost of operation, and enabling the flexibility to seamlessly shift data and functions between different cloud or on-premises platforms.

Yellowbrick’s “Your Data Anywhere” approach supports the financial institution’s journey from on-premises deployment to hybrid private and public cloud, enabling firms to streamline and modernize the applications and IT/data infrastructures supporting their capital markets activities.

Yellowbrick de-risks firms’ efforts to migrate to cloud for appropriate capital markets activities, and provides the performance and cost-efficiency firms need to handle mission-critical time-sensitive tasks with a high cost of failure.

Whatever their scale of operations, banks, investment banks and brokerages, hedge funds and asset managers, fund administrators and data vendors—all need to bring their legacy data IT estates up to date, to support world class trading and investment technology.

Yellowbrick’s innovative architecture means that the solution can achieve comparable performance to legacy solutions on a significantly smaller footprint, resulting in cost savings, energy savings and sustainability benefits.

Learn More

The challenges associated with establishing a modern data fabric are discussed in an A-Team Group webinar sponsored by Yellowbrick and featuring expert practitioners from Morgan Stanley and Voya.

The Modernising Data Infrastructures: Challenges and Opportunities of Managing Complex Financial Data & Analytics in Hybrid & Multi-Cloud Environments webinar explores how to manage the transition from deeply entrenched legacy platforms to emerging hybrid, on-premises, and cloud environments. Learn more about the challenges facing data management, infrastructure, and engineering teams as they grapple with implementing modernization programs.

Get the latest Yellowbrick News & Insights
Why Private Data Cloud?
This blog post sheds light on user experiences with Redshift,...
Data Brew: Redshift Realities & Yellowbrick Capabilities –...
This blog post sheds light on user experiences with Redshift,...
DBAs Face Up To Kubernetes
DBAs face new challenges with Kubernetes, adapting roles in database...
Book a Demo

Learn More About the Only Modern Data Warehouse for Hybrid Cloud

Faster
Run analytics 10 to 100x FASTER to achieve analytic insights that have never been possible.

Simpler to Manage
Configure, load and query billions of rows in minutes.

Economical
Shrink your data warehouse footprint by as much as 97% and save millions in operational and management costs.

Accessible Anywhere
Achieve high speed analytics in your data center or in any cloud.