Red Hat and Hortonworks collaborate on a modern data architecture for financial services.

There are very few industries that are as data-centric as the banking and financial services industries. Every interaction that a client or partner system has with a banking institution produces actionable data that has potential business value, as well as a level of risk associated with it. To stay competitive, financial services institutions have to capture, store and analyze all this data to more accurately forecast market movement, understand operations, screen for fraud and comply with regulations.

For these reasons, the strategic alliance of Red Hat and Hortonworks is focused on helping financial services companies adopt enterprise Apache Hadoop and create competitive advantages by improving risk management, reducing fraud and improving investment decisions. 

Challenges That Financial Institutions Face

Financial service firms have operated under significantly increased regulatory requirements, such
as Basel III, since the 2008 financial crisis. As capital and liquidity reserve requirements have increased, the requirement to know exactly how much capital needs to be reserved, based on current exposures, is critical. Unnecessarily tying up excess capital can keep the firm from taking advantage of business and market opportunities. Today’s risk management systems must respond to new reporting requirements and also handle ever-growing amounts of data to perform more comprehensive analysis of credit, counter-party, and geopolitical risk.

However, existing systems that are not designed to meet today’s requirements cannot finish reporting in time for start of business or trading, which can lead to uninformed decisions. The problem is compounded by the increasing need for intra-day reporting as well as a short window for overnight batch processing as required by global trading and electronic exchanges. And, many of these systems are inflexible and expensive to operate.

There are other problems that aging in-house solutions can present, such as:

  • Data is often stored across many silos throughout the firm, using multiple technologies which all require different methods for obtaining access. Instead of focusing on analysis and reporting, valuable time can be wasted as teams try to figure out how to reliably obtain the necessary data.
  • Many proprietary solutions have been built using high performance computers or grid computing clusters that are inflexible and can consume large portions of the available technology budget without meeting evolving challenges. Since these systems often don’t use any standard interfaces, off-the-shelf tools can’t be used or require custom development.
  • Existing systems typically lack the security and controls necessary to keep up with compliance and data security requirements

Requirements for a Successful Big Data Deployment

There are several requirements that financial institutions need to consider as they deploy a big data solution:

  • Analyzing liquidity risk requires access to both real-time and frequently updated pricing data.
  • Longer-term information such as reference data, historical pricing and results of past valuations also need to be accessed.

The two types of data have vastly different requirements for processing and storage. In the past, systems that attempted to address both types of requirements with a single data store generally either failed to meet performance requirements or were prohibitively expensive.

To address this, a modern data architecture for risk management has been developed, including both a repository hat is well suited to real-time data with frequent updates, as well as a repository that is specifically designed to accumulate large volumes of data. The modern data architecture for financial services makes it easy for batch or real-time applications to get the data needed. It also enables analysts to perform interactive ad-hoc research. Many standard data interfaces are available, which also enables financial institutions to use off-the-shelf software without having to develop custom adapters.

Big Data Use Cases

Big data use-cases in banking can be classified into two broad areas from a business value perspective:

  • Building competitive advantage in the Red Ocean part of the business.
  • How to develop new markets i.e. Blue Oceans.

The CIO’s primary concern should be business value and competitiveness, not simply keeping up with trends for the sake of it. There are four distinct use cases where Red Hat see firms beginning to use big data related technologies:

  • Risk management;
  • Fraud detection and management;
  • Intelligent customer upsell in retail banking and wealth management; and
  • Investment management.

The Red Hat and Hortonworks Solution

Red Hat and Hortonworks have collaborated to offer a solution that provides an open, modern data architecture for financial services risk management. The solution is designed to scale-out using small commodity systems. An in-memory data grid, stores real-time data with low-latency, while the Hortonworks Data Platform uses Apache Hadoop to efficiently process and store huge volumes of data that support advanced analytics capabilities. The key to scalability is that each additional system provides increased memory and storage, as well as more capacity for
computation. Flexibility is another advantage of the Red Hat and Hortonworks solution. Data is easily accessible to batch jobs, real-time processing, or interactive research. Standard interfaces enable use of a variety of off-the-shelf business software.

The strategic alliance between Hortonworks and Red Hat offers many benefits, including
engineering and integrated customer support. Both companies are aligned on an open
source business model, which focuses on making innovations from open communities ready for the enterprise through tested, certified software releases, and world-class support.

To learn more about the Red Hat and Hortonworks solution for financial services, you can listen to this webinar we aired in April. Or check out this white paper, “Open Modern Data Architecture for Financial Services Risk Management” we authored in partnership with Hortonworks. You can also find out more about the Red Hat and Hortonworks partnership here.





Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.