The real estate bubble of 2008 made shockwaves worldwide. This was so significant an event that the US economy only this year was determined to have recovered to pre-bubble levels. But this event also resulted in all sorts of other shockwaves -- regulatory ones.

To address these regulatory needs for Wall Street, Red Hat Storage teamed up with Hortonworks to build an enterprise grade big data risk management solution. At the recent Strata+Hadoop World, Vamsi Chemitiganti (chief architect, financial services) presented the solution in a session -- which you can see for yourself at the bottom of this post.

vamsi

The slide does a great job of breaking down the workflow. To spare your eyes, here are the steps:

  1. The Red Hat JBoss Data Grid and Hortonworks Data Platform (HDP) are primed with Real Time and Batch Loads
  2. Data Transformations take place in HDP. Data flows back and forth between the Red Hat JBoss Data Grid in Memory Compute Tier and HDP
  3. In-Memory calculations take place within Red Hat JBoss Data Grid. Results are routed to HDP where Interactive and Batch job loads
  4. Interactive SQL Queries and Batch Job Execution take place within HDP
  5. Business users leverage business intelligence tools to query data interactive in HDP and/or a relational database management system (RDBMS).

Check out the full session right here: