Search This Blog

Friday, April 26, 2013

Are You Monetizing Your MDM Information Hub?

Enterprises with existing Master Data Management (MDM) programs will benefit by looking at some innovative ways to “monetize” their MDM data assets.  Most existing MDM Programs were launched with a mission statement that did not include directly generating revenue off of the MDM Information Hub data asset. Most traditional (existing) MDM programs were put in place to:

·         Improve the accuracy of data that needs to be shared between disparate operational systems and external trade partners by creating a Gold Standard master in that Info Hub.

·         Improve the operational efficiencies by automating the daunting tasks of validating, cleansing and integrating critical reference data that needs to be shared between operational applications and trade partners.

·         Reduce the business risks to the enterprise due to erroneous data on critical compliance and regulatory reporting.

 MDM Programs are expensive! A lot of investment is poured into them year after year and many are achieving their mission. The question for consideration…“Is saving our way to better profit enough or is there opportunity to increase revenues and boost the ROI of MDM investments?” Enterprises need help finding ways to monetize that data asset to enable more revenue generation in addition to contributing to savings and mitigating risk.  For example in the Health Care industry:

·        In the case of the ICD-9àICD-10 conversions.  ICD-10 opens the door to analyze medical procedures and protocols and their outcomes at a more granular level. As companies achieve meaningful use of EMRs (Electronic Medical Records) applications, more detail about symptoms, diagnosis, treatments, protocols, supplies, involved professionals, outcomes, ongoing patient vitals and re-admittance detail are available for analysis than ever before.  ICD-10 encoding provides a level of detail to enable more robust analysis.

·       This opens the door for performing more sophisticated Business Discovery and cause/effect predictive/prescriptive analytics.  As Healthcare providers seek to be a member of an Accountable Care Organization (ACO) the ability to identify areas of potential treatment improvement through predictive and prescriptive analysis puts them in a much better position to score well across the ACO scoring criteria as established by Centers for Medicare and Medicaid Services (CMS).   Today, there are 33 measures for quality care.  Score well on these and Medicare/Medicaid will reimburse at 100% of defined eligible benefit.  Providers that do a poorer job in providing care (as evidenced by the lower scores) will receive less of the total eligible reimbursements.   This is lost revenue (revenue leakage).

Another area where advanced predictive/prescriptive analytics can be helpful is in finding root causes for data quality issues that MDM hubs are commissioned to resolve.  The goal is to improve the quality of the data (enforce MDM golden standard business rules) at point of original inception.  Move the MDM cleansing/Golden Record processing from a bulk-batch based method to an on-demand/transactional based processing method.   Predictive analytics helps uncover root-causes for poor quality and can analyze and predict which data elements will have the greatest adverse ramifications due to overnight processing lag times. This can be used to prioritize the most critical elements that should be moved to on-demand cleansing and gold standard processing.

Many MDM Programs do not have a means to analyze the overall enterprise data quality improvement of an MDM Information Hub over time. Nor do they track the changes/trends in the level of effort/operational costs applied across all of the data elements in the MDM info Hub.  Here is an opportunity to implement data warehouse, business discovery  and advanced analytics principles on top of their MDM business function (much like enterprises do for their traditional line-of-business functions such as Marketing Campaign Management, Sales Activity Management, Order Fulfillment, Inventor Management, etc.).  

Some enterprises are realizing the golden standard data asset within a specific information hub has intrinsic value to their industry. Innovation minded enterprises are finding ways to leverage their hubs to sell packaged, value-added analytics and data-provider services to other enterprises and service-providers in their industry.  In the Healthcare industry as Health Information Exchanges (HIE) gain adoption, care-provider enterprises with high quality information hubs will contribute to and take advantage of various HIEs to improve their ability to market services via the HIE open market by exposing/promoting their performance metrics vs. that of the local competition and/or global niche-provider/specialist competition.

Are you looking for ways to monetize your MDM Information Hubs?  Are you already monetizing your MDM Information Hubs? Would love to hear from you.

Monday, February 4, 2013

HANA - Is it Real? Should You Wait?

I'm a practitioner that initially reserved a high-degree of skepticism on the HANA promise. That is, until I had the chance to put HANA to the test. My findings? There's a lot of conjecture floating around about HANA.  It's new so there's not a lot of 1st hand experience being shared.

HANA is the real deal and that's based on 1st hand experience.

I invested in and led a development team that took an Oracle DW, pointed the DDL at HANA, ran the existing ETL and loaded the new HANA tables all in just a couple days. The ETL jobs and the existing report queries showed immediate performance improvement with very little development effort.

Then we got after building an application that included mapping and loading of additional source data to the integrated data schema on HANA, triggering event-based invocations of statistical scoring models (R), generating dynamic messages & alerts to mobile devices, capturing and committing write-backs from the mobile devices and dynamically updating a set of performance management dashboards. In less than 4 months with an agile team (1 DW Architect, 1 Data Scientist, .5 BI Architect, .5 Mobile Architect & access to the client's subject-matter-experts) we designed and deployed a robust application that is revolutionizing the business of higher-education.

Performance for ingest & load times, statistical model execution, mobile device alerts/write-backs, and dashboards response times all measured in seconds or less. In my experience this raises the bar on the other platforms I've worked (and I've worked on most of them).

As for migrating the BW-tables from traditional RDBMS platforms to HANA IMDBMS? With the Rapid Deployment Solutions (RDS) approach, migrating BW to HANA is a short and relatively simple tactical exercise. Out of the box, the load and query performance is many X faster (mileage will vary depending on deployment). Then you have the opportunity to optimize the underlying analytic and calculation views to extend functionality and improve performance of your BW info-cubes and reporting.

SAP has already starting migrating the more intense/complex OLTP transactions to read and commit updates to the ERP tables that have HANA underneath them. They are seeing positive results. Why would you do this? Simple performance lift is one key reason. The more compelling reason is you will start to realize the benefits of IMDBMS supporting the hybrid workloads of OLTP and OLAP against a single integrated database design. Yes, the need for the split of OLTP vs OLAP into separate eco-systems is on the path towards obviation as IMDBMS eliminates the performance constraints which forced the need to split OLTP & OLAP workloads out to begin with.

SAP's Hardware partners are coming out with scale-out reference architectures to leverage HANA's parallelism. This on top of the high compression ratios (10X plus) and innovative temperature-controlled archiving solutions from 3rd party software partners is opening up a capacity planning model for scaling on HANA while providing a compelling overall (OLTP & OLAP eco-system) TCO.

The visionary enterprises are seeing the value proposition of moving to a platform that enables ultra-low latency OLAP and OLTP applications while significantly reducing the resources and level of effort to design and maintain these applications. Less effort because you don't have to over-engineer the data models, ingest layers and query SQL in order to squeeze out marginal performance gains. You don't have to peel off subsets of data to run the complex multi-pass statistical scoring models for fear of dimming the lights on other workloads. That means more accurate scoring algorithms and more agile plan, test. do cycles.

You can wait, but you may end up wondering why you did as you look at your competitors' taillights. I highly recommend taking a good look at the platform, get out of the box, find a compelling use-case and map out the business outcome potentials before you decide to wait and see.