Search This Blog

Thursday, July 3, 2014

Mortgage Lending Looking Up...and Looking Out for "The TRID"


Finally some light at the end of the tunnel for the Mortgage Lending industry, and it’s not a train coming head on.  John Bancroft from Inside Mortgage Finance (IMF) noted, “Fannie Mae and Freddie Mac…issued $141.83 billion of single-family mortgage-backed securities during the second quarter, an encouraging 9.4 percent increase from the dreary levels recorded in the first three months of 2014.”  The market has a way to go before we can be confident it’s out of the woods but Fannie & Freddie saw a solid 15.2% increase in single-family mortgage backed securities being issued from May 2014 to June 2014.

Along with that promising news, IMF reported that lenders are improving their closing times for purchase mortgages and increasing the accuracy of good faith estimates which results in fewer closing-cost surprises for borrowers.

In the mortgage lending industry, the name of the game is having the best-in-class Loan Production processes as measured by shorter cycle-times and increased throughput, while ensuring quality credit decisions and accurate loan documentation.  I work with several mortgage lenders who use diagnostic-based analytic techniques and self-service Business Discovery platforms to continually monitor the proficiency and productivity of their Loan Production processes.  When they see blips on the performance radar, they zero in on the deal, team member and/or originator to determine root cause and effect the necessary change to bring performance back into alignment with corporate performance benchmarks.

The reason for a best-in-class Loan Production process is because Loan Originator (Broker, Correspondent, etc) “lender loyalty” is largely a function of which lender can get the best credit decision back the quickest while providing a frictionless experience in producing the interim and final loan documentation.  Improvements in “lender loyalty” performance measures increases market share of that originator’s business and that means more earning assets in the lender’s portfolio.

Here comes the TRID! The CFPB’s (Consumer Finance Protection Bureau) TILA/RESPA (Truth in Lending Act/Real Estate Settlement Procedures Act) Integrated Mortgage Disclosure rule.   In short, this rule applies to “most” closed-end consumer mortgages and mandates the consolidation of four (4)  forms down to two (2) more understandable/informative forms. The rule also mandates timing requirements for when those completed and accurate forms MUST be delivered to the loan applicant/borrower.

·         First, the Good Faith Estimate (GFE) and the initial Truth-in-Lending disclosure (initial TIL) have been combined into a new form, the Loan Estimate.  The Loan Estimate must be provided to consumers no later than the third business day after they submit a loan application.

·         Second, the HUD-1 and final Truth-in-Lending disclosure (final TIL and, together with the initial TIL, the Truth-in-Lending forms) have been combined into another new form, the Closing Disclosure.  The Closing Disclosure must be provided to consumers at least three business days before consummation of the loan.

My “spidey-sense” tells me TRID is a report specification change (new layouts, new filter criteria, new reporting elements) and a new set of performance benchmarks against which lenders MUST manage their operations. Are mortgage lenders anxiously awaiting a software “regulatory patch” from their Loan Origination System (LOS) software vendor(s)?  Maybe there’s a better way.  How about using their Enterprise Business Discovery platforms to produce reports (forms), track performance and proactively trigger alerts for loan applications requiring attention BEFORE they violate regulatory performance rules?  Today’s Business Discovery tools are by-and-large the most agile and nimble platforms for modifying existing/creating new “reports”, providing interactive performance dashboards and generating alerts.  Even if the lender doesn’t have all the necessary loan application detail flowing into their enterprise data warehouse, they can still use their business discovery platforms to bring the right data together to handle regulation changes such as TRID.    

The CFCB’s TRID rule is targeted at the core steps within the Loan Production process. After all that hard work optimizing the Loan Production process, it would be ill-fated to erode that performance as a result of scrambling to make LOS system changes and business process changes to comply with the rule. (Deadline is August 2015).  With the right business discovery platform, the new forms can be simulated within days to weeks and the new processing performance measures can be modeled and implemented now.

Still not convinced?  Please consider the fact that the CFCB has just recently completed taking over responsibility for RESPA from the Department of Housing and Urban Development.  During this several year transition, the enforcement of REPSA Section 8 was in abeyance. Industry experts are now forewarning that “…the CFCB is picking up where HUD left off, and then some.” This means there will be more rules passed down with the intent of making things clearer to the borrower. That sounds like more meaningful reports and more regulatory performance measures to implement in order to avoid costly fines for non-compliance.

TRID is considered to be massive and should not to be taken lightly. A lot can be done in thirteen-months using your business discovery platform.  Consider using it to get a jump on implementing regulatory changes such as TRID.

Sunday, March 9, 2014

Why Can't the BI Hardware & Software Vendors Leave Good Enough Alone?


It's tough being a VP of Business Analytics these days.  The vendors are relentless. They'll call on anyone in the enterprise and convince them to acquire their technology. My business community is restless, dare I say, sometimes reckless.  They buy into these pitches and acquire products often times without my even knowing. And Oh my...The myriad of options out there are mind-boggling and seem to be growing every quarter.  Why can't I just keep my shop running as it is?
The challenge facing most providers of business analytics (BI/DW IT Shops) is their solutions take too long and typically miss the requirements mark. After a lot of hard work they end up with a largely dissatisfied consumer population.  Our heritage is to focus on getting that "single version of truth". A pristine set of curated data before we allow the business community to consume.  Our techniques and our architectural reference patterns were built upon an industry heritage and historical tradition developed on technologies that are rapidly becoming obsolete. Our understanding of the true meaning of operational data is limited and our mission has become too data centric. (We think data is the "ends", when it is merely the "means" an end.  Namely, Optimized Functional Performance through actionable insights from better business analytics)  End-result, the business community takes BI into their own hands. Wham! Spread-marts, user-marts, data-sprawl, data-anarchy,  disastrous business decisions, fines, lost licenses, jail time, workforce reductions, bankruptcy.  That is a bit unnerving!  So I get the CIO to convince his peers to slap the business' hands, allocate spend time & money to  rationalize/simplify that data mess back into IT control, recoup costs, mitigate risks, get it under control.  Whew, success... then Wham! The 2-3 year cycle starts again. Where's the innovation in that?
Over the last 7-8 years the BI tool offerings, especially the self-service data discovery tools, cut through the critical path of any data warehouse/BI project because the end-consumers who really know their data, interact WITH production data through advanced visualization techniques. In collaboration with IT the business drives out the formulation of a federated, integrated business view, complete with difficult integration, transformation and data quality rules applied against it. All validated by the business in real-time. End-result?  Usable, reliable, agile self-service solutions to enable better decision making now.   If that same set of information needs to be accessible to other consumers via other tools, the developers now have a validated set of specifications from that self-service BI tool metadata repository. This metadata export is used as the basis for implementing the more traditional ETL-->Enterprise DW-->reporting Data Mart(OLAP) architecture. This may not be for everybody. However, it behooves us to continually assess the viability new self-service enabling technologies. The sooner the better, because next on the horizon will be self-service near=time time Predictive & Prescriptive Analytics! The pristine designs that we spent years, even decades refining are built on enabling hardware and software stacks which are increasingly becoming obsolete and relatively more expensive to maintain vs the newer options in the market.  So I ask myself.  "Self? What is the acceptable life-span of good enough?" 

On the big data front. For the last 3 years that term ("big data") raised the hackles on my neck. "We've been doing big data for decades folks!" Well, while that is true, I had to reckon with the fact that a lot of investment is being poured into hadoop-alooza. It is a new architecture that is spawning a whole new revolutionary round of innovation. More importantly, there is a deluge of digital information sets floating around the world waiting to be monetized, now. This is information that previously just wasn't available.  Information properly harnessed, which can bring richer insight to the markets, competitive landscapes, prospect/customer behaviors & psyches, world events and causal relationships that can & will impact/influence how my enterprise leaders can be better informed.  “Big Data” is enabling just that. It is my responsibility as a BI solution provider to investigate how we can cost-effectively snatch that data out of the air-waves and avail it to the creative minds of our business analysts & data scientists to find that next "a-hah" moment and proactively act upon it before the competition does or before an early-stage outbreak becomes the next world epidemic.
Data Warehousing/Business Intelligence has been mainstream for a good 15-20 years. It was the radical idea back in those early adopter years and was met a lot of resistance.   Imagine the state of our world had we took the position that the truckload of green-bar reports delivered to mailroom was good enough?  As I look across the Business Analytics landscape, I see a new generation taking the helm in our business management and leadership roles.  This generation literally grew up with technology in their hands. They were weaned on gaming technology that learns overnight based on the previous days' world-wide interactions.  They are demanding their business operational systems learn how do the same.

I applaud the innovation coming out of our hardware and software provider communities. They challenge me as a Business Analytics solution provider to establish an "innovation culture".

What's good enough? A culture that comes in each morning firmly anchored in the truth that change is a constant that must be embraced. A culture excited to learn one new thing and propose one new idea to try. I challenge myself and my team to find ways to uplift our current investment value and embrace new technologies and techniques with prudent governance.

Tuesday, February 25, 2014

"Advanced Analytics" - A Brief Interview



 
I recently had the pleasure of a short interview with Fern Halper, Director of Research for The Data Warehouse Institute, during the 2014 TDWI World Conference in Las Vegas. Here's a quick recap of our discussion.

 

 How do you define advanced analytics?

Unfortunately the term Advanced Analytics has become a bit amorphous. What seems  advanced to some isn’t really to others.  I think the intent is to clarify the distinctions between business intelligence, point-in-time reporting/dashboards/scorecards, and exploratory analytics. First it’s important to describe the various modes of “analytics”. I rather like Tom Davenport’s International Institute for Analytics (www.iianalytics.com) point of view;


Descriptive, Diagnostic, Predictive, and Prescriptive. That said, When I hear "advanced analytics", I tend to associate that term to the diagnostics, predictive, prescriptive modes. Let me explain;

·         Diagnostic Analytics is the mode where interesting hypothesis are formulated, using techniques and tools that enable data discovery, data mining, text mining to uncover apparent patterns, anomalies and/or behavior trends.

·         Predictive Analytics is the mode where those hypothesis are tested using predictive modeling disciplines, applying future state assumption and running various simulations to find and validate causal relationships which prove, disprove or even or morph that hypothesis.

·         Prescriptive Analytics where recommendations for change or remediation are developed. A prescription, if you will, for how the targeted business process/activity should be tweaked to achieve a higher level of optimization given the predicted future state.  The ultimate goal is to effect change to realize an improvement in process execution for a better business outcome.

 

What do you see your clients doing with the technology?  What’s new and interesting? 

The self-service data discovery tools are being used by business-community clients to do much more than rapidly create visually rich performance dashboards.

·         Clients are using this technology to flush out complex specifications for things like data quality rules and MDM match-merge-survivor rules. 

·         Clients are finding that next “a-hah” moment, discovering things about how the business is running by diagnosing why the data looks the way it does. They are using these data discovery disciplines as a divining rod to feed candidate projects into the predictive/prescriptive programs.

Virtually all of the BI platform providers are developing integration kits between their visualization platforms and the predictive modeling/scoring platforms (both open source as well as the mainstream commercial offerings). This will improve their ability to visualize the data at the macro-level with drill downs to the detailed level during model development and testing. This also improves the ability to visualize the ongoing shifts in business outcomes as a result of implementing the predictive model and acting on the prescribed changes.

Virtually all of the BI Platform providers and Data Integration Platform providers are developing integration kits and transformation objects to embrace the Hadoop eco-system for “big data”.  This opens the door for the Data Scientists to integrate more and more external, contextual data from 3rd parties or public sources thereby enriching their models.

Another early trend I am seeing, is increased collaboration btw the internal process improvement teams and the business analytics teams. Companies with Lean or 6-sigma programs are defining process performance measures during value-stream and activity-based costing analysis. Those data points when integrated into the business analytics platforms provide business leaders visibility to how the company is performing not only on the traditional key performance lag-time indicators but also at a lower level based on these process improvement measures to give insight to how well business functions are running on a lead-time basis.

Finally, another emerging trend is the use of automated process discovery tools and workforce engagement tools that capture how processes are actually being performed and what people are actually engaged on throughout the day. Armed with this functional view of the business operations we can make better prescriptions on how to optimize those activities going forward.

 

What’s the biggest challenge companies face with analytics and how do you see them dealing with it?

The biggest challenge is definitely not the technology, rather it is the organizational alignment and the culture of the enterprise, particularly around shifting from “wisdom of experience” based decisions to embrace more “data-driven” based decisions. This arms our business leaders with facts to support their decision making. This culture shift is being led by the “Chief Analytics Officer” role.


 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
Effective CAO’s create a culture of best practice evolution as organizations create an analytics ecosystem that prizes data-based decisions; manage the changes that analytics decisions bring about; act as a Trusted Partner across the various internal lines-of-business units; find/champion those compelling business decisions and projects on behalf of the functional line-of-business owners. A CAO type person is a gifted story teller speaking the language of the business as well as the language of the data scientist as an evangelist promoting the need to explore the possibilities. They take ownership and demonstrate confidence of Decision Outcomes; is willing to take risks and takes responsibility for the results (good and bad) by implementing a culture that creates transparency, accountability and “air-cover”.

Another key challenge is developing/acquiring the skills in the enterprise to become Advanced Analytics mission capable. According to a 2011 study by the McKinsey Global Institute, US businesses alone will need 190,000 more workers with deep analytical expertise and 1.5 Million more data-literate managers by 2018.

·         Business Managers that have the capacity for a deeper level of thoughtfulness and rigor around decision making along with a willingness to explore and modify assumptions.

·         Business analysts with expertise to mine data with an eye towards detecting patterns, anomalies or behaviors

·         Data Scientists for developing predictive models.  These Data Scientists need deep expertise in statistical model design, complex data mining and scripting in various programming languages as well as be a good story teller.

Friday, April 26, 2013

Are You Monetizing Your MDM Information Hub?

Enterprises with existing Master Data Management (MDM) programs will benefit by looking at some innovative ways to “monetize” their MDM data assets.  Most existing MDM Programs were launched with a mission statement that did not include directly generating revenue off of the MDM Information Hub data asset. Most traditional (existing) MDM programs were put in place to:

·         Improve the accuracy of data that needs to be shared between disparate operational systems and external trade partners by creating a Gold Standard master in that Info Hub.

·         Improve the operational efficiencies by automating the daunting tasks of validating, cleansing and integrating critical reference data that needs to be shared between operational applications and trade partners.

·         Reduce the business risks to the enterprise due to erroneous data on critical compliance and regulatory reporting.

 MDM Programs are expensive! A lot of investment is poured into them year after year and many are achieving their mission. The question for consideration…“Is saving our way to better profit enough or is there opportunity to increase revenues and boost the ROI of MDM investments?” Enterprises need help finding ways to monetize that data asset to enable more revenue generation in addition to contributing to savings and mitigating risk.  For example in the Health Care industry:

·        In the case of the ICD-9àICD-10 conversions.  ICD-10 opens the door to analyze medical procedures and protocols and their outcomes at a more granular level. As companies achieve meaningful use of EMRs (Electronic Medical Records) applications, more detail about symptoms, diagnosis, treatments, protocols, supplies, involved professionals, outcomes, ongoing patient vitals and re-admittance detail are available for analysis than ever before.  ICD-10 encoding provides a level of detail to enable more robust analysis.

·       This opens the door for performing more sophisticated Business Discovery and cause/effect predictive/prescriptive analytics.  As Healthcare providers seek to be a member of an Accountable Care Organization (ACO) the ability to identify areas of potential treatment improvement through predictive and prescriptive analysis puts them in a much better position to score well across the ACO scoring criteria as established by Centers for Medicare and Medicaid Services (CMS).   Today, there are 33 measures for quality care.  Score well on these and Medicare/Medicaid will reimburse at 100% of defined eligible benefit.  Providers that do a poorer job in providing care (as evidenced by the lower scores) will receive less of the total eligible reimbursements.   This is lost revenue (revenue leakage).

Another area where advanced predictive/prescriptive analytics can be helpful is in finding root causes for data quality issues that MDM hubs are commissioned to resolve.  The goal is to improve the quality of the data (enforce MDM golden standard business rules) at point of original inception.  Move the MDM cleansing/Golden Record processing from a bulk-batch based method to an on-demand/transactional based processing method.   Predictive analytics helps uncover root-causes for poor quality and can analyze and predict which data elements will have the greatest adverse ramifications due to overnight processing lag times. This can be used to prioritize the most critical elements that should be moved to on-demand cleansing and gold standard processing.

Many MDM Programs do not have a means to analyze the overall enterprise data quality improvement of an MDM Information Hub over time. Nor do they track the changes/trends in the level of effort/operational costs applied across all of the data elements in the MDM info Hub.  Here is an opportunity to implement data warehouse, business discovery  and advanced analytics principles on top of their MDM business function (much like enterprises do for their traditional line-of-business functions such as Marketing Campaign Management, Sales Activity Management, Order Fulfillment, Inventor Management, etc.).  

Some enterprises are realizing the golden standard data asset within a specific information hub has intrinsic value to their industry. Innovation minded enterprises are finding ways to leverage their hubs to sell packaged, value-added analytics and data-provider services to other enterprises and service-providers in their industry.  In the Healthcare industry as Health Information Exchanges (HIE) gain adoption, care-provider enterprises with high quality information hubs will contribute to and take advantage of various HIEs to improve their ability to market services via the HIE open market by exposing/promoting their performance metrics vs. that of the local competition and/or global niche-provider/specialist competition.

Are you looking for ways to monetize your MDM Information Hubs?  Are you already monetizing your MDM Information Hubs? Would love to hear from you.

Monday, February 4, 2013

HANA - Is it Real? Should You Wait?

I'm a practitioner that initially reserved a high-degree of skepticism on the HANA promise. That is, until I had the chance to put HANA to the test. My findings? There's a lot of conjecture floating around about HANA.  It's new so there's not a lot of 1st hand experience being shared.

HANA is the real deal and that's based on 1st hand experience.

I invested in and led a development team that took an Oracle DW, pointed the DDL at HANA, ran the existing ETL and loaded the new HANA tables all in just a couple days. The ETL jobs and the existing report queries showed immediate performance improvement with very little development effort.

Then we got after building an application that included mapping and loading of additional source data to the integrated data schema on HANA, triggering event-based invocations of statistical scoring models (R), generating dynamic messages & alerts to mobile devices, capturing and committing write-backs from the mobile devices and dynamically updating a set of performance management dashboards. In less than 4 months with an agile team (1 DW Architect, 1 Data Scientist, .5 BI Architect, .5 Mobile Architect & access to the client's subject-matter-experts) we designed and deployed a robust application that is revolutionizing the business of higher-education.

Performance for ingest & load times, statistical model execution, mobile device alerts/write-backs, and dashboards response times all measured in seconds or less. In my experience this raises the bar on the other platforms I've worked (and I've worked on most of them).

As for migrating the BW-tables from traditional RDBMS platforms to HANA IMDBMS? With the Rapid Deployment Solutions (RDS) approach, migrating BW to HANA is a short and relatively simple tactical exercise. Out of the box, the load and query performance is many X faster (mileage will vary depending on deployment). Then you have the opportunity to optimize the underlying analytic and calculation views to extend functionality and improve performance of your BW info-cubes and reporting.

SAP has already starting migrating the more intense/complex OLTP transactions to read and commit updates to the ERP tables that have HANA underneath them. They are seeing positive results. Why would you do this? Simple performance lift is one key reason. The more compelling reason is you will start to realize the benefits of IMDBMS supporting the hybrid workloads of OLTP and OLAP against a single integrated database design. Yes, the need for the split of OLTP vs OLAP into separate eco-systems is on the path towards obviation as IMDBMS eliminates the performance constraints which forced the need to split OLTP & OLAP workloads out to begin with.

SAP's Hardware partners are coming out with scale-out reference architectures to leverage HANA's parallelism. This on top of the high compression ratios (10X plus) and innovative temperature-controlled archiving solutions from 3rd party software partners is opening up a capacity planning model for scaling on HANA while providing a compelling overall (OLTP & OLAP eco-system) TCO.

The visionary enterprises are seeing the value proposition of moving to a platform that enables ultra-low latency OLAP and OLTP applications while significantly reducing the resources and level of effort to design and maintain these applications. Less effort because you don't have to over-engineer the data models, ingest layers and query SQL in order to squeeze out marginal performance gains. You don't have to peel off subsets of data to run the complex multi-pass statistical scoring models for fear of dimming the lights on other workloads. That means more accurate scoring algorithms and more agile plan, test. do cycles.

You can wait, but you may end up wondering why you did as you look at your competitors' taillights. I highly recommend taking a good look at the platform, get out of the box, find a compelling use-case and map out the business outcome potentials before you decide to wait and see.

Saturday, August 21, 2010

BI Appliances: Why does it have to be so hard?

There's lots of buzz these days on appliances in the Business Intelligence and Data Warehouse space. Twenty-eight years ago, Teradata established the first reference pattern of a database machine, later to become known as a data warehouse appliance when "data warehouse" caught on. Netezza made the term "appliance" popular in 2000 with it's marketing campaign. Now several database vendors (Greenplum, Asterdata, Vertica, Paraccel, Dataupia to name a few) offer their product as an appliance either with their own DBMS software or combined with a pre-defined hardware configuration from a hardware partner. According to Gartner, roughly 50% of of the vendors on the 2010 Data Warehouse Database Management Systems Magic Quadrant are considered appliances.



So today's concept of an "appliance" in the data warehouse & business intelligence space appears to be centric to the database management function. That's a great start, but how do I get the data into these database appliances and how do I get it out for consumption and to take action? What about the data integration, data visualization, data analytics and operational performance management functions? Will we see appliances pop up in the data integration, visualization, analytics and performance management functional areas as stand alone appliances or will we see the database management appliances expand their scope to include these other mission critical functions? Will we find ourselves inter-connecting functional-silo appliances or will we see an evolution towards bundled "BI in a Rack" multi-functional appliances? Will the multi-function appliances contain all of the compute, storage and network services necessary to seamlessly enable the inter-operability across the data integration, data management, visualization, analytics and performance management functions? Can I fill up cages with "BI in a Rack" functional appliances to facilitate “sprawl” (multi-node grid processing across these functions)?



Presuming for a moment that the short answer to my barrage of speculative questions is a resounding yes, yes, yes, let's ponder the next question. Can I simply subscribe to these functional "BI in a Rack" appliances as a set of services? Dare I say it...can I get Business Intelligence as a Service?

I'd like to hear your thoughts.

Thursday, November 12, 2009

Entrenched Success-It's a Good Thing Right?

Two weeks ago I read an article about the higher education system in the US. I was surprised to learn that the US has something like 35 of the world's top 50 Universities and 8 of the top 10. It's been that way for quite a while. "Wow, that's impressive" thought I. The article went on to highlight that in the 50's the US auto makers were the world's best. They stayed that way for quite a while. In both cases these US-based enterprises enjoyed what was termed "entrenched success" in their industries/markets. Things were going along swimmingly well, day after day, quarter over quarter, year after year. "If it ain't broke, don't change it", I mused. Entrenched success is a dangerous thing. It breeds complacency and blindness to the reality that things are changing all around, all the time. The US automakers came to realize that when Toyota took the market over. The US Higher Education system could be heading for the same demise if we don't change with the times.

Last week I attended a course on strategic leadership. The instructor shared the "ladder of inference". Where we naturally use our belief systems and past experiences to hone our ability to look at the right data to come to the right conclusions to take the right actions, right? Wrong! The inherent problem is the reflex loop, which causes us to filter down to only those data points upon which we've come to rely. This results in our being blind to the myriad of other data points that could be relevant and shed some breakthrough insight. Wow! The very experience base that has contributed to success can cause us to become blind to change.

Change is constant. That includes the need to change our personal definition of what breeds success. We need to be aware of the impacts of the reflex loop and force ourselves to take off the filter when dipping into the data pool. Ask other's for input, promote constructive debate, challenge the definition of success, then draw conclusions and take action. We'll produce better results in our roles as family members, work colleagues or members in our communities.