Search This Blog

Thursday, July 3, 2014

Mortgage Lending Looking Up...and Looking Out for "The TRID"


Finally some light at the end of the tunnel for the Mortgage Lending industry, and it’s not a train coming head on.  John Bancroft from Inside Mortgage Finance (IMF) noted, “Fannie Mae and Freddie Mac…issued $141.83 billion of single-family mortgage-backed securities during the second quarter, an encouraging 9.4 percent increase from the dreary levels recorded in the first three months of 2014.”  The market has a way to go before we can be confident it’s out of the woods but Fannie & Freddie saw a solid 15.2% increase in single-family mortgage backed securities being issued from May 2014 to June 2014.

Along with that promising news, IMF reported that lenders are improving their closing times for purchase mortgages and increasing the accuracy of good faith estimates which results in fewer closing-cost surprises for borrowers.

In the mortgage lending industry, the name of the game is having the best-in-class Loan Production processes as measured by shorter cycle-times and increased throughput, while ensuring quality credit decisions and accurate loan documentation.  I work with several mortgage lenders who use diagnostic-based analytic techniques and self-service Business Discovery platforms to continually monitor the proficiency and productivity of their Loan Production processes.  When they see blips on the performance radar, they zero in on the deal, team member and/or originator to determine root cause and effect the necessary change to bring performance back into alignment with corporate performance benchmarks.

The reason for a best-in-class Loan Production process is because Loan Originator (Broker, Correspondent, etc) “lender loyalty” is largely a function of which lender can get the best credit decision back the quickest while providing a frictionless experience in producing the interim and final loan documentation.  Improvements in “lender loyalty” performance measures increases market share of that originator’s business and that means more earning assets in the lender’s portfolio.

Here comes the TRID! The CFPB’s (Consumer Finance Protection Bureau) TILA/RESPA (Truth in Lending Act/Real Estate Settlement Procedures Act) Integrated Mortgage Disclosure rule.   In short, this rule applies to “most” closed-end consumer mortgages and mandates the consolidation of four (4)  forms down to two (2) more understandable/informative forms. The rule also mandates timing requirements for when those completed and accurate forms MUST be delivered to the loan applicant/borrower.

·         First, the Good Faith Estimate (GFE) and the initial Truth-in-Lending disclosure (initial TIL) have been combined into a new form, the Loan Estimate.  The Loan Estimate must be provided to consumers no later than the third business day after they submit a loan application.

·         Second, the HUD-1 and final Truth-in-Lending disclosure (final TIL and, together with the initial TIL, the Truth-in-Lending forms) have been combined into another new form, the Closing Disclosure.  The Closing Disclosure must be provided to consumers at least three business days before consummation of the loan.

My “spidey-sense” tells me TRID is a report specification change (new layouts, new filter criteria, new reporting elements) and a new set of performance benchmarks against which lenders MUST manage their operations. Are mortgage lenders anxiously awaiting a software “regulatory patch” from their Loan Origination System (LOS) software vendor(s)?  Maybe there’s a better way.  How about using their Enterprise Business Discovery platforms to produce reports (forms), track performance and proactively trigger alerts for loan applications requiring attention BEFORE they violate regulatory performance rules?  Today’s Business Discovery tools are by-and-large the most agile and nimble platforms for modifying existing/creating new “reports”, providing interactive performance dashboards and generating alerts.  Even if the lender doesn’t have all the necessary loan application detail flowing into their enterprise data warehouse, they can still use their business discovery platforms to bring the right data together to handle regulation changes such as TRID.    

The CFCB’s TRID rule is targeted at the core steps within the Loan Production process. After all that hard work optimizing the Loan Production process, it would be ill-fated to erode that performance as a result of scrambling to make LOS system changes and business process changes to comply with the rule. (Deadline is August 2015).  With the right business discovery platform, the new forms can be simulated within days to weeks and the new processing performance measures can be modeled and implemented now.

Still not convinced?  Please consider the fact that the CFCB has just recently completed taking over responsibility for RESPA from the Department of Housing and Urban Development.  During this several year transition, the enforcement of REPSA Section 8 was in abeyance. Industry experts are now forewarning that “…the CFCB is picking up where HUD left off, and then some.” This means there will be more rules passed down with the intent of making things clearer to the borrower. That sounds like more meaningful reports and more regulatory performance measures to implement in order to avoid costly fines for non-compliance.

TRID is considered to be massive and should not to be taken lightly. A lot can be done in thirteen-months using your business discovery platform.  Consider using it to get a jump on implementing regulatory changes such as TRID.

Sunday, March 9, 2014

Why Can't the BI Hardware & Software Vendors Leave Good Enough Alone?


It's tough being a VP of Business Analytics these days.  The vendors are relentless. They'll call on anyone in the enterprise and convince them to acquire their technology. My business community is restless, dare I say, sometimes reckless.  They buy into these pitches and acquire products often times without my even knowing. And Oh my...The myriad of options out there are mind-boggling and seem to be growing every quarter.  Why can't I just keep my shop running as it is?
The challenge facing most providers of business analytics (BI/DW IT Shops) is their solutions take too long and typically miss the requirements mark. After a lot of hard work they end up with a largely dissatisfied consumer population.  Our heritage is to focus on getting that "single version of truth". A pristine set of curated data before we allow the business community to consume.  Our techniques and our architectural reference patterns were built upon an industry heritage and historical tradition developed on technologies that are rapidly becoming obsolete. Our understanding of the true meaning of operational data is limited and our mission has become too data centric. (We think data is the "ends", when it is merely the "means" an end.  Namely, Optimized Functional Performance through actionable insights from better business analytics)  End-result, the business community takes BI into their own hands. Wham! Spread-marts, user-marts, data-sprawl, data-anarchy,  disastrous business decisions, fines, lost licenses, jail time, workforce reductions, bankruptcy.  That is a bit unnerving!  So I get the CIO to convince his peers to slap the business' hands, allocate spend time & money to  rationalize/simplify that data mess back into IT control, recoup costs, mitigate risks, get it under control.  Whew, success... then Wham! The 2-3 year cycle starts again. Where's the innovation in that?
Over the last 7-8 years the BI tool offerings, especially the self-service data discovery tools, cut through the critical path of any data warehouse/BI project because the end-consumers who really know their data, interact WITH production data through advanced visualization techniques. In collaboration with IT the business drives out the formulation of a federated, integrated business view, complete with difficult integration, transformation and data quality rules applied against it. All validated by the business in real-time. End-result?  Usable, reliable, agile self-service solutions to enable better decision making now.   If that same set of information needs to be accessible to other consumers via other tools, the developers now have a validated set of specifications from that self-service BI tool metadata repository. This metadata export is used as the basis for implementing the more traditional ETL-->Enterprise DW-->reporting Data Mart(OLAP) architecture. This may not be for everybody. However, it behooves us to continually assess the viability new self-service enabling technologies. The sooner the better, because next on the horizon will be self-service near=time time Predictive & Prescriptive Analytics! The pristine designs that we spent years, even decades refining are built on enabling hardware and software stacks which are increasingly becoming obsolete and relatively more expensive to maintain vs the newer options in the market.  So I ask myself.  "Self? What is the acceptable life-span of good enough?" 

On the big data front. For the last 3 years that term ("big data") raised the hackles on my neck. "We've been doing big data for decades folks!" Well, while that is true, I had to reckon with the fact that a lot of investment is being poured into hadoop-alooza. It is a new architecture that is spawning a whole new revolutionary round of innovation. More importantly, there is a deluge of digital information sets floating around the world waiting to be monetized, now. This is information that previously just wasn't available.  Information properly harnessed, which can bring richer insight to the markets, competitive landscapes, prospect/customer behaviors & psyches, world events and causal relationships that can & will impact/influence how my enterprise leaders can be better informed.  “Big Data” is enabling just that. It is my responsibility as a BI solution provider to investigate how we can cost-effectively snatch that data out of the air-waves and avail it to the creative minds of our business analysts & data scientists to find that next "a-hah" moment and proactively act upon it before the competition does or before an early-stage outbreak becomes the next world epidemic.
Data Warehousing/Business Intelligence has been mainstream for a good 15-20 years. It was the radical idea back in those early adopter years and was met a lot of resistance.   Imagine the state of our world had we took the position that the truckload of green-bar reports delivered to mailroom was good enough?  As I look across the Business Analytics landscape, I see a new generation taking the helm in our business management and leadership roles.  This generation literally grew up with technology in their hands. They were weaned on gaming technology that learns overnight based on the previous days' world-wide interactions.  They are demanding their business operational systems learn how do the same.

I applaud the innovation coming out of our hardware and software provider communities. They challenge me as a Business Analytics solution provider to establish an "innovation culture".

What's good enough? A culture that comes in each morning firmly anchored in the truth that change is a constant that must be embraced. A culture excited to learn one new thing and propose one new idea to try. I challenge myself and my team to find ways to uplift our current investment value and embrace new technologies and techniques with prudent governance.

Tuesday, February 25, 2014

"Advanced Analytics" - A Brief Interview



 
I recently had the pleasure of a short interview with Fern Halper, Director of Research for The Data Warehouse Institute, during the 2014 TDWI World Conference in Las Vegas. Here's a quick recap of our discussion.

 

 How do you define advanced analytics?

Unfortunately the term Advanced Analytics has become a bit amorphous. What seems  advanced to some isn’t really to others.  I think the intent is to clarify the distinctions between business intelligence, point-in-time reporting/dashboards/scorecards, and exploratory analytics. First it’s important to describe the various modes of “analytics”. I rather like Tom Davenport’s International Institute for Analytics (www.iianalytics.com) point of view;


Descriptive, Diagnostic, Predictive, and Prescriptive. That said, When I hear "advanced analytics", I tend to associate that term to the diagnostics, predictive, prescriptive modes. Let me explain;

·         Diagnostic Analytics is the mode where interesting hypothesis are formulated, using techniques and tools that enable data discovery, data mining, text mining to uncover apparent patterns, anomalies and/or behavior trends.

·         Predictive Analytics is the mode where those hypothesis are tested using predictive modeling disciplines, applying future state assumption and running various simulations to find and validate causal relationships which prove, disprove or even or morph that hypothesis.

·         Prescriptive Analytics where recommendations for change or remediation are developed. A prescription, if you will, for how the targeted business process/activity should be tweaked to achieve a higher level of optimization given the predicted future state.  The ultimate goal is to effect change to realize an improvement in process execution for a better business outcome.

 

What do you see your clients doing with the technology?  What’s new and interesting? 

The self-service data discovery tools are being used by business-community clients to do much more than rapidly create visually rich performance dashboards.

·         Clients are using this technology to flush out complex specifications for things like data quality rules and MDM match-merge-survivor rules. 

·         Clients are finding that next “a-hah” moment, discovering things about how the business is running by diagnosing why the data looks the way it does. They are using these data discovery disciplines as a divining rod to feed candidate projects into the predictive/prescriptive programs.

Virtually all of the BI platform providers are developing integration kits between their visualization platforms and the predictive modeling/scoring platforms (both open source as well as the mainstream commercial offerings). This will improve their ability to visualize the data at the macro-level with drill downs to the detailed level during model development and testing. This also improves the ability to visualize the ongoing shifts in business outcomes as a result of implementing the predictive model and acting on the prescribed changes.

Virtually all of the BI Platform providers and Data Integration Platform providers are developing integration kits and transformation objects to embrace the Hadoop eco-system for “big data”.  This opens the door for the Data Scientists to integrate more and more external, contextual data from 3rd parties or public sources thereby enriching their models.

Another early trend I am seeing, is increased collaboration btw the internal process improvement teams and the business analytics teams. Companies with Lean or 6-sigma programs are defining process performance measures during value-stream and activity-based costing analysis. Those data points when integrated into the business analytics platforms provide business leaders visibility to how the company is performing not only on the traditional key performance lag-time indicators but also at a lower level based on these process improvement measures to give insight to how well business functions are running on a lead-time basis.

Finally, another emerging trend is the use of automated process discovery tools and workforce engagement tools that capture how processes are actually being performed and what people are actually engaged on throughout the day. Armed with this functional view of the business operations we can make better prescriptions on how to optimize those activities going forward.

 

What’s the biggest challenge companies face with analytics and how do you see them dealing with it?

The biggest challenge is definitely not the technology, rather it is the organizational alignment and the culture of the enterprise, particularly around shifting from “wisdom of experience” based decisions to embrace more “data-driven” based decisions. This arms our business leaders with facts to support their decision making. This culture shift is being led by the “Chief Analytics Officer” role.


 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
Effective CAO’s create a culture of best practice evolution as organizations create an analytics ecosystem that prizes data-based decisions; manage the changes that analytics decisions bring about; act as a Trusted Partner across the various internal lines-of-business units; find/champion those compelling business decisions and projects on behalf of the functional line-of-business owners. A CAO type person is a gifted story teller speaking the language of the business as well as the language of the data scientist as an evangelist promoting the need to explore the possibilities. They take ownership and demonstrate confidence of Decision Outcomes; is willing to take risks and takes responsibility for the results (good and bad) by implementing a culture that creates transparency, accountability and “air-cover”.

Another key challenge is developing/acquiring the skills in the enterprise to become Advanced Analytics mission capable. According to a 2011 study by the McKinsey Global Institute, US businesses alone will need 190,000 more workers with deep analytical expertise and 1.5 Million more data-literate managers by 2018.

·         Business Managers that have the capacity for a deeper level of thoughtfulness and rigor around decision making along with a willingness to explore and modify assumptions.

·         Business analysts with expertise to mine data with an eye towards detecting patterns, anomalies or behaviors

·         Data Scientists for developing predictive models.  These Data Scientists need deep expertise in statistical model design, complex data mining and scripting in various programming languages as well as be a good story teller.