Data is the fuel. And analytics is the engine.
That was the message driven home repeatedly at the recently concluded SAS Global Forum in Orlando, Fla. Bankers in particular need to take this signal to heart.
Put into practical terms, this is the concept:
With the right approach, the right models, and the right know-how, raw data can be transformed, through analytics, into usable intelligence. That intelligence can greatly benefit any bank’s marketing, compliance, and fraud mitigation efforts, to name a few key functions.
Note that the value of advanced analytics flows through the mortal systems of the nearly 5,600 attendees of this conference as surely as oxygen and hemoglobin. The employees, customers, and observers there, coming from six of seven continents and representing a wide range of industries, truly embrace the current and nearly unlimited future possibilities of analytic applications.
Good intelligence simply left on the table
SAS CEO Jim Goodnight, who used the data fuel/analytics engine analogy at the opening session, expands on the concept in the recent SAS annual report to investors:
“Data has become the lifeblood of our world, pumping through the heart of everything. It is already being put to work to drive progress and make a better future. But far too much data is still being left on the table, expiring before it can ever be used. That’s wasted potential.”
That sentiment of wasted potential especially applies to financial institutions, according to David Wallace, Global Financial Services Marketing Manager for SAS, in an interview with Banking Exchange.
The good news: With new tools and new approaches, banks can turn “just-okay decisions” into “much-improved decisions.”
Wallace explains that banks were among the first SAS clients when the company started in 1976, and remain so today. More than 90% of the top 100 global banks use SAS, as well as about 3,500 other financial institutions.
Remarking on the data-analytics connection, he says: “It’s really all about trying to understand what the data are saying. With advanced analytics and optimization you actually can take all the data that’s coming from digital channels and you can take those strategies that started with [old forms of segmentation] and you can modify that and optimize it for each one of us and the particular situation that we’re in.”
Linking data to observations
One particular line of development that’s just starting to take hold is the concept of using unstructured data, and applying it to marketing.
“Suppose I am sitting across from you in a bank branch,” says Wallace. “You’re the customer and I’m the branch manager. I have your records in front of me. I am going to look at the information that I have. I am also going to be asking you questions and I am going to be processing the answers that you give me verbally.”
But there’s more to a face-to-face consultation, Wallace adds.
“I am also going to process all the nonverbal information,” he explains. “If you start squirming in your chair, I am going to navigate the conversation to a different place—that’s unstructured data.”
With certain applied technology, digitized forms of such interchanges can enhance the bank-customer relationship.
“As customers increasingly do business in digital channels, you have the data coming in—your unstructured data. You can see where customers navigate in the website, what pages they abandon, what applications they use and abandon,” he says. “You need to first make much more effective use out of the data that you have in the bank.”
Unfortunately or banks, says Wallace, “most unstructured data is not being utilized today. Call center notes. Branch sales reports. A lot of that is not very well used today.”
And that’s just what the bank already has and records.
“As banks are getting more ‘social,’ you can also process that social media data,” Wallace notes.
One way to do so is through text analytics. This approach, in brief, tracks given words or phrases that pop up in certain contexts and certain media. This technique can give a bank hints regarding a given customer’s thinking.
Analytics and coming new accounting standard CECL
Another huge area for analytics applications is for compliance and risk management.
As an example, Wallace mentions the pending implementation of the accounting standard known as current expected credit loss, or CECL, due in 2020.
(Briefly, under the previous incurred-loss model, banks estimated impairments using fairly simply analytical models and only recognized losses once a certain quality threshold was triggered. With the new standard, they’ll need to incorporate granular historical information, current conditions, and supportable forecasts to calculate expected loss over the life of each loan and recognize the expected losses upon origination.)
As explained by Laurent Birade, Americas Region Lead for CECL at SAS, in a statement: “Building forward-looking loss models is nothing new. What is new is having to run these models more frequently with the level of scrutiny required for internal controls for financial reporting. We expect that examinations coming from external regulators as well as auditors will raise the burden of proof that banks must show in setting reserves.”
Says Wallace: “Although CECL is an accounting standard, it does require the risk function and the finance function to work together on models. They have to evaluate the risk of credit in a way that requires advanced modeling. When the CFO has to sign a document every month that says the regulatory financials and the quarterly statements are correct, there is a legal risk if the CFO signs something that is incorrect.”
Mysterious black box no longer acceptable
Another aspect in advanced analytics gaining ground is the application of artificial intelligence, sometimes referred to as machine learning. A bank, in particular, has to be careful in how it might employ this, Wallace notes. The reason is, in some AI techniques that use multiple levels of a neuro-network model, they can get to the point where the operator can’t actually say how the model came up with a particular answer.
That wouldn’t work for credit decision models, he says, which are highly regulated. In such areas, examiners need to know not only what the results are and what data was used, but how the model process itself proceeded.
“But in the fraud area, fraud models aren’t as highly regulated. So you want to use all the horsepower of the engine, all the data as the fuel to figure out in the authorization and decision stream if you can catch fraud in real time,” he says.
With factors like real-time payments coming, says Wallace, anti-fraud efforts must be running at the same speed as the payment process.
“The credit card authorization cycle today is about 120-150 milliseconds. That means the fraud decisioning process has to be somewhere between 20-50 milliseconds,” Wallace says.
“You can only do that with advanced machine learning or what you would call artificial intelligence models,” he explains. “The fact is, once you train them on the data, you let them loose and they can make adjustments. Of course, you still have to constantly validate the process.”
Common ground: eye on the customer
All of this—unstructured data, compliance risk management, artificial intelligence—circles back to catering to, providing for, and protecting customers, even as all that remains transparent to those same customers.
“It is exactly what banks have to do for customers that they will never see and the customers will never know about. And they’ll never say what they need. But they will expect you as the bank to know what they need,” Wallace says.
- The New Killer KPI for Personal Digital Banking: Moment-of-Need to Resolution
- Adobe Executive Predicts People Will Not Ditch Bank Branches for Online Banking
- Effective Loan Pricing – Why It’s Imperative Now More than Ever
- Vice President at the Bank of Laverne of Oklahoma Speaks to the President
- State Street: From Block Chain to Digital Assets