Using Valuable Data Assets with AI in the Cloud

Create a vendor selection project & run comparison reports
Click to express your interest in this report
Indication of coverage against your requirements
A subscription is required to activate this feature. Contact us for more info.
Celent have reviewed this profile and believe it to be accurate.
3 February 2020
Marty Ellingsworth

This is my first blog at Celent, continuing my journey as analytics evangelist and change agent.

You know I can be counted on for a bit of levity, even at the expense of our industry, so let’s get the “2020 perfect vision” pun out of the way - the world awoke with 2020 realizing that data is a valuable asset -- and that no one is in charge of it.

(For a bit more fun – I wrote this blog below in exactly 2020 words)

Cloudy as that forecast appears, it does not take an artificial intelligence to look at today’s organizational charts and quickly and correctly classify that not a single box is in charge of data quality. Therefore, an AI would infer that if data is so valuable, EVERY BOX must be in charge of data quality, because without good data, how can you make good models and good decisions?

You don’t need to be a data scientist to observe that human nature is the same as ever… when something is everybody’s responsibility, everyone thinks someone else is doing it.

Quality suffers, since there is lots of confusion and little accountability.

There is plenty of corporate blaming going on around who is accountable for systematic, siloed, soiled data. I’ve been doing analytics since the ‘80s and as recently as last quarter, and I have led teams that have launched industry changing data products and services.

So, not to worry – it has been proven that you don’t need perfect data for great AI solutions. But, since this is a topic keeping executives up at night and bleary-eyed on a regular basis, this does help explain the shorter tenures of many incumbents as well as the expansion of chairs at the top table for analytics. All the top boxes need an innovation and AI in cloud roadmap.

For great AI solutions, you absolutely do need an analytics strategy that informs your data strategy and both of these should support your business strategy. The best practice for an AI learning system is to get good models into production and then make them better, by both continuing to learn and by improving your data quality where it matters most.

There’s nothing new here about dirty data. Plenty of ways to make things work, in context.

What is new now, is that a handful of companies who embrace being analytically data-driven enterprises, have for the first time in history breached the $1Trillion market valuation level. And, there are dozens of companies doing nothing more than pulling data together now making $Billions, some just as seeming start-ups. Employees dream of dancing unicorns, while executives dream of herds of dancing unicorns. Data, AI, and cloud unicorns are even seen flying.

That “T” for Trillion has every CEO, shareholder, and member of a board of directors starting to understand that data is an asset. It can contribute (or destroy) value. It can help grow the top line. It can help improve profitability. It can potentially be a proprietary asset that can be monetized, and, maybe, it can be a key competitive advantage. It can also be a living nightmare for security, regulatory, and brand-damning bad publicity and frustrating customer experiences.

The big “IF” on risk/reward is, can teams figure out how to use it, before their competitors do. In this digital experience economy, there is a real fear of becoming a $1B company by starting as a $5B company and not changing fast enough.

Going digital and delivering end-to-end customer experiences with AI and the cloud is a catalyst to make companies look at their data like an asset, and that’s when things can get blurry.

There are a lot of issues, but we will set aside chasm-like skill gaps, budget priorities, legacy project backlogs, legacy culture, cyber concerns, and legacy weak policy and practice around privacy and data governance issues. Our focus here is on data quality and “fit for use”.

An underlying structural issue in current operations is that companies are historically product-centric and siloed. This fact makes creating a personalized, customer-centric, consumer-facing digital experience particularly challenging – everything is organized at the wrong level of analysis and experience.

A single customer may have their name and address entered differently in multiple account, policy, billing, contact, call center, agent, ledger, claims, and payment systems.

Outside vendors and data suppliers often add to the “dirty data” when they are pooled back into analytic and operational reporting, modeling, service, and decisioning processes.

There is often shock and surprise when with “eyes wide open” executives see they do not have detailed access to any of the data assets processed at their vendors.

Even more eye opening, those vendors are creating AI systems from those data and then selling the solutions back to industry at large. AI models can’t un-see the data they look at. Keep that in mind when setting your analytics strategy, your procurement and data strategy, and also your data governance and regulatory strategy.

Every company is unique given it has its own history of data quality neglect, where cleaning data has been seen as an unnecessary expense with little business impact on the bottom line.

When first efforts begin for analytics, “having” and “seeing” the data are the preamble to “knowing” then “using” it.

The people, process, and technology fundamentals can be looked at as an information assembly line in a data factory. Every company and line of business is different, yet most need the same raw materials, inspections, services, sales, and statutory audits.

That’s why “all things as a service” are being offered externally, and each existing company-as-a-data-factory can use them at intersecting AI and cloud links into their data-driven digital asset decision supply chains.

This fact means that each AI project starts with a business/customer need and then works out the analytics strategy to discover insights available in the current data assets that can be used to satisfy those needs, while also charting a course to get better over time. Sometimes creating entirely new data is needed, while in others using existing data in new ways creates value.

In most cases, an analytic strategy fusing data from multiple systems, finding valuable new features, and streaming them forward to be used in production, requires new skills.

Do not quit before you start – professionals can help improve business issues and data issues serendipitously. AI, cloud, data, analytics, digital. These are all skills that can be learned with tools that are getting better. Like every trade, the tools, know-how, and tradecraft matter a lot.

You can see that you didn’t need an electrician before you had electricity. You need to see now that data, AI, and cloud are your new digital electricity. As in the progression from horses, to steam engines, to gas engines, to electric motors, you will see that you can’t stop progress.

Going digital is necessary. Time is of the essence and journeyperson mistakes can be shocking and calamitous. You still need to run your existing business efficiently and effectively – data, AI, and cloud will help with that too.

There a few scenarios to consider when it comes to using your data -- ranging from things done ‘for’ a customer, to things done ‘with’ a customer, to things done ‘to’ a customer. [All of the other great projects that improve business and employee problems are excellent AI and cloud candidates in their own right, but here we focus on customer-centric scenarios.] Each scenario (for, with, to) has different current hurdles of quality, care, and compliance on a ‘fit for purpose’ level for business partners and consumers. AI and cloud make it possible to press on.

Let’s take a quick look.

Learning at-the-edge actually helps fine tune and personalize many AI-inside devices as the end user preferences and living situation can be part of the adaptation – like a smart hearing aid learns to change with the wearer’s abilities in different ambient noise settings, and then, WOW, it also offers real-time multi-language translation of voice data that interacts with a pairable smartphone display screen to let people communicate. And, if you need more information or point-to-point directions, just a touch and the paired smartphone will whisper into your ear. This is an example of data, AI, and cloud working “for” the consumer. Instant, powerful, effortless, and remarkably valuable.

Back to insurance, banking, and finance – “can you hear me now?”

Peace of mind, ease of use, value for price, with personalized advice and information on-demand, that adjusts over time to the context of existing situations and circumstances.

Everyone could use a smart, translation capable, voice-to-text-to-voice, location intelligent, enhanced, and internet connectable metaphorical “hearing-aid-as-a-coach” when it comes to managing their money and risk.

Furthermore, every company would want that type of customer for a lifetime, since they make better choices as informed consumers who use information to manage uncertainty. Engagement matters!

Even when their choices don’t turn out so well, because their data is shared, auxiliary services, mitigation, and resiliency opportunities arise where the relationship can be deepened and proven more valuable by being there in moments of need as well as moments of rejoice.

Of course, it matters more when we can rely on everything being connected -- the people, places, things, cars, homes, accounts, and services. The data about them and the interaction between them all point to the experiences we expect when dealing with them. There is mounting frustration felt when things don’t connect or when the information is wrong.

How the data quality issue matters often hinges on whether the in-the-moment use of it is “for”, “with”, or “to” the user. The information available in the AI embedded in the hearing aid was an ancillary benefit “for” the wearer, since the main objective was hearing improvement. Making the device an access point to the internet and additional AI solutions was an innovation.

I will save anecdotes of “with” and “to” for future blogs (or you can call me), but here are two quick examples.

Helping to organize a consumer’s monthly budget balance sheet to categorize and track spend and risk along with benchmarks to their peers is a good example of “with”. Making real time decisions on whether or not to start a new business relationship with a consumer, what price, terms, and conditions are offered, and how they need to set up their accounts are all things done “to” them. Of course, these are the things with the most legal and regulatory scrutiny historically to track the accuracy and fairness of decisions made from individual pieces of data ascribed to consumers for a specific purposeful decision.

You don’t know today what data may solve tomorrow’s opportunities, but you can be certain that people want what they want, when they want it, as cheap as they can get it.

So now, the circle is complete – “We can’t do that” turns into “It must be done”.

What is holding companies back is the most mundane of things.

The historical data come from business processes and systems that are siloed, legacy, product-based, off-line, and often manual ways of working. These manual processes have been the “fake it ‘til you make it” glue holding all of the customer channels together to deliver branded experiences. The new way of working is to use AI and cloud to fuse together data assets in an omnichannel fashion and to stop as much, or all, of the manual and paper-based work as possible.

Emerging digital platforms and ecosystems are working through the data quality issues. On an as used basis, data are now meeting the naked reality of an end-to-end digital lens where things often don't look ready to show to a customer, or to be used for decisions. Knowing when “not to use AI” becomes the digital off-ramp. This is truer now since data accuracy expectations increase when transparent to customers in each experience. Over time, learning how to fix source system of record data quality issues will minimize off ramp activity. Hint: this is a really good problem to solve.

Expect the need to retain and audit every scrap of data going forward, but not just for regulatory reasons. Test and learn loops will help you make agile sprints forward faster, and as more data is made available, smarter AI in faster clouds can deliver better experiences and decisions “for”, “with” and “to” customers. This will drive your business forward.

Data - if you have it, you can see if it is valuable, and once you know that, you can figure out how to use it. Repeat for new problems, new data, new models, and new decisions. That's the Have-See-Know-Use analytic value framework.

Insight details

Insight Format
Blogs
Geographic Focus
Asia-Pacific, EMEA, LATAM, North America