Be it in banking, insurance or any other industry, the pressure on improving the customer experience is immense. How can an organization engage more closely with consumers and become more responsive to their needs? By using data strategically through AI and ML (machine learning). Cognizant’s Erik Dingvall, Director of AI & Analytics, explains why the data foundation is essential in becoming truly digital.
In most organizations, every element is rapidly becoming a data-driven play. All aspects of the business generate information to help develop and improve services, for those who manage to harvest it. Doing so is crucial to success in today’s marketplace.
As for insurers, for example, they want to understand data around claims, improve loss prevention, make sure customers are correctly insured, and learn from experience to strengthen customer experience. Now, with all the connected devices and cars, there are immense possibilities to learn from data and suggest new, profitable business models.
Within banking, the needs also pivot around getting to know the customers better. Among other things, banks want to be able to understand customer lifetime moments to service their finance needs just in time, build new ecosystems with fintech partners and improve risk management with predictive capabilities.
Get the Foundation Right
Yet, the majority of organizations are not leveraging enough data to become truly digital and data-driven through AI and ML. What’s stopping them then? Normally, having too many sources, siloed information, and a mixture of structured and unstructured data make it hard to extract any real value out of the existing data. There might also be a lack of other success-drivers like executive support, ability to retain and nurture the talent, lack of focus on organizational change efforts, lack of focus on human computer collaboration, etc.
In this post, I would like to focus on tech foundation imperatives. Even though AI and machine learning are business enablers, we need to get a bit techie. Utilizing AI and ML technologies, at scale, requires a functional data foundation. Quite commonly, this is viewed upon as an infrastructure issue rather than a business matter and might not be prioritized as one, but it really should be. This has potential to enable new revenue opportunities, grow existing business and improve your bottom-line.
What you need is an engine that allows the capture of any type of data, the generation of insights through analytical models, and the integration of those insights in real time with core processes. The engine should also support the processing of data from live events. On top of that, you also need the capability to scale your data-driven initiatives beyond Proof of concept stages. You also need to infuse collaboration around data within your business teams by enabling easy access to data and tools to play around with the data.
Is a Data Lake the Answer?
Traditional data platforms, like relational databases, may not be able to meet many of the AI needs. They rely on using rigid technology infrastructure to capture mostly internal data in predefined formats, and this has become insufficient. Hence, identifying the right data infrastructure (big data, data lakes, and other modern data platforms) is key to the success of having the right data foundation.
From my experience, a data lake is a good option. They are low-cost data storage environments that use commodity hardware and an integrated technology stack of open-source data and analytical tools. Most cloud vendors have strong commercial cloud offerings including storage, compute, utilities and data science work benches and services around it.
The data lake can store vast amounts of structured and unstructured data for AI purposes and help drive more value from existing data assets by combining, analyzing and using traditional and new types of data. It also gives the opportunity to democratize data by providing enterprise-wide access to information.
Obviously, simply storing the data is not enough. You also need to work with it, prepare it, experiment with it and then finally make it AI ready for your teams to build analytics products on top of it (e.g., predictive models). Roughly, about 25–30 different technology components (many of them open source) coming together to work in tandem. This is by no means an easy feat and you need experienced team which can scale your data pipelines in production.
Some Advice on the Way
Cognizant has been involved in several such data modernization projects, among them an insurance company and a Nordic-Baltic banking group, as part of preparation for realizing AI and ML initiatives. All in all, our experience from across the globe shows some distinct make-or-break factors that I’d like to share with you:
• Get the internal organization on board and put efforts into democratizing the use of data with modern techniques.
• Be ambitious; pick business drivers and use cases that will make an impact as opposed to simple low-hanging fruits which sometimes demand similar effort but don’t bring sharp results that can enthuse the business.
• Find the center of gravity and pinpoint the accountability of building a platform which in turn enables business teams to experiment and build data products on their own – otherwise it won’t happen.
• Go all the way. Building a good predictive model is just 10–15% of the activity. Think about applying the insights into your business processes. This is not just a frontend integration issue, but more importantly a change management one where business teams need to learn to work with machine intelligence.
• Make a plan to revisit key design and tech decisions so that you are continuously improving and making your foundation future proof.
Business all around the world have realized that AI will play a key role in disruption of industries globally. If you get the required foundational enablers in place quickly, you’re likely to be among the winners within your industry.