14.9 C
Melbourne
Sunday, December 22, 2024

Trending Talks

spot_img

CBA is building digital twins of its financial planning systems – Finance – Software

[ad_1]

The Commonwealth Bank of Australia is building digital twins on Confluent’s event-streaming platform to analyse its financial planning systems and make improvement fixes to them in real-time.

CBA is building digital twins of its financial planning systems


CBA’s digital twins are replicas of its systems that use machine learning to forecast future performance and derisk changes to processes by running virtual tests before deployment.  

CBA told the 2024 Kafka Summit in Bangalore earlier this month that it is designing proof-of-concepts for digital twins in its “events hub”, which runs on Apache Kafka.

Real-time analysis 

The bank has been experimenting with using digital twins to better analyse decision-making frameworks for functions like home lending processes since 2021.

However, CBA’s senior vice president – advanced analytics chapter area lead Sharmistha Chatterjee said that the current focus is on connecting them to real-time analysis systems rather than feeding them insights at intervals through batch processing.

“Without the continuous feedback, the machine learning can’t act on the feedback from the observability; not without having a real-time streaming framework and that is the role of Kafka in the data feedback.” 

Event-based, virtual replicas of CBA’s systems for managing “financial planning” will launch “at the end of this year or early next year,” she said. 

“Right now, it’s not fully yet time, but that is the place we have to go.”

The proof-of-concepts focused on segmenting CBA’s customers into more granular segments and improving tailoring of financial products, Chatterjee said. 

“So, for personalisation offers that we give customers, we need to see if these personalised offers are getting translated or not. And then we understand and analyse ‘what is the best way we can retain this customer’ and ‘what is something we can do better?’

“What we’re trying to look for in the pipeline is where you have a lot of variables, different customer segments that you take; you have data continuously where the volume of data is so huge that you have gigabytes of data and terabytes of data streaming in, and then you have to segregate [them].

“We want real-time visualisations of different sorts of customers in the same industry in different regions, different targets and other different customer segments. We need that to go through our efficient streaming pipeline.”

In CBA’s case, real-time events are typically either different transaction types or CBA’s reactions to them, which are logged into different Kafka topics. 

Continuous correction 

Chatterjee said that live streaming is real-time observability but only of the present whereas the digital twin component built on top of it uses machine learning prediction models to expand the observability into the future.

“We are not only sending more real-time alerts but being proactive with the observability; we have become proactive before the issue happens.”

Moreover, before implementing the fix that the digital twin recommends, CBA will be able to trial multiple fixes in the virtual ‘world’, compare their results and select the optimal one. 

“We have a lot of data; we ingest that data; we send the feed; we design all the simulations and use-cases and use that to understand what could go wrong; what could go right; and we replicate it exactly; then we send the impact back to the physical model.

“We build a resilient end-to-end system that can help AI-driven decision-making practices, threat detection, and demand forecasting and then analyse that in post-mortem, which we call the causal analysis.

“We analyse how the systems are reacting; what has caused what problem, and what was the action that was actually taken on the ground by the business that has helped to remediate this?”

CommBank’s event hub 

CBA has previously invested extensively in live data processing architecture, adopting its first real-time processing platform more than a decade ago. 

In 2009, CBA selected Progress(R) Apama(R)’s complex event streaming platform for equity trading, integrating it with both its internal systems and the ASX. 

CBA started using Kafka in 2016. When it was adopted, former CIO David Whiteing told iTnews that “moving to an event-based architecture so we can separate out the write and read transactions” would improve the way customers experienced banking services.

Within two years, CBA’s Kafka-based ‘event hub’ hosted more than “70 projects across the bank; either in test or in production; with various use cases,” CBA product owner/lead architect Christopher Arthur told the 2018 Kafka Summit in the US. 

“Since then, Kafka has become a mission-critical platform in the bank and it is the core component in our “event-driven architecture strategy.”

One of the first major projects CBA managed on its Kafka cluster was its implementation of the New Payments Platform – a system for real-time transfers between banks, which launched in February 2018.   

In 2020, CBA used Kafka to de-risk its cloud migration. Marrying the platform with Neo4j’s knowledge graphs, CBA mapped different services’ dependencies across its network to avoid disruptions as it moved systems off-prem.

Jeremy Nadel attended Kafka Summit Bangalore as a guest of Confluent.

[ad_2]

Source link

Serendib News
Serendib News
Serendib News is a renowned multicultural web portal with a 17-year commitment to providing free, diverse, and multilingual print newspapers, featuring over 1000 published stories that cater to multicultural communities.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles