Vendors
日本語

Operationalizing Fraud Prevention on IBM z16: Reducing Losses in Banking, Cards, and Payments

Create a vendor selection project
Click to express your interest in this report
Indication of coverage against your requirements
A subscription is required to activate this feature. Contact us for more info.
Celent have reviewed this profile and believe it to be accurate.
We are waiting for the vendor to publish their solution profile. Contact us or request the RFX.
Projects allow you to export Registered Vendor details and survey responses for analysis outside of Marsh CND. Please refer to the Marsh CND User Guide for detailed instructions.
Download Registered Vendor Survey responses as PDF
Contact vendor directly with specific questions (ie. pricing, capacity, etc)
5 April 2022

Celent estimates that applying AI inferencing models to all banking, card, and payments transactions running on IBM zSystem mainframes could potentially reduce fraud losses by an estimated US$161 billion globally.

Abstract

Advances in artificial intelligence (AI) such as deep learning are enabling significant improvements in fraud detection. However, large banks and payments processors who use AI models often run them on only a fraction of transactions due to throughput and latency constraints with their fraud detection systems. As a result, many fraudulent transactions go unmonitored and undetected.

The IBM Integrated Accelerator for AI, part of IBM’s new Telum mainframe processor, is designed to run inferencing for real time workloads at scale and at low latency. The chip is designed to support real time fraud detection even in high-volume bank, card, or payments processing environments.

To help banks and payments processors understand the potential value of this innovation for fraud operations, Celent has developed estimates of the potential reduction in fraud losses if these entities applied AI inferencing to 100% of their transactions.

This report was commissioned by IBM, but the analysis and conclusions are Celent’s alone.