As the wave of artificial intelligence sweeps across the global technology industry, a seasoned veteran is preparing to make a comeback. IBM recently released a 28-page report titled "Mainframes as the Vanguard of Digital Transformation," aiming to prove that this computing platform, with a history of 60 years, remains indispensable in the AI era. The report, authored by IBM's Institute for Business Value, not only showcases the current state of mainframes but also highlights their critical role in AI-driven digital transformation.
The report reveals that 79% of IT executives believe mainframes are crucial for achieving AI-driven innovation. Over the past 60 years, mainframes have become the backbone for storing and processing massive amounts of critical business data. As organizations embark on their AI-driven digital transformation journeys, mainframes will play a key role in enhancing the value of data.
Image source: This image was generated by AI, provided by the image licensing service Midjourney
IBM seems concerned that mainframe users might think modern generative AI workloads are only suitable for public clouds and/or data center x86 and GPU servers. Therefore, the report emphasizes the importance of mainframes in this field. IBM proposes a hybrid approach, combining mainframes, public clouds, and edge computing, to select the most appropriate platform based on the characteristics of the workload.
The report advises mainframe users to "leverage AI for in-transaction insights to enhance business use cases including fraud detection, anti-money laundering, credit decisions, product recommendations, dynamic pricing, and sentiment analysis." A notable case is a North American bank that, by migrating its credit card transaction scoring application to a mainframe, increased its processing capacity from 80 milliseconds per second to handle 20% of transactions, to 2 milliseconds per second to handle 15,000 transactions, achieving 100% transaction scoring and saving about $20 million annually in fraud prevention.
IBM emphasizes that mainframes equipped with embedded on-chip AI accelerators "can scale to handle millions of inference requests per second with extremely low latency, which is particularly important for transactional AI use cases such as detecting payment fraud." IBM advocates an 'integrated AI' approach, combining existing machine learning models with new large language models (LLMs) to improve prediction accuracy.
Beyond business applications, AI can also be used to enhance mainframe management. The report finds that 74% of executives believe integrating AI into mainframe operations and transforming system management and maintenance is crucial. AI-driven automation, predictive analytics, self-healing, and self-tuning capabilities can proactively detect and prevent issues, optimize workflows, and enhance system reliability.
In terms of security, mainframes can leverage AI for monitoring, analyzing, detecting, and responding to cyber threats. Additionally, generative AI and code assistants can accelerate the conversion from old programming languages like COBOL to Java and JCL development, "bridging the skill gap by enabling developers to modernize or build applications faster and more efficiently."
IBM is adopting an AI processing offload approach for its next-generation z16 mainframe, expected in 2025, equipped with dedicated AI data processing units (DPUs). The new mainframe will feature up to 32 Telum II processors with on-chip AI inference acceleration capabilities at a rate of 24TOPS. The Spyre accelerator will add 32 AI accelerator cores and 1GB DRAM, matching the performance of the Telum II on-chip AI accelerator.
However, IBM did not mention plans to add GPUs to its mainframe architecture. Inference workloads will run effectively on mainframes, but AI training workloads will not. We can expect IBM to arrange vectorization and vector database features for mainframes to support retrieval-augmented generation (RAG) in inference workloads.
For this commentator, adding GPUs to mainframes would be a "holy grail" breakthrough, as it would open the door to running AI training workloads on this classic computing platform. Perhaps this idea, namely GPU co-processors, will become a feature of the z17 mainframe generation.