The mainframe is still not dead. But it could definitely use some modernization
Those of you who are computer history buffs may remember the insightful 1996 documentary film Triumph of the nerds that tells the story of the invention of personal computers and how they changed the world.
This documentary by Bob Cringley (pen name of technology journalist Mark Stephens) is based on his 1992 best-selling book Accidental Empires, in which he also predicted that the mainframe computer era will end. And he was quite specific about it.
Based on Cringley, mainframe computing was supposed to die, along with IBM mind you, with the coming of the millennium, on December 31, 1999.
Well, he was wrong on both accounts.
A modern IBM z Systems Mainframe
Arguments supporting and denying the demise of the mainframe live on, and you can find dozens of articles and blogs along the lines of “The mainframe is dead. Long live the Mainframe”. In reality, the mainframe is here, and it will be here for the foreseeable future. A Forrester survey released in 2019 found that 56% of respondents planned to increase their mainframe usage over the next two years, while 36% planned on the same amount of use.
But commercial mainframes have been around for over 70 years, and it’s fair to say that the age of mainframe computing technology and its surrounding ecosystem is starting to show. Literally.
Assembly of IBM 1401, 1960 ca
Take COBOL for example. If you are under 50 years old, it won’t be surprising if you’ve never heard of it. Acronym for Common Business-Oriented Language, COBOL was designed in 1959 and standardized in 1968. And while it will be reasonable to assume it’s all but extinct outside of computer museums, nothing can be further from reality.
A 2017 report by Reuters finds that COBOL underpins much of the U.S. financial industry, with 43% of banking systems built on COBOL, 80% of in-person transactions use COBOL, 95% of ATM swipes rely on COBOL code, and there are 220 billion (!) lines of COBOL in use today.
This is not limited to the financial industry alone. A 2018 report by the inspector general for the Social Security Administration found that the administration maintained more than 60 million lines of COBOL with “millions more lines of other legacy programming languages.” In the meantime, the age of COBOL programmers is likely to be between 45-55 years old or older.
Many American universities have not taught COBOL in their computer science programs since the 1980s. As a result, as COBOL programmers are retiring, less and less young programmers are filling in the ranks. Let’s face it, learning an ancient programming language to maintain legacy business applications does not sound like the most attractive career path.
The cover of the COBOL report to the CODASYL committee 1960
Mainframe and the CO2 Emissions Standard – The Challenge of Speed and Scale
You might wonder what mainframes have to do with CO2 emissions standards. Well, European regulation requires car manufacturers to calculate the CO2 emission of each car they make, and companies might be subject to huge fines when they go over their assigned quota.
Since every consumer can customize their vehicle over with hundreds of different options and configurations that impact the car’s CO2 emission levels, a real-time model must run every time such a car configuration is created.
European car manufacturers found that the expected number of such calculations per second from their various brands, dealerships and partners will be over tenfold the maximum capacity of their production mainframe that runs these calculations. They found that if they could reduce mainframe queries, they would not need to grow their mainframe footprint, and in parallel be able to deploy modern services for their customers.
They save millions of dollars, provide millisecond response time and have a modern data platform that can migrate them to the cloud.
Mainframes and Financial Services – The Challenge of High Costs
While Mainframes don’t usually reach the Cloud, the costs of running them can certainly get sky high. And the higher the transaction volume, the more expensive it gets.
One example is a top-3 US and multinational bank running up to 1 billion transactions a day supporting over 35 million daily customers found out that if they reduced Mainframe queries, they could reduce MIPS related license fees while seamlessly scaling their deployment to service their rapidly growing digital service loads. They save $20M annually.
Open Banking has also dramatically increased the number of queries to the banks’ databases – straining the existing architecture which in itself difficult and expensive to scale. Along with that, Financial Services organizations are struggling to monetize the Open Banking API for new applications and better customer experience.
Enter COVID-19 – The Challenge of Unexpected Peak Loads
You might wonder what COBOL and mainframe computers have to do with COVID-19. States such as Connecticut and New-Jersey learned this the hard way when they faced challenges processing hundreds of thousands of unexpected unemployment claims from people who lost their jobs due to the pandemic.
The overloaded mainframes could not handle it.
eCommerce sites are also suffering as digital service usage skyrockets due to the distancing guidelines that are in place. The new norm will see an overall increase in digital services.
The Mainframe Dilemma
Should these organizations buy more extremely expensive mainframes to make up for missing capacity? Should they start looking for new COBOL developers or retrain their Java, Spark and modern coders (that will definitely not work!)? Should they rip out their mainframes and start from scratch (would you take that risk)?
This is where the concept of modernizing mainframe architecture can be the perfect strategy.
So What Does A Mainframe Modernization Strategy Look Like?
- Capture your mainframe data such as DB2 using a Change Data Capture (CDC) and transfer it into a distributed in-memory data fabric
- Leverage extreme transaction processing (ACID compliant) and fast analytics for business intelligence with minimal latency
- Optionally expose your COBOL or CICS applications via microservice-based APIs for integration with modern-technology applications
The solution’s smart caching of legacy applications, drastically lowers the need to access the mainframe, reducing MIPS consumption
Mainframe Modernization Benefits
Reducing MIPS Costs
Processes running on mainframe have direct and indirect costs. There are many different mainframe oriented pricing models, but generally speaking – the more load that runs on the mainframe, the higher the costs of hardware and software related licenses. By offloading transactions from the mainframe into a distributed data fabric you can save millions.
Modernizing Legacy Stacks
Critical financial and government applications in production still run COBOL code that was written decades ago with not enough technical resources to effectively maintain it and overcome the challenges that come with rising workloads. By making these applications available from modern APIs you’ll gain agility to integrate mainframe legacy applications with modern distributed applications.
Leverage the Cloud
The Cloud offers many benefits that are usually out of reach of legacy applications running on Mainframes. These include greater agility, scalability, and cost-effectiveness. Modernizing your mainframe architecture means you can now leverage the Cloud and take advantage of these benefits.
GigaSpaces Can Help
The GigaSpaces Mainframe Modernization Solution is designed to optimize your mainframe investment and perfect your legacy strategy with elastic scale and agility while dramatically reducing costs. It’s based on four key components:
- Change Data Capture (CDC) – GigaSpaces offers a CDC that allows tracking changes in the mainframe database to ensure a strong consistency between data that resides in the mainframe and GigaSpaces
- Smart Caching – Allows you to offload MIPS from the mainframe by storing data in-memory and respond to requests from other applications with msec latency
- Modern Data Platform – Simplifies enterprise data architecture and powers the delivery of multiple use cases. Key features of the platform include:
- Microservices architecture
- Elastic scaling to support peaks
- Containerized and therefore can be deployed anywhere
- Supports all workloads (read-intensive, write-intensive compute-intensive, batch & streaming)
- OLAP & OLTP combined which allows processing transactions (ACID compliant) & analytics
- Runs AI/ML through native Spark support
- Supports any data structure (structured, semi-structured, unstructured)
- Complete agility package to support DevOps, MLOps & DataOps
GigaSpaces InsightEdge: A Modern Data Platform Approach
- Multi-Region / Multi-Cloud – GigaSpaces provides an out-of-the-box intelligent and efficient replication solution between remote sites. This capability is used to replicate data to the cloud; allowing enterprises to migrate digital assets to the cloud on a gradual continuous basis with low risk.
If you’re interested in perfecting your legacy strategy to:
- Lower TCO by up to 80%
- Modernize legacy stacks and accelerate time-to-market by up to 10x
- Reduce risks and simplify legacy to clouds migration with a continuous migration strategy