Mainframe and the Value of Data
November 20, 2020

In the words of Sherlock Holmes …It is a capital mistake to theorize before one has data. 

This writer has always been partial to BIG IRON but with Big Iron comes big amounts of data. Each year, estimates are IBM Z handles $23 billion worth of ATM transactions and over $7 trillion in credit and debit card transactions. This produces a massive amount of complex data.  Considering more than 70% of global Fortune 500 companies use Z mainframes to run their core business functions, that is a lot of data to manage. 

Of course, industry analysts and experts tell us there are many varied ways to find value in mainframe data.  Yet, the greater value is in the enablement to find and utilize the data. There is tremendous value in historical data and even more so the past is married to the present. The net benefit is the capability to make accurate predictions for the future based on knowledge of the past!

Only when we understand this data do we have the information to make decisions. For many, this is the value in Artificial Intelligence but what if we could have that information now? Tactical and strategic decisions would not be made with stale data but with current and predictive information.  With these capabilities, the future of your organization and its digital transformation can be supported through complex data analytics.  Marissa Mayer, former Yahoo CEO and a world leader in global business said, “With data collection, the sooner the better is always the best answer.” Why rely on data that is 15 minutes or much further in the past, when you can have decision tools that are up to the second at the ready?

How does this create value for your organization? Value measurement occurs in different ways and there are many different types of worth, but what is value to you? Maybe it is reducing Mean Time to Recovery for critical systems and applications or ensuring you recover long before your customers even realized there was an issue. Or, creating budgetary visibility to your organization by reducing OPEX and CAPX through optimization is another. These examples of value that could deliver additional benefits to your business today.

Too often barriers stop you achieving these benefits, the digital disruption that challenges your organization are varied. You need solutions that mitigate those disruptions to deliver the business value your organization seeks. Traditionally organizations use many different tools and solutions from various silos, departments, and teams.  An organization with these issues can be further complicated by a lack of in-house skills and tribal knowledge.  Because, after all, we know what happens with tribal knowledge – it often walks out the door with the Subject Matter Expert (SME).  Further complicating the resolution picture is Z mainframe skills are growing scarcer throughout the industry.  As much as 62% of datacenter organizations recognize they will have a skills shortage within the next five years. We need a solution that can help us achieve our goals with less reliance of mainframe SMEs. 

Z Subject Matter Experts engage in the process of collecting and curating data.  It is possible, even highly likely, multiple teams within the organization are unknowingly collecting and processing the same data over and over. In typical Z legacy environments this is often true. Just think about the resource savings if the data is only researched, collected, and modeled once. The potential savings for the business is significant.

Once automation is achieved for data collection, what other benefits could be realized?  Significant benefit to an organization is the capability to capture and analyze data in such a way that it does not require ever-scarcer Z SMEs. Even more return comes from delivering the data in such a way that you can build out what-if and predictive business decisions for future growth. The business and IT need to be involved and have meaningful reports for decision making with the right data at the right time.

When developing an Information Technology strategic plan, the capability to determine what resources are being used by which parts of the organization would deliver greater insight to the operational environment. For example, if a component of the strategy is to upgrade to a new processor or more CPUs, then how valuable would it be if you could simulate this before making the investment based on historical, current and forecasted data? Potentially allowing you to defer hardware costs until you actually need them.

We live in a world where projects always contend for a slice of annual budget. The capability to manage projects, budgets, strategic planning with data that can demonstrate hard dollar savings through reduced mainframe resource usage can greatly enhance your business case to the executives. This is especially significant dashboards and reports are delivered in real time to demonstrate where the value is within organization. All without the need of a Z Subject Matter Expert.

Sound good? Take control of the total cost of ownership of your mainframe. 

IZPCA VIDEO SERIES: Episode 1

IZPCA VIDEO SERIES: Episode 1

In this video we illustrate how to use IBM Performance and Capacity Analytics, an #AIOps on #IBMZ solution, to create custom #Cognos reports and modify the SQL.

21CS is a leader in the development of software solutions that are designed to create value across the business and IT spectrum.

Careers

+1.800.555.6845

+1.610.971.9946

® 2023 21CS
All Rights Reserved.