Mainframe modernization, enterprise batch hybrid applications, mainframe in the era of cloud computing, devops to deliver better software faster and so on and so forth …

Frans Beyl
4 min readMar 11, 2022

Over the past five years, I have been repeatedly confronted by people talking about mainframe application modernization, mainframe and the cloud, and the ever imminent end of mainframe computing if this does not happen.

All this time I have not seen it happen and I became curious to know if the reason why might be that it´s technically difficult to accomplish on z/OS.

Setting up some test cases taught me that it is not difficult at all, but that the will to discover and use the multitude of existing, mostly unknown, open source software packages is indispensable.

The image above shows the main tools I used for it. The most innovative tool is undoubtedly “zowe cli” because it allows to integrate z/os automation in most development tools in a simple way. To build these few case, I chose visual studio code with microsoft java pack, red hat language support for java, broadcom cobol & jcl language support and zowe cli + zowe explorer to dramatically speed up development, test and deploy phase using vscode tasks.

To check if mainframe application can participate in a hybrid setup I created a small java app (available on github) that delivers info to and consumes info from a Kafka instance in the cloud and does so running on Linux, z/OS, both (or any os supporting java). Since the free CloudKarafka solution has a browser interface to consume and produce info and because I liked to do it, added some logic to halt the process on z/OS or Linux and stop both reacting on info sent to the topic from the browser page. I chose java because it runs on multiple platforms, because it is mature on all platforms, and because it runs on z/OS specialty engines, which should reduce software costs.

Since this is java only dev & debug were done inside vscode locally using lightweight java debugger extension, no unit testing done.

Following video shows the java class started from vscode to run on z/os as jes2 job and shows that same class running in java jvm inside vscode terminal on linux Mint. The browser interface for Cloudkarafka instance visualises kafka topic incoming messages from z/os execution, linux execution and how both runs act on messages from within browser to halt or stop.

This shows a z/OS job + a Linux process + a cloud app acting together as one hybrid application. Functionally trivial but technically a valid proof of concept I suppose.

To check if a mainframe COBOL application can invoke java classes I then wrote a COBOL program that invokes the java project’s main class thus starting the project from within COBOL. To check if a mainframe COBOL program can exchange data with java classes I added data-set input to the cobol module, pass records read one by one to invoked java class dbb (direct byte buffer) and deliver to KAFKA instance topic as producer client.

Once this was done I wanted the input dataset a bit more ´old style´ mainframe and adapted mainframe COBOL program to read variable-blocked dataset with multi-structure records containing typical COBOL fields: packed-decimal, zoned-decimal and EBCDIC coded character field, pass the records asis (without any form of conversion) to Kafka topic and consume the Kafka topic from java class using JZOS.fields package to handle cobol record. This class then runs in jvm on linux, z/OS or both to create pdf file with 1 simple structured page / record. Java packages org.apache.kafka.clients, org.apache.pdfbox and com.ibm.jzos.fields were used.

The input dataset’s contents:

For dev and test COBOL and java on z/OS I simply submitted good old compile & link / test jobs using vscode defined tasks executing ‘zowe cli’ commands, to browse jobs status and output ‘zowe explorer’. I still had to use X3270 for browse/edit contents of mvs dataset with mixed binary/text content (github issues exists for zowe and vscode-hexeditor to solve this in the future). Code is available on github.

I believe last testcase shows that one can modernize even decade old cobol batch applications with little effort, possibly even with limited cost when exploiting Z hardware features as there is encryption (Crypto coprocessors), compression (zEDC Express or on Z15 even integrated on each processor), …all available to java on z/OS.

This is just one simple example of application modernization usable during a transition period. There is much more waiting to be put into practice, e.g. Rest APIs for z/OS services (z/OS Connect to CICS, IMS, DB2 Rest, MQ), container extensions zCX available since z/OS 2.4, … and why not Change Data Capture streaming data from enterprise database to data warehouse over Kafka ( Change Data Capture with Debezium)

After performing this exercise, I still wonder why I see so little of this possible modernization … All information is available on the web, no specific knowledge needed, everything has been done before …

Some info and used links

Infrastructure used
Mainframe:
Zowe — z/OS Bootcamp labs system on IBM z14 running z/OS V2.3
Distributed:
Dell Precision M4400 laptop running Linux Mint 20.3
Cloud:
Cloudkarafka ‘Developer Duck’ instance on AWS

Tools used
Traditional mainframe:
x3270 v3.6ga4 free terminal emulator
Nontraditional mainframe:
Zowe CLI,Visual Studio Code with plugins Zowe explorer, Microsoft Extension Pack for Java, Git, Broadcom JCL language support, Red Hat Language support for Java, Maven for Java, …

Source reference
https://www.baeldung.com
https://www.w3schools.com/java/
https://kafka.apache.org
https://www.cloudkarafka.com
https://github.com/CloudKarafka/java-kafka-example
http://x3270.bgp.nu
https://code.visualstudio.com
https://www.zowe.org
https://marketplace.visualstudio.com/items?itemName=vscjava.vscode-java-pack

Documentation
New Ways of Running IBM z/OS Batch Applications (redbook)
Supercharge IMS Business Applications with Java

--

--