Master the Mainframe – Meetup

We are proud to present you a Master the Mainframe Meetup on 22th November 2018 in Erlangen.

Info about the meetup:

When? 22th November 2018, 18:00 (MEZ)
Where? Blaues Hochhaus, LME Bildlabor (01.134), Martensstr. 3, 91058 Erlangen
Registration: cs5-mainprog@fau.de

Maybe you want to join us? You only have to bring your notebook, we serve Pizza and Mate!

What ist Master the Mainframe?

You will participate in a challenge based competition with your own computer. You will also receive Master the Mainframe badges when you complete the Part 2 and Part 3 of the challenges. And, it’s fun! With interactive, hands-on activities, you will learn about this technology who controls 80% of world’s data.

What is a Mainframe?

Industry relies on the mainframe for their speed, reliability, scalability, and unmatched security. Mainframes are critical to the success of a wide range of industries for high-speed cloud computing, real-time analytics, and more.

Here are some examples of how enterprises in different branches use mainframes in their daily business:

    • 44 out of the top 50 worldwide banks
    • 10 out of the top 10 insurers
    • 18 out of the top 25 retailers
    • 90% of the largest airlines

Also, 80% of the world’s data runs on mainframes (vs laptops or other servers) and 87% of all credit card transactions are processed via mainframes ($7.7 trillion per year).

You can read more about the Master the Mainframe at
https://ibm.biz/masterthemainframe.

Practicum/Project: Flexible utilization/visualization of monitoring data (e.g. using Jupyter)

Database management systems such as Db2 for z/OS have numerous measuring instruments and variables at their disposal in order to be able to provide comprehensive information on a regular basis about their status, the data stored in them and the queries processed. Based on this information, monitoring tools make it easy to view and evaluate them. In the context of SQL performance tuning, for example, the CA Detector can be mentioned here. It makes it possible to view the performance of SQL statements (with regard to execution times, waiting times, etc.) in a predefined way and to compare them with historical data. Its visualization capabilities are unfortunately severely limited.

On the basis of the performance data pre-processed by the Detector, own evaluations can be flexibly created in a suitable environment via SQL and visualized with suitable tools (e.g. Excel). However, there is a multiple media break (Detector in the terminal emulation, SQL execution in a separate environment and visualization e.g. with Excel) with various manual steps. Jupyter Notebook could considerably simplify this task by combining text, executable code and dynamic visualizations and enable flexible evaluations “from a single source”.

The aim of the work is to get to know Jupyter Notebook and its possibilities in order to evaluate the feasibility of the above thesis (flexible reports using Jupyter). If the suitability is determined, selected example evaluations are to be realized in Jupyter notebooks and a compact guideline is to be developed on how to proceed with the creation of such evaluations.

Practicum/Project: Aggregation of detector data

The 7 x 24 hour cloud database operation of a relational database management system such as Db2 for z/OS at DATEV generates a large amount of important data. These include, for example, the number of calls to individual applications, individual SQL statements and many other detailed information collected at system level – the so-called “detector data”. These mass data are currently only cumulated / aggregated to a small extent and can therefore only be evaluated over a limited period of time (10 or 30 days). To make them easier to evaluate regarding their extremely valuable information:

  • like trends in applications
  • progressively evolving vulnerabilities
  • Typical patterns in application operation etc.

meaningful aggregations are required. These would then enable the following:

  • make the mass data quickly and efficiently evaluable
  • to significantly extend the period of current data storage and the possibilities of retrospective analysis – objective “annual / multi-year evaluations

The aim of the work is an appropriate aggregation of the available data via SQL on a database to be implemented. Based on this, first evaluation examples are to be created, with which resulting weak points and the possibilities of trend analyses can be pointed out.

Practicum/Project: Software Development Mainframe (IT security)

The Access Management Team is responsible for the technical implementation of IT security guidelines on the IBM mainframe (System z mainframe). The implementation as well as adaptation and extension of automated processes within the framework of System z Security projects is part of the core business.

You own:

  • Basic knowledge of programming
  • Experience with scripting languages
  • Experience in website design with HTML
  • Experience in REST-API programming
  • a safe handling of Microsoft Office products
  • an independent way of working and the ability to work in a team
  • Interest in new topics and complex problems
  • Fun with IT security

Practicum/Project: Technical Basis Human Resources Management

The aim of the internship is to create a new data catalog in which the technical and functional description of the data structures for payroll data is stored in the DATEV computer center. Db2 serves as the data storage system for storing the data catalogue. The technical description is populated by the analysis of program modules (Assembler or Cobol) during production transfer. The technical attributes will probably be recorded via a web frontend (Java, JavaScript).

Practicum/Project: Low Code Platforms – An Alternative for Development under z/OS?

Low code platforms are currently on the rise and the range of available low code platforms is constantly increasing. To accelerate the development of software solutions, companies are increasingly relying on these development platforms, which make it possible to develop applications without having to write code. Low-code platforms offer a kind of “construction kit” of prefabricated components that can be combined to form an application using drag & drop. Regardless of this, it is possible to add your own code if required.

There are now a number of commercial providers of low-code platforms and tools on the market. The topic of the internship work is the evaluation of the relevant low-code platforms for the development of business applications and thus for a potential use at DATEV.

The aim is to investigate which market-relevant low-code platforms are basically available for the development of business applications and to evaluate these platforms with regard to their possible use at DATEV. A prerequisite for the use of one of these platforms is the creation of modern service-oriented applications that are basically platform-independent and can therefore be hosted on different target systems. Since the DATEV data center currently operates a high workload of business applications on the z-platform, one of the target platforms to be considered is the z-platform under z/OS or zLinux. The second target platform is Cloudfoundry on Linux under x86.

Practicum/Project: Transactions Human Resources

Over 13 million employees receive their pay slips via DATEV. This is why quality assurance has top priority.

The payroll core for the payroll accounting product LODAS is maintained and further developed by RZ. Due to the professional as well as technical complexity of this payroll core on the payroll accounting side, there is a high demand for individually prepared tests.

Your challenge is the testing and development of elements of test automation, module tests/unit tests, nightly build with the aim of increasing our quality assurance.

Skills/technologies that can be acquired in this way:

  • Software development on the mainframe
  • Software development with modern programming languages (e.g. Java)
  • Tools such as ID/z, GIT, HOST emulations/tools
  • Test Methodology/Test Management
  • agile development process
  • teamwork
  • Inclusion of customers (=software developer of the accounting core of LODAS on the data center side)

Searchable, syntax highlighted 3270 terminal screen as pdf

To print your terminal screen into a pdf file, execute the following steps:

1. Requirements

2. Print the terminal content into a html file

  • In the 3270 terminal emulator navigate to the screen you want to print
  • Open the “Enter Action” Dialog
    • in x3270: from the menu select: File -> Execute an Action
    • in x3270: from the menu select: File -> c3270> Prompt
  • Enter the following action, where filename is the name of the output file:
    printText(html, filename.html)

3. Convert the html to a pdf  file

First convert the filename.html file to pdf using wkhtmltopdf:

wkhtmltopdf filename.html filename.pdf

Then crop the pdf:

pdfcrop filename.pdf filename.pdf

Those two steps produce the following result: filename

4. Change the color

To invert the colors in the filename.pdf file run the following command:

./pdfinvert.rb filename.pdf filenameInverted.pdf

You can also specify a colorfile, which contains a colormap to replace the colors in the pdf by another specified color in the map:

./pdfinvert.rb -c colorfile filename.pdf filenameColored.pdf

My colorfile:

00 #000000 #ffffff
00 #ffffff #1d1f21
00 #00bfff #3971ed
00 #ff0000 #cc342b
00 #ffc0cb #a36ac7
00 #00ff00 #198844
00 #40e0d0 #39c6ed
00 #ffff00 #fba922

To learn more about the colorfile, read the documentation of pdfinvert.rb.

After changing the colors, crop the pdf one last time:

pdfcrop filenameColored.pdf filenameColored.pdf

The result looks like this: filenameColored

 

Update: replaced colorfile.

Getting started: In 3 steps to Cobol development with gradle

1. Install

To use the plugin, you need GNUCobol and Gradle.

On Ubuntu 18.10 and higher:

sudo apt install gradle gnucobol

On Arch (via yaourt):

yaourt gnu-cobol gradle

 

2. Configure your project

Create the project structure:

.
├── build.gradle  (empty)
└── settings.gradle  (empty)

Import the plugin from the provided repo (in your settings.gradle):

pluginManagement {
	repositories {
		maven {
			url 'https://sebastianruziczka.de/repo/mvn/'
		}
	}
}

Add to your build.gradle:

plugins {
     id 'de.sebastianruziczka.Cobol' version 'latest'
}

3. Run HELLOWORLD.cbl

Insert HelloWorld in src/main/cobol and run it with the single command:

gradle helloWorld

Running

Run your application with

gradle runDebug

or build an complete executable and run it with:

gradle runExecutable

Additional configuration

A minimal configuration for more than one file cobol source file:

cobol {
     srcMain = 'HelloWorld' // Path to your main file in src/main/cobol without the file extension .cbl
}

 

Additional information

 

Sebastian Ruziczka: Continuous Integration of Cobol software at the example of “Bank of Cobol” – Can open source frameworks compete with commercial software products?

Researcher: Sebastian Ruziczka

Start date: 29.03.2018

introduction

In modern software development, continuous integration is an integral part of the development process. For example, a central source code translation, software tests or the calculation of software metrics are used for long-term quality assurance.

The Cobol programming language is one of the oldest actively used high-level programming languages at the age of 60. Especially in systemically relevant areas, Cobol forms the foundation for banks and insurance companies. However, the programs currently in use have hardly any adaptations to modern software development due to their historical conception. In particular, the long lifetime of the Cobol source code as well as the criticality of the application have great potential for continuous integration.

Due to Cobol’s different target platforms, commercial and free tools for individual sub-functions of software integration have emerged. However, there is still no integration process with open source software. In order to continue to position the Cobol programming language in a future-proof way, the Cobol universe should be expanded to include an open source option for continuous software integration.

Link to the github repository.