- Posts not found
Database management systems such as Db2 for z/OS have numerous measuring instruments and variables at their disposal in order to be able to provide comprehensive information on a regular basis about their status, the data stored in them and the queries processed. Based on this information, monitoring tools make it easy to view and evaluate them. In the context of SQL performance tuning, for example, the CA Detector can be mentioned here. It makes it possible to view the performance of SQL statements (with regard to execution times, waiting times, etc.) in a predefined way and to compare them with historical data. Its visualization capabilities are unfortunately severely limited.
On the basis of the performance data pre-processed by the Detector, own evaluations can be flexibly created in a suitable environment via SQL and visualized with suitable tools (e.g. Excel). However, there is a multiple media break (Detector in the terminal emulation, SQL execution in a separate environment and visualization e.g. with Excel) with various manual steps. Jupyter Notebook could considerably simplify this task by combining text, executable code and dynamic visualizations and enable flexible evaluations “from a single source”.
The aim of the work is to get to know Jupyter Notebook and its possibilities in order to evaluate the feasibility of the above thesis (flexible reports using Jupyter). If the suitability is determined, selected example evaluations are to be realized in Jupyter notebooks and a compact guideline is to be developed on how to proceed with the creation of such evaluations.
The 7 x 24 hour cloud database operation of a relational database management system such as Db2 for z/OS at DATEV generates a large amount of important data. These include, for example, the number of calls to individual applications, individual SQL statements and many other detailed information collected at system level – the so-called “detector data”. These mass data are currently only cumulated / aggregated to a small extent and can therefore only be evaluated over a limited period of time (10 or 30 days). To make them easier to evaluate regarding their extremely valuable information:
meaningful aggregations are required. These would then enable the following:
The aim of the work is an appropriate aggregation of the available data via SQL on a database to be implemented. Based on this, first evaluation examples are to be created, with which resulting weak points and the possibilities of trend analyses can be pointed out.
The Access Management Team is responsible for the technical implementation of IT security guidelines on the IBM mainframe (System z mainframe). The implementation as well as adaptation and extension of automated processes within the framework of System z Security projects is part of the core business.
Low code platforms are currently on the rise and the range of available low code platforms is constantly increasing. To accelerate the development of software solutions, companies are increasingly relying on these development platforms, which make it possible to develop applications without having to write code. Low-code platforms offer a kind of “construction kit” of prefabricated components that can be combined to form an application using drag & drop. Regardless of this, it is possible to add your own code if required.
There are now a number of commercial providers of low-code platforms and tools on the market. The topic of the internship work is the evaluation of the relevant low-code platforms for the development of business applications and thus for a potential use at DATEV.
The aim is to investigate which market-relevant low-code platforms are basically available for the development of business applications and to evaluate these platforms with regard to their possible use at DATEV. A prerequisite for the use of one of these platforms is the creation of modern service-oriented applications that are basically platform-independent and can therefore be hosted on different target systems. Since the DATEV data center currently operates a high workload of business applications on the z-platform, one of the target platforms to be considered is the z-platform under z/OS or zLinux. The second target platform is Cloudfoundry on Linux under x86.
Over 13 million employees receive their pay slips via DATEV. This is why quality assurance has top priority.
The payroll core for the payroll accounting product LODAS is maintained and further developed by RZ. Due to the professional as well as technical complexity of this payroll core on the payroll accounting side, there is a high demand for individually prepared tests.
Your challenge is the testing and development of elements of test automation, module tests/unit tests, nightly build with the aim of increasing our quality assurance.
Skills/technologies that can be acquired in this way:
To print your terminal screen into a pdf file, execute the following steps:
First convert the filename.html file to pdf using wkhtmltopdf:
wkhtmltopdf filename.html filename.pdf
Then crop the pdf:
pdfcrop filename.pdf filename.pdf
Those two steps produce the following result: filename
To invert the colors in the filename.pdf file run the following command:
./pdfinvert.rb filename.pdf filenameInverted.pdf
You can also specify a colorfile, which contains a colormap to replace the colors in the pdf by another specified color in the map:
./pdfinvert.rb -c colorfile filename.pdf filenameColored.pdf
00 #000000 #ffffff 00 #ffffff #1d1f21 00 #00bfff #3971ed 00 #ff0000 #cc342b 00 #ffc0cb #a36ac7 00 #00ff00 #198844 00 #40e0d0 #39c6ed 00 #ffff00 #fba922
To learn more about the colorfile, read the documentation of pdfinvert.rb.
After changing the colors, crop the pdf one last time:
pdfcrop filenameColored.pdf filenameColored.pdf
The result looks like this: filenameColored
Update: replaced colorfile.