Database management systems such as Db2 for z/OS have numerous measuring instruments and variables at their disposal in order to be able to provide comprehensive information on a regular basis about their status, the data stored in them and the queries processed. Based on this information, monitoring tools make it easy to view and evaluate them. In the context of SQL performance tuning, for example, the CA Detector can be mentioned here. It makes it possible to view the performance of SQL statements (with regard to execution times, waiting times, etc.) in a predefined way and to compare them with historical data. Its visualization capabilities are unfortunately severely limited.
On the basis of the performance data pre-processed by the Detector, own evaluations can be flexibly created in a suitable environment via SQL and visualized with suitable tools (e.g. Excel). However, there is a multiple media break (Detector in the terminal emulation, SQL execution in a separate environment and visualization e.g. with Excel) with various manual steps. Jupyter Notebook could considerably simplify this task by combining text, executable code and dynamic visualizations and enable flexible evaluations “from a single source”.
The aim of the work is to get to know Jupyter Notebook and its possibilities in order to evaluate the feasibility of the above thesis (flexible reports using Jupyter). If the suitability is determined, selected example evaluations are to be realized in Jupyter notebooks and a compact guideline is to be developed on how to proceed with the creation of such evaluations.
The 7 x 24 hour cloud database operation of a relational database management system such as Db2 for z/OS at DATEV generates a large amount of important data. These include, for example, the number of calls to individual applications, individual SQL statements and many other detailed information collected at system level – the so-called “detector data”. These mass data are currently only cumulated / aggregated to a small extent and can therefore only be evaluated over a limited period of time (10 or 30 days). To make them easier to evaluate regarding their extremely valuable information:
- like trends in applications
- progressively evolving vulnerabilities
- Typical patterns in application operation etc.
meaningful aggregations are required. These would then enable the following:
- make the mass data quickly and efficiently evaluable
- to significantly extend the period of current data storage and the possibilities of retrospective analysis – objective “annual / multi-year evaluations
The aim of the work is an appropriate aggregation of the available data via SQL on a database to be implemented. Based on this, first evaluation examples are to be created, with which resulting weak points and the possibilities of trend analyses can be pointed out.
The Access Management Team is responsible for the technical implementation of IT security guidelines on the IBM mainframe (System z mainframe). The implementation as well as adaptation and extension of automated processes within the framework of System z Security projects is part of the core business.
- Basic knowledge of programming
- Experience with scripting languages
- Experience in website design with HTML
- Experience in REST-API programming
- a safe handling of Microsoft Office products
- an independent way of working and the ability to work in a team
- Interest in new topics and complex problems
- Fun with IT security
Low code platforms are currently on the rise and the range of available low code platforms is constantly increasing. To accelerate the development of software solutions, companies are increasingly relying on these development platforms, which make it possible to develop applications without having to write code. Low-code platforms offer a kind of “construction kit” of prefabricated components that can be combined to form an application using drag & drop. Regardless of this, it is possible to add your own code if required.
There are now a number of commercial providers of low-code platforms and tools on the market. The topic of the internship work is the evaluation of the relevant low-code platforms for the development of business applications and thus for a potential use at DATEV.
The aim is to investigate which market-relevant low-code platforms are basically available for the development of business applications and to evaluate these platforms with regard to their possible use at DATEV. A prerequisite for the use of one of these platforms is the creation of modern service-oriented applications that are basically platform-independent and can therefore be hosted on different target systems. Since the DATEV data center currently operates a high workload of business applications on the z-platform, one of the target platforms to be considered is the z-platform under z/OS or zLinux. The second target platform is Cloudfoundry on Linux under x86.
Over 13 million employees receive their pay slips via DATEV. This is why quality assurance has top priority.
The payroll core for the payroll accounting product LODAS is maintained and further developed by RZ. Due to the professional as well as technical complexity of this payroll core on the payroll accounting side, there is a high demand for individually prepared tests.
Your challenge is the testing and development of elements of test automation, module tests/unit tests, nightly build with the aim of increasing our quality assurance.
Skills/technologies that can be acquired in this way:
- Software development on the mainframe
- Software development with modern programming languages (e.g. Java)
- Tools such as ID/z, GIT, HOST emulations/tools
- Test Methodology/Test Management
- agile development process
- Inclusion of customers (=software developer of the accounting core of LODAS on the data center side)