Programmers.io Acquired Infoview Release Blog | PIO Press Release
infoConnect Hub Fireside Chat Watch Video
COMMON POWERUp 2024 Learn More
InfoCDC 3.0.0 Announcement Learn More
The infoConnect Hub is now available on the GCP Marketplace! Learn More   
30th Anniversary of Infoview Learn More

Overview

Google “Digital Transformation” and you’ll get a lot of diagrams, white papers, and dire predictions for companies not fully embracing it. The common theme is businesses must innovate at ever-increasing speed. The limitations of the traditional IT delivery model make it difficult to evolve the systems at this pace. The recipe is to let business teams rapidly deliver their own projects, by reusing core business data and logic packaged as APIs. Modern low code development tools further democratize the delivery of new UI and process layers. The result is a network of applications that address specific business capabilities, evolve rapidly and independently of other components, and communicate through a common integration fabric.

It’s a great model for startups that can get up and running in no time with modern cloud-based applications such as Salesforce, Netsuite, Workday, and ServiceNow (an extra shot of readily available capital never hurts either). Most modern SaaS applications have been built from the ground up around core APIs, and can be easily plugged into an integration platform. By contrast, established companies are saddled with a number of applications that have been developed over time with limited or no pre-built APIs or other out-of-the-box integration methods.

In this post, I discuss typical IBM i integration use cases and summarize the technology options for plugging into the application network. As we all know, for any given IT problem there is always at least a half dozen tools and methods, and often many more.  Below I list the tools our team used or evaluated. Don’t see your tool or technique of choice listed here? I appreciate if you send it to me!

Use Cases Summary

 

I want to… Then use this pattern…
Consume IBM i data

Direct access to DB2 for i

IBM i API provider

Execute IBM i business logic

Direct call to IBM i programs

IBM i API provider

Consume IBM i “green screen” applications IBM i UI to API
Execute batch process on IBM i

Bulk Data Processing

IBM i API provider

Consume external data from IBM i

Remote DB access from IBM i

IBM i API consumer

Execute external business logic from IBM i IBM i API consumer
Stream IBM i events to another system (s) Remote DB access from IBM i

IBM i API consumer
Meet Infoview and Mulesoft crew in person and further discuss my use cases Join us at Mulesoft Summit Chicago and COMMON Fall Conference in Columbus, there will be a great lineup of speakers and of course awesome food and after-glow reception!

Pattern Implementation Details

We identified the following common IBM i integration patterns:

  • Direct access to DB2 and IBM i programs
  • IBM i API provider
  • IBM i UI as API
  • Remote DB access from IBM i
  • IBM i API consumer
  • Bulk Data processing

Below is a high level overview of each pattern. I will further discuss the use case details and examples in the follow up posts. 

Direct Access to DB2 and IBM i Programs

Direct Access to DB2 and IBM i Programs

The majority of IBM i applications use integrated DB2 database as data store. IBM DB2 is a mature and robust relational database with standard JDBC and ODBC connectivity. When faced with requirement to access IBM i application data from external clients, the first thing that comes to mind is to just connect to the IBM i from the source system directly.

Similar approach can be used for executing IBM i business logic. The client system directly calls IBM i programs or executes a DB2 stored procedure that wraps the IBM i program.

Pros Cons
Easy to implement for simple data structures Point to point
May provide more low-level performance tuning options and faster response times for sequential data retrieval No reuse, tight coupling to IBM i DB structure
Each consumer must implement the IBM i application-specific transformation and business logic
Limited control over data access
Potential impact on IBM i system performance
Potential security risks due to opening IBM i DB to multiple client applications

Tools

Java: IBM i Toolbox for Java (JTOpen, JT400) is a Java library that includes JDBC driver as well as tools for working with various IBM i objects, such as programs, service programs, command calls, system values, IFS, etc.

.Net: IBM i Access for Windows ODBC driver for .Net clients – CWBX.dll and IBM i Data Provider enable DB2 access and allows working with various IBM i objects (calling programs, checking output queues etc).

PHP: PHP Toolkit for IBM i provides DB access and the ability to call programs, work with queues etc.

When to use

  • Not recommended, except in limited cases when the current and projected number of interfaces with IBM i is small, IT has full control over consumer application that accesses IBM i database.
  • Interfaces dealing with very large data sets warranting specialized clients tailored to a particular optimization goal

IBM i as API Provider

IBM i as API Provider

The biggest problem with clients accessing application DB or calling programs directly is the client code is now tightly coupled to IBM i data model. In many cases, each client must also implement application-specific business rules for extracting and formatting the data. Even with just a few client applications, it becomes painful to evolve the IBM i application and DB model or dependent clients. After some time it gets to the point where even a small change in one system takes months of coordination across multiple dev teams.

The solution is to provide the access to data and business logic via reusable APIs. Instead of accessing the IBM i database or calling a program directly, the consumer application calls an API that executes the IBM i Data Access program and/or accesses the IBM i database. This approach insulates the clients from internal IBM i application changes as long as it doesn’t affect the request/response structure.

There are a number of technical approaches for executing IBM i program remotely. The important factor to consider is where the request/response transformation happens. I found the most efficient approach is to have RPG programs deal with IBM i database operations and program calls, and use robust mapping and prototyping tools, such as Mulesoft Dataweave, for XML / JSON / flat file / … translations.  

Pros Cons
Avoids tight coupling of the consumer to the IBM i data model and program parameters Extra effort and cost to implement and operate an API layer
Plays well in point-to-point, hub and spoke, application network, and other integration topologies Performance considerations for very high data volumes
Enables discovery and reuse of IBM i APIs across multiple consumers/systems
Enables security and governance with API management tool
Standard REST or SOAP APIs are easy to consume

Tools

Language-specific remote program call implementations

  • IBM i Toolbox for Java
  • IBM i Access for Windows
  • PHP Toolkit for IBM i
  • DB2 Stored Procedure via Database connectivity

Implementation Tools and platforms

  • MuleSoft Anypoint + IBM i (AS/400) connector + Web Transaction Framework
  • IBM Integration Bus
  • IBM i Integrated Web Services Server
  • IBM i Apache HTTP server + custom CGI programs
  • Newlook by Fresche Legacy (look software)
  • OpenLegacy Server
  • RPGXML by KrengelTech
  • Rocket API by Rocket Software

API lifecycle suites

  • Mulesoft Anypoint
  • IBM API Connect
  • CA API Management
  • Apigee (Google)
  • Microsoft Azure API Management

When to use

  • Preferred option except for processing very large data sets

IBM i UI to API

IBM i UI to API

The above two use cases covered common integration scenarios of providing easy access to IBM i application data or executing business logic. Data access works great when the data model is well designed and aligns with the business domain language. It’s also pretty straightforward to produce IBM i API when the business logic is externalized as (or can be easily extracted into) callable programs.

For scenarios where the application source code is not available or the complex business logic is embedded into UI programs and cannot be easily extracted, the common integration approach is to use a “Screen Scraper” tool that taps into IBM i user interface stream (5250 protocol), sends the request data via simulated user keystrokes and retrieves the response by reading green screen data. The end result is an API wrapper with IBM i UI program working in the background as if user performed all the data entry. Another flavor of the same approach is using RPG Open Access handlers for scenarios where UI source code is available for modification.

The screen scraping has a generally bad stigma, unless, of course, you work for one of a number of vendors in this space. The typical use of such tools is to modernize the green screen UI with minimal effort, and the main consideration points are low quality of auto-generated UI, tight coupling to underlying IBM i application, limited reuse, duplication of effort, and impact on CI / CD. Of course, real life is far from being black and white, and there are use cases where screen-scraping technology can bring some value. Anyway, modernizing IBM i UI is a whole separate topic for another time and another post.  

In the context of system integrations, however, in the short or medium term, the approach can be pretty efficient for externalizing business processes, especially when the source code is not available, or the business logic is very complex and spread across multiple UI programs, which makes it difficult and risky to externalize.

Pros Cons
Exposes IBM i business logic even if there’s no access to source code Tightly coupled to underlying UI screens
Fast implementation cycle Performance may suffer depending on the user “path”

Tools

As mentioned above, the below-limited list includes the tools our team used or evaluated. There are a number of other vendors in UI modernization space and I would think most can to a certain degree address the 5250 to API transformation.

  • Tn5250j
  • IBM Host Access Transformation Services (HATS)
  • Zend 5250 Bridge
  • OpenLegacy iSuite
  • Rocket API
  • Axes Robot

When to use

  • Need to create API for COTS application with no source code
  • Business logic is complex and embedded into UI code
  • Aggressive project timelines
  • UI screens are not being actively modified
  • Performance is not a concern

IBM i remote DB access

IBM i remote DB access

In this scenario, a process running on IBM i, directly accesses remote DB. I have seen this approach used extensively in the environments where application database is partitioned into multiple IBM i systems and there’s a need for real-time data federation. Another common use case for a direct interface with external systems is where IBM i developers have strong Java skillsets and need to just “get things done” without relying on the overburdened SOA team.

The most typical approach is to use Java / JDBC to connect IBM i process to the remote database. There are several strategies and considerations for JVM and class loading, performance, security, and monitoring.

Pros Cons
Easy to implement for simple data structures Point to point
May provide more low-level performance tuning options and faster response times for sequential data retrieval No reuse, tight coupling to remote DB structure
IBM i process must implement the remote application-specific transformation and business logic
Limited control over data access
Potential impact on IBM i system performance
Potential security risks

Tools and technics:

  • Java / JDBC
  • DRDA / DDM (for DB2 target database only)

When to use

  • Avoid unless absolutely necessary due to specific project requirements, use IBM i API consumer instead

IBM i API Consumer

IBM i API Consumer

When working on IBM i transformational initiatives, there’s a lot of talk about unlocking IBM i data and processes. The IBM i provides APIs for easy consumption by other systems, either directly or via integration fabric such as Mulesoft Anypoint. This focus is typical for large but relatively self-contained, stable, and slow-changing IBM i systems that are being plugged into growing application networks.

In contrast, many evolving IBM i applications have a smaller scope and often need to consume external data sources or business logic exposed via APIs. As with other integration patterns, there’s a number of technical approaches, tools and tips. The key considerations are ease of use, scalability, governance, and operational insight.

The separation of duties between the traditional IBM i and integration-specific languages directly impacts the speed of development and ongoing refactoring. It’s possible to build and parse XML and JSON in RPG, and there are several frameworks and tools that support templates etc, but it requires specific skills and often slows down integration initiatives. My strong preference is to let RPG handle the business transformations while dealing with database tables and program calls, and implement all data transformations, security etc with the help of mapping tools such as Mulesoft Dataweave. Makes a huge difference in the speed and quality of integration development.

Pros Cons
Avoids tight coupling of IBM i process to the remote data model and program parameters Extra effort and cost to implement and operate an API layer
Plays well in point-to-point, hub and spoke, application network, and other integration topologies Performance considerations for very high data volumes
Enables security and governance with API management tool

Tools

  • Mulesoft Anypoint + IBM i (AS/400) connector + Web Transaction Framework
  • IBM i Integrated Web Services Client for ILE
  • Custom Java / JAX-RS or JAX-WS

API lifecycle suites

  • Mulesoft Anypoint
  • IBM API Connect
  • CA API Management
  • Apigee (Google)
  • Microsoft Azure API Management

When to use

  • Preferred option except for processing very large data sets

Bulk IBM i Data Processing

Bulk IBM i Data Processing

Some applications, such as Billing and A/R, are batch-oriented and deal with a large data volumes. When such process needs to access external data or logic or share the results of the batch run, calling an API for each record often can slow down the process to hours or even days. Instead, such balk processes can be implemented via good old file transfer or bulk ETL. Back in 90s and early 2000s we built quite a few such integrations based on FTP scripts, and many of these interfaces, for better or worse, are still in use.

The specialized ETL tools streamline application mapping and transformation rules, address encryption of data in flight, and in some cases greatly improve the performance by splitting the processing into chunks then processing them in parallel threads.

Pros Cons
Support for large data volumes Point to point
Specialized mapping tools Learning curve
Encryption and security Not well suited for time-sensitive transactions

Tools

  • Mulesoft Anypoint Batch
  • Informatica
  • IBM InfoSphere DataStage
  • Linoma GoAnywhere

When to use

  • Large volumes of data are processed periodically (as opposed to streaming)

Summary

IBM i is a modern platform that offers a number of built-in integration options. Since it’s introduction back in the 80s the platform attracted a large ecosystem of vendors, further expanding integration choices. Selecting the right tool and approach often reminds me of wandering through the outdoor equipment store. There’s lot of hype and an overwhelming number of vendors trying to differentiate their products. I like REI because their sales folks get to use their gear and can explain different options based on their experience, and don’t hesitate to suggest sticking to good old battle-tested stuff that still works. Hope this post provides useful information navigating the multitude of IBM i integration choices. Feel free to contact me if you have any questions, comments, or suggestions.

🌐