Connecting from z/OS apps to Kafka via MQ

Apache Kafka is the de facto standard open-source event streaming platform. In event-driven architectures, applications publish events when data changes, allowing other systems to react in real-time rather than polling for updates.

An example is a CRM application that serves as the system of record for customer data. When a customer’s address changes, instead of having every application repeatedly query the CRM for current address data, the CRM can publish an ‘address-update’ event. Interested applications subscribe to these events and maintain their own current copy of the data.

Example of a use case for a Kafka broker

Kafka provides native programming interfaces for Java, Python, and Scala. This article demonstrates how traditional z/OS applications can participate in Kafka-based event streaming using IBM MQ and Kafka Connect.

Native Kafka programming interfaces and Kafka Connect

Applications can interact directly with Kafka through native programming interfaces. Kafka, being Java-based, naturally supports Java applications. Other languages with native Kafka support include Python and Scala. IBM recently introduced a Kafka SDK for COBOL on z/OS, though I will not explore that approach here.

Kafka Connect bridges the gap for applications without native Kafka support. This open-source component sits between Kafka and other middleware technologies like databases and messaging systems, translating between their protocols and Kafka’s event streaming format.

Solution Architecture

Our solution enables z/OS applications to produce and consume Kafka events through IBM MQ, leveraging the well-established asynchronous messaging patterns familiar to mainframe developers.

Key Benefits:

  • Uses proven MQ messaging patterns
  • Works with both CICS online and batch applications
  • Supports any z/OS programming language that can create MQ messages (COBOL, PL/I, Java, Python, Node.js, Go)
  • No application code changes required beyond message formatting

Architecture Overview

The solution uses Kafka Connect as a bridge between MQ queues and Kafka topics.

Architecture overview of z/os application producing and consuming kafka events through an MQ interface

For Event Production:

  • z/OS applications send messages to dedicated MQ queues
  • Kafka Connect reads from these queues
  • Messages are published to corresponding Kafka topics
  • Kafka broker makes events available to subscribers

For Event Consumption:

  • Kafka Connect subscribes to Kafka topics
  • Incoming events are placed on corresponding MQ queues
  • z/OS applications read from queues for business processing

Queue-to-Topic Mapping

Each Kafka topic has a dedicated MQ queue. This one-to-one mapping simplifies configuration and makes the data flow transparent for both operations and development teams.

Software Components

Kafka Connect runs as a started task on z/OS. Multiple instances can serve the same workload by sharing startup parameters, providing scalability and high availability.
Kafka Connect includes a REST API for:

  • Configuring connectors for your applications
  • Monitoring connector status
  • Integrating with provisioning and deployment processes

Production Configuration

In a production environment, multiple Kafka Connect instances run across different LPARs for high availability. Each instance accesses application queues through MQ local binding connections. MQ queue sharing groups distribute workload across LPARs, ensuring both performance and resilience.

Kafka MQ z/OS solution details

The infrastructure setup supports:

  • Load balancing across multiple z/OS instances
  • Fault tolerance through redundant components
  • Efficient local MQ connections

Summary

This article describes an architecture that provides a clean, straightforward path for z/OS applications to participate in event-driven systems using Apache Kafka. By leveraging existing MQ messaging patterns and Kafka Connect middleware, traditional mainframe applications can integrate with modern streaming platforms without requiring extensive code changes or new programming paradigms.
The solution maintains the reliability and performance characteristics that z/OS environments demand while opening doors to real-time data integration and event-driven architectures.

On the REST API provided by IBM MQ

Just a few things on the possibilities on  the MQ REST API.

With the MQ API facility you can PUT and GET messages on an MQ queue through a REST API. This capability only supports interacting with text messages. You will get the payload as a string, not as a “neat” JSON structure.

This is explained in Using the messaging REST API – IBM Documentation.

If you want to get a “neat” JSON API and map the “text” structure to a JSON structure and get a real API, you should use z/OS Connect.

Matt Leming from IBM explains things very clearly in this presentation REST APIs and MQ (slideshare.net)

By the way, z/OS Connect option also requires the MQ REST API infrastructure to talk to MQ.