We’re excited to share that Infoview Systems has joined the Connect with Confluent technology partner program. This program helps businesses accelerate the development of real-time applications through fully managed integrations with Confluent Cloud. Our customers now have the best experience for working with data streams from right within infoConnect, paving a faster path to powering next generation customer experiences and business operations with real-time data.
Expanding the data streaming ecosystem together with Confluent
Connect with Confluent brings fully managed data streams directly to organizations through a single integration to the cloud-native and complete data streaming platform, Confluent Cloud. It’s now easier than ever for organizations to stream any data to or from infoConnect with a fully managed Apache Kafka® service that spans hybrid, multi-cloud, and on-premises environments. In addition, the program supercharges partners’ go-to-market efforts with access to Confluent engineering, sales, and marketing resources. This ensures customer success at every stage from onboarding through technical support.
Build real-time applications with infoConnect
Built exclusively for IBM i systems and composed of three connectors with source and sink capabilities, Infoview’s infoConnect can be used with Kafka open-source frameworks and Confluent Cloud to exchange real-time events between IBM i based systems and Confluent. infoConnect does not require a strong background in IBM i development to leverage. It works seamlessly with both RPG and COBOL, supports data replication with DB2, and integrates with both homegrown and commercial IBM i applications.
For teams seeking to unlock real-time data capabilities for analytics, real-time data sync, and form the foundation for AI enablement, infoConnect serves as an easy-to-implement, standalone component that accelerates IBM i modernization with ease.
- Does not require any additional infrastructure to leverage
- Easy to implement and fully supported by our expert team
- Fair pricing and no restrictions on transaction amounts
- Real Time Replication: Works natively with InfoCDC and delivers data change events from InfoCDC Replication Queues to Kafka topics and on to the targets with no custom IBM i or Confluent development.
- Execute Business Logic: Execute IBM i business logic (via program call or automating the green screen user action) based on a message in Kafka topic.
- Send Messages: Send messages to IBM i Data Queue or stream the messages from Data Queues to Kafka topics.
Confluent Connector: How to Configure & Sample Use Case
Sample Use Case: A manufacturer and B2B focused distributor with a legacy ERP running on IBM i (formerly AS/400, iSeries) platform. The ERP works as expected and over time is adapted to very specific business needs. Now they want to expand their reach and open a Direct to Consumer (D2C) channel.
One often overlooked consideration is that D2C transaction volumes are typically higher than the B2B levels the platforms and processes are currently optimized for. Among other things, the order status check is performed in several channel applications (mobile, web, and partner marketplaces) and starts overwhelming ERP / WMS systems during peak load periods. Scaling up IBM i infrastructure is an expensive proposition, and still doesn’t protect from spikes in demand such as during Black Friday.
The problem is not new. One way to attack it is to maintain a real-time replica of the most often used data (in this example, order information). Then, redirect the order status check API to use that replica instead of hitting the primary back-end ERP.
Our infoCDC and Kafka Connect solution offers a lightweight, streamlined, and economical product stack, which is a good fit when the number of tables to replicate is not very high and there’s a benefit in filtering just relevant tables, columns, and rows.
The system will automatically load the table structure, and keys, validate if it’s properly journaled, and will prompt the user to select the filtering and replication rules:
After the table is defined and saved, the product will auto-generate all required objects, including the replication data queue, replication message format table, etc.
In this case, the order consists of the order header and details, therefore we will need to define both tables in the product.
Next, we need to start the replication flow for the associated journal in the second menu option.
IBM i Data Queue Source Connector Configuration: Follow the configuration steps to visually configure the Data Queue listener.
The connector will read a message from IBM i Data Queue, perform type translation per format table, transform it into JSON or AVRO format, and place the resulting message into Kafka topic.
MySQL JDBC Sink Configuration: We will use a simple JDBC Sink connector that inserts or updates a record in the target MySQL table for each new message received from the Data Queue source on the same Kafka topic as we configured above.
We will provide a similar configuration for Order Details, which will have its own Kafka topic, source, and target connector configurations.
After we start the Kafka cluster with related topics, connectors, and brokers, we are ready to test our replication solution.
In our “ERP” we will create a new order:
We did not have to write a single line of code for our controlled use case. The infoCDC product, combined with the Confluent and Kafka framework/infoConnect for Kafka connector, provides a simple no-code solution for replicating the data from IBM i to other databases in real-time.
Getting Started
Getting started with infoConnect and Confluent Cloud is simple. You can begin right away with a 30-day free trial of infoConnect, further supported by allocated consultants, complete knowledge transfer, and Proof of Concept (POC) assistance, and a bench of IBM i resources for platform implementation, staff augmentation, and more.
i resources for platform implementation, staff augmentation, and more.
- Learn more → Explore our blog articles and case studies to dive deeper into use cases and best practices.
- Find support → Explore our additional IBM i services for businesses of all sizes.
- Talk to our team → Contact us for a personalized walkthrough and to discuss how we can accelerate your modernization initiatives.
Through our expert team and partnership with Confluent, you’ll have the confidence of a fully supported, enterprise-grade solution that unlocks real-time data streaming – without the hassle.
Not yet a Confluent customer? Start your free trial of Confluent Cloud today. New signups receive $400 to spend during their first 30 days – no credit card required.
Recent Comments