Programmers.io Acquired Infoview Release Blog | PIO Press Release
infoConnect Hub Fireside Chat Watch Video
COMMON POWERUp 2024 Learn More
InfoCDC 3.0.0 Announcement Learn More
The infoConnect Hub is now available on the GCP Marketplace! Learn More   
30th Anniversary of Infoview Learn More

infoConnect for Kafka infoConnect Hub

Composed of three connectors with source and sink capabilities, Infoview’s infoConnect can be used with Kafka open-source and Confluent enterprise to exchange events between IBM i and Kafka

InfoConnect-for-Kafka_-Real-time-bi-directional-integrations_-Diagram
Data Replication​

Real-time bi-directional integrations​

  • Enable event-based use cases with homegrown or commercial IBM i applications​​
  • Exclusively for IBM i (AS/400, iSeries) based systems​
  • Does not require IBM i expertise to leverage​
  • Supports data replication functionality with DB2​
Source and Sink Operations​

Works with RPG and COBOL Languages​​​​

  • Data queue source: Read data queue data and publish to a Kafka topic​ ​
  • Data queue sink: Polls data from Kafka topic and publishes to IBM i data queue​
  • Program call sink: Writes to IBM i data queue, calls IBM i programs, and maps program call parameters​
Kafka-Source-and-Sink-Operations_-Diagram

Flexible Proof of Concept Models

For prospects who wish to evaluate our connector solely in-house, designing, and implementing the Confluent/Kafka components and IBM i configuration to their specific use case, a 30-day trial license will be supplied with installation documentation. The Infoview team will be available to answer any questions and assist with configuration or troubleshooting on request.
For a short period of time (around 40 hours) a consultant can be allocated to review the POC scope, address questions, provide feedback, and participate in the development and testing of the Confluent/Kafka and IBM i components. For this approach, the scope would be small and include 1-2 simplified scenarios working end to end in a non-production environment.
After receiving the required information, the Infoview product team would recreate an use case within our own sandbox environment and demonstrate it once complete. The aim of this approach is to assist parties who have been interested in the infoConnect for Kafka but may not have the time to conduct and allocate in-house resources for the POC process.

Infoview Ownership and Implementation

Our Team would implement the desired use cases including full alignment with your team’s Kafka/Confluent and IBM i development guidelines, best practices, integration architecture review, adoption for IBM i integration, IBM i development and configuration, as well as go-live support

The end result would entail working integrations deployed to production in addition to knowledge transfer sessions with the client team members who would take ownership of the implemented components and integrations. The benefits of this approach would be quick implementation and a solid launch point for future IBM i integration projects. In terms of duration, this would typically last 2-3 months and require a formal SOW depending on complexities and expectations. Additionally, this method would require a commercial connector license by the time the project goes live.

Common Integration and Streaming Use Cases

Extract data from data queue periodically
Real time IBM i Data Replication pipeline – works natively with InfoCDC and delivers data change events from InfoCDC Replication Queues to Kafka topics and on to the targets with no custom IBM i or Kafka development
External database
Execute IBM i business logic (via program call or automating the green screen user action) based on a message in Kafka topic
AS400 Data Queue Sink Connector
Send messages to IBM i Data Queue or stream the messages from Data Queue to Kafka topics

Frequently asked questions

Functionality, compatibility, and pricing

  • How does the connector work?

    Via low-level socket-based connections, facilitated via IBM toolbox library. The connector works as a plug-in within the Kafka Connect on-premise, cloud, or Confluent Hub runtime.
  • Intent for the products creation?

    We wanted to provide Kafka development teams with an easy way to connect their IBM i based systems to Kafka without any special knowledge of IBM i, and without a need to implement and operate another piece of middleware software.
  • How is the product priced?

    Pricing is based on the number of physical IBM i servers leveraged regardless of server type (production, non-production, DR). For additional pricing and subscription information, please reach out to our team.
  • Compatibility with commercial off the shelf IBM i backend systems?

    The connector can be used to execute business logic or exchange messages with IBM i based commercial applications
  • Compatibility with Confluent Suite?

    The Kafka Connector Suite can indeed be leveraged on both Confluent and Apache Kafka stacks, the only difference is that the implementation and configuration steps will be slightly different for the Confluent suite
  • Kafka vs JDBC connectors ​

    When leveraging a JDBC connector, changes cannot be captured in real-time, leading to an inevitable time lag. The infoConnect for Kafka enables bidirectional integration directly from Kafka and the IBM i without any additional application servers or code generation in real-time. Furthermore, some database models are complex and additional rules must be applied, in this case our products ability to call back programs instead of replicating business logic in the integration layer is a feature customers have found useful.
  • What security comes standard with the connector?

    The connector supports SSL, SASL, and ACL as part of the certification process. Each iteration of the connector undergoes security review by the Confluent team.

infoConnect for Apache Kafka Demonstration

WATCH HOW IT WORKS

Connector suite configuration and Salesforce demonstration

Subscription Models, Delivery, and Support

Product pricing is based on the desired term of agreement (one year or three-year) and is not restricted upon the number of IBM i servers, server type (prod, dev, DR), LPAR’s, or middleware environments leveraged with the product license. In addition, we also offer cost-free evaluation licenses as well as proof of concept assistance.

Available for download on the Confluent Hub. Requires a license available directly through Infoview Systems.

  • Standard:During the entirety of the subscription term, support will be provided entailing product deployment, error/bug resolution, best practice advice, as well as subsequent product releases
  • Priority:Expedited incident resolution, bug fixes, and small enhancements. Unused support hours can be rolled over for the following month.
  • Priority 24×7: Support engineer available for incident resolution during standard business hours and on-call rotations for all nights/weekends/holidays. Unused support hours are rolled over to the following month.
  • Dedicated 24×7: A dedicated support engineer is online and ready to jump in at any point in time, day or night. Time outside incident resolution can be used for any additional project work.
Available for product implementation, POC guidance, as well as IBM i and Kafka system integration.

Learn more about Infoview’s Kafka Connector Suite

Connector Overview

Connector Overview

Confluent Cloud Hub Listing

Confluent Cloud Hub Listing

Kafka User Guide

User Guide

End User License

License Agreement

Launch Announcement

Launch Announcement

Product Support

Product Support & Maintenance Terms​

Copyrights © 2024 by Infoview Systems Inc.

Infoview got the Best Place to Work Award Infoview Systems Inc. is Accredited with FSQS Certification Infoview Systems Inc is delighted to announce that we are certified with the world's most recognized ISO standards – ISO 27001 Infoview Systems Inc is delighted to announce that we are certified with the world's most recognized ISO standards – ISO 9001 - 2015 ISO 9001-20015
🌐