Java Message Service (JMS) API

Java Message Service (JMS) is an API that provides the facility to create, send, and read messages from one application to other. Cavisson supports JMS APIs (IBMMQ, Kafka, and Tibco) producer and consumer scripts through NetStorm Script Manager. The configuration set via JMS API window is reflected in the flow file of the script.

Adding a JMS API

There are two ways to add a JMS API in script:

  1. Via ‘File’ menu
  2. Via ‘Flow’ file

Via File Menu

To add a JMS API via File menu, go to File > Insert > JMS API.

Via ‘Flow’ File

To add a JMS API via Flow file, right-click on a flow file of script, go to Insert > JMS API.

On accessing both the ways for adding JMS APIs, this displays the Add JMS window.

Here, the user can add IBMMQ, Kafka, and Tibco APIs by selecting the same from JMS type drop-down list. Each JMS type is further categorized into Producer and Consumer type script. Producer / Consumers are the virtual users, which produces or consumes the messages by using a queue.

IBMMQ

On selecting this JMS API, the user can create an IBMMQ Producer / Consumer script.

The user needs to provide the following inputs:

  • Producer: It is used to send the message. Select this option to create a ‘Producer’ type script.
  • Consumer: It is used to receive messages from a topic or a queue. Select this option to create a ‘Consumer’ type script.
  • Pool Size: It is the maximum number of connections. The user can specify a maximum of 65535 connections.
  • Message: It is the message that a user needs to provide while creating a ‘Producer’ type script.

The user needs to provide the following inputs within JMS Configuration:

  • Server IP/ Host Name: It is the IBM MQ Server IP / Host name. It is a mandatory field.
  • Port: It is the port where IBM MQ is listening. It is a mandatory field.
  • Queue Manager: It manages the resources associated with a queue that it owns. It is a mandatory field.
  • Channel: It is the element used to transfer messages between queues. It is a mandatory field.
  • Queue: It is responsible to hold the message until the receiver is ready. It is a mandatory field.
  • User ID / Password: It is the User ID and password for IBM MQ by which the user can have the monitoring access. It is an optional field, which depends on server.

Note: After the required configurations, click the Test Connection button to test the connection on the server.

Kafka

On selecting this JMS API, you can create a Kafka Producer / Consumer script for the C and Java type script files.

Note: You can add Kafka Avro only for Java type scripts.

Provide the following information:

  • Producer: It is used to create objects that are required to send the message. Select this option to create a ‘Producer’ type script.
  • Consumer: It is used to receive messages from a topic or a queue. Select this option to create a ‘Consumer’ type script.
  • Avro: It is used to create an Avro type script.
  • Pool Size: It is the maximum number of connections. The user can specify a maximum of 65535 connections.
  • Message: It is the message that a user needs to provide while creating a ‘Producer’ type script.

The user needs to provide the following inputs within JMS Configuration:

  • Server IP/ Host Name: It is the Kafka Server IP / Host name. It is a mandatory field.
  • Port: It is the port where Kafka is listening. It is a mandatory field.
  • Topic: It is used as a message oriented middleware that is responsible to hold and deliver messages. It is a mandatory field.
  • Consumer Group: It is a multi-threaded or multi-machine consumption from Kafka topics. It provides Kafka the flexibility to have the advantages of both message queuing and publish-subscribe models. A consumer group has a unique id. Each consumer group is a subscriber to one or more Kafka topics. It is a mandatory field on selecting ‘Consumer’ option.
  • User ID / Password: It is the User ID and password for Kafka by which user can have the monitoring access. It is an optional field, which depends on server.

Java Script Files

You can enable the Avro option for the Java type script files. Selecting Avro adds a section for specifying the Avro configuration, as shown in the following figure:

Adding a Generic Record Type in Avro

To add a generic record type:

  1. Select the Generic record type in the Avro Config section.
  2. Specify the schema registry URL.
  3. Click OK. The schema file is added in the flow.java file.

Adding a Specific Record Type in Avro

To add a specific record type:

  1. Select the Specific record type in the Avro Config section.

2. Specify the schema registry URL.

3. Specify a schema file. You can also browse to the required schema file.

4. Click Generate Specific File. The schema file is added in the script.

5. Click OK.

Enabling SSL Script

To enable SSL script, the user needs to select the Use Security Protocols check box.

On selecting that, the user is required to provide the following details within Advanced Settings for SSL:

  • Security Protocol: Kafka supports following security protocols for scripting:
    • ssl: It is a security protocol, which ensures that the communication is encrypted and authenticated using SSL.
    • sasl_plaintext: It is used if SSL encryption is not enabled.
    • sasl_ssl: It is used if SSL encryption is enabled (SSL encryption should always be used if SASL mechanism is PLAIN). All broker/client communication use sasl_ssl security protocol, which ensures that the communication is encrypted and authenticated using sasl/plain.
  • Ciphers: It is the cipher suite names. The user can either browse the file path or enter the file path manually. The user can provide multiple ciphers separated by colon.
  • Key File: These are openssl generated keys with the crypto toolkit and saved into files with the .key or .pem
  • Password: It is the password of the .key
  • Certificate File: These are small data files that digitally bind a cryptographic key to an organization’s details.
  • CA Certificate File: It is a digital certificate file that certifies the ownership of a public key by the named subject of the certificate.
  • CRT File: It is used to verify a secure website’s authenticity, distributed by certificate authority (CA) companies.

Note: After the required configurations, click the Test Connection button to test the connection on the server.

Tibco

On selecting this JMS API, the user can create a Tibco Producer / Consumer script.

The user needs to provide the following inputs:

  • Producer: It is used to create objects that are required to send the message. Select this option to create a ‘Producer’ type script.
  • Consumer: It is used to receive messages from a topic or a queue. Select this option to create a ‘Consumer’ type script.
  • Pool Size: It is the maximum number of connections. The user can specify a maximum of 65535 connections.
  • Message: It is the message that a user needs to provide while creating a ‘Producer’ type script.

The user needs to provide the following inputs within JMS Configuration:

  • Server IP / Host Name: It is the Tibco Server IP / Host name. It is a mandatory field.
  • Port: It is the port where Tibco is listening. It is a mandatory field.
  • Topic / Queue: Queue is responsible to hold the message until receiver is ready. Topic is used as a message oriented middleware that is responsible to hold and deliver messages. The user can either select ‘Topic’ or ‘Queue’.
  • User ID / Password: It is the User ID and password for Tibco by which the user can have the monitoring access. It is an optional field, which depends on server.

To enable SSL script, the user needs to select the SSL check box.

On selecting that, the user is required to provide the following details within Advanced Settings for SSL:

  • SSL: It is used to decrypt client identity and is required if identity file is set.
  • Ciphers: It is an algorithm for encrypting and decrypting data. The user can either browse the file path or enter the file path manually. The user can provide multiple ciphers separated by colon.
  • Private Key File: It is a separate file that is used in the encryption/decryption of data sent between server and the connecting clients. The user can either browse the file path or enter the file path manually.
  • PK Password: It is the password of the primary key file.
  • Trusted CA: It is the trusted certificates, which are data files used to cryptographically link an entity with a public key. The user can either browse the file path or enter the file path manually.
  • Issuer: It is the client issuer file. The user can either browse the file path or enter the file path manually.
  • Identity: It is the client identity file. The user can either browse the file path or enter the file path manually.

Once a JMS API is added for IBM / Kafka / Tibco, the configuration set via JMS API window is reflected in the flow file of the script.

Amazon Simple Queue Service (Amazon SQS)

Amazon Simple Queue Service (Amazon SQS) offers a secure, durable, and available hosted queue that lets you integrate and decouple distributed software systems and components. Amazon SQS supports both standard and FIFO (ends with .fifo suffix) queues. 

High Level Usage

  1. A producer sends message A to a queue, and the message is distributed across the Amazon SQS servers redundantly.
  2. When a consumer is ready to process messages, it consumes messages from the queue, and message A is returned. While message A is being processed, it remains in the queue and does not returned to subsequent receive requests for the duration of the visibility timeout.
  3. The consumer deletes message A from the queue to prevent the message from being received and processed again when the visibility timeout expires.
  4. Amazon SQS automatically deletes messages that have been in a queue for more than maximum message retention period.
  5. The message retention period is 4 days (default). However, you can set the message retention period to a value from 60 seconds to 1,209,600 seconds (14 days) using the SetQueueAttributes

Benefits

  1. Better Performance: Message queues enable asynchronous communication, which means that the endpoints that are producing and consuming messages interact with the queue, not each other.
  2. Increased Reliability: Queues make your data persistent, and reduce the errors that happen when different parts of your system go offline.
  3. Granular Scalability: Message queues make it possible to scale precisely where it is needed. When workloads is on peak, multiple instances of the application can add all requests to the queue without risk of collision.
  4. Simplified Decoupling: Message queues remove dependencies between components and significantly simplify the coding of decoupled applications.

How to Access

  1. Select Amazon SQS from the JMS Type drop-down list.

2. Provide the following inputs:

  • Producer: It is used to create objects that are required to send the message. Select this option to create a ‘Producer’ type script.
  • Consumer: It is used to receive messages from a topic or a queue. Select this option to create a ‘Consumer’ type script.
  • Pool Size: It is the maximum number of connections. The user can specify a maximum of 65535 connections.
  • Message: It is the message that a user needs to provide while creating a ‘Producer’ type script.

3. In JMS Config section, provide the following inputs:

  • Queue Name: Queue is responsible to hold the message until receiver is ready.
  • Queue Type: Types of Queue. Available options are:
    • Standard: It is the default queue type. Standard queues support a nearly unlimited number of transactions per second (TPS) per API action (SendMessage, ReceiveMessage, or DeleteMessage). Standard queues support at-least-once message delivery.
    • FIFO: FIFO queues have all the capabilities of the standard queue. FIFO queues provide exactly-once processing but have a limited number of transactions per second (TPS).
    • Group ID: The tag that specifies that a message belong to a specific message group. Messages that belong to the same message group are always processed one by one, in a strict order relative to the message group. This option is only available in FIFO type Queue selection in producer script.
  • Regions: The default AWS Region to send requests to for commands requested.

4. In Access Key Settings section, provide the following details:

  • Access Key: The AWS access key used as part of the credentials to authenticate the command request.
  • Secret Access Key: The AWS secret key used as part of the credentials to authenticate the command request.
  • Access Key Path: The path to the credential file. A credentials file is a plaintext file that contains your access keys.
  • Profile: A profile is used to select a specific set of access key and secret access key.
Adding Message Header

To add the message header for Amazon SQS in a script file, the user is required to write the following API:

ns_amazonsqs_set_message_header(JMSKey key, String header_name, String value_type, Object header_value) and return type is boolean

Header Value Format (any one of these)

  • NS_BOOLEAN
  • NS_BYTE
  • NS_DOUBLE
  • NS_FLOAT
  • NS_INTEGER
  • NS_LONG
  • NS_SHORT
  • NS_STRING
  • NS_OBJECT;