Execute queries or macros in Teradata using Tibco EMS - tibco

Is it possible to execute queries or call macros in Teradata through TIBCO EMS?
If so, what is the process?
If not, what is the correct Tibco product to perform this action?

TIBCO EMS is a messaging product that transports data from one point to another. To do something, you'll need to have a producer and a consumer of messages connected to it and the consumer of those messages would be able to execute queries in Teradata.
If you want to connect to Teradata and execute those queries to visualize them. In that case TIBCO Spotfire is the best suited TIBCO product for the job. You can find details on how to connect to Teradata using TIBCO Spotfire here.

You must connect EMS queues/topics to a listener service developed in C/Java/.NET/BusinessWorks to read incoming messages and for each messages received you can do actions to Teradata. I am not familiar with Teradata but API in Java could exist to permit you to call macros in the listener service.

Related

streaming data from oracle with kafka

I'm starting with kafka and I need to control the inserts in a specific Oracle table, send the new records through kafka at the moment. I have no control over the database, so, in principle, Debizium is excluded. How can I do this? Without using triggers.
I've made a producer read data from Oracle with a java program in eclipse but, that would make constant requests to the database. I use java for simulated a ETL with consumer.
PS: I work with Windows but that's secondary.
If I understand your problem correctly, you are trying to route inserts from Kafka to Oracle Database. There could be few possibilities:
You implement Kafka consumer and as soon as your kafka cluster gets a message consumer makes a insert. You could reuse your java code here- just remove the polling part. Please visit here
If you have kafka deployed in a cloud environment and are using it as a service(aws msk) you would have the option to handling the events. Again you can use java program or can write a python script to make inserts. Please visit here
I would like to understand your throughput requirements, whether you really need kafka as a distributed messaging system or a simple aws sqs would work just fine. If you can use sqs things would be straightforward for you. You create a queue and you write a listener in
python or java
boto3 is an excellent python library for working with sqs

Is there any custom mysql input event adapter for wso2 cep

I wanted to have the event recevier/stream from DB instead of jms,email,http.
In wso2 cep mysql/db is available as only output adapter not as intput adapter.
So is there any custom mysql input adapter available.
Please let me know if there is any alternative solution for db adapter.
I am afraid that WSO2 CEP does not have an Input Event Adapter to recieve events from a database (You can find the list of available input event adapters in the product documentation).
To my understanding, this is mainly because WSO2 CEP is designed for realtime stream processing.
I guess, here the database is not the original source which generates the events?
If so, there should be another event publisher which writes to the database. If you have the control over this publisher, would n't it be possible to get that publisher to send events to the WSO2CEP server directly, rather than writing to the database and then reading from it?
In my opinion, that will be a better solution compared to reading from the database.

Can we consume multiple message from JMS queue by single read?

I only want to call my JMS Adapter once, and in return I want maximum of 100 messages to be returned in response. Is it even possible ?
I am using 12C of Fusion middleware.
Please any points will be very helpful.
Unfortunately, This is not Possible. JMS is not very rich in functionality and can read only one message at a time.
This is the basic behavior of JMS at least till the 12C Version.
In fusion, Oracle might enhance it.

Enqueuing JMS message directly into a Oracle persisent store

Is there a way to enqueue a JMS message into an Oracle table that is used as a persistent store for WebLogic JMS Server?
Thanks!
When you create a JMS Server, it will ask you to configure a persistent store. If you configure and use a JDBCStore (vs. a FileStore) it will ask for a database connection and create a table there called WL_Store, which it will use to store messages.
Are you asking if you can manually write a message into the WL_Store table?
You have yourself mentioned AQ, why not continue using AQ and configure WLS to enable consumption messages from AQ itself.
Its not advisable to store messages into the JMS JDBC store. JMS JDBC Store not only stores the messages by a bunch of extra information like message state, destination information and so on which wont be straight forward to push programatically.
Oracle hasnt provided a way to do this in their documentation anyways.

connecting to tibco and see the message body from the queue

I am working in an application in which we use to send some messages in tibco queue, we are sending messages through spring jms and we have been provided by a customized interface in which graphically we can see the message count and can see that if that number increases then messages are being send on that queue, but i was looking for some tool which have GUI in which I can connect to my tibco queue and can see the message body also that is the message content in detail , please advise for such free tools , I think similar to queuezee tool.
This one is paid one http://jmsbrowser.com/...looking for free one
You could try GEMS(Graphical Administration Tool for EMS). Its a GUI tool built specifically to monitor the queues and topics in TIBCO EMS
TIBCO Community:GEMS tool
Also there is one more tool called Hermes which offers the almost the same functionality as GEMS
You can use ActiveMQ,it is from Apache and its free.
You can see the messages sent and messages recieved.
Both queue and topic message counts can also be viewed

Resources