Enqueuing JMS message directly into a Oracle persisent store - oracle

Is there a way to enqueue a JMS message into an Oracle table that is used as a persistent store for WebLogic JMS Server?
Thanks!

When you create a JMS Server, it will ask you to configure a persistent store. If you configure and use a JDBCStore (vs. a FileStore) it will ask for a database connection and create a table there called WL_Store, which it will use to store messages.
Are you asking if you can manually write a message into the WL_Store table?

You have yourself mentioned AQ, why not continue using AQ and configure WLS to enable consumption messages from AQ itself.
Its not advisable to store messages into the JMS JDBC store. JMS JDBC Store not only stores the messages by a bunch of extra information like message state, destination information and so on which wont be straight forward to push programatically.
Oracle hasnt provided a way to do this in their documentation anyways.

Related

Is there any custom mysql input event adapter for wso2 cep

I wanted to have the event recevier/stream from DB instead of jms,email,http.
In wso2 cep mysql/db is available as only output adapter not as intput adapter.
So is there any custom mysql input adapter available.
Please let me know if there is any alternative solution for db adapter.
I am afraid that WSO2 CEP does not have an Input Event Adapter to recieve events from a database (You can find the list of available input event adapters in the product documentation).
To my understanding, this is mainly because WSO2 CEP is designed for realtime stream processing.
I guess, here the database is not the original source which generates the events?
If so, there should be another event publisher which writes to the database. If you have the control over this publisher, would n't it be possible to get that publisher to send events to the WSO2CEP server directly, rather than writing to the database and then reading from it?
In my opinion, that will be a better solution compared to reading from the database.

Can we consume multiple message from JMS queue by single read?

I only want to call my JMS Adapter once, and in return I want maximum of 100 messages to be returned in response. Is it even possible ?
I am using 12C of Fusion middleware.
Please any points will be very helpful.
Unfortunately, This is not Possible. JMS is not very rich in functionality and can read only one message at a time.
This is the basic behavior of JMS at least till the 12C Version.
In fusion, Oracle might enhance it.

Oracle service bus with BigData

I do not have much experiences with Oracle Service Bus, I am trying to design a logging solution with BigData.
As I read, the default log and report activity in OSB will put the data into the domain's server log file or into the database where we setup the server domain. If I want to put all the logs into a separate BigData database. I will need to either of these approaches:
Java callout, use JMS or some other technology to send data to the bigdata server.
Web service callout, create a separate web service to handle the logging.
Create custom report provider to replace the default one in OSB Reporting.
Something else
Please tell give me some ideas about what method I should be using, and please provide your reasons if you can, thank you so much.
Isn't the logging framework in weblogic based on Log4j? That means you can use a JMSAppender (probably prudent to wrap in an Async log4j appender if you can) and handle it however you want.
Or, if you're talking about the OSB Reporting framework, there's a few options:
Configure the default JMS reporting provider (which uses the underlying SOAINFRA database which hopefully is set up to be something better than the default Derby instance), then write a MDB that pulls reports off the queue and inserts it into SAS BigData
Turn the JMS provider off and use a custom provider, which can do anything you want. If you want, you can still do a two-step process, where the reporting provider itself puts reports on a JMS queue so it returns quickly, and a different MDB pulls messages off and persists them at its own pace.
I do not recommend a web service or database callout without an async step in the middle, because you need logging and reporting to be very quick and use as little resources for as short a period as possible.
You don't want logging to hog threads while you're experiencing load. I have seen entire buses brought down because of one hiccup, because the logging database suffered a performance blip, which caused a bunch of open threads trying to log to it, which caused thread starvation or timeouts, which caused more error logging...
if you have a buffer like a JMS queue, then you can handle peaks by planning ahead. You can say "actually I want a JMS queue of 10,000 messages, and if that overflows due to whatever reason, I want to (push the overflow to a separate queue over on this other box) or (filter out all the non-essential messages) or (throw new messages away) or (action of your choice). Oh yeah, and if the logging database fails then I will try 3 times to commit and if not, move it to this other queue". Or whatever you want.
There are multiple ways to achieve this. You could use the report activity to push to JMS or use the log activity.
You can also write a small routine such as this (either on OSB or outside it), that can read anything that you are logging (such as via the log activity but also additional metadata that is logged when you turn on monitoring of OSB components) and do with it whatever is needed (such as pushing it to a database or BigData store).
The key is to avoid writing an explicit service call in each pipeline/flow and the above approach(es) use standard OSB/ODL* loggers
*Oracle Diagnostic Logging

how to read MQTT mosquitto server persisted DB file

I am using mosquitto server for MQTT protocol.
Using persistence setting in a configuration file with -c option, I am able to save the data.
However the file generated is binary one.
How would one be able to read that file?
Is there any specific tool available for that?
Appreciate your views.
Thanks!
Amit
Why do you want to read it?
The data is only kept there while messages (QOS1 or QOS2) are in flight to ensure they are not lost in transit while waiting for a response from the subscribed client.
Data may also be kept for clients that are disconnected but have persistent subscriptions (cleanSession=false) until that client reconnects.
If you are looking to persist all messages for later consumption you will have to write a client to subscribe and store this data in a DB of your choosing. One possible option to do this quickly and simply is Node-RED, but there are others and some brokers even have plugins for this e.g. HiveMQ.
If you really want to read it then you will probably have to write your own tool to do this based on the Mosquitto src code

WebSphere MQ/MQSeries - Possible to send a message to multiple queues with single call?

I'm queuing messages to a WebSphere MQ queue (NB: A point-to-point queue -- not a topic) using a stored procedure in my Oracle database. Is there a way to publish each message to multiple queues with a single call? What I would like is to find a solution that would incur zero additional latency on my database compared to sending the message to a single queue.
Solutions that involve changing my WebSphere MQ settings are certainly welcome! What I had in mind was somehow creating a "clone" queue that got all the same messages as the original one, but I've been unable to locate anything like this in the documentation.
Thanks,
Jeff
With WMQ v7 you can do this easily and with administration only. You would create a topic object and then an alias over the topic. The Oracle app writes to the alias and does not know that it is actually publishing.
Meanwhile, you make two administrative subscriptions on the topic so that publications are delivered to your two destination queues. The apps consuming them have no idea that the messages were published as opposed to delivered through point-to-point queues.
If you are not familiar with the new WMQ v7 features, take a look at the Infocenter. In particular, the "What's New in V7" section and the sections on Pub/Sub.
You can accomplish this using "Distribution Lists" in WebsphereMQ. These have to be configured on your queue manager.
Take a look at the Wesbphere MQ Application Programming Guide for more info.

Resources