How do you read the Oracle transaction log - oracle

Instead of placing triggers on tables everywhere in an Oracle database, is there a Java API that I can use to read transactions off the Oracle transaction log?
My purpose is to be able to detect transactions going into a proprietary(vendor) database and react accordingly. We can't modify the database so that we do not void our maintenance contract.
Please help!

There is LogMiner which is SQL based (and so you could access through JDBC).
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/logminer.htm#sthref1875
Or you can look at Oracle Streams which reads the logs and generates 'logical change messages' into a queue from the log contents.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14229/strms_over.htm#i1006309

If you are running in *nix, there is a perl module that you could use to tail the file; then break down the lines for yourself.

Related

Manually logging database event in datastage job

i have a parallel job that writes in oracle table. I want to manually write warnings in Datastage's log if some event occur. For example if a certain value for a certain column is inserted i want to track this information in the log. Could this be achieved somehow?
To write custom messages into the logs for a particular jobs data stream, you can use a combination of a copy stage, transformer, and peak stage. The peak stage is the one that writes to the logs. I like to set the peak stage to run in sequential mode, so that your messages are kept together in single entries in the log, instead across nodes.
Also, you can peak the rejects of the oracle stage. maybe combine this with the above option (using a funnel stage and a standard column schema).
Lastly, if you'd actually like to query the logs themselves and write those logs out somewhere else or use them in a job (amoungst allother data kept about jobs in the repository). You can directly query the DSODB schema in the XMETA database. I.e. the DataStage repository (by default DB2).
You would need to have the DataStage Operations Console up and running for that (not sure what version of DataStage you're running). If DataStage is running on a single tier and using the default DB2 database. You can simply catalog the DSODB database so that it's available as a connection in the DB2 connector. Else you'd need to install a DB2 client on the DataStage engine tier and catalog the database there.
All the best!
Twitter: #InforgeAcademy
DataStage tips and Tricks: https://www.inforgeacademy.com/blog/

Tibco JDBC Update dry run

Is it possible to have a dry run in Tibco for the JDBC Update activities? Meaning that I want to run those activities, but not actually update the database.
Even running in test mode if it's possible will be good.
The only option I see is having a copy of the targeted tables in a separe schema, duplicate the data, and temporary align the JDBC connection of you activity on this secondary, temporary/test database.
Since you can use global variables, no code is changed between test and delivery (a typical goal), and you can compare both DB tables to see if the WOULD HAVE ran well...
I think I found a way. I haven't tested it yet, but theoretically it should work.
The solution is to install P6Spy and to create a custom module that will throw an exception when trying to execute an INSERT/UPDATE.
You could wrap the activity into a transaction group and rollback whenever you only want to test the statement. Otherwise just exit the group normally so the data gets commited.

Moving files from log servers to Oracle - Flume

I'm trying to create a log file from the log servers and pushing it in to oracle. Is there a way I could implement the same using Flume (without HDFS setup) to push the log file in to ORACLE? Any help would be greatly appreciated.
You will need to create a custom sink in order to persist data into Oracle. This is relatively easy, you only have to extend the AbstrackSink class and implement the process() method (basically, take an event from the channel and use the Oracle API in order to persist the data).

Oracle to Neo4j Sync

Do we have any utility to sync data between Oracle & Neo4J database. I want to use Neo4j in readonly mode & all writes will happen to oracle DB.
I think this depends on how often you want to have the data synced. Are you looking for a periodic sync/ETL process (say hourly or daily), or are looking for live updates into Neo4j?
I'm not aware of tools designed for this, but it's not terribly difficult to script yourself.
A periodic sync is obviously easiest. You can do that directly using the Java API and connecting via JDBC to Oracle. You could also just dump the data from Oracle as a CSV and import into Neo4j. This would be done similiarly to how data is imported from PostreSQL in this article: http://neo4j.com/developer/guide-importing-data-and-etl/
There is a SO response for exporting data from Oracle using sqlplus/spool:
How do I spool to a CSV formatted file using SQLPLUS?
If you're looking for live syncing, you'd probably do this either through monitoring the transaction log or by adding triggers onto your tables, depending on the complexity of your data.

Oracle : Getting Notification when table gets new data's

In psql we have PL/Perl to communicate with external program when the new row is inserted into our table. Like that is there any way (procedural language ) to communicate with external program in Oracle . For achieving this things, what should I do.....?
Can any one help me out of this problem.....
Oracle offeres packages to communicate externally to a file, or pipe. Create a trigger to write to one of these when a row is updated. Be careful how you deal with failures in this code so you don't lock up the database or rollback the transaction if you external program is not available. Checkout out the utl packages.
The most suitable answer to your rather vague question depends on the kind of problem you want to solve when you mention communicate with external programs.
Please check documentation about Oracle's Database Change Notification, you will find your answers there.

Resources