How to store BI publisher reports output in Oracle DB - oracle

How to schedule BI publisher reports to run and then to store the output PDFs to the DB?

You can use an Event Trigger in BIP.
First of all implement an oracle package that creates a record in a table,that inserts something in a table.

Related

Parameterized query in SSIS Oracle data load

I am learning to use SSIS to use between our Oracle ERP system and my SQL Server data warehouse
When loading transactional data (Inventory transactions) I have a unique auto increment integer Transaction_Id column that I would like to use in my SSIS data flow task
I understand that I would need to create a Parameter and fill this using a "Execute SQL Task", where I look up the latest transaction_id from my SQL Server table, then on to the "Data flow task" where I would use the parameter in a SQL Query.
My issue is that the Oracle connectors I am using "MS Oracle Source" From attunity does not allow me to add parameters to my query.
How would I go about getting my parameter value into my SQL query?

talend etl oracle error 0 row insert

I am a newbie to TalendETL and am using Talend Open Studio for Big Data version 6.2.
I have developed a simple Talend ETL job that picks up data from a tFileInputExcel and tOracleInput(dimension date ) and inserts data into my local Oracle Database.
Below is how my package looks :
this job run but i get 0 rows insert into my local Oracle Database
Your picture shows that no rows come out your tMap Component. Verify that your links inside the Tmap are corrects.
Seems there is no data that matches between fgf.LIBELLE_MOIS and row2.B.

Is it possible in oracle to trigger a SAS program after an insert or update on oracle table?

I'm new to oracle and I saw Oracle triggers can trigger some action after an update or insert is done on oracle table.
Is it possible to trigger a SAS program after every update or insert on Oracle table.
There's a few different ways to do this but a problem like this is an example of the saying "Just because you can, doesn't mean you should".
So sure, your trigger can be fired on update or insert and that can call a stored procedure in a package which can use the oracle host command to call an operating system command which can call SAS.
Here are some questions:
do you really want to install SAS on the same machine as your Oracle database?
do you really want every transaction that inserts or updates to have to wait until the host command completes? What if SAS is down? Do you want the transaction to complete or.....?
do you really want the account that runs the database to have privileges to start up or send information to other executables? Think security risks.
if an insert does one record the action is clear. What if an update affects a thousand records? What message do you want to send to SAS? One thousand update statements? One update statement?
There are better ways to do this but a complete answer needs more details from you as to the end goal and business logic involved. Some ways I have used include:
trigger inserts data into an Oracle advanced queue. At predetermined intervals take the changes off the queue and write them to a flat file. Write a file watcher to look for the files and send the info to SAS.
write a Java program to take the changes and ship them
use the APEX web service and expose the changes as a series of JSON or REST packets.

How does the Oracle CEP turn a command into Event?

Take an INSERT command for example, when I insert something into a table, how can the Oracle CEP know what I have done?
I have read the document about the Oracle CEP, but it's too brief. I know only that event is represented as POJO in Oracle.
Does the Oracle CEP just parse the INSERT command and try to find whether it matches some event class in Oracle?
Or is there any difference between CEP and Active Database?

Oracle PL/SQL and Shell scripting: from one schema to another schema

I want to load data from one table from one schema to another schema on daily basis.
Tables are in different database so to create database link will not be an option due to some security purpose....
About million records will get process....
Databases are on different server , from database "A" I am fetching Employee presence details by combining emp details and emp presence table for period of a month , and loading this data in other table on database "B". Need to run this activity on daily basis.
I need to run a job daily at low peak hours to get complete copy of table into other db ...
will Import/Export or loading data with help of sqlldr?
please let me know the correct way..
Thanks in Advance..
What are my best options?
Well, it seems that using database link would best fit for your situation. If you want to read a table from a database, you should have read privilege. Perhaps you can ask the DBA creating an account(user) which only has read privilege for specific table. Then you can use database link connecting with the new user.
You can't update or delete data from the table because the user you connecting doesn't have the write privilege. This can solve the security problem.
exp/imp and sqlldr are different tools. They don't work together. You can only import data from an export file. You can't load export file with sqlldr.
If you want to run this periodically, it sounds like you might want to take a look at the Oracle Scheduler
Overview: http://docs.oracle.com/cd/B28359_01/server.111/b28310/schedover001.htm
To export the data and add it into the new database, you might want to use Oracle DataPump, which can do both the export and import for you, securely.
Data Pump Export: http://docs.oracle.com/cd/B28359_01/server.111/b28319/dp_export.htm
So your bet might be creating a shell script that uses data pump to create an export file from database number 2, and then uses data pump again to import said file into database number 1.
Once you have that script, you can schedule it to run during nights or at any time you have low traffic.
Regards

Resources