We have created ODI12c(12.2.1.2.6) mapping to load data from external table to staging table and staging table to fact table.
Our database is oracle 19c.
Now, we would like to automate ETL process by running ODI mapping from AIX 7.2 crontab to do schedule odi mapping run.
Would you please help us how to run odi mapping from AIX 7.2 crontab
Related
I am using always free autonomous DB on Oracle Cloud. I want to create the cron job which will be executed on every new row in the table eg. customers.
Can I run the script from a bucket or where do I need to upload the script which needs to be executed?
Thanks in advance!
I am an old Informatica PowerCenter 8 guy and am heading up a team using Informatica Big Data Edition 9.5.1. I have a question regarding Hive. Can Informatica build Hive tables or do they have to be built separately? If they can be built when 'Not Exists', what are the steps?
Thanks!
If you enable the below option in PC, this generate/create the HIVE tables
"Generate And Load Hive Table"
This is available only in PC, with Power Exchange for Hadoop
The way this works is, PC will first load into an HDFS file and then will create an HIVE definition on this and make sure you select "Externally Managed Hive Table", so that PC will create an external table.
Although at the mapping level, you need to define flat file as your target
Is it possible to get a structure of a derby database so it is saved in a form of an sql script that I can run and it would recreate the database along with the data in it?
The dblook tool (http://db.apache.org/derby/docs/10.8/tools/ctoolsdblook.html) will get the structure of the database and export it as a SQL script.
But it doesn't extract the data.
You could perhaps use the backup and restore utilities, but the format of a Derby backup is not a sql script.
The Apache 'ddlutils' tool can extract and move the data, I believe. See: http://db.apache.org/derby/integrate/db_ddlutils.html
and
http://db.apache.org/ddlutils/
Now I'm too keen to know how can I export dmp file with Oracle jobs? It's because of I'm very new to Oracle and don't know how to backup Oracle with jobs like MsSQL bachup with schedule. That's why I want to know what I asked.
You can fairly easy setup a backup scheduled by the database. Best approach for this is to install the Oracle Scheduler Remote Job Agent - local to the database - and configure that agent in the database that holds your backup schedule. This can be the database itself, it can also be a central backup schedule database, all a matter of taste.
Oracle Scheduler is very powerful and can execute tasks in the local database, in remote database[s], on the local server and on remote servers. If using OS type of jobs, best is to use the 11g Remote Scheduler Agent. Don't use the old fashioned 10g style External Jobs. Use remote jobs with defined credentials.
For help look at my blog where you also find pointers to docu.
After you installed and configured the job agent to be a valid target for the database that performs the scheduling, easiest is to use dbconsole to define the jobs. If you configure the dbconsole, it also gives an option to generate auto backup jobs. Maybe this is already enough. You asked for export and there expdp with the Oracle Scheduler does a wonderful job.
You can run an OS process from Oracle Job using Java Stored Procedure or a C program.
See this blog entry,
Instead of exporting dump using the old imp/exp utilities to generate dmp files, look at Oracle Datapump, especially since from the tags I infer you're using Oracle 11g.
Data pump supports table-level, tablespace-level, schema-level & full export modes and is kown to be consdirably faster than the previous imp/emp tools.
Further reading:
Export/import with Oracle Data pump
Oracle Documentation on Data pump
know how can I export dmp file with Oracle jobs?
That's not possible. The emp tools runs outside the database, while jobs run within. if you want scheduled exports perhaps you could use a cronjob/scheduled task.
Just wondering how do I check if a backup is running on an Oracle database. I am on version 11.2.0.2
That depends on how you are doing backups. Are you using RMAN? If you are using recovery manager you can use the corresponding database views like V$RMAN_BACKUP_JOB_DETAILS or V$RMAN_BACKUP_SUBJOB_DETAILS