Scheduling Oracle sql files using Unix based SAS enviornment - oracle

I have bunch of SQL queries that run against an Oracle database. Is there a way to schedule these .sql files using UNIX Based SAS, so they can execute one after another at certain time of day?

If they are .sql files, why do you want to schedule them using SAS? Are they SAS programs? If not, I would do one of three things, depending on my constraints:
1) Convert the .sql files to stored procedures and call them from DBMS_SCHEDULER within Oracle, since Oracle has a fantastic job scheduling subsystem (actually multiple variants) that protects against duplicate jobs among other issues, and you get transactional control, auditing and logging. http://docs.oracle.com/cd/B19306_01/appdev.102/b14258/d_sched.htm
2) If converting them to stored procs is too much, then call the .sql scripts directly from DBMS_SCHEDULER with DBMS_SCHEDULER.CREATE_PROGRAM() and then schedule that program with DBMS_SCHEDULER.CREATE_JOB.
3) Use cron or atrun to schedule batch / shell script wrappers that call sqlplus to run .sql files.
If the question is specifically how to do this with SAS, then DBMS_SCHEDULER can still execute external SAS programs using option (2) above.

Related

Package or automating execution of Hive queries

In Oracle or other DBs, we have a concept of PL/SQL package where we can package multiple queries/procedures and call them inside a UNIX script. In case of Hive queries, what's the process used to package and automate the query processing in actual production environments.
If you are looking to automate the execution of numerous Hive queries, the hive or beeline CLI (think sqlplus with Oracle) allows you to pass a file containing one or more commands such as multiple inserts, select, create tables, etc. The contents of said file can be created programmatically using your favorite scripting language like python or shell.
See the "-i" option in this documentation: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Cli
In terms of a procedural language, please see:
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=59690156
HPL/SQL does have a Create Package option but if whatever you are trying to achieve is scripted outside of HPL/SQL (e.g. python, shell), you can 'package' your application in accordance with scripting best practices of your selected language.
To run mutilpe queries simply write it down one after another in a file (say 'hivescript.hql') and then it can be run from bash by simply calling it through beeline or hive shell
beeline -u "jdbc:hive2://HOST_NAME:10000/DB" -f hivescript.hql

Performing ETL Actions via sqlplus

I am looking into replacing our current ETL Datawarehouse Program with a set of scripts that use sqlplus to complete the ETL actions. Loading and extracting are simple enough via spooling files and sqlldr +loader files. Is there an industry standard, or Oracle backed solution that I am just not finding?

Teradata Jobs and KSH

I tried searching online but was unable to find anything pertaining to my requirements.
I am new to Teradata.
In our team Teradata jobs are used to call the ksh which in turn calls the procedure to run at a scheduled time.
I want to understand how exactly does this calling works? How does a job call a KSH and then how does a KSH call a procedure in turn.
Your help would be much appreciated.
At a very basic level UNIX has a scheduler mechanism called cron. Users with sufficient privilege on the UNIX server can use cron to run jobs at a scheduled time by defining a crontab. Your crontab can call UNIX commands or in many cases a shell script (ksh in your example) to perform a complex set of operations. In many production environments jobs may be scheduled using an enterprise platform instead of many independent crontab files across many users and many servers in the data center.
As this pertains to Teradata, the ksh is likely invoking a Teradata utility such as BTEQ to logon to the database and execute a stored procedure, macro, or set of SQL statements contained within the BTEQ script. Once the BTEQ script has completed a return code is sent to the ksh script to account for any error handling should an error occur within the BTEQ script or an unhandled/handled error within the stored procedure.
You can use your search engine of choice to read up on how to develop UNIX shell scripts (Korn, Bash, etc.) and how Teradata utilities such as BTEQ work. If you have a more discrete question about something in your environment feel free to post a separate question here with the appropriate tags in the question to target the audience who can best help you.

Can I export dmp file with Oracle jobs?

Now I'm too keen to know how can I export dmp file with Oracle jobs? It's because of I'm very new to Oracle and don't know how to backup Oracle with jobs like MsSQL bachup with schedule. That's why I want to know what I asked.
You can fairly easy setup a backup scheduled by the database. Best approach for this is to install the Oracle Scheduler Remote Job Agent - local to the database - and configure that agent in the database that holds your backup schedule. This can be the database itself, it can also be a central backup schedule database, all a matter of taste.
Oracle Scheduler is very powerful and can execute tasks in the local database, in remote database[s], on the local server and on remote servers. If using OS type of jobs, best is to use the 11g Remote Scheduler Agent. Don't use the old fashioned 10g style External Jobs. Use remote jobs with defined credentials.
For help look at my blog where you also find pointers to docu.
After you installed and configured the job agent to be a valid target for the database that performs the scheduling, easiest is to use dbconsole to define the jobs. If you configure the dbconsole, it also gives an option to generate auto backup jobs. Maybe this is already enough. You asked for export and there expdp with the Oracle Scheduler does a wonderful job.
You can run an OS process from Oracle Job using Java Stored Procedure or a C program.
See this blog entry,
Instead of exporting dump using the old imp/exp utilities to generate dmp files, look at Oracle Datapump, especially since from the tags I infer you're using Oracle 11g.
Data pump supports table-level, tablespace-level, schema-level & full export modes and is kown to be consdirably faster than the previous imp/emp tools.
Further reading:
Export/import with Oracle Data pump
Oracle Documentation on Data pump
know how can I export dmp file with Oracle jobs?
That's not possible. The emp tools runs outside the database, while jobs run within. if you want scheduled exports perhaps you could use a cronjob/scheduled task.

Running RMAN Scripts with the job scheduler (Oracle)

Here's a good one for any Oracle gurus out there. I'm working on a web page that dynamically configures Oracle DB backup settings in a closed environment. Right now, I have everything set up to generate scheduled jobs that run pre-determined RMAN scripts that already exist on the Database server's disk. This works, but I want to go a step further.
Is there any way to create jobs with the scheduler that will run RMAN scripts which haven't first been written to disk? For example, is it possible to fire off an RMAN backup script directly from the scheduler by using a pipe of some sort? I've found some vague information on the RMAN Pipe Interface, but I can't see how I could create a private pipe, pack it with RMAN commands, and then feed it to RMAN all in one job run... Any thoughts would be very much appreciated.
In anything related to backup/restore of the database, I advise you to prefer OS's means to execute scheduled jobs (cron/at on unix, Scheduled tasks on Windows). The advantage is that they are independent from oracle instance and you can better handle cases when oracle instance is down or malfunctioning. The "RMAN pipe interface" is meant to be used together with operating system's shell, as well.
However, executing scripts directly from database is also possible: AskTom
If you want to use DBMS_SCHEDULER then the script has to reside on the database server.
But if you install an Oracle client on the web server you can run RMAN from there and connect to the TARGET database. E.g.:
rman 'usr/pwd#conn_str AS SYSDBA' CMDFILE /home/www/db/backup-full.rman
In this case the script can reside on the web server.
Hope this helps.

Resources