Using UTL_FILE in ADW - oracle

Is there anyway I can use UTL_FILE in Autonomous Database to an Object Storage or another Cloud storage? I am asking that because we can't use OS files in ADB.
Here an example of the line but in on-premise env. :
UTL_FILE.PUT_LINE(v_file, 'col1,col2,col3');

Filesystem directories are supported, and documented here. Any file you put in the filesystem will count against your storage, just to make sure you are aware.
You can use UTIL_FILE to write to the database file system followed by DBMS_CLOUD.PUT_OBJECT to copy that file a bucket in object store. Use DBMS_CLOUD.DELETE_FILE to delete the file from the database object store if needed. I do it all the time and it works well.
My guess is the p_dir is a parameter to a PL/SQL procedure. If that's the case, p_dir should be set to a value equal to a valid database directory object. DATA_PUMP_DIR exists by default in ADW, so it could be something like this:
BEGIN
my_file_proc(
p_file => 'my_file.csv',
p_dir => 'DATA_PUMP_DIR');
END;

Related

How to pass a text file or it's path as parameter and read the file into an oracle stored procedure

I want to pass a file path (which might be in C or D drive or anywhere on a server) as a parameter to an oracle stored procedure and read the file inside of it. Is it possible to do? How can I do that? What kind of variables are used to do so? I am using oracle12C.
"Reading a file" - as far as I understood the question - means "using UTL_FILE package".
As UTL_FILE reads data from a directory (an Oracle object which points to a filesystem directory, the one you see on the hard disk), your DBA has to
create directory
grant you (i.e. user which will use that procedure) read an/or write privileges on that directory
Directory is usually located on the database server; it can be its C or D drive, but not exactly as you put it "anywhere" - unless DBA is willing to create that many directories (I doubt it).
Then you'll be able to access that file and work on it.
Alternatively, you might even use an external table - it works as an "ordinary" table so you can write SELECT statement directly against it. In the background, it uses SQL*Loader. Still, file must be located in previously mentioned directory.
Finally, SQL*Loader itself: its benefit is that file you'd like to load resides on your PC, it doesn't have to be on the server. You'd create a control file (which says how to read the file), load data into a table and do whatever you want.

How to declare Windows path in Oracle PLSQL

I need to burn a blob column from an image that is saved in windows.
How do I declare the image path in Oracle PLSQL?
Ex:
DECLARE
dest_lob BLOB;
-- this line report ORA22285 "non-existent directory or file for %s operation"
src_lob BFILE := BFILENAME('MY_DIR', 'C:\Users\gus\Desktop\image.jpg');
BEGIN
INSERT INTO teste_gustavo_blob VALUES(2, EMPTY_BLOB())
RETURNING imagem INTO dest_lob;
DBMS_LOB.OPEN(src_lob, DBMS_LOB.LOB_READONLY);
DBMS_LOB.LoadFromFile( DEST_LOB => dest_lob,
SRC_LOB => src_lob,
AMOUNT => DBMS_LOB.GETLENGTH(src_lob) );
DBMS_LOB.CLOSE(src_lob);
COMMIT;
END;
Note: I'm trying to insert a record into a table through a Windows machine using SQLDeveloper. The database is on a remote server.
the path needs to be part of the directory object definition, and the file name is just that - just the file name. Example;
CREATE DIRECTORY MY_DIR AS 'C:\Users\gus\Desktop';
..
BFILENAME('MY_DIR', 'image.jpg');
Note that the directory is created on the server not your local machine. So if you are trying to create a file on a local machine, this will not work in pl/sql. PL/sql runs on the server, not the client. In that case, you need to code a client.
The database can only see directories which are local to it. It cannot see files on your PC (unless the database is running on that PC). So you cannot load your file through PL/SQL.
However, you say you are using SQL Developer. You can load a BLOB by editing the table's data in the Table Navigator. Click on the Data tab then edit the cell (the BLOB column of the row you want to load the file into). Use the Local Data > Load option to upload your file. That Jeff Smith has publish a detailed step-by-step guide on his blog.
I had a similar problem recently. I needed to dev test a feature and I needed a PDF file in my DB. I read few questions here and formulated an answer. I inserted a row manually with SQL and changed the file manually after that.
The SQL:
insert into ATTACHMENT_TABLE (id, file_content) values
(1, utl_raw.cast_to_raw('some dummy text'));
Using DBeaver, I edited the row and loaded the file from my windows PC. The end result is a row which contains the PDF file I wanted.

dbms_datapump.get_dumpfile_info can't read directory when compiled in stored procedure

I'm creating a stored procedure to load (impdp) a Datapump database dump.
I am trying to get the dump file's creation date (to compare with the date of a previously loaded dump), using DBMS_DATAPUMP.GET_DUMPFILE_INFO, like in this example.
When running in an anonymous block (like below), it runs fine, outputting the dump file's creation date. However, when this same block is adapted and compiled in a stored procedure, I get the ORA-39087 error (Directory name is invalid).
DECLARE
dumpfile VARCHAR2(256) := 'my_file.dp';
dir VARCHAR2(30) := 'MY_DIR';
info ku$_dumpfile_info;
ft NUMBER;
BEGIN
sys.dbms_datapump.get_dumpfile_info(dumpfile, dir, info, ft);
FOR rec IN (SELECT * FROM TABLE (info) WHERE item_code = 6 ) LOOP
dbms_output.put_line(rec.value);
END LOOP;
END;
The directory exists. The name is valid. When I run
SELECT * FROM datapump_dir_objs;
with the same user, I can see that the user has READ and WRITE privileges on the directory. Oracle version is 11g Release 11.2.0.4.0.
Any light on what I am doing wrong?
Thanks in advance.
The problem was that the READ and WRITE privileges on the directory were added via a role. By default, anonymous blocks are executed with the current user's privileges, but stored procedures are not.
I added AUTHID CURRENT_USER to the procedure's header and managed to access my directory.
Thanks to Alex Poole for the insight.

How to SFTP a CSV file from PLSQL on a Oracle 10g database (10.2.0.3)

My Application generates a CSV file using UTL_FILE and writes the file to the DB server location,then the SFTP should transfer that file to a desired shared location.
First part is done,need help in the second one i.e SFTP using PLSQL
Thanks
While it is entirely possible to write a SFTP client in PL/SQL using the UTL_TCP package, that is unlikely to be a practical approach. In general, you have a couple options
Create a Java stored procedure using one of the many Java SFTP libraries and call that Java stored procedure from PL/SQL.
Create a shell script that does the SFTP using the server's command-line utililties and call that shell script either using DBMS_SCHEDULER or via a Java stored procedure.
If your Oracle database is running on Windows, you could also write a .Net stored procedure rather than a Java stored procedure in either of the two options above. A Java stored procedure, however, would be much more common.
If you would like to try a commercial offering you can check ORA_SFTP
You can upload a file with it with a code block similar to this:
DECLARE
connection_id NUMBER;
private_key_handle BFILE;
private_key BLOB;
PRIVATE_KEY_PASSWORD VARCHAR2(500);
BEGIN
DBMS_LOB.createtemporary(PRIVATE_KEY, TRUE);
private_key_handle := BFILENAME('PGP_KEYS_DIR', 'test_putty_private.ppk'); -- directory name must be Upper case
DBMS_LOB.OPEN(private_key_handle, DBMS_LOB.LOB_READONLY);
DBMS_LOB.LoadFromFile( private_key, private_key_handle, DBMS_LOB.GETLENGTH(private_key_handle) );
DBMS_LOB.CLOSE(private_key_handle);
PRIVATE_KEY_PASSWORD := 'changeit';
connection_id := ORA_SFTP.CONNECT_HOST('localhost', 22, 'nasko', private_key, private_key_password);
-- upload the private key just for a demo
ORA_SFTP.UPLOAD(connection_id, private_key, 'data.csv');
ORA_SFTP.DISCONNECT_HOST(connection_id);
END;
/
Disclaimer: I work for DidiSoft

Taking dump of tables in oracle 10g using PL/SQL procedure

Hi Required immediate response,
I want to take dump of some selected tables from schema, can any body tell me is it possible?
Can anybody provide procedure by executing that we can take dump.
e.g. I have schema, testuser, and tables (T1,T2,T3,T5,T9), i want to take dump of T1 & T5 only.
Thanks in advance
As you are on 10g you can do this with the Data Pump API. You need to have read and write access on a directory object which maps to the destination OS directory.
In the following example I am exporting two tables, EMP and DEPT, to a file called EMP.DMP in a directory identified by DATA_PUMP_DIR.
SQL> declare
2 dp_handle number;
3 begin
4 dp_handle := dbms_datapump.open(
5 operation => 'EXPORT',
6 job_mode => 'TABLE');
7
8 dbms_datapump.add_file(
9 handle => dp_handle,
10 filename => 'emp.dmp',
11 directory => 'DATA_PUMP_DIR');
12
13 dbms_datapump.add_file(
14 handle => dp_handle,
15 filename => 'emp.log',
16 directory => 'DATA_PUMP_DIR',
17 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
18
19 dbms_datapump.metadata_filter(
20 handle => dp_handle,
21 name => 'NAME_LIST',
22 value => '''EMP'',''DEPT''');
23
24 dbms_datapump.start_job(dp_handle);
25
26 dbms_datapump.detach(dp_handle);
27 end;
28 /
PL/SQL procedure successfully completed.
SQL>
#DerekMahar asks:
"Is there a similar data pump tool or
API available for execution from the
client side"
DataPump, both the PL/SQL API and the OS utility, write to Oracle directories. An Oracle directory must represent an OS directory which is visible to the database. Usually that is a directory on the server, although I suppose it is theoretically possible to map a PC drive to the network. You'd have to persuade your network admin that this is a good idea, it is a tough sell, because it isn't...
The older IMP and EXP utilities read and wrote from client directories, so it is theoretically possible possible to IMP a local dump file into a remote database. But I don't think this is a practical approach. By their nature dump files tend to be big, so importing across a network is slow and prone to failure. It is a much better solution to zip the dump file, copy it to the server and import it from there.
You should try using the DATAPUMP api's (EXPDP/IMPDP). It has a lot more capabilities and has PLP/SQL APIs. DATAPUMP is a replacement for exp and imp and is supported in 10g.
http://www.orafaq.com/wiki/Datapump#Invoking_from_PL.2FSQL
With this command you'll get a binary Oracle dump:
exp scott/tiger file=mydump.dmp tables=(T1,T5)
I recommend this link: http://www.orafaq.com/wiki/Import_Export_FAQ
If you must use PL/SQL, and you're trying to create a file, then you'll need to have a directory defined with write access granted to your user. That's something your DBA can do. See the "create directory" command.
At that point, you can (1) call UTL_FILE to open a file and write rows to it or (2) create an "EXTERNAL TABLE" and copy the information to it or (3) use DBMS_XMLGEN or (4) use any of several other ways to actually write the data from the database to the file. All of these are in the Oracle docs. The PL/SQL Packages and Types manual is your friend for things like this.
Note that the actual file system directory has to be on the server where the database is located. So you may need to get access to that server to copy your file, or have somebody set up a mount or whatever.
Alternatively, you could set up a plsql web service that you could call to get your data.
But, personally, I'd just use exp. Or, if that's not available, Toad or some other front end tool (even SQL*Plus) where you can just write a simple SQL script and save the results.
If you're doing this for a homework assignment, my guess is they'll want a UTL_FILE solution.

Resources