Load csv from S3 to Oracle - oracle

I have a requirement to load a csv file which is kept in AWS S3 location into Oracle table.
I saw the LOAD DATA statement in the Oracle documentation page (code mentioned below) but there is no option to load data from AWS S3 to Oracle table. Does any one know if there is a feature in Oracle to do this?
LOAD DATA
[LOW_PRIORITY | CONCURRENT] [LOCAL]
INFILE 'file_name'
[REPLACE | IGNORE]
INTO TABLE tbl_name
[PARTITION (partition_name [, partition_name] ...)]
[CHARACTER SET charset_name]
[{FIELDS | COLUMNS}
[TERMINATED BY 'string']
[[OPTIONALLY] ENCLOSED BY 'char']
[ESCAPED BY 'char']
]
[LINES
[STARTING BY 'string']
[TERMINATED BY 'string']
]
[IGNORE number {LINES | ROWS}]
[(col_name_or_user_var
[, col_name_or_user_var] ...)]
[SET col_name={expr | DEFAULT}
[, col_name={expr | DEFAULT}] ...]

From my point of view, if you have a CSV file, then SQL*Loader is a tool to be used.
It is installed along with every Oracle database (so, if you have access to an on-premise server, you have it). Even better, you can use that tool from your own PC; SQL*Loader is part of Oracle's utilities, can be installed from the Client software. Or, if you install Oracle XE (Express Edition) on your PC, it also contains it.
Shortly: you'd create a control file which specifies data source (your CSV file), target table, which columns you'll be loading, etc. Then run it from the operating system command prompt and that's it.
SQL*Loader is really, really fast.
Another option is to use external tables feature; its drawback is that you do have to have access to the database server (actually, to a directory - an Oracle object which points to a file system directory which is usually located on the database server). If you aren't a DBA, you'll have to talk to one as they have to grant you required privileges.
If you use it, then you'd access the CSV file from SQL, i.e. you can query the file as if it was an ordinary Oracle table and use e.g.
insert into target_table (col1, col2, ...)
select col1, col2, ... from csv_file_as_external_table

Related

How to analyze oracle dump file

I need to analyze a large Oracle DMP file. So far, I have no experience with Oracle.
I know that the database contains information about certain people, for example a person with the name Smith.
I don't know how the database is structured (which table contains which information, are there triggers, ...).
As long as I don't know which tables I have to search, the best way I have found to work with the database files is to use grep.
This way, I can at least verify that the database really does contain the name "Smith".
Ultimately, I would like to have an SQL dump that can be viewed, filtered and understood in a text editor.
The DMP file was created with
expdp system / [PW] directory = [expdp_dir] dumpfile = [dumpfile.dmp] full = yes logfile = [logfile.log] reuse_dumpfiles = y
I know that the name Smith occurs often in the Database. Running grep -ai smith dumpfile.dmp returns many hits.
To analyze the database further I installed oracle-database and sqldeveloper-20.2.0.175.1842-x64. I imported the DMP file with
impdp USERID = system / [PW] FULL = y FILE = [dumpfile.dmp]
The folder C:\app\[user]\oradata\orcl now contains the files SYSAUX01.DBF and SYSTEM01.DBF, among others.
I suspect that these are the database files.
The command grep -ai smith * .DBF does not return any hits.
Either the files SYSAUX01.DBF and SYSTEM01.DBF are not the databases or something did not work on the import.
Using the SQL developer, I log in with the following data:
User: system
Password: [PW] (= PW from the expdp command)
SDI: orcl
In SQL developer, I do not find Smith. SQL developer displays many tables, most of which seem
to be empty and none of which I understand. I suspect that these tables are not the tables I am looking for. Perhaps I need to log in a different way (different user, different SDI?).
I tried to export the database to an SQL dump file, trying out various options that SQL developer provides,
but the result does not contain the string "Smith".
Something is not right:
Import is faulty
wrong SDI
Export is faulty
anything else
What might have gone wrong along the way?
You have a lot misconceptions in your question.
Oracle Datapump is a database utility designed for exporting and importing. But the content, either is DDL commands ( as create table, create index ) or data from the tables, is stored as binary, so you can't check the contents of those files. There are options to extract the DDL commands from the dumpfile and put it into a script.
The datafiles you are mentioned are part of the database itself, they have nothing to do with datapump. Do not touch those files
I don't know what you mean by "Smith" , if you mean an schema, after importing make a select over dba_users looking for the field username = 'SMITH'
If you mean looking for "Smith" as part of any of those tables, you will have to look in any single table of the database ( except the ones of schemas belonging to Oracle ) and for each field that is a string
SDI does not mean anything. I guess you meant SID or Oracle System ID, an unique identifier to identify a database in a specific environment
There is nothing wrong. The problem I believe is that you don't exactly know what you are looking for.
Check this
A user/schema with name SMITH
SQL> SELECT USERNAME FROM DBA_USERS WHERE USERNAME = 'SMITH' ;
A table which name contains the word SMITH ( unlikely )
SQL> SELECT TABLE_NAME FROM DBA_TABLES WHERE TABLE_NAME LIKE '%SMITH%' ;

read and insert data from text file to database table using oracle SQL Plus

I really need your help
I am always work on SQL server, but now I am working on something else and that why I need your help.
I m working on (Oracle SQL plus), I have a text file lets say the name test.txt and just I want to upload data from this file to database table using SQL plus
lets say the text file data:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill 25-5-2018
how to write a code pl/sql on sql plus to upload the data from the text file to the table on my data base??
usually on SQL server I use Bulk insert, here what the methods?
I tried many from the internet but not solved.
Please help me
Thanks a lot
If the text file is on the same machine you're running SQL*Plus from, you can use the SQL*Loader utility.
As a simple example, lets say your table is:
create table your_table (id number, name varchar2(10), some_date date);
And you have a text file data.txt containing what you showed, but with a comma added on the third line:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill,25-5-2018
You can create a basic SQL*Loader control file in the same directory, called say your_table.ctl, with something like:
LOAD DATA
INFILE 'data.txt'
APPEND
INTO TABLE your_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
ID,
NAME,
SOME_DATE DATE "DD-MM-YYYY"
)
Look at the documentation to see what all those mean, particularly what APPEND means; you may want to TRUNCATE instead - but be careful with that.
Then run SQL*Loader from the command line (not from within SQL*Plus), using the same credentials and connect string you normally use to connect to the database:
sqlldr userid=usr/pwd#tns control=your_table.ctl
Once that has completed - assuming there are no errors reported on console ro in the log file it creates - then querying your table will show:
select * from your_table;
ID NAME SOME_DATE
---------- ---------- ----------
1 mike 2018-01-01
2 jon 2017-12-20
3 bill 2018-05-25
There are lots of other options and capabilities, but that might cover what you need at the moment.

Oracle: export a table with blobs to an .sql file that can be imported again

I have a table "Images" with two fields:
Name VARCHAR2
Data BLOB
I would like to export that table to a .sql file which I could import on another system. I tried to do so using the "Database unload" assistant of Oracle SQL Developer. However the generated file does just have the content for the names in it but not the data. Thus after importing I would end up with all the names but the data field would be null everywhere.
I'd really prefer it just to be one file (I saw some examples that included dumping the data to one file per field on the fs...)
Is it possible to generate such a script with SQL Developer? or is there any other way/tool to do so?
I don't think this is possible with SQL Developer (but then I don't use it very often).
The SQL client I am using - SQL Workbench/J - can do this.
There are several ways to export this data.
Generate a proprietary script
It can create a SQL script that uses a special (tool specific) notation to reference an external file, something like:
INSERT INTO images
(name, data)
VALUES
('foobar', {$blobfile='blob_r1_c2.data'});
The above statement can only be executed with SQL Workbench again. It is not compatible with any other SQL client.
Use utl_raw
Another alternative is to use a "blob literal", but due to Oracle's limit on 4000 bytes for a character literal, this only works for really small blob values:
INSERT INTO images
(name, data)
VALUES
('foobar', to_blob(utl_raw.cast_to_raw('......')));
where the character literal for the cast_to_raw call would contain the hex values of the BLOB. As this requires 2 characters per "blob byte", you can't handle BLOBs larger than 2000 bytes with that. But that syntax would work for nearly all Oracle SQL tools (if they can handle scripts with very long lines).
SQL*Loader input file
The third alternative is to export the data into a text file that can be imported using SQL*Loader:
The text file would contain something like this:
NAME DATA
foobar blob_r1_c2.data
Together with the following SQL*Loader control file:
OPTIONS (skip=1)
LOAD DATA CHARACTERSET 'WE8ISO8859P15'
INFILE 'images.txt'
APPEND
INTO TABLE IMAGES
FIELDS TERMINATED BY '\t' TRAILING NULLCOLS
(
NAME,
lob_file_data FILLER,
DATA LOBFILE(lob_file_data) TERMINATED BY EOF
)
This can be loaded using SQL*Loader and is thus doesn't need SQL Workbench to import the data.
More details are in the manual
Edit
As Alex has pointed out in his comment, you can also use a DataPump export - but that requires that you have access to the file system on the server. The above solutions all store the data on the client.
If you absolutely need to use a single .sql file to import the BLOB you can generate the script using PL/SQL:
set serveroutput on
declare
lob_in blob;
i integer := 0;
lob_size integer;
buffer_size integer := 1000;
buffer raw(32767);
begin
select
data, dbms_lob.getlength(data)
into lob_in, lob_size
from images
where name = 'example.png';
for i in 0 .. (lob_size / buffer_size) loop
buffer := dbms_lob.substr(lob_in, buffer_size, i * buffer_size + 1);
dbms_output.put('dbms_lob.append(lob_out, hextoraw(''');
dbms_output.put(rawtohex(buffer));
dbms_output.put_line('''));');
end loop;
end;
Its output will be the BLOB's content encoded like:
dbms_lob.append(lob_out, hextoraw('FFD8FFE0...0000'));
dbms_lob.append(lob_out, hextoraw('00000000...0000'));
...
dbms_lob.append(lob_out, hextoraw('007FFFD9'));
Which you can load into an already inserted row with PL/SQL:
declare
lob_out blob;
begin
select data into lob_out
from images
where name = 'example.png'
for update;
dbms_lob.append(lob_out, hextoraw('FFD8FFE0...0000'));
dbms_lob.append(lob_out, hextoraw('00000000...0000'));
...
dbms_lob.append(lob_out, hextoraw('007FFFD9'));
end;
Just remember the resulting .sql file will be huge.
Thx for your answer. I used the third alternative.
First I downloaded SQL Workbench/J. Then I used the following command to make an export:
WbExport -type=text -file='c:\temp\Images' delimiter='|' -decimal=',' -sourcetable=Images -formatfile=oracle;
This produced a Images.txt file and many Images_r*_c2.data files and a Images.ctl file.
I could then use the following command to import:
sqlldr myuser#myhost/mypassword control=Images.ctl
This is definitely possible in SQL developer.
First you need to export the table in the source location choosing
appropriate table(s).
Tools > Database Export
Select output format as loader rather than insert , excel which we
normally use.
Following these steps would create sqlldr control files and data files and also the create table ddl if you chose the option.You can use them to import(sqlldr) data in the destination.
This is a better solution and is portable in terms of extraction and distribution . It gives the flexibility of delivering components to be deployed through code repositories.
Here is a link that explains it step by step.
Exporting Multiple BLOBs with Oracle SQL Developer
SQL workbench uses a special file format for blob data, in addition to .sql. If you can accept such files, an even simpler solution is to use Oracle's Original import and export. (It is deprecated, but unlike Oracle's DataPump, it does not require access rights on the server.)
Here is a nice tutorial on the export part.

Insert a text file into Oracle with Bulk Insert

I have a place.file text file;
place.file
New Hampshire
New Jersey
New Mexico
Nevada
New York
Ohio
Oklahoma
....
There are 4000 place names in this file. I will match my my_place table in oracle and place.file . So I want to insert the place.file into the Oracle . Maybe I should use bulk insert, how can I do bulk insert ?
You can use SQL Loader from Oracle.
The syntax is:
sqlldr *connection_string* control=*control_file.ctl*
The control file contains:
LOAD DATA
INFILE names.file
INTO TABLE <table_name>
FIELDS TERMINATED BY <delimiter>
OPTIONALLY ENCLOSED BY <enclosing character>
(<column_name>[, <column_name>, <column_name>])
No mention of an Oracle version. (For the best possible answer, always include Oracle version, Oracle edition, OS, and OS version.)
However, you should investigate using an external table for this purpose. Once you have that set up correctly, you can do:
insert into db_table select ... from external_table;
Optionally, you could use the APPEND hint on the INSERT statement, to use direct load.
Also,optionally, you could set the NOLOGGING attribute on the table you're loading the data into, for best performance. But, consider the recovery implications before you enable NOLOGGING.
Hope that helps,
-Mark

Oracle: Import CSV file

I've been searching for a while now but can't seem to find answers so here goes...
I've got a CSV file that I want to import into a table in Oracle (9i/10i).
Later on I plan to use this table as a lookup for another use.
This is actually a workaround I'm working on since the fact that querying using the IN clause with more that 1000 values is not possible.
How is this done using SQLPLUS?
Thanks for your time! :)
SQL Loader helps load csv files into tables: SQL*Loader
If you want sqlplus only, then it gets a bit complicated. You need to locate your sqlloader script and csv file, then run the sqlldr command.
Another solution you can use is SQL Developer.
With it, you have the ability to import from a csv file (other delimited files are available).
Just open the table view, then:
choose actions
import data
find your file
choose your options.
You have the option to have SQL Developer do the inserts for you, create an sql insert script, or create the data for a SQL Loader script (have not tried this option myself).
Of course all that is moot if you can only use the command line, but if you are able to test it with SQL Developer locally, you can always deploy the generated insert scripts (for example).
Just adding another option to the 2 already very good answers.
An alternative solution is using an external table: http://www.orafaq.com/node/848
Use this when you have to do this import very often and very fast.
SQL Loader is the way to go.
I recently loaded my table from a csv file,new to this concept,would like to share an example.
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION
)
Place the control file and csv at the same location on the server.
Locate the sqlldr exe and invoce it.
sqlldr userid/passwd#DBname control=
Ex : sqlldr abc/xyz#ora control=load.ctl
Hope it helps.
Somebody asked me to post a link to the framework! that I presented at Open World 2012. This is the full blog post that demonstrates how to architect a solution with external tables.
I would like to share 2 tips: (tip 1) create a csv file (tip 2) Load rows from a csv file into a table.
====[ (tip 1) SQLPLUS to create a csv file form an Oracle table ]====
I use SQLPLUS with the following commands:
set markup csv on
set lines 1000
set pagesize 100000 linesize 1000
set feedback off
set trimspool on
spool /MyFolderAndFilename.csv
Select * from MYschema.MYTABLE where MyWhereConditions ;
spool off
exit
====[tip 2 SQLLDR to load a csv file into a table ]====
I use SQLLDR and a csv ( comma separated ) file to add (APPEND) rows form the csv file to a table.
the file has , between fields text fields have " before and after the text
CRITICAL: if last column is null there is a , at the end of the line
Example of data lines in the csv file:
11,"aa",1001
22,"bb',2002
33,"cc",
44,"dd",4004
55,"ee',
This is the control file:
LOAD DATA
APPEND
INTO TABLE MYSCHEMA.MYTABLE
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
CoulmnName1,
CoulmnName2,
CoulmnName3
)
This is the command to execute sqlldr in Linux. If you run in Windows use \ instead of / c:
sqlldr userid=MyOracleUser/MyOraclePassword#MyOracleServerIPaddress:port/MyOracleSIDorService DATA=datafile.csv CONTROL=controlfile.ctl LOG=logfile.log BAD=notloadedrows.bad
Good luck !
From Oracle 18c you could use Inline External Tables:
Inline external tables enable the runtime definition of an external table as part of a SQL statement, without creating the external table as persistent object in the data dictionary.
With inline external tables, the same syntax that is used to create an external table with a CREATE TABLE statement can be used in a SELECT statement at runtime. Specify inline external tables in the FROM clause of a query block. Queries that include inline external tables can also include regular tables for joins, aggregation, and so on.
INSERT INTO target_table(time_id, prod_id, quantity_sold, amount_sold)
SELECT time_id, prod_id, quantity_sold, amount_sold
FROM EXTERNAL (
(time_id DATE NOT NULL,
prod_id INTEGER NOT NULL,
quantity_sold NUMBER(10,2),
amount_sold NUMBER(10,2))
TYPE ORACLE_LOADER
DEFAULT DIRECTORY data_dir1
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY '|')
LOCATION ('sales_9.csv') REJECT LIMIT UNLIMITED) sales_external;

Resources