How to use default value in LOAD statement in DB2? - insert

I am currently using DB2 . I do not know much about load statement.
I am using this query to load data..
LOAD FROM "IXAC.CSV" OF DEL METHOD P ('IX',1,2,3,4,) MESSAGES
"SYAC.MSG" INSERT INTO SYNC.AC_COUNT ( "TYPE", AC1, AC2, AC3,
AC4 ) ; COMMIT;
In "IXAC.CSV" there are 4 int values separated with comma. My problem is that how can i insert 'IX' with load statement as a constant with each row insert.
I tried this but not found any success. I am newer in database.
Help me ..
Thanks in advance ...

Change your table definition in the database to have a default value for the column 'IX' (it looks like you want "TYPE"?).
Then do the load as normal, leaving out the IX column.

if you are able to edit the .csv file a workaround is that you can use a text editor (such as ultra edit) that supports wildcards or regular expressions in its find/replace feature and replace each carriage return/line feed with a CR/LF followed by "IX," (quotes optional depending on if you want to specify a text delimiter on insert). then your .csv file will have all your data.

Related

How to load data into Oracle using SQL Loader with skipping and merging columns?

I am trying to load data into Oracle database using sqlloader,
My data looks like following.
1|2|3|4|5|6|7|8|9|10
I do not want to load first and last column into table,
I want to load 2|3|4|5|6|7|8|9 into one field.
The table I am trying to load into has only one filed named 'field1'.
If anyone has this kind of experience, could you give some advice?
I tried BOUNDFILLER, FILLER and so on, I could not make it.
Help me. :)
Load the entire row from the file into a BOUNDFILLER, then extract the part you need into the column. You have to tell sqlldr that the field is terminated by the carriage return/linefeed (assuming a Windows OS) so it will read the entire line from the file as one field. here the whole line from the file is read into "dummy" as BOUNDFILLER. "dummy" does not match a column name, and it's defined as BOUNDFILLER anyway, so the whole row is "remembered". The next line in the control file starts with a column that DOES match a column name, so sqlldr attempts to execute the expression. It extracts a substring from the saved "dummy" and puts it into the "col_a" column.
The regular expression in a nutshell returns the part of the string after but not including the first pipe, and before but not including the last pipe. Note the double backslashes. In my environment anyway, when using a backslash to take away the special meaning of the pipe (not needed when between the square brackets) it gets stripped when passing from sqlldr to the regex engine so two backslashes are required in the control file
(normally a pipe symbol is a logical OR) so one gets through in the end. If you have trouble here, try one backslash and see what happens. Guess how long THAT took me to figure out!
load data
infile 'x_test.dat'
TRUNCATE
into table x_test
FIELDS TERMINATED BY x'0D0A'
(
dummy BOUNDFILLER,
col_a expression "regexp_substr(:dummy, '[^|]*\\|(.+)\\|.*', 1, 1, NULL, 1)"
)
EDIT: Use this to test the regular expression. For example, if there is an additional pipe at the end:
select regexp_substr('1|2|3|4|5|6|7|8|9|10|', '[^|]*\|(.+)\|.*\|', 1, 1, NULL, 1)
from dual;
2nd edit: For those uncomfortable with regular expressions, this method uses nested SUBSTR and INSTR functions:
SQL> with tbl(str) as (
select '1|2|3|4|5|6|7|8|9|10|' from dual
)
select substr(str, instr(str, '|')+1, (instr(str, '|', -1, 2)-1 - instr(str
, '|')) ) after
from tbl;
AFTER
---------------
2|3|4|5|6|7|8|9
Deciding which is easier to maintain is up to you. Think of the developer after you and comment at any rate! :-)

sqlldr WHEN clause

I am trying to code a sqlldr.ctl file WHEN Clause to limit the records imported to those matching a portion of the current Schema's name.
The code I have (which does NOT work) is:
LOAD DATA
TRUNCATE INTO TABLE TMP_PRIM_ACCTS
when REGION_NUM = substr(user,-3,3)
Fields terminated by "|" Optionally enclosed by '"'
Trailing NULLCOLS
( PORTFOLIO_ACCT,
PRIMARY_ACCT_ID NULLIF (PRIMARY_ASSET_ID="NULL"),
REGION_NUM NULLIF (PARTITION_NUM="NULL")
)
sqlldr returns:
SQL*Loader-350: Syntax error at line 3.
Expecting quoted string or hex identifier, found "substr".
when PARTITION_NUM = substr(user,-3,3)
I cannot put single quotes around "user", because that turns it into the literal string "user". Can anyone explain how I can reference the "active" User in this WHEN Clause?
Thank you!
Can you try something like this? (now I can't make test with SQLLDR, but this is syntax I used for changing values):
when REGION_NUM = "substr(:user,-3,3)"
It doesn't look like you can. The documentation only shows fixed values:
Trying to use an expression in when that clause (or in nullif; thought I'd try to see if you could cause a rejection based on null PK value) you just see the literal value in the log:
Table TMP_PRIM_ACCTS, loaded when REGION_NUM = 0X73756273747228757365722c2d332c3329(character 'substr(user,-3,3)')
which is sort of what you referred when you said you couldn't quote user, but you'd have to quite the whole thing anyway. Using :user doesn't work either, the colon is seen as just another character, it doesn't try to find a column called user instead.
The simplest approach may be to pre-process the data file and remove any rows which don't match the pattern (e.g. via a regex). That would actually be slightly easier if you used an external table instead of SQL*Loader.
Alternatively, generate your control file and embed the correct literal value based on the user you'll connect as.

How to perform an add functionality in sql loader file

I have a fixed length data file a.dat with below data in it
1234544550002200011000330006600000
my focus is on specific positions
POSITION(1:4)
POSITION(5:8)
and I want to add values in these 2 positions and insert it in a field named Qty in XYZ_Table.
I am trying to the following in my CTL file. But it fails, and I don't know how to pursue it further.
LOAD DATA
INFILE '$SOME_DATA/a.dat'
APPEND
PRESERVE BLANKS
INTO TABLE XYZ_Table
(QTY POSITION(1:4)+POSITION(5:8) "to_number(:QTY)")
I need to achieve this addition functionality in SQL Loader only.
If the above methodology is not possible, it would be great if you can help me with a different approach.
P.S: What I am trying to achieve is just one part of the bigger CTL file.
You need to identify the positions you want to add together but not load into their own columns as "BOUNDFILLER", which means don't load them but remember them for use in an expression later. Then use like this:
LOAD DATA
infile test.dat
append
preserve blanks
INTO TABLE X_test
TRAILING NULLCOLS
(val_1 BOUNDFILLER position(1:4)
,val_2 BOUNDFILLER position(5:8)
,qty ":val_1 + :val_2"
)

While exporting data from Oracle PLSQL to Excel, format issue is coming

I am trying to export data from Oracle PLSQL to Excel.
The data type of one of the column is VARCHAR2.
The Column conatins value like 00798019859217.
But after exporting, the value in excel is something like this 7.9802E+11.
PLease let me know how to resolve this and the reason for this format issue.
Thanks in advance.
Do the following:
Select the column with single quotes, say
SELECT ("'" || COLUMN_NAME) AS COLUMN_NAME, OTHER_COLUMNS FROM MY_TABLE
Output will be like:
'ABC0157976
'00798019859217
Export the output to an excel.In excel "A" column values will be
'ABC0157976
00798019859217 (Single quote will not be visible for number only values)
Select the entire "A" row and clear all single quotes with replace all option. You will get final excel as.
ABC0157976
00798019859217
Since it is a text field and non-numeric characters are also expected to be present, the step#3 is required. If it is going to be only numeric characters, then step #3 can be ignored.
This is happening due to excel where the number auto casted. As #HamidP said try exporting as csv and check with opening in notepad or text, and check whether the number displaying fully or not. If so then you can open the same in Excel with small options change such
Right click the cell and click format option and make it cell format as text and then save the excel.

SQL*Loader - How can i ignore certain rows with a specific charactre

If i have a CSV file that is in the following format
"fd!","sdf","dsfds","dsfd"
"fd!","asdf","dsfds","dsfd"
"fd","sdf","rdsfds","dsfd"
"fdd!","sdf","dsfds","fdsfd"
"fd!","sdf","dsfds","dsfd"
"fd","sdf","tdsfds","dsfd"
"fd!","sdf","dsfds","dsfd"
Is it possible to exclude any row where the first column has an exclamation mark at the end of the string.
i.e. it should only load the following rows
"fd","sdf","rdsfds","dsfd"
"fd","sdf","tdsfds","dsfd"
Thanks
According to the Loading Records Based on a Condition section of the SQL*Loader Control File Reference (11g):
"You can choose to load or discard a logical record by using the WHEN clause to test a condition in the record."
So you'd need something like this:
LOAD DATA ... INSERT INTO TABLE mytable WHEN mycol1 NOT LIKE '%!'
(mycol1.. ,mycol2 ..)
But the LIKE operator is not available! You only have = and !=
Maybe you could try an External Table instead.
I'd stick a CONSTRAINT on the table, and just let them be rejected. Maybe delete them after load. Or a unix "grep -v" to clear them out the file.

Resources