oracle insert statement with blob does not work - oracle

Hi I got a db dump from a mysql testdatabase and I want to insert it into an oracle database with the same table structure but I am getting the following error:
[42000][972] ORA-00972: identifier is too long
this is the statement
INSERT INTO aic.keystore (id, alias, keystore_bytes, password, type, description) VALUES (61, 'test', hextoraw(0x30820B5534002010330820A4A06092A864886F70D010701A0820A3B04820A3730820A333082058806092A864886F70D010701A082057904820575308205713082056D060B2A864886F70D010C0A0102A08204FA308204F63028060A2A864886F70D010C0103301A0414DE03192F0E0A847E7E03FA8F08E27ADE383E2A4902020400048204C81BD11FB045C9043D446FB50851C0792667310D1EFF7CF87AB122B01448E73F30873182A22DACDC2D630CA0C2EDB79D6ACEDC4E1F69BE37E535BE7838B149685A661AA829A457D5FC87EEE7BA3D8ADD6E4D8996A258B16EECC9706085414832A49F60060E55CEF434DF30D58BC77F275F4EA9B1B52DD8A9CC7EC6911390EE716C5C31C512DA5947FFB8DBDB48240921DABE0487F79DEDFCC739EFF011A4672FA7DA9626C053BB5075A94F51BCB322CFF9AD6CCE99EC5AF09E6628A4AA9724A05208AB90530B963890D8BEA0146B7CD6B53C09866F808878AC98824A8A1489216EE2951FBF024BD364BD7385D5FF7552E9A0110FE37E38C3219EAA8D7EFB6570D07203D6E3F17214FD665DF18B875161E85B15EB378A557A1EC8A6C56B97E2FE24C1350A6C937C082EBC00D029D0C5C0791E0A45C9BF785EC7D02AF40DFF5014B500E1FB4E6893934231F0BBFDD3DB0B0C3D6B8595E5BC58DD28C209ECE234908DCAFFADCA1E9FFDECEBC02A195F63964287CFE542CF03E132410AD77923F480C5CF9F4F0107BE672AA06184C86ABD743A6D06F20A73E97EB7EEA9761C72338077B4A3A07089028D32909308BA87D2166EBD8BF4EB5261D33784F7C4D0DAF2FE69CF3E60A38887932EF6B227BCF56950B9FE74FD29E35968E5FFF74889413C56E47C20973DC13E8700C2570930594F211B93898BC8CB3AB4F602DD9C457DC9FDA57A14C257EC3D1454CBD7545092AB5C11C8670E82410973DB2E6DECB7AEBFDD0585665E7C24E5227536B154326AF3C6545EA01403555DC84CCC9B4714FB23308247455C814F293B5FAA6A5C48B20B038006D0B80923E7F609890350A25DEDC333F9C636A76360F226C13A8664B6E2DA44EBA4AFC45C32102A674B461CADC86B14C80AF7924E2F7BCC2F7FBFF1F73AA4B928EF7D4250468B6754C4639FD819C6BD5D411C423C8C6FB752AC244C5BD2D3B4609E60E277B1A827F1F337C398D663C7349044C49EC021F6382BF7CA90C80DDB59BB52E63D00C6302C9D4F876DCD605E138D60EA306A6EF4DF2B2CDF3B6092EA8D5B1DDF6B6C5ABF3A7A5A626CD5BC41040E61D3D2C79F1CE1963F5DB384323C0C429B414E7C81D86D9B68586A110BF3A6B429F867A7122326CF106FBD8EEB88621AEB030ACDB71BCED42A44F2A0C1B73DE68F6C7C12E015A4717B6AC664F280A47859CE7F16934173363EE1374ABB8CFE6849E621563239A7195605E2E7EA571686808609057E1AA02466AA5EB9A9DFB18693C2606D7FF7C4F3CA66C267F13EEBFEBA7C33BF199056706963A2E499FF26940DFD0A17462506DE0ACFDAA043829A732C1EF2ED12412743CD557A1261FC15F8B4FE374929D060FD15D7E032E6E743B3EC838AF5B99F9A1609457E064BE62DE0513F86F1D1FB9A39008B5E6BB5A60225F1D5915FE3E9E79661F1D73A10195985FFAEDFFCDF68C77BE3E6E46DDBF0B204D4377813566B7D695A822054D5DB052065A7C23A622A73208402DFD9C98CB1D785E239EE8B7FB8272374ECCE946128B74959E8ACAE9366773E4F2FD422F87AC71A30B0EDA25D3865DD33D84C6BF6C5B7893848FCEEDA666FE2558E2CBEAE41BB0A235926CFD5FA292C6510661487D08A0A475C0776D0D6CBDB3E1275DD42EB4A8B7C702C8D102576815CEB80434606B5EF4557A055C8FC8928228BD4472AFC3F2CDC87B828F6281C134E636DBA488EDC4AC38D177D033160303B06092A864886F70D010914312E1E2C0073006500720076006900630065006100630063006F0075006E0074002D00730061006D006C002D00700077302106092A864886F70D0109153114041254696D652031343836393937323733363833308204A306092A864886F70D010706A0820494308204900201003082048906092A864886F70D0107013028060A2A864886F70D010C0106301A0414167A515711D35DEE864529821AABC3A221E4F887020204008082045022F58215DA63413260D3B4F87D4BABC2D4CBC6A12AF3086AA0F7FA18A722C022A18B3D9F8AA12397EC427233EE8CD90B2350FF8DBC05228F1507E14B2F15F8C7AFC2372FD3E89B2898F327DD4D06D66459B3A8064AB0644195F253E76B7A4F5F6CAFB8564BCC159CE24F8BCA72951EC45008BEA430AB48B3221D368D3F3F5F64AAD0B84E1527181C11040D9E1A8BE737B4CA8CAA0BB3DA607DAAABCFC73586CC589D8DC052F700698B2A10227EDABBD4AF1D4F93261D5B5C763EAAA00AB2EA7C912AFF5B8268769FD5ECAD7A9B17445774765AE8C1A52BC7CE11A7E1BA3C4B419C8EDA911A14BDFD171FAD16B8F825BAC398A0823D942BC555769301E68AF614C8F21E34B0B9006E8329EF039E04373FBA01840E02DBB60822780E59F13BC4C0A538906711819C5AEDE4E14704CC5AE0C94E9F7787962C672F0FAD9307A3CAB4EF0E2DF9A963A2975B787128187163CCF6D37325788422544F89FE73941756FA1D3BA261158B415677421586109B03B886BFA17ED79515E376D4F3C12BF917D88671AE2F0961042F839C938B0692AB09AD89B106ADF544C3D9688E2C2F4BE60BB53636AC067B32C47D696527E38734CF263DDBFD365569E2F9E59E6EDD668CCEB7451CD121FF87B983C51AE3913F711BBCC3029B54A2A8CCDB1A933EC554263636DD798BD997BEF9FBD1F158619BB85C326A0AAD6B1B38E4449BBB4A5654985095B57F5AF2E1AD89D73807EAB63F3D50475803E81B887F0B7D00AA618810A2E4D19A5122E03030D485811DA64365D008864681EF8DCECD70BB09EF0ACFD9F2017701BE71F152BE00091CAF13EA2B34060F25038A2EEA1891656D88B7F93596070EAC35132E98CEDD359773B255E39AA2F36EE802076E7214ED9A6D83081C4F81581F68D776DA30F57CD4364BF2F415795A7E9828465FEA0CF1C39CF8D21E3AC05319B804274C30DFA70DE8BB0262C52D940EB964DD0805FF1DEE3DC00DFEFD19E5657E2516F4E39A70CBD9D1EFD24EF3E4FA87F2C568A9703B00495B630822541B01E5C8AFC86944F8E4FD5D5BA3072010EBED3927DE4D26043AD21539D95A8E2F0AC1BD89C18D288F64E0F29B87CFA918BF96373C3C3CD792F35457D0CD15C5290385FDA57F0A3A219707F72E37943485AD4F181BA5E6A1EFD4CB58812CE9F8768621AEE4FCFB3905B178479F19C5DF94A2DA202D79719023052595595AEC3DDE1501F6ECCE2B32E9A1CD56F659FA0CCFC87DDC4DA44D6815148A56272AB692C48A962B1007710C60F5D0063B46EA011DC8662A1B060CF8CA8204BF4EFD90BB9CA2B1308268B924E5E7CBB48FD4C561D1148861A5D806FBF36E27708DD461AD60867952A2F35D8D74CCDBB86D81915E8A4AC5560D5D191BF48B3536D8FD2A51A6C6F048E3C06F9E9CA4E96BC513A6C9472368F0B03D35BE18B958EE7743ABC55A6B82F25D196C4B42BF00267CB53970544ADA6C89E3B6D2C49541F0A3CA857AE3C9B56ABDBE32791108DA35E989127028025871B4F0A15B1B86D1E210DB8A20660D3B2A64FD9EF19100A78A49139330303D3021300906052B0E03021A0500041494A0003D236C18865528381FA607BBBEB2E377F20414D8A7FC776095D6A60878C86ABCAB8498AC02C6F802020400), '123456', 'PKCS12', null);
if I remove the keystore_bytes insertion the insert does complete successfully. So why am I getting an identifier is too long? I haven't any identifiers that exceed 30 characters...

You can have a logic of something like below by using dbms_lob package.
if($stringlen >= 32000) {
for(my $i = 1; $i <= int(($querylen/32000) + 0.99); $i++) {
$temptext = substr($sql, 0 + (32000*($i-1)), 32000);
$sqltext .= "dbms_lob.append(my_sqltext,'$temptext');" . "\n";
}
}
else {
$sqltext = "my_sqltext := '$sql';" . "\n";
}
Once you get the lob you can insert by using dbms_lob.write.

Related

how to import huge tsv file into h2 in memory database with spring boot

I have a huge tsv files and I need to import them into my h2 in memory database.
I can read it with Scanner and import it line by line but it takes for hours !
is there any faster way to import tsv file into h2 in memory database ?
Use insert into select convert for direct importing from file into your h2 table.
How to read CSV file into H2 database :
public static void main (String [] args) throws Exception {
Connection conn = null;
Statement stmt = null;
Class.forName("org.h2.Driver");
conn = DriverManager.getConnection("jdbc:h2:~/test", "", "");
stmt = conn.createStatement();
stmt.execute("drop table if exists csvdata");
stmt.execute("create table csvdata (id int primary key, name varchar(100), age int)");
stmt.execute("insert into csvdata ( id, name, age ) select convert( \"id\",int ), \"name\", convert( \"age\", int) from CSVREAD( 'c:\\tmp\\sample.csv', 'id,name,age', null ) ");
ResultSet rs = stmt.executeQuery("select * from csvdata");
while (rs.next()) {
System.out.println("id " + rs.getInt("id") + " name " + rs.getString("name") + " age " + rs.getInt("age") );
}
stmt.close();
}
Or
SELECT * FROM CSVREAD('test.csv');
-- Read a file containing the columns ID, NAME with
SELECT * FROM CSVREAD('test2.csv', 'ID|NAME', 'charset=UTF-8 fieldSeparator=|');
SELECT * FROM CSVREAD('data/test.csv', null, 'rowSeparator=;');
-- Read a tab-separated file
SELECT * FROM CSVREAD('data/test.tsv', null, 'rowSeparator=' || CHAR(9));
SELECT "Last Name" FROM CSVREAD('address.csv');
SELECT "Last Name" FROM CSVREAD('classpath:/org/acme/data/address.csv');
h2 csvread function
NOTE: You can specify file's field separator for these commands.

Can't insert multiple INSERT queries via Laravel

I have simple multiple insert:
$query = "
INSERT INTO `products` SET `code` = '0100130', `price` = '273.90', `brand` = 'Alpina', `supplier` = 'karat';
INSERT INTO `products` SET `code` = '0600075', `price` = '222.24', `brand` = 'Alpina', `supplier` = 'karat';
";
I have tried DB::raw($query), DB::query($query), DB::statement($query) - all three fails. But all three works if there is only one INSERT statement. If more than one, I get no error, but inserts are not performed.
I'm looking for a fastest way to import 13million inserts. Inserting one by one will take 24 hours for server.
Laravel v7.12.0
Try like this :
$query = "
INSERT INTO products(code, price, brand, supplier) VALUES
(0100130, 273.90, 'Alpina', 'karat'),
(0100130, 273.90, 'Alpina', 'karat')
";
You can't do two (or more) INSERT INTO into one sql query.

SSIS sending query with data parameter to Oracle cloud VS-2019 and MS Oracle Source

Already checked this post: SSIS and sending query with date to Oracle
And I'm using variable query per below thread
SSIS - Using parameters in Oracle Query using Attunity Oracle Datasource
Tool used: VS-2019
Data flow: MS Oracle Source (for VS-2019)
My source is Snowflake cloud. I'm successfully able to get the max date from table and store in Object type variable (named:- #var_Snowflake_Table_maxDate). Then I use a script task to convert the value to string type.
Code for script task is:
public void Main()
{
OleDbDataAdapter A = new OleDbDataAdapter(); //using System.Data.OleDb; ADDED above in NAMESPACES
System.Data.DataTable dt = new System.Data.DataTable();
A.Fill(dt, Dts.Variables["User::var_Snowflake_Table_maxDate"].Value);
foreach (DataRow row in dt.Rows)
{
object[] array = row.ItemArray;
Dts.Variables["User::var_CreateDate"].Value = array[0].ToString();
}
Dts.TaskResult = (int)ScriptResults.Success;
}
This sets my param of #var_CreateDate String type correctly. I tried this on local machine and was able to pass the value to a native instance of sql-server(yes NOT oracle). Just to test my parameters from script task works.
Finally: I'm using VS-2019's MS Oracle Source to pass the value into Oracle cloud server. Sample query's I have tried
"select * from Table where rownum <= 5 and NVL(CREATE_DATE,UPDATE_DATE) = " +"'05-09-2020'"
::::evals to::::
select * from relate.awd_acct_activity where rownum <= 5 and NVL(CREATE_DATE,UPDATE_DATE) = '2020-05-09'
and this works. But value is hard coded.
Try 2:
"select * from table where rownum <= 50 and
NVL(CREATE_DATE,UPDATE_DATE) = " +"'#[User::var_CreateDate]'"
Try 3:
"select * from table where rownum <= 50 and
NVL(CREATE_DATE,UPDATE_DATE) = to_date(" +"'#[User::var_CreateDate]'"+")"
Try 4:
"select * from table where rownum <= 50 and
NVL(CREATE_DATE,UPDATE_DATE) = to_date(" +"'#[User::var_CreateDate]'"+",'YYYY-MM-DD')"
None of try 2 through 4 eval correctly. Can I have some guidance into how to pass this parameter to Oracle cloud.
Thanks.
I'm assuming you're trying to figure out the syntax for a variable, that would hold the query text. You can try something like this:
"select * from table where rownum <= 50 and
NVL(CREATE_DATE,UPDATE_DATE) = to_date(" + "'" + #[User::var_CreateDate] + "'" + ",'YYYY-MM-DD')"

Fetch all rows of a ORACLE SQL query using PERL Script

I need to fetch all rows from an oracle sql query and then loop through each rows using PERL.
Below is some sample data and table
create table t1 (col1 varchar2(30));
insert into t1 values ('row1');
insert into t1 values ('row2');
insert into t1 values ('row3');
insert into t1 values ('row4');
insert into t1 values ('row5');
commit;
i have written PERL script like below to fetch above table --
# connexion a la base
my $dbh = DBI->connect( 'dbi:Oracle:'.$dbname,
$dbusername,
$pass,
{ PrintError => 0,
RaiseError => 1
}
) || die "Erreur lors de la connexion: $DBI::errstr";
print ("Connexion à la base de données $dbname avec $dbusername OK \n");
$requete = "select col1 from t1";
$sth_sql = $dbh->prepare($requete);
$sth_sql->execute(#row);
#row=$sth_sql->fetchrow_array;
my $size = #row;
print $size;
#$first=#row[0];
#$sec=#row[1];
print $sec;
print $first;
foreach $script_name (#row) {
print "$script_name\n";
}
the above code is returning only one row and size of the array is showing only 1 element in it.
I need to fetch all fives rows and then loop through them one by one.
please suggest what i am missing here !!
I am using oracle database.
Thanks
EDIT :
I have made some changes and it is working fine now
$requete = "select col1 from t1";
$sth_sql = $dbh->prepare($requete);
$sth_sql->execute();
##row=$sth_sql->fetchrow_array;
$sth_sql->bind_columns(undef, \$script_name);
print $sec;
print $first;
while ($sth_sql->fetch()) {
$script_sql=$script_name.".sql";
print "$script_sql\n";
}
The ->fetchrow_array function is documented in DBI. There you'll see documented that you can either use it within a loop:
$sth = $dbh->prepare("SELECT foo, bar FROM table WHERE baz=?");
$sth->execute( $baz );
while ( #row = $sth->fetchrow_array ) {
print "#row\n";
}
to retrieve all rows sequentially, or that you can use the ->fetchall_arrayref method to retrieve the complete resultset in one go:
$sth = $dbh->prepare("SELECT foo, bar FROM table WHERE baz=?");
$sth->execute( $baz );
my $rows = $sth->fetchall_arrayref;
for my $row (#$rows) {
print "#row\n";
}

Pass Values to a query through fucntions argument in Perl

I have a perl code which queries the Oracle 11G database to pull the count of the 'values' from a table.
$sqlStatement="SELECT count(values) FROM Table WHERE values IN ('value1','value2','value3',.... 'valuen')"
How can I pass the values in query through a subroutine's argument?
For example:
$sqlStatement="SELECT count(values) FROM Table WHERE values IN ($values)
subroutine($values)
Wouldn't it make more sense to pass in the values using an array rather than a scalar?
count_values(#values);
Then the subroutine could start like this:
sub count_values {
my #values = #_;
my $sql = 'select count(values) from table where values in (';
$sql .= join ',', ('?') x #values;
$sql .= ')';
my $sth = $dbh->prepare($sql);
$sth->execute(#values);
my $count = ($sth->fetchrow_array)[0];
}

Resources