Encoding german characters - oracle

I need to import with load data some perl - generated files to oracle database.
Perl-script get a webpage and write csv file.
Here a simplified script:
use File::Slurp;
my $c= ( $user && $passwd )
? get("$protocol://$user:$passwd\#$url")
: get("$protocol://$url");
write_file("$VZ_GET/$FileTS.$typ.csv",$c);
Here a sample line from the webpage:
5052;97;Jan;Ihrfelt 5053;97;Jari;Honko 5121;97;Katja;Keitaanniemi 5302;97;Ola;Södermark 5421;97;Sven;Sköld 5609;97;Peter;Näslund
Content of the webpage is saved in var $c.
Here a sample line of csv file:
5053;97;Jari;Honko
Here a load command:
LOAD DATA
INTO TABLE LIQA
TRUNCATE
FIELDS TERMINATED BY ";"
(
LIQA_ANALYST_ID,
LIQA_FIRM_ID,
LIQA_ANALYST_FIRST_NAME,
LIQA_ANALYST_LAST_NAME,
LIQA_TS_INSERT DATE 'YYYYMMDDHH24MISS'
)
Command SELECT * FROM NLS_DATABASE_PARAMETERS WHERE PARAMETER = 'NLS_CHARACTERSET'; returns AL32UTF8.
The generated csv file is recognized as UTF-8 Unicode text.
Anyhow I cant import german characters. In the csv file they are still correct. But it is not the case in the database.
I have also tried to convert $c like this:
$c = encode("iso-8859-1", $c);
The generated csv file is stll recognized as UTF-8 Unicode text.
I have no clue how can I fix it.

I have solved it:
$c = decode( 'utf-8', $c );
$c = encode( 'iso-8859-1' , $c );

Related

Oracle - Utf_file.get_line not reading utf8 csv lines properly

I have an utf8 csv file from which I want to read from.
It contains characters like ç, á, à ã...
This is what I'm doing:
CSVFile := utl_file.fopen (UPPER('dummyDIR'), CSV_name, 'r', 32767);
utl_file.get_line(CSVFile,CSVLine);
DBMS_OUTPUT.PUT_LINE('line obtained with get_line: '||CSVLine);
And all the special characters get weird...
Any idea how I can get the correct charset read?
This is what I get in Linux to check the charset:
echo $NLS_LANG
PORTUGUESE_PORTUGAL.UTF8

Encoded CSV with sqlToCsv package not working with non-ASCII characters

I am using package "github.com/jeffyi/sqltocsv" to export MSSql rows to CSV files.
My problem is that special characters end up wrong way:
ü as Ć¼
ä as Ƥ
etc..
I have read the sqltocsv package multiple times and I just don't get that when and where is it going wrong.
i have logged output to console, before exporting data comes out from DB as UTF-8 but on adding to CSV it gets messed up.
I have tryed to use package "encoding/csv" to convert my data to csv file.
(without any success)
Here's how I use sqlToCsv package:
rows, _ := db.Query(sqlQuery)
csvConverter := sqltocsv.New(rows)
csvConverter.Delimiter = ';'
csvConverter.TimeFormat = time.RFC822
csvConverter.WriteFile(directory + "/" + fileName)
so in the end result should all characters be as they are:
ü as ü (not Ć¼ )
ä as ä (not Ƥ )

Import big csv in Laravel

I'm using following package to import large CSV file to MySQL database:
https://github.com/Maatwebsite/Laravel-Excel
Here is my controller code:
Excel::filter('chunk')->load($file)->chunk(250, function($results) use ($count)
{
++$count;
echo "<br/> Count = " . $count;
foreach($results as $row)
{
// do stuff
}
Here is line from composer.json
"maatwebsite/excel": "~2.1.0"
Here is my config/app.php file:
'providers' => [
....
....
Maatwebsite\Excel\ExcelServiceProvider::class,
],
'aliases' => [
....
....
'Excel' => Maatwebsite\Excel\Facades\Excel::class,
]
I am getting this error and I cannot find any solution:
InvalidArgumentException in Dispatcher.php line 333:
No handler registered for command [Maatwebsite\Excel\Readers\ChunkedReadJob]
I tried following link for solution but no luck:
https://github.com/Maatwebsite/Laravel-Excel/issues/957
https://github.com/Maatwebsite/Laravel-Excel/issues/952
Are you doing some manipulations for each row of csv before inserting into the database or is it like you need to import the data directly to the database?
Just a quick tip, its your csv is in the sequence how your database table columns are ordered, you can use ubuntu terminal to import larger files :
mysql -uroot -proot --local_infile=1 3parsfdb -e "LOAD DATA LOCAL INFILE '/logfiles/Bat_res.csv' INTO TABLE Bat_res FIELDS TERMINATED BY ','"
If this is something which you want to do programatically or in cron, then you need this package. Can you try clearing laravel cache and also try
composer dump-autoload
One more thing is to ensure there are no special characters in the csv which can not get imported.
Instead of that package, try the LOAD DATA INFILE MySQL procedure, using Laravel's DB object. Here is an example.
I used this for importing big sized csv files (300M-400M) into a mysql db and worked for me.
LOAD DATA LOCAL INFILE! I really missed MySQL, I used this to ingest a huge CVS file, about 6gb in our server, this took about 20 mins.
You can use something like this:
private function importData(int $ignoreLines, string $fileName) : void
{
//$this->setLOG("Importing data, please wait", "i");
$table = 'npdata_csvfile_temp';
$importDB = "LOAD DATA LOCAL INFILE '$fileName' ";
$importDB .= "INTO TABLE $table ";
$importDB .= "COLUMNS TERMINATED BY ',' ";
$importDB .= 'OPTIONALLY ENCLOSED BY "\""';
$importDB .= "LINES TERMINATED BY '\\n' ";
$importDB .= "IGNORE $ignoreLines LINES ";
DB::connection()->getpdo()->exec($importDB);
//$this->setLOG("Done with importing data", "i");
}

How to have a file as schema for pig?

So I have the following code:
My ultimate goal is to have the file produced be the schema for another input file. Is this possible? The output of the current script looks like this:
%declare INPUT '$input'
%declare SCHEMA '$schema'
%declare OUTPUT '$output'
%declare DEL '$del'
%declare COL ':'
%declare COM ','
A = LOAD '$SCHEMA' using PigStorage('$DEL') AS (field:chararray, dataType:chararray, flag:chararray, chars:chararray);
B = FOREACH A GENERATE CONCAT(field,CONCAT('$COL',CONCAT(REPLACE(REPLACE(dataType, 'decimal','double'), 'string', 'chararray'),'$COM')));
rmf $OUTPUT
STORE B INTO '$OUTPUT';
Not sure the right approach.
Here is the output:
record_id:chararray,
offer_id:double,
decision_id:double,
offer_type_cd:integer,
promo_id:double,
pymt_method_type_cd:double,
cs_result_id:double,
cs_result_usage_type_cd:double,
rate_index_type_cd:double,
sub_product_id:double,
campaign_id:double,
market_cell_id:double,
assigned_offer_id:chararray,
accepted_offer_flag:chararray,
current_offer_flag:chararray,
offer_good_until_date:chararray,
Of course, You can use the run command of pig to run the script and do the stuff as needed by you, For more explanation of how to do, refer this link
Hope it helps!

Matlab insert image(.jpg) file to word file

I want to insert the image to word file, If I try like the below code the word file show some unknown symbols like
"ΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰΰ"
My code:
figure,imshow(img1);
fid = fopen('mainfile.doc', 'a', 'n', 'UTF-8');
fwrite(fid, img1, 'char');
fclose(fid);
open('mainfile.doc');
fwrite won't do this directly. You could try Matlab report generator if you have access to it or a file exchange submission, OfficeDoc.

Resources