I need to insert currency Mongolian tögrög and symbol ₮ to Oracle Database.
The insert query as :
INSERT INTO CURRENCY (CUR_ISO_ID, CUR_ISO_CODE, CUR_DESC, CUR_DECIMAL_PLACE, CUR_SYMBOL)
VALUES (496,'MNT','Mongolian tögrög',2,'₮');
results as:
CUR_ISO_ID | CUR | CUR_DESC | CUR_DECIMAL_PLACE | CUR_SYMBOL |
-----------------------------------------------------------------------
496 | MNT | Mongolian t?gr?g | 2 | . |
Kindly advise on how to get the special characters inserted as is to the Database? i.e. the symbol not as . but ₮ and the description not as Mongolian t?gr?g but Mongolian tögrög. Please help.
Before you launch your SQL*Plus enter these commands:
chcp 65001
set NLS_LANG=.AL32UTF8
The first command sets codepage of cmd.exe to UTF-8.
The second command tells your database: "I am using UTF-8"
Then you sql should work. I don't think there is any 8-bit Windows codepage 125x which supports Mongolian tögrög.
See also this post to get some more information: NLS_LANG and others
Check also this discussion how to use sqlplus with utf8 on windows command line, there is an issue when you use UTF-8 at command line.
Related
I have a file:
%appdata%/postgresql/psqlrc.conf
and in the file is one single line:
set CLIENT_ENCODING to 'UTF8';
I was expecting that on running
psql -U postgres
in the command line, I would connect to the server, and have the client_encoding set to UTF8 however, I find:
postgres=# show client_encoding;
client_encoding
-----------------
WIN1252
(1 row)
I would very much like my client encoding to be, by default, UTF8 in order to match the server
postgres=# show server_encoding;
server_encoding
-----------------
UTF8
(1 row)
Does anyone know how what I am doing wrong?
Thanks,
I am facing a scenario where the sqlldr is not being able to recognize special characters. I usually don't bother about this as its not important for me to have the exact same names however this led to another issue which is causing the system to malfunction.
unittesting.txt
8888888,John SMITÉ,12345678
unittesting.ctl
load data
CHARACTERSET UTF8
infile 'PATH/unittesting.txt'
INSERT
into table temp_table_name
Fields Terminated By ',' TRAILING NULLCOLS(
ID_NO CHAR(50) "TRIM(:ID_NO)" ,
NAME CHAR(50) "TRIM(:NAME)" ,
ID_NO2 CHAR(50) "TRIM(:ID_NO2)" )
SQLLDR command
sqlldr DB_ID/DB_PASS#TNS
control=PATH/unittesting.ctl
log=PATH/unittesting.log
bad=PATH/unittesting.bad
errors=100000000
OUTPUT from table
|ID_NO |NAME |ID_NO2 |
|8888888 |John SMIT�12345678 | |
Other information about system [RHEL 7.2, Oracle 11G]
export NLS_LANG=AMERICAN_AMERICA.AL32UTF8
select userenv('language') from dual
OUTPUT: AMERICAN_AMERICA.AL32UTF8
file -i unittesting.txt
OUTPUT: unittesting.txt: text/plain; charset=iso-8859-1
echo $LANG
OUTPUT: en_US.UTF-8
Edit:
So i tried to change the encoding as advised by [Cyrille MODIANO] of my file & use it. The issue got resolved.
iconv -f iso-8859-1 -t UTF-8 unittesting.txt -o unittesting_out.txt
My challenge now is that I don't know the character set of the incoming files & its coming from different different sources. The output of file -i i get for my source data file is :
: inode/x-empty; charset=binary
From my understanding, charset=binary means that the character set is unknown. Please advise what I can do in this case. Any small advice/ idea is much appreciated.
I'm migrating a db from postgres to oracle.I create csv files with this command:
\copy ttt to 'C:\test\ttt.csv' CSV DELIMITER ',' HEADER encoding 'UTF8' quote as '"'; then with oracle sql loader I put data in oracle tables.
It's all ok but I have in some description this character  that wasnt in the original DB.
The encoding of db postgres is UTF8 and I'm on a window machine.
Thanks to all.
Gian Piero
Before you start sqlloader run
chcp 65001
set NLS_LANG=.AL32UTF8
chcp 65001 sets codepage of your cmd.exe to UTF-8 (which is inherited by sqlloader and sqlplus)
With set NLS_LANG=.AL32UTF8 you tell the Oracle database "The client uses UTF-8"
Without these commands you would have this situation (due to defaults)
chcp 850
set NLS_LANG=AMERICAN_AMERICA.US7ASCII
Maybe on your PC you have codepage 437 instead of 850, it depends whether your PC is U.S. or Europe, see National Language Support (NLS) API Reference, column OEM codepage
You can set NLS_LANG also as Environment Variable in PC settings or you can define it in Registry at HKLM\SOFTWARE\Wow6432Node\ORACLE\KEY_%ORACLE_HOME_NAME%\NLS_LANG (for 32 bit), resp. HKLM\SOFTWARE\ORACLE\KEY_%ORACLE_HOME_NAME%\NLS_LANG
You can also change codepage of your cmd.ext persistent, see https://stackoverflow.com/a/33475373/3027266
For details about NLS_LANG see https://stackoverflow.com/a/33790600/3027266
I am trying to run a mysql script on centos. I have following mysql installed.
mysql> SHOW VARIABLES LIKE "%version%";
+-------------------------+------------------------------------------------------+
| Variable_name | Value |
+-------------------------+------------------------------------------------------+
| innodb_version | 5.6.25-73.1 |
| protocol_version | 10 |
| slave_type_conversions | |
| version | 5.6.25-73.1 |
| version_comment | Percona Server (GPL) |
| version_compile_machine | x86_64 |
| version_compile_os | Linux |
+-------------------------+------------------------------------------------------+
My sample script looks like:
DELIMITER //
DROP TRIGGER IF EXISTS trg_table1_category_insert;
DROP TRIGGER IF EXISTS trg_table1_category_update;
CREATE TRIGGER trg_table1_category_insert
AFTER INSERT
ON
table1_category
FOR EACH ROW
BEGIN
insert into table1_category_history (
table1_category_history_id,
table1_id,
transaction_start_date
) values (
new.table1_category_id,
new.table1_id,
new.create_date
);
END;
//
CREATE TRIGGER trg_table1_category_update AFTER UPDATE on table1_category FOR EACH ROW
BEGIN
insert into table1_category_history (
table1_category_history_id,
table1_id,
transaction_start_date
) values (
new.table1_category_id,
new.table1_id,
new.create_date
);
END;
//
DELIMITER ;
My database utilizes utf8 encoding. While importing this file in database on mysql client it keeps throwing
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '//
CREATE TRIGGER trg_table1_category_update AFTER UPDATE on ta' at line 1
I do not see any syntax error while using the delimiter, also it works on some machines absolutely fine, i have googled almost 100s of links and tried all the ways with downgrading/upgrading mysql server/client , my.cnf, charset etc it is not helping me out. Can anyone please help me on this? Can there be any settings done at client level to interpret it correctly.I am using the same version client that comes with mysql server installation.
I am using an SQL that includes query_to_xml:
select query_to_xml('select 1+1 answer', true, true, '') as_xml;
When any SQL with query_to_xml is executed in Squirrel SQL it will result in:
| as_xml |
+--------------------+
|<UnknownType (2009)>|
With the same JDBC driver, credentials and a Java class the SQL will result in the expected XML output:
| as_xml |
+------------------------------------------------------------+
| <row xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">|
| |
| <answer>2</answer> |
| </row> |
I can execute other functions in Squirrel and they will respond, like select version().
Is this a known feature in Squirrel?
If you are using the latest squirrel client, Go to File, Global Preferences-> Data Type Controls, check the box under Unknown DataTypes section.
Rerun the query, it might work.
It seems to be an issue with the text option output for SQL results. It works if you change to the tabular output. Go to the Session menu and click "Session Properties". On the "General" tab under "Output" change "SQL Results" from Text to Table and rerun the query. You may need to close the existing results tabs first.
If this fixes it please add a bug report so that it can be fixed in the future.