Encoding data in joomla 1.0.0 - joomla

I'm writing a tool to migrate data from excel to joomla
1.0.0 db.
My data have Vietnamese character
(Ex: "Thành Phố Hồ Chí Minh").
When migarted, data show in page is error
("Th?nh Ph? H? Ch? Minh").
I think joomla encoded data before saving to db but i dont know what's encoding code?.
How can do that?
Thanks

After searching google many times, i finded answer for my question
This encoding is latin1
use sql command below to convert data before saving
convert(cast(convert(name using latin1) as binary) using utf8)

Related

How to fix UTF-8 decoded with ISO-8859-1 in Redshift

I assumed a dataset was ISO-8859-1 encoded, while it was actually encoded in utf-8.
I wrote a python script where i decoded the data with ISO-8859-1 and wrote it into a redshift sql database.
I wrote the messed up characters into the redshift table, the decoding did not happen while writing into the table. (used python and pandas with wrong encoding)
Now the datasource is not available anymore but the data in the table has a lot of messed up characters.
E.g. 'Hello Günter' -> 'Hello GĂŒnter'
What is the best way to resolve this issue?
Right now i can only think of collecting a complete list of messed up characters and their translation, but maybe there is a way i have not thought of.
So my questions:
First of all i would like to know if information was lost when the decoding happened..
Also i would like to know if there might be a way in redshift to solve such a decoding issue. Finally i have been searching for a complete list, so i do not have to create it myself. I could not find such list.
Thank you
EDIT:
I pulled a part of the table and found out i have to do the following thing:
"Ð\x97амÑ\x83ж вÑ\x8bÑ\x85оди".encode('iso-8859-1').decode('utf8')
The table has billions of rows, would it be possible to do that in redshift?

php wrong utf8 characters from mysql using Twig

Im doing some webapp on php, im using my own MVC pattern, including Activerecord, and Twig templates.
So i have some problems with charset, there is some details about my encoding.
Im using polish characters
Mysql encoding is set to utf8_unicode_ci (i tried urf8_general_ci)
Twig template have standard html-5 header with utf8 encoding
Im not sure about files encoding (using netbeans), but sublime text 2 console on view.encoding() says: u'Undefined', i dont try to change it yet.
Problem description:
When im using polish characters like ółąćź in Twig template file - everything looks good, there is no problem. I tried to use:
echo $twig->render('hello.tpl', array('locations'=>"óóśąłłąś"));
And in this case is no problem too.
But when I get my data from database the polish characters are like "�"
I tried to get data by structural php mysql call, and by activerecord - ex. Model::all().
It allways have problems with characters from database in Twig template.
And yes, i set my active record config like: dbname?charset=utf8
The answer is funny.
I tried again to do it structural and i used this query:
mysql_query("SET NAMES 'utf8'", $dbLink);
It works, all characters are visible now.
On activerecord the problem still apears, so i updated activerecord to nigtly build, and everything works now !

Confused about conversion between windows-1252 and UTF-8 encoding

I have a legacy database that claims to have collation set to windows-1252 and is storing a text field's contents as
I’d
When it is displayed in a legacy web-app it shows as I’d in the browser. The browser reports a page encoding of UTF-8. I can't figure out how that conversion has been done (almost certain it isn't via an on-the-fly search-and-replace). This is a problem for me because I am taking the text field (and many others like it) from the legacy database and into a new UTF-8 database. A new web app displays the text from the new database as
I’d
and I would like it to show it as I’d. I can't figure out how the legacy app could have achieved this (some fiddling in Ruby hasn't showed me a way to affect converting a string I’d to I’d).
I've tied myself in a knot here somewhere.
It probably means the previous developer screwed up data insertion (or you're screwing up somewhere). The scenario goes like this:
the database connection is set to latin1
app actually sends UTF-8 to database
database interprets received data as latin1, stores it as such (interprets ’ as ’)
app queries for the data again
database returns ’ encoded in latin1
app interprets the data as UTF-8, resulting in ’
You essentially need to do the same misinterpretation to get good data. Right now you may be querying the database through a utf8 connection, so the database returns ’ encoded in UTF-8. What you need to do is query through a latin1 connection and interpret the data as UTF-8 instead.
See Handling Unicode Front To Back In A Web App for a more detailed explanation of all this.

setting encoding for CSV file generated by exportButton in Oracle Application Framework

Is it possible to set encoding for exported data from OAF page?
I have some polish letters like 'ś' or 'ć' and OAF page shows it correctly but in exported CSV file these letters are shown as '?'
Found this thread for you which might be relevant:
https://forums.oracle.com/forums/thread.jspa?messageID=3151892

Norwegian Special character not supporting in Codeigniter 2.0.2

I am trying to save some special norwegian characters like æøå ÆØÅ but this not saved properly in database. Sometimes such characters get trimmed and sometimes shown like æøå Ã
I had used htmlentities to support such characters in Codeigniter 1.7 and works well.
So the problem came with new version of Codeigniter.
Any ideas?
I had a similar issue with one of our native languages in India (that had accented charcters), and I resolved it by changing charset values to utf8_unicode_ci in database (table and fields compilation) and files related with data capture and display of data.
Let me know that helps you.

Resources