How to create CSV file in ASCII format in Talend - ascii

I need to create a file (CSV) by fetching data from DB and need to send this file to another server. Client on other server needed this file in ASCII format.
How to create CSV file in ASCII format??
By default talend support ISO and utf-8 format, how about ASCII??

ASCII is the default - please see this article for more information.

Related

Save .txt File on IBM Iseries Database from procedure on genexus

I am using a procedure to generate a .txt file inside my machine, but what I want is when the file is generated it'll be saved inside an IBM I series database and not inside mi machine
I am new to genexus and to IBM Iseries database so I don't know how I can achieve that.
Am using .Net and genexus 10 V3
You can generate the .txt file and then save it in the iSeries in a blob field. In the case that you need to do anything with the file inside the DB you can convert it to base64 before sending it to the iSeries and save it in a longvarchar field.

How to convert json file to excel/sql query

I have an ETL process where supplemental/delta load is generated by source system vendor team in a json file and given to Dev team to load in table. And source system has agreed to provide the testing team with excel spreadsheet with changes/updates to data. Is there a way where Json file can be converted to Excel using some code/macro. There is no indicator or date field to track changes in the table loaded. SO inputs are excel sheet and json file. Since it is a regulated industry, using online converters is not recommended. Any ideas on how to do this, or better any other other approach for testing data would be helpful.
Is there a way where Json file can be converted to Excel using some code/macro
Maybe, but asking for such "libraries" is off topic here.
You've listed Hadoop in your tags, so you are welcome to drop JSON into HDFS, load a table schema over it in Hive (or Impala) using JSONSerde.
Then you can render the data into Excel using an ODBC connector for Hive (or Impala)

Oracle - Access a remote file to read

I have this very specific requirement.
My database server is running on some linux server X, where I have written some stored procedure which will read the file from a DIRECTORY and create an XML table based on the content of that xml file.
Now, The file in picture can come from any machine i.e. it is uploaded by user in Browser and then we need to process it with the stored procedure.
Is there a way I can access the file of my local machine from the database server without mount/ftp? I mean, is there any utility in Oracle which can access file system of the client to read the file content?
is there any utility in Oracle which can access file system of the client to read the file content?
No, there is not. PLSQL program cannot reach your client PC. You have to upload it to the server then can use UTL_FILE to interpret it.

SSIS Flat File Import into Oracle failing despite DT_WSTR definitions

I am getting error messages about being unable to convert between unicode and non-unicode character sets when trying to import a flat file via SSIS.
My flat file is UTf-8 encoded according to Notepad++. The file contains characters such as the "microgram" character (µg) for example.
My flat file connection manager is set up to use 65001 (UTF8) and all my columns are set up on my flat file connection manager to be DT_WSTR, and the data previews OK in the flat file connection manager.
My database is Oracle 11g, using Microsoft Oracle OLEDB driver for Oracle.
I have tried VARCHAR2 and NVARCHAR dta types in Oracle, but when I connect my DT_WSTR (65001) columns to my Oracle tables, I get the unicode conversion error.
I have tried conversion steps in my SSIS packaged, to convert to DT_WSTR and DT_STR. Had some success with DT_STR, but I got my microgram symbol scrambled.
How hard should this be? I set up a 65001 text file connection, with DT_WSTR column types and I cannot for the life of me connect it to Oracle using VARCHAR2, NVARCHAR.
Any advice appreciated.
Stephen

loading data from a flat file to table using informatica, having both english and foreign language characters like chinese

I am loading data from a flat file to table using informatica, the file has both english and foreign language characters like Chinese, and others. The foreign language characters are not getting displayed properly after loading. How can this problem be solved?
I could try to solve it by using UTF-16 encoding, but earlier I was using UTF-8.
Start with the Source in designer. Are you able to see the data correctly in the source qualifier preview? If not, you might want to set ff source definition encoding to UTF-8.
the Integration service should be running in Unicode mode and not ASCII mode. You can check this from the Integration service properties in Admin Console.
The target should be UTF-8 encoding.
Check the relational connection ( if the target is a database) encoding in workflow manager to ensure it is UTF-8
If the problem persists, write the output to a utf-8 flatfile and check if the data is loading properly. If yes, then the issue is with writing to the database.
Check the database settings like NLS_LANG, NLS_CHARACTERSET (for oracle) etc.
Sadagopan
You need to find out the encoding for the Integration Service that runs the workflow of the loading. Informatica supports three different encoding for this, utf-8, ascii and windows-1252, you need to make sure yours is utf-8. You also need to tell the source qualifier for the workflow to use the right encoding to read the file (could be utf-8 or utf-16). And finally you must make sure your database tables are using an encoding that supports chinese.

Resources