TOAD Schema browser thousand separator - oracle

is it possible to display data in TOADs schemebrowser with thousand separator? i.e. one setting so all tables are displayed with thousand separator.
thank you

Not possible, as far as I can tell.
If you search TOAD's options for "thousand", you'll find the "Thousand Separator" in "General" set of options:
But, it just says which character (comma or dot) acts as a thousands separator so that you could use e.g.
select to_char(salary, '999G990D00') from employees
^ ^
| |
thousands decimal separator
It doesn't affect data displayed in the "Data" tab of the Schema Browser (or Editor's "Data Grid").

Related

Text Fields Acceptable in SQL Loader

Are there any reserved text characters in SQL Loader ?
Any special characters like &,_" etc which cannot be loaded in Oracle table columns ?
My file column seperator is a pipe {|} character and I will escape to accept this too in my text columns but are there any other reserved characters which I cannot use in the data fields to be interfaced ?
There are none, as far as I can tell.
However, I'd suggest you to choose delimiters wisely because if text you're loading contains delimiters, you'll have problems in figuring out whether e.g. a pipe sign is a delimiter, or part of text to be loaded.
If you can prepare input data so that values are optionally enclosed into double quotes, you'd be able to avoid such problems. However, why having it complicated if it can be simple?

data factory special character in column headers

I have a file I am reading into a blob via datafactory.
Its formatted in excel. Some of the column headers have special characters and spaces which isn't good if want to take it to csv or parquet and then SQL.
Is there a way to correct this in the pipeline?
Example
"Activations in last 15 seconds high+Low" "first entry speed (serial T/a)"
Thanks
Normally, Data Flow can handle this for you by adding a Select transformation with a Rule:
Uncheck "Auto mapping".
Click "+ Add mapping"
For the column name, enter "true()" to process all columns.
Enter an appropriate expression to rename the columns. This example uses regular expressions to remove any character that is not a letter.
SPECIAL CASE
There may be an issue with this is the column name contains forward slashes ("/"). I accidentally came across this in my testing:
Every one of the columns not mapped contains forward slashes. Unfortunately, I cannot explain why this would be the case as Data Flow is clearly aware of the column name. It can be addressed manually by adding a Fixed rule for EACH offending column, which is obviously less than ideal:
ANOTHER OPTION
The other thing you could try is to pre-process the text file with another Data Flow using a Source dataset that has no delimiters. This would give you the contents of each row as a single column. If you could get a handle on the just first row, you could remove the special characters.

Thousand separator in SSRS

I have a column in SSRS report which should be shown using thousand separator. I have already tried using text box properties thousand separator check box and it doesn't work.
The column contains decimal values like 10036356.51, -12345.75, etc.
What should be done to display this numbers using thousand separator?
The thousand separator does not work for Character type datatype like CHAR/VARCHAR.
Cast your column value to DECIMAL OR NUMERIC and TRY.

Send a Flat file attachment in the workflow in Informatica Developer

In a mapping we use delimited flat file having 3 columns.The column separated through comma. But i have a requirement that in between the column there is a column having 2 comma.So how should I process the column in the mapping?
You should have information quoted with "" so whatever is within " is skiped. this way you could differentiate between comma of a piece of information or as a column separator.
We don't know what have you tried, but count the number of commas for each line and separate accordingly (if possible).

Access 2007 export table to csv decimal

I have some tables in Access that I'm trying to export to csv so that I can import to Oracle. I don't use the export via ODBC because I have 70K - 500K records in some of these tables and that feature takes way to long as I have about 25 tables to do so I want to export to csv (which is much faster) then load via sqlldr.
Some numeric columns can go out to 16 decimal places and I need them all. However when I export they only go out 2. I've done some googling around this. Regional settings only allows 9 decimals out (Win XP), formatting the column via a query will change it to text which I don't want when I import to Oracle (maybe I can use to_number() in the control file?).
Why is this so difficult? Why can't Access just export numeric columns as they are?
In my Access 2007 test case, I'm not seeing quite the same result you described. When I export to CSV, I get all the decimal places.
Here is my sample table with decimal_field as decimal(18, 16).
id some_text decimal_field
-- --------- ------------------
1 a 1.0123456789012345
2 b 2
Unfortunately, those exported decimal_field values are quoted in the CSV:
"id","some_text","decimal_field"
1,"a","1.0123456789012345"
2,"b","2"
The only way I could find to remove the quotes surrounding the decimal_field values also removed the quotes surrounding genuine text values.
If quoted numeric values are unworkable, perhaps you could create a VBA custom CSV export procedure, where you write your values to each file line formatted as you wish.
Regarding "Why is this so difficult?", I suspect decimal data type as the culprit. I don't recall encountering this type of problem with other numeric data types. Unfortunately, that's only my speculation and won't help even if it's correct.
Create a query selecting all the records from your table. Format the troublesome column by using the format function:
Select Format(Fieldname,"0000.00000") AS FormattedField
Save this query and export the query instead of the table.
One disadvantage of this approach is that your numeric field is then treated as text, so you then get quotes around the exported numbers, and if you use the option not to enclose text in quotes, then any actual text fields you export in the same query lose their quotes too
The other (quicker, dirtier, bodge job) method is to export first into Excel and from there to text. This leaves decimal places intact, but obviously it's not very elegant.

Resources