dbsaint - Retrieve form EXCEL - oracle

How can I retrieve data (using sql) from Excel to a table in Oracle database. I am using dbsaint.
Instead of DBSAINT, which developer tool should I use for this purpose?

The easiest way to do this is to export the data from Excel into a CSV file. Then use an external table to insert the data into your database table.
Exporting the CSV file can be as simple as "Save as ...". But watch out if your data contains commas. In that case you will need to ensure that the fields are delimited safely and/or that the separator is some other character which doesn't appear in your data: a set of characters like |~| (pipe tilde pipe) would work. Find out more.
External tables were introduced in Oracle 9i. They are just like normal heap tables except their data is held in external OS files rather than inside the database. They are created using DDL statements and we can run SELECTs against them (they are read only). Find out more.
Some additional DB infrastructure is required - the CSV files need to reside in an OS directory which is defined as an Oracle dictionary object. However, if this is a task you're going to be doing on a regular basis then the effort is very worthwhile. Find out more.
I don't know much about DbSaint; it's some kind of database IDE like TOAD or SQL Developer but focused at the cheap'n'cheerful end of the market. It probably doesn't support this exact activity, especially exporting to CSV from Excel.

Related

Can we find a table that is used in a set of FMBs without opening it individually in Oracle Forms?

I am using both Oracle Forms version 11g and 12c.
Is it possible to find a table for e.g table1 used in the Oracle Forms application screens including LOV's without opening each FMB individually and searching in it.
Totally there are around 50-75 FMBs in the application.
Thanks
While Forms was a new software product, back then in its 3.0 version (or even lower), you could choose whether you'll keep the form source
in the database or
in that case, you could have written a query which selects data from the data dictionary and - hopefully - extract tables' names
in file system
file extension was .INP (not .FMB) and it was a textual file; it means that you could even create a form using text editor! Nobody probably did that, but hey - you could have done it.
.FMB is no longer textual file. Yes, you can open it it a text editor (such as Notepad++) and search for e.g. FROM (because any table used in form's PL/SQL units or LoVs is part of a SELECT statement which requires the FROM keyword) and get something like this:
Yes, you'll get "duplicates" if any table is referenced more than once.
Another option is to write a program which will parse the .FMB file and extract tables' names (I can't help with that, though).

Loading csv and writing bad records with individual errors

I am loading a csv file into my database using SQL Loader. My requirement is to create an error file combining the error records from .bad file and their individual errors from the log file. Meaning if a record has failed because the date is invalid, against that record in a separate column of error description , Invalid date should be written. Is there any way that SQL loader provides to combine the too. I am a newbie to SQL loader.
Database being used Oracle 19.c
You might be expecting a little bit too much of SQL*Loader.
How about switching to external table? In the background, it still uses SQL*Loader, but source data (which resides in a CSV file) is accessible to you by the means of a table.
What does it mean to you? You'd write some (PL/)SQL code to fetch data from it. Therefore, if you wrote a stored procedure, there are numerous options you can use - perform various validations, store valid data into one table and invalid data into another, decide what to do with invalid values (discard? Modify to something else? ...), handle exceptions - basically, everything PL/SQL offers.
Note that this option (generally speaking) requires the file to reside on the database server, in a directory which is a target of Oracle directory object. User which will be manipulating CSV data (i.e. the external table) will have to acquire privileges on that directory from the owner - SYS user.
SQL*Loader, on the other hand, runs on a local PC so you don't have to have access to the server itself but - as I said - doesn't provide that much flexibility.
it is hard to give you a code answer without the example.
If you want to do your task I can suggest two ways.
From Linux.
If you loaded data and skipped the errors, you must do two executions.
That is not an easy way and not effective.
From Oracle.
Create a table with VARCHAR2 columns with the same length as in the original.
Load data from bad_file. Convert your CTL adapted to everything. And try to load in
the second table.
Finally MERGE the columns to original.

Replace invalid character in oracle (by editing dmp file)

We have a portal written in php/mysql and an enterprise application based on Java EE and Oracle. Recently we found out that a certain Unicode character (0643 to be precise) is invalid (due to improper data entry by end users) in text columns and must be changed to another character (06A9).
In MySQL I simply changed the export file using a text editor's find and replace tool. But in oracle, the dmp file is a binary file and i have no idea about how to edit the dmp file.
How can I change the invalid character?
Is there an alternative to iterating through all text columns in all tables?
(I have saved that as a last resort!)
Editing an Oracle dump file may be possible but isn't practical; even if you could get in and change something you'd risk corrupting it, and I doubt Oracle support would be impressed. (See this AskTom question for example).
If you're using data pump and you know which column(s) the data is in you might be able to use the REMAP_DATA parameter to change it on the fly, or the QUERY parameter to skip the data, but it doesn't sound like you're in that situation. You could potentially add temporary constraints to the relevant column(s) to block the value, so import would reject (and log) the affected rows, but that's painful and messy.
If you do have to check all columns on all tables, this link may be helpful.

csv viewer on windows environement for 10MM lines file

We a need a csv viewer which can look at 10MM-15MM rows on a windows environment and each column can have some filtering capability (some regex or text searching) is fine.
I strongly suggest using a database instead, and running queries (eg, with Access). With proper SQL queries you should be able to filter on the columns you need to see, without handling such huge files all at once. You may need to have someone write a script to input each row of the csv file (and future csv file changes) into the database.
I don't want to be the end user of that app. Store the data in SQL. Surely you can define criteria to query on before generating a .csv file. Give the user an online interface with the column headers and filters to apply. Then generate a query based on the selected filters, providing the user only with the lines they need.
This will save many people time, headaches and eye sores.
We had this same issue and used a 'report builder' to build the criteria for the reports prior to actually generating the downloadable csv/Excel file.
As other guys suggested, I would also choose SQL database. It's already optimized to perform queries over large data sets. There're couple of embeded databases like SQLite or FirebirdSQL (embeded).
http://www.sqlite.org/
http://www.firebirdsql.org/manual/ufb-cs-embedded.html
You can easily import CSV into SQL database with just few lines of code and then build a SQL query instead of writing your own solution to filter large tabular data.

load data into text file from oracle database views

I want to load data into text file that is generated after executing "views" in Oracle?How can I achieve this in oracle using UNIX.for example-
I want the same in Oracle on unix box.Please help me out as it alredy cosume lots of time.
your early response is highly appreciated!!
As Thomas asked, we need to know what you are doing with the "flat file". For example, if you're loading it into spreadsheet or doing some other processing that expects a defined format, then you need to use SQL*Plus and spool to a file. If you're looking to save a table (data + table definition) for moving it to another Oracle database then EXP/IMP is the tool to use.
We generally describe the data retrieval process as "selecting" from a table/view, not "executing" a table/view.
If you have access to directories on the database server, and authority to create "Directory" objects in Oracle, then you have lots of options.
For example, you can use the UTL_FILE package (part of the PL/SQL built-ins) to read or write files at the operating system level.
Or use the "external table" functionality to define objects that look like single tables to Oracle but are actually flat files at the OS level. Well documented in the Oracle docs.
Also, for one-time tasks, most of the tools for working SQL and PL/SQL provide facilities for moving data to and from the database. In the Windows environment, Toad's good at that. So is Oracle's free SQLDeveloper, which runs on many platforms. You wouldn't want to use those for a process that runs every day, but they're fine for single moves. I've generally found these easier to use than SQLPlus spooling, but that's a primitive version of the same functionality.
As stated by others, we need to know a bit more about what you're trying to do.

Resources