Oracle's SQL Developer Database Export Results in 2 Digit Years - oracle

I'm using Oracle's SQL Developer to export data into CSVs. I found that Oracle spits out the dates as dd-MMM-yy. When I bulk insert these files into SQL Server it's interpreting some of the dates incorrectly. How do I change that?
I'm an Oracle neophyte, so I might be approaching this whole thing incorrectly. I need to transfer a lot of tables/rows from Oracle to SQL Server. I have a linked server set up in SQL to Oracle, but that takes a really long time to transfer the data. About 18 hours, and both databases are on the same server, but it gets the dates correct.
I didn't find any good way to accomplish this other than a couple of PL/SQL scripts I couldn't get to work for me. Is it really that rare that data gets migrated from Oracle to MS-SQL?

Well, dates across database vendors are just hell.
The default date format can be set in SQL Developer Preferences > Database > NLS > Date Format. You can also set it in the session as #belayer has commented.
For writing CSV files or for migration projects, I would always try to control the format directly, like
SELECT id, TO_CHAR(my_date,'YYYY-MM-DD') AS my_date, my_column
FROM my_table;
Having said that, there should be a better way to move the data out of Oracle into SQL Server...

Related

How to see the vertica's procedure in DBeaver?

everyone
Does anyone know how to see the vertica's procedures in DBeaver?
Not use script.
Here is I can see the objects for vertica.
Here is I can see the objects for sql server.
In sql server we can see the 'procedure' folder, but in vertica can not.
I find the last one about this discussion.
It saild,
"DBeaver supports stored procedures/views source code view only for: MySQL, Oracle, DB2, PostgreSQL, MS SQL Server, Vertica, Firebird."
Do I lost some thing need to install?
Stored procedures in Vertica are brand new.
They have become generally available with Version v11.0.1 , which just came out a few weeks ago.
DBeaver can only list, in the navigator, things that it "knows" are supported by the connected db.
I doubt you can teach new stuff to a front end in a few weeks.
And if your Vertica Version is below v11.0.1, all the worse ..
What would be worth trying, is opening a SQL window, and going:
SELECT * FROM v_catalog.user_procedures
procedure_name|language|procedure_arguments |schema_name
pivot |PL/vSQL |idlist varchar, keyname varchar, valname varchar, tbname varchar, pvtbname varchar|public

View SQL History by User [Oracle]

I'm trying to find all (available) SQL history by user in our Oracle 11g database. I've tried using some view such as v$sql_monitor, v$sqlarea, and dba_hist_active_sess_history to try to get usernames, and their executed SQL statements (joined on SID and Serial #) but I'm not having any luck. Our senior DBA and DBE said they've done it before but just told me to look in sqlarea since that has the longest history of SQL held. I'm not having any luck with this. Is this possible to do in Oracle?
Edit: We use SQL Developer. I understand that TOAD may or may not have this feature built-in but I haven't been able to find anything that accomplishes this (other than view current sessions and current SQL) in SQL Developer.
You can try something like that if you have the proper licensing to query dba_hist_active_sess_history, you need a license for the diagnostic pack:
select trunc(hist.sample_time,'DD'),u.name,hist.sql_id,sql.sql_text
from dba_hist_active_sess_history hist,
dba_hist_sqltext sql,
user$ u
where hist.sql_id = sql.sql_id
and hist.user_id = u.user#;
I think the only way to fetch the exact SQL-Executions of a User (i think you mean Session?) is by enabeling Tracing.
I think this article (https://oracle-base.com/articles/misc/sql-trace-10046-trcsess-and-tkprof) gives you a good instruction how it is used.

how to find the query used for creation of Temporary table in Oracle sql developer

I have created a temporary table in oracle sql developer but I forgot to save it and now I want to reuse the query but I don't remember the code used then. Is there a process to get query used creation of temp table?
You can use dbms_metadata.get_ddl()
select dbms_metadata.get_ddl('TABLE', 'YOUR_TABLE_NAME_HERE')
from dual;
The result is a CLOB with the complete DDL. You might need to adjust the display in SQL Developer to make the content of that value fully visible (I don't use SQL Developer, so I don't know if that is necessary and if so, what you would need to do)
Edit:
It seems SQL Developer can't display the result of this query properly unless you use the "Run Script" option. And with that you need to use a SET LONG 60000 (or some other big number) before you run it, to see the complete source code:

How to export large amount of data using sql developer - Oracle

I want to upload some data from UAT DB to DEV DB. When I try to do this from Export function in SQL Developer, I got an error File C:\Users\xxx\export.sql was not opened because it exceeds the maximum automatic open size
How can I copy the UAT data to DEV ?
ORACLE Version 12C
SQL Developer Version 4.0.0.13
found the below answer from a SQL Developer forum :
It appears that the "maximum automatic open size" is hard-coded to a value of 500000 (bytes, I believe) with no way to override it. By
limiting this, we nip in the bud any potential complaints of Java
OutOfMemory upon trying to open a huge file.
To view the file from within SQL Developer despite this limitation,
just use the File|Open menu. For those huge files, please use an
external editor. And if you don't want to open files automatically in
order to suppress the warning dialog, use
Tools|Preferences|Database|Export/View DDL Options and un-check the
"Open Sql File When Exported" box.
Are you certain the export file does not contain all the insert rows?
That would be a bug unless you hit an OutOfMemory or disk full
condition. I just tried your scenario on at 55000 row table that
produced an export.sql of about 20MB. All rows were included.
Regards,
Gary Graham
SQL Developer Team
and as the summary, it suggested that the SQL developer is not the best tool to open a large size of data file.
hope Gary's answer will guide you to some extent.
If you need to get an idea of some tools that you can open large files, check this LINK
Solution 1:
Set these values to some higher value!
Solution 2:
change "save to" to worksheet!
I was having this error when exporting database in insert format, selecting loader format on the 1st Export wizard screen fixed the issue.
This is probably because insert format creates a single SQL script with DDL and data as insert statements. So all the database is dumped in a single script file.
loader option produces multiple files: control file, data file, and sql files. And there are separate files for each table. As a result the export will consist of hundreds of files and no one file will reach the size limit.
This may not however work with single tables with very large amounts of data as that table's data file would hit the limit.
You can try different options like below.
On SQL developer, when right click on Table and click export, export wizard will be launched you can select either "Save As" - "separate files" that will export data in same SQL file.
OR you can change the format type on the same wizard to CSV that will export data in CSV format.
If you want to transfer large amounts of data (or small amounts, too) from one database to another, you should consider the tools that were specifically designed for such tasks.
First and foremost, look into data pump. It has a bit of a learning curve, though.
exp and imp (also by Oracle) are a bit easier to handle, but they're older and not nearly as powerful as data pump.
You might also want to look into the SQL*Plus copy command.
There is a trick to copy large chunk of data (from SQL developer) into excel sheet.
steps to be followed : Right click ---> export data ----> select format type as 'Text' ---> select type as "Clipboard" ----> open an excel sheet and try to paste keeping the below in mind :)
Then paste the data
NOTE : **Do Not paste the data on the first cell of the excel. Ctrl+v in any of the columns **
This will work.
Thanks
You can use spool the query and save the results as CSV or XLSX files for larger results. For example:
spool "D:\Temp\Report.csv"
SELECT /*csv*/ select id,name,age from EMP;
spool off;
1-You can create a database link (db link) on DEV DB pointing to UAT DB, to INSERT rows in DEV DB.
2-Or you can build in PL/SQL a procedure in UAT DB to export data to a file in CSV format and in DEV DB use oracle external tables to SELECT from that files.
Be carefull about DATE acolumns, write down using TO_CHAR.
3-Use Datapump to export data from UAT DB and then import into DEV DB; it's a bit tricky.
Oracle database commands can run both in SqlCl by Oracle and in SQL developer, so this is easy:
set feedback only -- for Oracle 12.2+, turn off terminal output
set sqlformat insert -- data in "insert into ..." format
-- set sqlformat csv -- data in csv format
spool /path/to/your/file.sql
select * from t; -- lines to export
spool off
set feedback off -- restore terminal output
Simplest way to this is to modify the "Save As" below in the screenshot to save to multiple files instead of single file while exporting-

Same stored procedure acts differently on two/(three) different IDEs

I just created a stored procedure in MS SQL DB using TOAD.
what it does is that it accepts an ID wherein some records are associated with, then it inserts those records to a table.
next part of the stored procedure is to use the ID input to search on the table where the items got inserted and then return it as the result set to the user just to confirm that the information got inserted.
IN TOAD, it does what is expected. It inserts date and returns information using just the stored procedure.
IN Oracle SQL developer however, it does the insert and it ends at that. It seems to not execute the 2nd part of the stored procedure which is a select stmt.
I just have a feeling that this is because of the jdbc adapter. Also why I'm asking is because I'm using a reporting tool Pentaho Report Designer and it would really make it easier if I can do 2 things at the same time. Pentaho Report Designer is also using jdbc adapters, not a coincidence maybe?
But if there are other things that I can tweak I'd really appreciate it.
This is a guess, but worth considering...
There are things called "Batches", where are sets of SQL Statements that are all sent to the server at once, and executed by the server as one set of statements, within a single server-side session. Sending a set of sql statements to the server as a batch will often result in different results than if you sent them one at a time, where each statement is executed in its own session.
I haven't used Toad (or Oracle) in a while, but as I recall, it dealt with batches differently than the other ide I used. If the second statement in your set is relying on being in the same session as the first, and in one ide it is in a separate session, then this might explain what is happening.

Resources