I want to upload some data from UAT DB to DEV DB. When I try to do this from Export function in SQL Developer, I got an error File C:\Users\xxx\export.sql was not opened because it exceeds the maximum automatic open size
How can I copy the UAT data to DEV ?
ORACLE Version 12C
SQL Developer Version 4.0.0.13
found the below answer from a SQL Developer forum :
It appears that the "maximum automatic open size" is hard-coded to a value of 500000 (bytes, I believe) with no way to override it. By
limiting this, we nip in the bud any potential complaints of Java
OutOfMemory upon trying to open a huge file.
To view the file from within SQL Developer despite this limitation,
just use the File|Open menu. For those huge files, please use an
external editor. And if you don't want to open files automatically in
order to suppress the warning dialog, use
Tools|Preferences|Database|Export/View DDL Options and un-check the
"Open Sql File When Exported" box.
Are you certain the export file does not contain all the insert rows?
That would be a bug unless you hit an OutOfMemory or disk full
condition. I just tried your scenario on at 55000 row table that
produced an export.sql of about 20MB. All rows were included.
Regards,
Gary Graham
SQL Developer Team
and as the summary, it suggested that the SQL developer is not the best tool to open a large size of data file.
hope Gary's answer will guide you to some extent.
If you need to get an idea of some tools that you can open large files, check this LINK
Solution 1:
Set these values to some higher value!
Solution 2:
change "save to" to worksheet!
I was having this error when exporting database in insert format, selecting loader format on the 1st Export wizard screen fixed the issue.
This is probably because insert format creates a single SQL script with DDL and data as insert statements. So all the database is dumped in a single script file.
loader option produces multiple files: control file, data file, and sql files. And there are separate files for each table. As a result the export will consist of hundreds of files and no one file will reach the size limit.
This may not however work with single tables with very large amounts of data as that table's data file would hit the limit.
You can try different options like below.
On SQL developer, when right click on Table and click export, export wizard will be launched you can select either "Save As" - "separate files" that will export data in same SQL file.
OR you can change the format type on the same wizard to CSV that will export data in CSV format.
If you want to transfer large amounts of data (or small amounts, too) from one database to another, you should consider the tools that were specifically designed for such tasks.
First and foremost, look into data pump. It has a bit of a learning curve, though.
exp and imp (also by Oracle) are a bit easier to handle, but they're older and not nearly as powerful as data pump.
You might also want to look into the SQL*Plus copy command.
There is a trick to copy large chunk of data (from SQL developer) into excel sheet.
steps to be followed : Right click ---> export data ----> select format type as 'Text' ---> select type as "Clipboard" ----> open an excel sheet and try to paste keeping the below in mind :)
Then paste the data
NOTE : **Do Not paste the data on the first cell of the excel. Ctrl+v in any of the columns **
This will work.
Thanks
You can use spool the query and save the results as CSV or XLSX files for larger results. For example:
spool "D:\Temp\Report.csv"
SELECT /*csv*/ select id,name,age from EMP;
spool off;
1-You can create a database link (db link) on DEV DB pointing to UAT DB, to INSERT rows in DEV DB.
2-Or you can build in PL/SQL a procedure in UAT DB to export data to a file in CSV format and in DEV DB use oracle external tables to SELECT from that files.
Be carefull about DATE acolumns, write down using TO_CHAR.
3-Use Datapump to export data from UAT DB and then import into DEV DB; it's a bit tricky.
Oracle database commands can run both in SqlCl by Oracle and in SQL developer, so this is easy:
set feedback only -- for Oracle 12.2+, turn off terminal output
set sqlformat insert -- data in "insert into ..." format
-- set sqlformat csv -- data in csv format
spool /path/to/your/file.sql
select * from t; -- lines to export
spool off
set feedback off -- restore terminal output
Simplest way to this is to modify the "Save As" below in the screenshot to save to multiple files instead of single file while exporting-
Related
I want to create a backup of all SQL scripts (views) that are saved on snowflake (not data). How can I do it? Obviously manual copy and pasting is not an answer.
Expected result: I have all views (sql scripts) that are in snowflake database on my local machine, file per view.
Expected result perfect version: I have all views (sql scripts) that are in snowflake on my local machine, where folders would correspond to schemas in snowflake and files would correspond to views in snowflake (files are also placed in correct folders).
SHOW VIEWS includes the DDL in the text column. To get all views in the database:
show views in database my_database;
select "text" from table(result_scan(-1));
You can invoke it from the CLI with SNOWSQL.
You can run:
SELECT get_ddl('schema',{schema_name});
This will get you the DDL of all objects in the schema, which you can then save to a file in a folder.
You just download the full database DDL
Option 1
select GET_DDL('database','databasename') and copy and save it to your machine
Option 2
Write a Python Script to get a list of schemas, views, tables, stored procs etc. and save it to its corresponding folder on your local machine. Something like this, you just have to extend it to get your perfect version output. Just install the snowflake Python connector to run the following code.
import snowflake.connector
con = snowflake.connector.connect(
user='YourUsername',
password='YourPassword',
account='your snowflakeaccount',
database='databasename',
warehouse='datawarehousename',
role='dbrole'
)
cur = con.cursor()
try:
cur.execute("SELECT TABLE_SCHEMA,TABLE_NAME,TABLE_TYPE from information_schema.tables")
for (TABLE_SCHEMA,TABLE_NAME,TABLE_TYPE) in cur:
print('{0}, {1}'.format(TABLE_SCHEMA,TABLE_NAME,TABLE_TYPE))
#Have another loop to get the DDL for each object and save it to a file/folder structure, something like this..
#cur2.execute("SELECT GET_DDL('object type information from previous query','object name'")
finally:
cur.close()
Great answers above, but you could take it a step further and manage all your DDL through a version controlled tool like DBT.
That way, you would have a mechanism not only to store your DDL in text files, but also to run that DDL (instead of relying on error-prone manual processes).
Otherwise, how would you know whether or not your text files are up-to-date?
I was wondering if we could use spool in Plsql developer. I am currently running a query in plsql developer which is resulting in a result set of 1 million rows. I need to export that data to an excel file but when I am selecting the columns and right clicking on the data to copy it to the excel file, it copies only 10 or 11 rows, To load all the rows, I need to press the downward arrow in the image here
and it takes so much time to load. I was wondering if there is any easy way I could export my huge amount of data directly to an excel file in plsql developer?
Most SQL*Plus commands work in a PLSQL Dev Command window. spool does. Of course, spooling 1000000 rows of data is going to take a long time.
Plus you'll need to handle the CSV formatting manually, with other SQL*Plus commands. Find out more.
You can use the ORA_EXCEL package for export data from sql developer in excel format.
Please visit this site :
https://www.oraexcel.com/examples
I am trying to copy a table data from dev box db to uat db which are 2 different data bases . I am trying in toad.All the connection details are correct but its not working and throwing the following error.
[Error] Execution (12: 1): ORA-00900: invalid SQL statement
This is what i am trying
copy from abc/cde#//abc.abc.com:1521/devbox to abc/cde#//abc.abc.com/uatbox
INSERT TOOL_SERVICE_MAPPING (*)
USING (SELECT * FROM TOOL_SERVICE_MAPPING)
If your table doesn't have a huge number of rows you can use Toad's Export function: it creates an insert statement for each row. You can then run these statements in destination DB to re-create your table's data.
Here are the steps:
A. Create a copy of the table in destination DB
in source DB in a schema browser window click on the table you want to copy, select "script" tab in the right part of the window: you will find the script to re-create your table; copy this script
paste the script in a new SQL editor window in destination DB and run it. This should create the new table
B. Copy data in new table
in a schema browser window right click on table name in source DB
select "Export Data" from context menu
write "where" statement of your export query (leave it blank if you want to copy the entire table)
select destination: clipboard
click "ok" (now insert statements are stored in your clipboard)
paste insert statements in a new SQL editor window in destination DB
run statements as script (shortcut F5)
copy is a SQL*Plus command, not a SQL statement. I would be surprised if Toad had implemented that particular SQL*Plus command (it does implement many of the simpler commands). If you want to use the copy command, you would need to use SQL*Plus, not Toad.
If you want to use Toad, you would need to use a SQL statement to copy the data. You could create a database link in the destination database that points to the source database and then
INSERT INTO tool_service_mapping
SELECT *
FROM tool_service_mapping#<<db link to source database>>
The easyest and most error-free way I have experienced so far is: Database->Compare->Schemas
It's not too complicated as it looks (lots of checkboxes), but you tick boxes for objects you need to be created in an empty database, and at the end of comparison you end up with SQL script including all objects (triggers, views, sequences, packges) that you selected (checkboxes).
I clearly see all tables, triggers, data, etc in generated sql script and even can tick these I don't wish to create (if any)... Before executing script, TOAD asks you to confirm against which database you are running the script - saved me few times... As ackward as it looks, it works perfectly.
I have arround 200 tables I don't know if this is suitable for huge databases.
I have an Oracle 10G database and I need to write a fairly straightforward query that joins two tables and selects some data. However, I'd like to export the result list to an excel, so end users can use this .xls document to see the results and filter by one of the fields (location)
When I write the query, is there an easy way I can generate/ create an excel document that would hold these results as described above? The SQL doesn't need to run from within excel, but I guess that would be a useful feature now that I think about it!
Thanks.
There is simple solution for your request.
By using ora_excel, small pl/sql package which generates Excel xlsx file, you can select data and export selected data to Excel and set filtering.
Please see following example:
BEGIN
ORA_EXCEL.new_document;
ORA_EXCEL.add_sheet('My sheet');
ORA_EXCEL.query_to_sheet('select * from employees'); -- Select data from database
ORA_EXCEL.set_cells_filter('A1', 'K1'); -- Add cell filtering from column A1 to column K1
-- Save generated Excel to file with name example.xlsx to Oracle folder EXAMPLE_XLSX
ORA_EXCEL.save_to_file('EXPORT_DIR', 'example.xlsx');
END;
For more details please check here
Cheers
Pretty easy to do in excel; and when done user can right click the data and say "Refresh" to get the latest updates.
but why reinvent the wheel lots of online articles already explain how to do this... Here's one
http://blog.mclaughlinsoftware.com/microsoft-excel/how-to-query-oracle-from-excel-2007/
After you've connected to a table, you can edit the properties on the connection and enter custom SQL (copy and paste from your developer tools)
Since you cannot use OLE DB in your version of Excel. Use SPOOL to create a CSV file.
SQL> SET echo off
SQL> SET verify off
SQL> SET colsep ,
SQL> SET pagesize 0
SQL> SET trimspool on
SQL> SET feedback off
SQL> SPOOL ON
SQL> SPOOL C:\data.csv
SQL> SELECT COLUMN1,COLUMN2,COLUMN3....
FROM TABLE;
SQL> SPOOL OFF
The .csv file should open in Excel by default. Use proper column aliases so that users understand the column headers.
Quick way:
At first create a view which contains your Query(Best way because you might need to change this query later).
Be sure to properly have installed oracle client.
In Excel(2007 and above) in Data tab go this way:
From Other sources -> From Data Connection Wizard -> Microsoft Data Access - OLE DB Provider for Oracle
Now Enter your DataSource Name(Stored in tnsnames.ora) and user password
Find you view and Then You'll have what you need.
You can save password and set option to refresh automatically in connection properties.
You are able to query an oracle database directly from Excel 2003 however, your sql statements are interpreted by MS Query and because of this it can often be frustrating. I will assume the machine in question already has the ability to query your database and has properly configured the database naming.
To query your database from excel 2003 you must:
Install and configure oracle's ODBC Driver (You must have the 32bit drivers installed since excel03 is a 32bit application). ODBC can be configured under start > administrative tools > ODBC Data Source Administrator
Open excel 2003 and goto data > import external data > new database query.
This should bring up MS Query which is an Access-like interface.
Obviously this is a very brief starter to get you stepping in the right direction. If you have any specific questions, please comment and I will try and help you.
Step 1
Run Your Query
Right Click on Resultenter image description here
Step 2
Click on Export
enter image description here
Step 3
Select Format To Excel
Enter datasheet name and location
Step 4
Click on Next and then finish
enter image description here
You can do one thing.
First generate the output in a form that includes column separators using symbols (like , or #).
Import the data to the excel and then define the placeholders as the column separators.
Does anyone know how to export results from more than one query into different sheets of the same Excel workbook using the report automation in TOAD for data analyst?
Thank you
I'm not sure that you can do that with Toad automatically but there is a little trick that you can do with Excel.
Write first query and execute it in Toad, after that right click on query result data grid and choose "Export dataset...", under Excel format choose "Excel instance" and click OK. It will open Excel and add one sheet with data from your query.
Repeat same process for second query and it will add another sheet to same document and fill with data from second query.
After you executed all queries and added it to Excel save excel document.
If you want to do that completely automatically, there is another solution which you can use to create single Excel document with multiple sheets which are loaded with data from different queries. Purchase the third party PL/SQL package, ORA_EXCEL.
Here is example how to do that:
BEGIN
ORA_EXCEL.new_document;
ORA_EXCEL.add_sheet('Employees');
ORA_EXCEL.query_to_sheet('select * from employees');
ORA_EXCEL.add_sheet('Departments');
ORA_EXCEL.query_to_sheet('select * from departments', FALSE);
ORA_EXCEL.add_sheet('Locations');
ORA_EXCEL.query_to_sheet('select * from locations');
-- EXPORT_DIR is an Oracle directory with at least
-- write permission
ORA_EXCEL.save_to_file('EXPORT_DIR', 'example.xlsx');
END;
It can generate Excel file and store it to Oracle directory, or you can get generated Excel file into PL/SQL BLOB variable so you can store it to table or create your own process to distribute file like sending it to email.
More details you can find on products documentation/examples page: http://www.oraexcel.com/examples
Cheers
I don't think this functionality exists in TOAD.
The usual solution for exporting straight from PL/SQL to Excel - Tom Kyte's OWA_SYLK wrapper for the SYLK api - only works with single worksheets. There are a couple of alternative solutions.
Sanjeev Sapre has his get_xl_xml package. As the name suggests it uses XML to undertake the transformation. Find out more.
Jason Bennett has written a PL/SQL object which generates an Excel XML document. Find out more.
You no longer need to write code to output data for multiple sheets.
As long as your SQL has queries identified clearly (with semicolons), TDA or now TDP will automatically dump data for different SQLs in different worksheets.
I have Toad for Data Analyst 2.6. I use the keyword GO between queries.
Select * from tableA;
GO
Select * from tableB;
This creates two tabs in Excel.