Execute a .sql File from cmd to IBM server (db2) - cmd

I'm trying to import a .sql file into IBM db2 Server using cmd
Here is my cmd line prompt that I'm using
cmd /c db2 -u<USER> -h<HOST> -p<PASSWORD> -D<DATABASE> <<My File Repository>/<FileName>.sql
Is this cmd prompt correct or not?
Available to give more informations

This question is frequently asked, so do your research.
With cmd.exe on MS-Windows, you cannot use such syntax, because the Db2 clp does not accept arguments like -u ... -h... -p... -D....
Instead there are other ways to achieve what you need.
If the Db2-database is on the same hostname as your clp (i.e. is is a local database), then you can use db2 connect to DATABASENAME and then use db2 -tvf filename.sql. When the database is remote (on a different hostname from your clp), you must supply a userid/password (or certificate, or token, depending on Server configuration)and the remote database has to be pre-catalogued locally (either via XML file or via catalog commands). With local databases you do not need to specify either a userid or a password, it will connect as the currently logged in operating-system user.
If you prefer to use a connection-string, you can rework your scripts to be compatible with the CLPplus program which is a java app that accepts arguments on its command line. For people who are familiar with Oracle SQL*Plus, this is often their go-to tool when using Db2.
If you really want to use cmd.exe with the Db2 clp to run SQL scripts (for both local and remote databases) , then you can do it if you are prepared to first perform some basic configuration steps. You have a choice of old style catlog configuration steps, or XML file.
The old style actions are: catalog tcpip node, then catalog database on that node, then db2 termainate and finally connect to the remote catalogued database with db2 connect to $database user XXXX using YYYY and db2 -tvf filename.sql. You can also have the connect statement inside the SQL script. All of these actions are well documented in the Db2 Knowledge Center online, so do your research.
If you do not wish to perform the catalog actions, you can also have a pre-prepared XML file (called db2dsdriver.cfg) which contains all of the connection details for all Data-Source-Names and databases you use. You can either create the XML file manually, or programatically with the db2cli tool and its command line. The IBM supplied CLI drivers for Db2 read and interpret that file at runtime. This lets you connect to local and remote DSNs without the need to use explicit catalog actions because the XML file has all the details. The CLP will then let you run scripts against those DSNs. This file is documented in the Db2 Knowledge Centre, and can be located (by defailt) in the CLI driver cfg directory, or anywhere via the environment variable DB2DSDRIVER_CFG_PATH. Refer to the documentation for all details.
If your remote database runs on Db2-for-i, or Db2-for-z/os then different considerations apply. First IBM supplied CLI drivers both require a license before the connect will succeed. Refer to documentation. Second, for Db2-for i (as/400), you should only consider using the separate IBM product for IBM i series access and its odbc option.
If your remote Db2-database is on Linux/Unix/Windows/zLinux/cloud then you do not need any license locally.

Related

Data Pump export using cloud shell

I am trying to export schema using data pump on Oracle Cloud Autonomous database.
I am using cloud shell to export schema.
When I tried to do the final step:
expdp admin/password#DB_HIGH schemas=SCHEMA_NAME directory=data_pump_dir dumpfile=exp%U.dmp filesize=1G logfile=expot.log
I got
UDE-12154: operation generated ORACLE error 12154 ORA-12154:
TNS:could not resolve the connect identifier specified
Do I need Oracle instant client to do export?
The Oracle client code uses one of three ways to look up connect data:
A flat file named tnsnames.ora
Oracle Names service
LDAP
When the complete ORA-12154 error appears with the text line, your program has found a working Oracle client install. However, the specified Oracle service is not listed in tnsnames.ora, Oracle Names or LDAP.
The first step in the troubleshooting process is to determine which name resolution method is deployed at your site. Most sites use tnsnames.ora, but enough use Oracle Names and LDAP, so it’s best to confirm this information.
If you are not the database administrator, get in touch with the people managing your Oracle systems and find out which method you should be using. They may be able to guide you in fixing the problem in accordance with your site’s standards.
The client code decides which mechanism to use based on the file sqlnet.ora. This file and tnsnames can usually both be found in the Oracle install directory (“ORACLE_HOME”), under network/admin/. This location may be overridden with the environment variable TNS_ADMIN.
If the sqlnet.ora file does not exist or does not specify a resolution method, then Oracle Net uses tnsnames.ora.
Example locations of Oracle networking files include:
Windows
ORANTNET80ADMIN
ORACLEORA81NETWORKADMIN
ORAWIN95NETWORKADMIN
ORAWINNETWORKADMIN
UNIX / Linux
$ORACLE_HOME/network/admin/
/etc/
/var/opt/oracle/
If you fix the naming issues, but you still see the ORA-12154 error, check the Oracle service to confirm that it’s available for connections. A power outage, server failure, or network connectivity issue will make this resource inaccessible. It’s also possible that scheduled maintenance or repairs of an unrelated Oracle issue may take that resource temporarily offline.
Thanks

db2 load command fails using JDBC

I'm using JDBC to transfer data from a delimited file to a db2 database table. Initially, I encountered SQLCODE=-104, SQLSTATE=42601, so on further debugging I found this which referred me to call stored procedure SYSPROC.ADMIN_CMD.
I modified the call and tried running the procedure version, but I'm still getting the same error:
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=CLIENT;LOAD;FROM, DRIVER=4.26.14
Error |
at com.ibm.db2.jcc.am.b7.a(b7.java:810)
Error |
at com.ibm.db2.jcc.am.b7.a(b7.java:66)
I'm not sure what exactly am doing wrong.
Code:
CALL SYSPROC.ADMIN_CMD('LOAD CLIENT FROM "<PATH_TO_FILE>" OF DEL MODIFIED BY COLDEL0X09 INSERT INTO <SCHEMA_NAME>.<TABLE_NAME> NONRECOVERABLE')
I ran the LOAD command on the db2 command prompt and it ran without any issues.
Db2 version: 11.5
The load client command is intended for use at a client workstation, not at a Db2-server, so sysproc.admin_cmd will reject the client keyword.
The stored procedure executes at the Db2-server , it does not have access to files that are stored at the client workstation.
Therefore any file you mention in parameters to sysproc.admin_cmd stored procedure must be relative to the Db2-server file-system and must be accessible (readable) to the Db2-instance owner account.
If your data-file is already located on the Db2-server, just reference its fully qualified filename and run the sysproc.admin_cmd procedure with the load command. Carefully check the documentation for the load command to understand all of the implications of using LOAD, especially if the target Db2-server is highly-available. This is an administration matter.
If your data-file is not already located at a Db2-server, then first copy the file to the Db2-server and retry with load (or slower import).
You can also run the import command via sysproc.admin_cmd when the data file is accessible to the Db2-instance owner on the Db2-server than runs that stored-procedure.
If your Db2-version is recent you can also consider the ingest command, refer to the documentation for details.
If your data file is located on a client workstation, and you are unable or unwilling to transfer it to a Db2-server, then you can use your local workstation Db2-client (if you installed a suitable Db2-client that includes the db2 CLP) to run a script/batch-file to perform the relevant commands. You cannot use jdbc for that specific purpose, although you can exec/shell out from java to run the required commands in one script (db2 connect to ... user ...using .., db2 load client ... , or db2 ingest ... or db2 import ...
If your target Db2-server is already at version 11.5 or higher then it should support insert from external table, and remote external table, and since INSERT is plain SQL then you can do that via jdbc.
Apart from the above, most DBAs would arrange for direct Db2-server to Db2-server transfers if both are Db2-LUW and have IP-connectivity and if security rules permit, this avoids the slow and insecure business of taking the data outside of Db2. That is not a programming matter, more an administrative matter.

DB2 database in Oracle SQL developer with SSL option

Need help to connect DB2 on cloud with Oracle SQL developer.
I have configured SQL developer with third party JDBC to other DBs, but not work in DB2 with SSL option.
There have no option to set "sslConnection=true" in connection dialog. I have tried db2 type 2/4 jdbc drivers, that's same result.
Oracle-SQL-Developer successfully lets me connect to Db2-on-cloud with SSL.
My version of Oracle-SQL-Developer is old 17.02 and yes that version seems to lack a GUI way to add connection attributes for Db2 connections . I will update this answer for version 19.02 later.
A workaround is:
append the required property to the database name field.
Example: BLUDB:sslConnection=true; . Depending on version, the GUI
may misbehave , in which case do not try a connect or test at this
time, but instead try SAVE and then close Oracle-SQL-Developer - to
cause it to update its connections.xml file.
The connection information is stored in file connections.xml
which you can edit when Oracle-SQL-Developer is closed. The location
of that file may depend on which operating-system you are using. For
Linux it is in the .sqldeveloper tree off the home directory of the user running SQL-Devleoper
. First take a backup of that file before
you change it. Search for your newly created connection name. Look
through the settings to find the customUrl for your Db2-on-cloud
connection. You can edit it to look something like below:
<StringRefAddr addrType="customUrl">
<Contents>
jdbc:db2://dashdb-txn-sbox-***********.services.*****.bluemix.net:50001/BLUDB:sslConnection=true;
</Contents>
If you made changes, save the file, take another backup of the changed file (in case it gets overwritten next time), and restart SQL-Developer. Your connection should appear in the Connections pane, and the connect should succeed if you entered all other credentials and connection-parameters correctly. Works for me...

How does JDBC Connection to Sybase in Pentaho Kettle differ depending on the number of databases running in the SQL Anywhere server?

I'm trying to test a Database Connection in Pentaho Kettle to a SQL Anywhere database server on local with the Native (JDBC) connection using the default DBA/sql credentials. The .exe used to run the database server is the classic dbsrv10.exe for SQL Anywhere. It runs on port 2638. When I run dbsrv10.exe with only one database file, Kettle can connect just fine with the credentials on that port, and here's the interesting part, with ANY Database Name.
Once I include two database files, I get a Login Error from Kettle, even when using the appropriate names of each database in the Database Name field (I set the names dbsrv10 uses to refer to each database in the command line or I can omit that and SQL Anywhere refers to them by the file names without the .db). Any thoughts as to what's different and how I can actually connect when dbsrv is managing two databases?

Run commandline command at remote Oracle server using SQL*Plus

I have a machine running Oracle 10g server in windows server 2008. I want to take backup of the database. I also want to take backup of some files saved on hard disk by oracle server that users have uploaded using my website.
I can connect to the Oracle server using sql developer and sqlplus. I can run sql queries on the server.
In order to take backup of database I have to run the command "exp" (this is the only way of taking backup of databases that I know). There might be some other way but there is another problem because of which I must run dos command. That problem is to take backup of files. These files are stored in c:\mydir. The folder mydir is not accessible anyway through web and is not a shared folder.
I have tried running "host " in sqlplus after connecting to oracle server, that is at "sql>" prompt. The command ran successfuly but at local machine, not at oracle server.
Edit: The "host" command is provided by sqlplus and is not an oracle command, means cannot be used in a query. Sqlplus even when connected to remote machine run the "host" command at local machine.
The target is to either make sqlplus run the "host" command at remote machine, Or run the dos command from inside a pl/sql query (independent of sqlplus).
In addition to what Justin has written:
If you want to take a logical snapshot of the database the new DataPump tool is preferred over the old (and deprecated) exp tool.
DataPump is a commandline tool (expdp) but also has a SQL API through Oracle packages and procedures.
The Data Pump API (including examples)
DBMS_DATAPUMP (reference)
But if you want a "real" backup you should look into RMAN
It is possible to create a Java stored procedure on the database server that executes an operating system command on the Oracle server. But it would be extremely unusual to use the export utility to backup a database-- that only creates a logical backup not a more appropriate physical backup. And it would be extremely unusual to run a backup by connecting to the database via SQL*Plus and spawning a job on the server operating system. It would make much more sense to create a job using the Windows scheduler on the database server that ran whatever export commands you want to run.

Resources