db2 load command fails using JDBC - jdbc

I'm using JDBC to transfer data from a delimited file to a db2 database table. Initially, I encountered SQLCODE=-104, SQLSTATE=42601, so on further debugging I found this which referred me to call stored procedure SYSPROC.ADMIN_CMD.
I modified the call and tried running the procedure version, but I'm still getting the same error:
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=CLIENT;LOAD;FROM, DRIVER=4.26.14
Error |
at com.ibm.db2.jcc.am.b7.a(b7.java:810)
Error |
at com.ibm.db2.jcc.am.b7.a(b7.java:66)
I'm not sure what exactly am doing wrong.
Code:
CALL SYSPROC.ADMIN_CMD('LOAD CLIENT FROM "<PATH_TO_FILE>" OF DEL MODIFIED BY COLDEL0X09 INSERT INTO <SCHEMA_NAME>.<TABLE_NAME> NONRECOVERABLE')
I ran the LOAD command on the db2 command prompt and it ran without any issues.
Db2 version: 11.5

The load client command is intended for use at a client workstation, not at a Db2-server, so sysproc.admin_cmd will reject the client keyword.
The stored procedure executes at the Db2-server , it does not have access to files that are stored at the client workstation.
Therefore any file you mention in parameters to sysproc.admin_cmd stored procedure must be relative to the Db2-server file-system and must be accessible (readable) to the Db2-instance owner account.
If your data-file is already located on the Db2-server, just reference its fully qualified filename and run the sysproc.admin_cmd procedure with the load command. Carefully check the documentation for the load command to understand all of the implications of using LOAD, especially if the target Db2-server is highly-available. This is an administration matter.
If your data-file is not already located at a Db2-server, then first copy the file to the Db2-server and retry with load (or slower import).
You can also run the import command via sysproc.admin_cmd when the data file is accessible to the Db2-instance owner on the Db2-server than runs that stored-procedure.
If your Db2-version is recent you can also consider the ingest command, refer to the documentation for details.
If your data file is located on a client workstation, and you are unable or unwilling to transfer it to a Db2-server, then you can use your local workstation Db2-client (if you installed a suitable Db2-client that includes the db2 CLP) to run a script/batch-file to perform the relevant commands. You cannot use jdbc for that specific purpose, although you can exec/shell out from java to run the required commands in one script (db2 connect to ... user ...using .., db2 load client ... , or db2 ingest ... or db2 import ...
If your target Db2-server is already at version 11.5 or higher then it should support insert from external table, and remote external table, and since INSERT is plain SQL then you can do that via jdbc.
Apart from the above, most DBAs would arrange for direct Db2-server to Db2-server transfers if both are Db2-LUW and have IP-connectivity and if security rules permit, this avoids the slow and insecure business of taking the data outside of Db2. That is not a programming matter, more an administrative matter.

Related

Oracle sql or pl/sql command for checking if remote database is available i.e. similar to OS command tnsping

In my PL/SQL script, I have requirement to test the connectivity of remote data. Normally, in OS shell, we use command tnsping. Is there any command in SQL or PL/SQL so that I can test connectivity.
Basically, I am getting some data using database link. I want my script to execute immediately. If remote database is available then I want to run query using database link. If remote database is unavailable then I want to skip the getting of data using database link. So, this is the reason why I am looking for.
Kindly guide me for proper way. So, that my script does not hang in case of unavailability of remote database.
I am using Oracle 12.1C.

Data Pump export using cloud shell

I am trying to export schema using data pump on Oracle Cloud Autonomous database.
I am using cloud shell to export schema.
When I tried to do the final step:
expdp admin/password#DB_HIGH schemas=SCHEMA_NAME directory=data_pump_dir dumpfile=exp%U.dmp filesize=1G logfile=expot.log
I got
UDE-12154: operation generated ORACLE error 12154 ORA-12154:
TNS:could not resolve the connect identifier specified
Do I need Oracle instant client to do export?
The Oracle client code uses one of three ways to look up connect data:
A flat file named tnsnames.ora
Oracle Names service
LDAP
When the complete ORA-12154 error appears with the text line, your program has found a working Oracle client install. However, the specified Oracle service is not listed in tnsnames.ora, Oracle Names or LDAP.
The first step in the troubleshooting process is to determine which name resolution method is deployed at your site. Most sites use tnsnames.ora, but enough use Oracle Names and LDAP, so it’s best to confirm this information.
If you are not the database administrator, get in touch with the people managing your Oracle systems and find out which method you should be using. They may be able to guide you in fixing the problem in accordance with your site’s standards.
The client code decides which mechanism to use based on the file sqlnet.ora. This file and tnsnames can usually both be found in the Oracle install directory (“ORACLE_HOME”), under network/admin/. This location may be overridden with the environment variable TNS_ADMIN.
If the sqlnet.ora file does not exist or does not specify a resolution method, then Oracle Net uses tnsnames.ora.
Example locations of Oracle networking files include:
Windows
ORANTNET80ADMIN
ORACLEORA81NETWORKADMIN
ORAWIN95NETWORKADMIN
ORAWINNETWORKADMIN
UNIX / Linux
$ORACLE_HOME/network/admin/
/etc/
/var/opt/oracle/
If you fix the naming issues, but you still see the ORA-12154 error, check the Oracle service to confirm that it’s available for connections. A power outage, server failure, or network connectivity issue will make this resource inaccessible. It’s also possible that scheduled maintenance or repairs of an unrelated Oracle issue may take that resource temporarily offline.
Thanks

Execute a .sql File from cmd to IBM server (db2)

I'm trying to import a .sql file into IBM db2 Server using cmd
Here is my cmd line prompt that I'm using
cmd /c db2 -u<USER> -h<HOST> -p<PASSWORD> -D<DATABASE> <<My File Repository>/<FileName>.sql
Is this cmd prompt correct or not?
Available to give more informations
This question is frequently asked, so do your research.
With cmd.exe on MS-Windows, you cannot use such syntax, because the Db2 clp does not accept arguments like -u ... -h... -p... -D....
Instead there are other ways to achieve what you need.
If the Db2-database is on the same hostname as your clp (i.e. is is a local database), then you can use db2 connect to DATABASENAME and then use db2 -tvf filename.sql. When the database is remote (on a different hostname from your clp), you must supply a userid/password (or certificate, or token, depending on Server configuration)and the remote database has to be pre-catalogued locally (either via XML file or via catalog commands). With local databases you do not need to specify either a userid or a password, it will connect as the currently logged in operating-system user.
If you prefer to use a connection-string, you can rework your scripts to be compatible with the CLPplus program which is a java app that accepts arguments on its command line. For people who are familiar with Oracle SQL*Plus, this is often their go-to tool when using Db2.
If you really want to use cmd.exe with the Db2 clp to run SQL scripts (for both local and remote databases) , then you can do it if you are prepared to first perform some basic configuration steps. You have a choice of old style catlog configuration steps, or XML file.
The old style actions are: catalog tcpip node, then catalog database on that node, then db2 termainate and finally connect to the remote catalogued database with db2 connect to $database user XXXX using YYYY and db2 -tvf filename.sql. You can also have the connect statement inside the SQL script. All of these actions are well documented in the Db2 Knowledge Center online, so do your research.
If you do not wish to perform the catalog actions, you can also have a pre-prepared XML file (called db2dsdriver.cfg) which contains all of the connection details for all Data-Source-Names and databases you use. You can either create the XML file manually, or programatically with the db2cli tool and its command line. The IBM supplied CLI drivers for Db2 read and interpret that file at runtime. This lets you connect to local and remote DSNs without the need to use explicit catalog actions because the XML file has all the details. The CLP will then let you run scripts against those DSNs. This file is documented in the Db2 Knowledge Centre, and can be located (by defailt) in the CLI driver cfg directory, or anywhere via the environment variable DB2DSDRIVER_CFG_PATH. Refer to the documentation for all details.
If your remote database runs on Db2-for-i, or Db2-for-z/os then different considerations apply. First IBM supplied CLI drivers both require a license before the connect will succeed. Refer to documentation. Second, for Db2-for i (as/400), you should only consider using the separate IBM product for IBM i series access and its odbc option.
If your remote Db2-database is on Linux/Unix/Windows/zLinux/cloud then you do not need any license locally.

What is EXTPROC in Oracle?

For security reasons I asked DB team to add EXTPROC_DLLS:ONLY; but they said this:
"Please be informed that the KEY = EXTPROC1526 doesn’t refer to any
external process at all. This is just a key used by any process needs
to call Oraxxx via IPC protocol. The key can be any value and the same
key value should be passed via the tnsnames.ora"
To me, it seems wrong. Could you please help me on this? What is the exact use of EXTPROC and what happens if we don't add EXTPROC_DLLS:ONLY?
For any program to connect the oracle database you need Extproc agent.
PLS/SQL for example needs Extproc to work with oracle
You can find more information about the securit here
Ill past some of the link
Description
***********
The Oracle database server supports PL/SQL, a programming language. PL/SQ can execute external procedures via extproc. Over the past few years there has been a number of vulnerabilities in this area.
Extproc is intended only to accept requests from the Oracle database server but local users can still execute commands bypassing this restriction.
Details
*******
No authentication takes place when extproc is asked to load a library and execute a function. This allows local users to run commands as the Oracle user (Oracle on unix and system on Windows). If configured properly, under 10g, extproc runs as nobody on *nix systems so the risk posed here is minimal but still present.
and an example here
On contrary to other databases Oracle does NOT allow plugins to access it's own memory address space. In case of MySQL/PostgreSQL a .dll plugin (C stored procedure) is loaded by the main database process.
Oracle lets listener to spawn a new process by calling extproc (or extproc32). This process loads the shared library and the rest of the database talks to this process via IPC.
This approach is safer, because the external library can not crash the database nor corrupt data. On the other hand sometimes C stored procedures might be slower than Java ones.
This option can restrict path for .dlls being loaded by extproc. i.e. those created by CREATE LIBRARY statement.
PS: usage of C stored procedures is VERY rare, if you do not use them you can freely remove the whole extproc stanza from listener.ora.
PS1: there is possible scenario of exploiting the extproc feature.
User must have CREATE LIBRARY, which usually NOT granted
extproc is not configured to run with nobody's privs - but runs as oracle:dba
User creates malicious .so library, which will performs something "evil" during it's initialization.
User puts this lib into /tmp directory
User creates Oracle LIBRARY pointing into /tmp by using CREATE LIBRARY statement
User forces extproc to dlopen this library
exproc will execute evil code with OS privileges oracle:dba
When using this EXTPROC_DLLS:ONLY restriction, developers have to cooperate with DBAs, and only white-listed libraries can be used and loaded.

UTL_FILE server side usage/ client side usage

I've used TEXT_IO package for creating files in the local(client) machine. From the documentation http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABBBABB I see that it is "available for both client-side and server-side PL/SQL". What does this mean?. Does it mean I can use it to create file in both client and server side? If so, which method/option should I use to create a file in the client side. Thanks.
UTL_FILE is a PL/SQL database package. It can read from or write to any directory which the oracle OS account has the matching privileges on. In practice this means directories on the database server, although directories on other servers - or even your local PC - can be shared with that server, through the good graces of your network administrator, and the DBA creating the appropriate Directory object.
TEXT_IO is an Oracle Forms package for writing to the client. Naturally it only works in client/server versions of the product, although the webutils library provides an implementation which can work in webform deployments.
The oracle OS account is the user which installed the Oracle software. We create the account before running the OUI. The oracle user has no direct relationship to any database accounts.
Processes inside the database can only read or write files in directories which the OS account can access. These processes include UTL_FILE, Data Pump, external tables, Java stored procedures running OS commands and extprocs, as well as background things like the alert log, dumps and trace files .
No, I think it means that UTL_FILE and TEXT_IO provide equivalent functionality for server and client respectively.

Resources