Can someone tell me how to password protect an existing SQL Server CE 4 file?
In the SqlCeEngine.Compact method, specify a password. Or you can use SqlCeCmd from a command line:
sqlcecmd40
-d "Data Source=C:\test.sdf;"
-z "Data Source=;Password=secret"
Related
I'm trying to make a request to a function in a SAP RFC server hosted at 10.123.231.123 with user myuser, password mypass, sysnr 00, client 076, language E. The name of the function is My_Function_Nm with params: string Alternative, string Date, string Name.
I use the command line:
/usr/sap/nwrfcsdk/bin/startrfc -h 10.123.231.123 -s 00 -u myuser -p mypass -c 076 -l en -F My_Function_Nm
But it always shows me the help instructions.
I guess I'm not specifying the -E pathname=edifile, and it's because i don't know how to create a EDI File to include the parameters values to the specified function. Maybe someone can help me on how to create this file and how to correctly invoke startrfc to consume from this function?
Thanks in advance.
If you actually check the help text the problem shows, you should find the following passages:
RFC connection options:
[...]
-2 SNA mode on.
You must set this if you want to connect to R/2.
[...]
-3 R/3 mode on.
You must set this if you want to connect to R/3.
Apparently you forgot to specify -3...
You should use sapnwrfc.ini which will store your connection parameters, and it should be places in the same directory as client program.
Sample file for your app should be following:
DEST=TST1
ASHOST=10.123.231.123
USER=myuser
PASSWD=mypass
SYSNR=076
RFC_TRACE=0
Documentation on using this file is here.
For calling the function you must create Bash-script, but better to use Python script.
I have to prepare few scripts for importing data into the Oracle database, but I will have to run it on different databases.
For each table to be imported I have a data and control file:
table1.dat
table1.ctl
table2.dat
table2.ctl
etc..
For each table I have prepared separate .bat file that runs sqlloader :
table1.bat:
sqlldr login/password#database control=table1.ctl log=table1.log
It is easy and simple solution as slong as I don't have to run it on different databases and change login credentials.
What I wolud like to do is have one file with login and password that runs loading scripts for each table.
Have you got any suggestions how it could be done?
Regards
Pawel
I hope I understood your question.
In your .bat file you can connect to any database but you sqlldr login decides on which database the import is started.
I would call a start.sql in the .bat file where I do something like this:
-- database 1
host sqlldr login/password#database1 control=table1.ctl log=table1_db1.log
host sqlldr login/password#database1 control=table2.ctl log=table2_db1.log
-- database 2
host sqlldr login/password#database2 control=table1.ctl log=table1_db2.log
host sqlldr login/password#database2 control=table2.ctl log=table2_db2.log
An other option is to call import_db1.sql in your start file en write your code concerning database 1, etc.
start.sql
##import_db1.sql
##import_db2.sql
import_db1.sql
-- database 1
host sqlldr login/password#database1 control=table1.ctl log=table1_db1.log data=csvfile.csv
host sqlldr login/password#database1 control=table2.ctl log=table2_db1.log data=csvfile.csv
etc.
Your issue isn't very clear, however it sounds like you just want to source username/password per server. In which case for bash you can do:
. /dir/to/file/.sql_password_file
where sql_password_file has the entry:
SQLLDRLOGON='user/pass'
then in your script you can do
sqlldr userid=$SQLLDRLOGON control=table1.ctl log=table1.log
I would look into changing your script to a loop too e.g.
for load in table1 table2
do
loads="control=${load}.ctl bad=${load}.bad log=${load}.log"
sqlldr $SQLLDRLOGON $loads
etc...
I am running xemacs with a .sql-mode file containing the following:
1 (setq sql-association-alist
2 '(
3 ("XDBST (mis4) " ("XDBST" "xsius" "password"))
4 ("dev " ("DEVTVAL1" "xsi" "password" "devbilling"))
5 ))
When I log in to the database in xemacs by selecting Utilities->Interactive Mode->Use Association, it logs me in but it does not pick up the database parameter. For example, when I log in to "dev", it logs me in but then when I do "select db_name()" it yields csdb instead of devbilling. It appears that it is picking up the default database associated with the user and ignoring the database parameter. How do you configure xemacs so that it picks up the database parameter specified in .sql-mode when the option is selected?
Thanks,
Mike
I did some more research and xeamcs is using sql-mode.el which on my system is in /usr/local/xemacs/lisp/sql-mode.el to login with SQL Mode. The code in the file does not use the database specified in .sql-mode in Interactive Mode. It does, however, use the database specified in .sql-mode in Batch Mode. You can use Batch Mode as a workaround.
I try to check the availability of a DB2 Instance via the db2cli-utility, as follows
db2cli execsql -user USER -passwd PASSWD -connstring DATABASE:HOST:PORT
(with actual values for the uppercased text). I would expect this to connect to HOST:PORT, using the credentials USER and PASSWD, and to switch to database DATABASE.
As a result i get
SQLError: rc = 0 (SQL_SUCCESS)
SQLGetDiagRec: SQLState : 08001
fNativeError : -1024
szErrorMsg : [IBM][CLI Driver] SQL1024N A database connection does not exist. SQLSTATE=08003
cbErrorMsg : 82
But: these values WORK, on the same machine, if i use them as credentials in applications that connect to DB2, so i would expect that i get a connection with the given command.
My Question is: am i using db2cli wrong?
You are using wrong connection string as well as options. Check correct command syntax by running "db2cli execsql -help" command.
You can use -user and -passwd option with -dsn option only. If you are using connection string, then uid and pwd should be part of -connstring option value. Also, the syntax of connection string is wrong. It must be a pair of keyword and value separated by semicolon and enclosed by quotes like "key1=val1;key2=val2;key3=val3". The correct command that you should use is:
db2cli execsql -connstring "DATABASE=dbname;HOSTNAME=hostname;PORT=portnumber;UID=userid;PWD=passwd"
The output for me is as below:
$ db2cli execsql -connstring "database=bluemix;hostname=192.168.1.20;port=50000;uid=myuid;pwd=mydbpassword"
IBM DATABASE 2 Interactive CLI Sample Program
(C) COPYRIGHT International Business Machines Corp. 1993,1996
All Rights Reserved
Licensed Materials - Property of IBM
US Government Users Restricted Rights - Use, duplication or
disclosure restricted by GSA ADP Schedule Contract with IBM Corp.
> select 'bluemix' from sysibm.sysdummy1
select 'bluemix' from sysibm.sysdummy1
FetchAll: Columns: 1
1
bluemix
FetchAll: 1 rows fetched.
> quit
$
To know the instance name, you should run db2level command.
$ db2level
DB21085I This instance or install (instance name, where applicable: "bimaljha") uses
"64" bits and DB2 code release "SQL10054" with level identifier "0605010E".
Informational tokens are "DB2 v10.5.0.4", "s140813", "IP23623", and Fix Pack "4".
Product is installed at "/home/bimaljha/sqllib".
you can try validate connect like below(it will make you sure if connection is successful)
db2cli validate -dsn sample -connect
db2cli.ini :
[sample]
hostname=host
pwd=password
port=portnumber
PROTOCOL=TCPIP
database=dbname
uid=username
Is there a MySQL query command to upload/insert all the queries in a .sql file (generated from mysqldump) on a local server to a mysql database on a remote server?
I'd like to try and do this all within MySQL queries from within an application and avoid issuing command-line mysql commands because I think that there would be a bit more overhead in parsing the output in that way.
I'm looking for something like, e.g. in Perl:
my $hostname = "remote_server_address";
my $dsn = "DBI:mysql:dbname:$hostname";
my $user = "user";
my $password = "password";
my $dbh= DBI->connect($remote)dsn, $user, $pw) );
my $myquery = "SPECIAL_INSERT_QUERYCOMMAND my_local_mysql_query_file.sql";
my $execute = $dbh->prepare($myquery);
$execute->execute;
Update: Additional requirements: Is there any "flow-control and resilience" whereby any connection issues between the local and remote server are handled so that the entire set of queries get transfered. And would there be a limit to the size of the file to be transferred? (I hope not).
I'm not sure why you couldn't use the command line tool mysql
cat my_local_mysql_query_file.sql | mysql -uuser -ppassword dbname
or on windows
mysql -uuser -ppassword dbname < my_local_mysql_query_file.sql