SO this is a bit of an odd request but hoping someone on here knows some command line fu. Might have to post to serverfault too, we'll see.
I'm trying to figure out how i can pass the results of a curl request to the mysql command line application. So basically something kinda like this -
mysql --user=root --password=my_pass < (curl http://localhost:3000/application.sql)
where that URL returns basically a text response with sql statements.
Some context:
An application I am developing supports multiple installations, as part of the installation process for a new instance we spin up a copy of our "data" database for the new instance.
I'm trying to automate the deployment process as much as possible so I built a small "dashboard" app in rails that can generate the sql statements, config files, etc for each instance and also helps us see stats about the instances and other fun stuff. Now I'm writing capistrano tasks to actually do a deployment based on the ID of the installation which i pass in as a variable.
The initial deployment setup includes creating the applications database, which this sql request will do. I could in theory pull the file in a wget request, execute and delete it but I thought it would be cleaner to just tell the remote server to curl request it and execute it in one step.
So any ideas?
I'm fairly certain the syntax you have originally won't work as the '<' expects a file. Instead you want to pipe the output of curl, which by default prints to STDOUT to mysql.
I believe the following will work for you.
curl http://localhost:3000/application.sql | mysql --user=root --password=my_pass
In Bash, you can do process substitution:
mysql ... < <(curl ...)
Related
I have a script lying into a Unix server which looks like this:
mainScript.sh
#some stuff here
emailScript.sh $PARAM_1 $PARAM_2
#some other stuff here
As you can see, mainScript.sh is calling another script called emailScript.sh.
The emailScript.sh is supposed to perform a query via sqlplus, then parse the results and return them via email if any.
The interesting part of the code in emailScript.sh is this:
DB_SERVER=$1
USERNAME=$2
PASSWORD=$3
EVENT_DATE=$4
LIST_AUTHORIZED_USERS=$5
ENVID=$6
INTERESTED_PARTY=$7
RAW_LIST=$(echo "select distinct M_OS_USER from MX_USER_CONNECTION_DBF where M_EVENT_DATE >= to_date('$EVENT_DATE','DD-MM-YYYY') and M_OS_USER is not null and M_OS_USER not in $LIST_AUTHORIZED_USERS;" | sqlplus -s $USERNAME/$PASSWORD#$DB_SERVER)
As you can see, all I do is just creating the variable RAW_LIST executing a query with sqlplus.
The problem is the following:
If I call the script mainScript.sh via command line (PuTTy / KiTTy), the sqlplus command works fine and returns something.
If I call the script mainScript.sh via an external job (a ssh connection opened on the server via a Jenkins job), the sqlplus returns nothing and takes 0 seconds, meaning it doesn't even try to execute itself.
In order to debug, I've printed all the variables, the query itself in order to check if something wasn't properly set: everything is correctly set.
It really seems that the command sqlplus is not recognized, or something like this.
Would you please have any idea on how I can debug this? Where should I look the issue?
You need to consider few things here. While you are running the script, from which directory location you are executing the script? And while you are executing the script from your external application from which directory location it is executing the script. Better use full path to the script like /path/to/the/script/script.sh or use cd /path/to/the/script/ command to go to the script directory first and execute the script. Also check execute permission for your application. You as an user might have permission to execute the script or sql command but your application does not have that permission. Check the user id for your application and add that into the proper group.
I'm working and discovering the world of Zabbix. In particular I am trying to monitor an Oracle database with the Zabbix server through an external script. Given that other external scripts work, however, I created one with sqlplus, but on Zabbix I get "command not found". Can you tell me why?
The code is:
check.pl
#!/usr/bin/perl
use strict;
use warnings;
my $out=`echo "select * from v$version;" | sqlplus user/password#ip_database:port`;
print $out;
The code is very simple.
I created an item as always, passed as type "external check" and a key I entered my script. Can anyone solve my problem? Also if I was not clear, just ask for more information rather than "insult" on the forum: Thanks to everyone in advance
I RESOLVED IT WITH:
echo "/usr/lib/oracle/11.2/client64/lib" > /etc/ld.so.conf.d/oracle.conf
echo "export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/oracle/11.2/client64/lib" >> /etc/profile
THANKS TO ALL!!!!
Apparently, your zabbix server does not have the necessary environment to find sqlplus. You could simply use the full path to sqlplus in your script (but that alone might not be enough) or create a wrapper script that sets all the necessary environment variables for your script.
From TFM:
The command will be executed as the user Zabbix server runs as, so any
access permissions or environment variables should be handled in a
wrapper script, if necessary, and permissions on the command should
allow that user to execute it.
You also have to configure the sqlplus libraries required to run sqlplus. The script which you use to start the zabbix server you can configure below oracle things in the startup script so that zabbix can find all necessary libraries to run.
export ORACLE_HOME={path to Oracle Client}
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:${ORACLE_HOME}/lib
If still there are issues related to .so files then there must be some issues in your SQL client installation.
Actually, I have a shell script which calls the informatica workflow. but i want to add a functionality in script to catch the data error while processing of data in workflow if required, and give the error message on screen like (error is coming due to wrong data .please refer the logs). Currently log is generated but i am unable to show screen message by using shell script.
below is command to call the workflow
pmcmd startworkflow -sv CSA_DEV_INT -d Domain_CSADevelopment -u Administrator -p Administrator -f Sumit -wait wf_ERROR_LOG_TESTING
pwc_status=$?
but the value of pwc_status is coming as 0 whereas I processed the wrong data. and informatica logs catch the error.
As long as the pmcmd call itself is successful (i.e. the server is found, the user can be authenticated, the workflow starts) it will return 0, even if there are errors while processing data. Use the the getworkflowdetails or gettaskdetails commands of the pmcmd utility to obtain details related to the workflow execution.
For more information about these commands see the Command Reference - you can find it in the Informatica installation directory on your server or download from Informatica My Support site (you need to be a registered user).
32-bit mongo 2.0.1 on a windows XP machine
//script filename: test.js (one line shell script file to store a person)
db.cTest.save({Name: "Fred", Age:21});
run against database dbTest by entering the following 2 shell commands:
> use dbTest
switched to dbTest
> load("test.js")
So far, so good.
But if I try and include the "use" statement in the script it fails:
//script filename: test.js (including "use" statement)
use dbTest;
db.cTest.save({Name: "Fred", Age:21});
fails with error msg as follows:
> load("test.js")
SyntaxError: missing ; before statement
Mon Dec 19 11:56:31: Error: error loading js file temp.js (shell):1
Adding or removing semicolons to test.js doesn't seem to matter.
So how do you put a "use" directive into a mongo shell script?
In a mongo script you can use the db.getSiblingDB('new_db_name') to get a reference of a new database. So, it it not mandatory to give the database name in the command line. You can use the script.js:
db = db.getSiblingDB('new_db_name');
print(db);
// the rest of your code for database "new_db_name"
and the output of this script is (invoked with mongo script.js):
MongoDB shell version: 2.2.2
connecting to: test
sag
http://www.mongodb.org/display/DOCS/Scripting+the+shell
use dbname
This command does not work in scripted mode. Instead you will need to explicitly define the database in the connection (/dbname in the example above).
Alternately, you can also create a connection within the script:
db2 = connect("server:27017/otherdbname")
Well, it still is unfortunate that "load('file.js')" and "mongo file.js" don't actually use the same script interpreter as the interactive mongo shell. Opening the connection explicitly in the script is potentially a violation of the DRY principle because mongo already knows that information. What does work, though, is piping the file into mongo rather than passing its name on the command line:
mongo <file.js
I'm trying to execute the command "file /directory/*" through the web, using Ajax that call a perl script.
When I'm running the script from the server I get the mime type correctly, but when i'm using the web that trigger the ajax, i'm getting "application/x-empty".
If i'm running the command from the server using "sudo -u apache perl_script.pl" - the result is correct.
Why from the Ajax I get a different response ?
Try without the asterisk but instead with a complete filename.