The 2nd command is not executed before terminating the previous one - bash

I'm very new to bash. I'm running a bash script. It is supposed to start Neo4j and then execute a series of queries located in a file called "cypher.ex1". Here is the code:
#!/bin/bash
./bin/neo4j console
./bin/cypher-shell -u neo4j -p 123456 --file cypher.ex1
In order to use Cypher-shell, we should start the Neo4j service first. So, this line:
./bin/neo4j console
starts the Neo4j, so that cypher-shell can be used using the following line:
./bin/cypher-shell -u neo4j -p 123456 --file cypher.ex1
The problem is that since ./bin/neo4j console starts the Neo4j service, the next command (./bin/cypher-shell -u neo4j -p 123456 --file cypher.ex1) is not executed unless I press the "Ctrl + C". If I press "Ctrl + C", the Neo4j service will be stopped and the following command will not be executed too (I get "connection refused" error). What should I do in order to start the Neo4j service and then run the cypher shell in this bash script?
I tried the solutions given in Run a command in background and another command in frontground in the same line. None of them worked for me. For example, When I execute the code with "(command1 &); command2" (As it was suggested in the proposed topic), my script is executed 2 times automatically. The first time command2 is executed and since the command1 is not executed I get "connection refused" error; The second time command1 is executed and command2 is not executed.

As it it mentioned in first answer in the Q/A you should exec the commands on this way:
#!/bin/bash
./bin/neo4j console &
./bin/cypher-shell -u neo4j -p 123456 --file cypher.ex1
This will run ./bin/neo4j console in background. So you need to take care about this process and stop it in case of need:
PID=$(jobs -l |awk '{print $2}')
kill $PID

Related

Issue in running my interactive bash script on remote server by doing ssh and with sshpass if VPN disconnects

I am trying to run my interactive bash script kept on remote machine from my local machine.this script can take 4-5 hours to run hence need to run it on screen or nohup so that even if vpn disconnects it doesn't stop my script from running.
here is my command which I run from local.
sshpass -p <Mypassword> ssh -t user#hostaname.com "cd /directory/ ; ./runscr.sh ; bash --login"
my runscr.sh contains following
now=$(date +"%s")
screen -t $now bash ./mainscript.sh bash
trying to run main script in screen /I am open to try nohup as well if it works.
mainscript.sh takes input for 2 parameters and then starts executing some further processing (taken input and for that input extracts data from the main 300gb file, for this zgrep, split and for loop is used in bash which does not have any issue, just it takes some 3-4 hours)
read -p 'please enter Input 1 : ' cname
read -p 'please enter Input 2 : ' cid
echo Thankyou entered cname is $cname and entered cd is $cid we now have the details .starting zgrep commands to extract above cname and cide from main 300gb file.
.
.
.
.
further code with grep,split and loop commands
this works fine till vpn is not disconnected, but once vpn is disconnected it stops processing.
what can I do so that even if vpn disconnected it keeps processing and generated th output

Loop trough docker output until I find a String in bash

I am quite new to bash (barely any experience at all) and I need some help with a bash script.
I am using docker-compose to create multiple containers - for this example let's say 2 containers. The 2nd container will execute a bash command, but before that, I need to check that the 1st container is operational and fully configured. Instead of using a sleep command I want to create a bash script that will be located in the 2nd container and once executed do the following:
Execute a command and log the console output in a file
Read that file and check if a String is present. The command that I will execute in the previous step will take a few seconds (5 - 10) seconds to complete and I need to read the file after it has finished executing. I suppose i can add sleep to make sure the command is finished executing or is there a better way to do this?
If the string is not present I want to execute the same command again until I find the String I am looking for
Once I find the string I am looking for I want to exit the loop and execute a different command
I found out how to do this in Java, but if I need to do this in a bash script.
The docker-containers have alpine as an operating system, but I updated the Dockerfile to install bash.
I tried this solution, but it does not work.
#!/bin/bash
[command to be executed] > allout.txt 2>&1
until
tail -n 0 -F /path/to/file | \
while read LINE
do
if echo "$LINE" | grep -q $string
then
echo -e "$string found in the console output"
fi
done
do
echo "String is not present. Executing command again"
sleep 5
[command to be executed] > allout.txt 2>&1
done
echo -e "String is found"
In your docker-compose file make use of depends_on option.
depends_on will take care of startup and shutdown sequence of your multiple containers.
But it does not check whether a container is ready before moving to another container startup. To handle this scenario check this out.
As described in this link,
You can use tools such as wait-for-it, dockerize, or sh-compatible wait-for. These are small wrapper scripts which you can include in your application’s image to poll a given host and port until it’s accepting TCP connections.
OR
Alternatively, write your own wrapper script to perform a more application-specific health check.
In case you don't want to make use of above tools then check this out. Here they use a combination of HEALTHCHECK and service_healthy condition as shown here. For complete example check this.
Just:
while :; do
# 1. Execute a command and log the console output in a file
command > output.log
# TODO: handle errors, etc.
# 2. Read that file and check if a String is present.
if grep -q "searched_string" output.log; then
# Once I find the string I am looking for I want to exit the loop
break;
fi
# 3. If the string is not present I want to execute the same command again until I find the String I am looking for
# add ex. sleep 0.1 for the loop to delay a little bit, not to use 100% cpu
done
# ...and execute a different command
different_command
You can timeout a command with timeout.
Notes:
colon is a utility that returns a zero exit status, much like true, I prefer while : instead of while true, they mean the same.
The code presented should work in any posix shell.

Why is this script doing nothing after the for loop?

I have this bash script. It uses cqlsh to wait for a cassandra schema to be set up. The arguments to the script are the cassandra db keyspaces. Essentially it is checking that there is at least 1 entry in the schema_updates table. This table records all database migrations. If there is any entry in this table, it means that the keyspace that table is now available.
The issue is that none of the commands after the for loop are being executed when I use this script as a docker-compose entrypoint. It works fine if I just call it up directly.
I don't think the problem has anything to do with cassandra or the cassandra query. I have tested each line individually, I have run the the whole script on my local, I have gone into the container started up by the docker-compose file to start the script manually and it works exactly as expected in all three cases.
#!/usr/bin/env bash
for keyspace in "$#";
do
KEYSPACEFOUND=1
until [[ $KEYSPACEFOUND = 0 ]];
do
cqlsh -u cassuser -p casspwd cassandra -e "select filename from $keyspace.schema_updates limit 1" 2>/dev/null | grep "(1 rows)" >/dev/null 2>&1
let KEYSPACEFOUND=$?
done
done
echo "All keyspaces are available"
exec ./bin/applicationStartScript
It turns out that one of the arguments I was passing to the script was not a valid keyspace. This meant that the until loop would never terminate and therefore neither would the for loop.

bash call is creating a new process.I want to the next command in same process

I am logging into a remote server using SSH client. I have written a script that will execute two commands on the server.But, as the first command executes a bash script that calls "bash" command at the end. This results in execution of only one command not the other.
I cannot edit the first script to comment or remove the bash call.
i have written following script:
abc.sh
#!/bin/bash
command1="sudo -u user_abc -H /abc/xyz/start_shell.sh"
command2=".try1.sh"
$command1 && $command2
Only command 1 is getting executed not the second as the "bash" call is creating a new process the second command is not executing.
Solution 1
Since you can execute start_shell.sh you must have read permissions. Therefore, you could copy the script, modify it such that it doesn't call bash anymore, and execute the modified version.
I think this would be the best solution. If you really really really have to use start_shell.sh as is, then you could try one of the following solutions.
Solution 2
Try closing stdin using <&-. An interactive bash session will exit immediately if there is no stdin.
sudo -u user_abc -H /abc/xyz/start_shell.sh <&-; ./try1.sh
Solution 3
Change the order if both commands are independent.
./try1.sh; sudo -u user_abc -H /abc/xyz/start_shell.sh

Run bash script loop in background which will write result of jar command to file

I'm novice to running bash script. (you can suggest me, if title I've given is incorrect.)
I want to run a jar file using bash script in loop. Then it should write the output of jar command into some file.
Bash file datagenerate.sh
#!/bin/bash
echo Total iterations are 500
for i in {1..500}
do
the_output="$(java -jar data-generator.jar 10 1 mockData.csv data_200GB.csv)"
echo $the_output
echo Iteration $i processed
done
no_of_lines="$(wc -l data_200GB.csv)"
echo "${no_of_lines}"
I'm running above script using command nohup sh datagenerate.sh > datagenerate.log &. As I want to run this script in background, so that even I log out from ssh it should keep running & output should go into datagenerate.log.
But when I ran above command and hit enter or close the terminal it ends the process. Only Total iterations are 500 is getting logged into output file.
Let me know what I'm missing. I followed following two links to create above shell script: link-1 & link2.
nohup sh datagenerate.sh > datagenerate.log &
nohup should work this way without using screen program, but depending on your distro your sh shell might be linked to dash.
Just make your script executable:
chmod +x datagenerate.sh
and run your command like this:
nohup ./datagenerate.sh > datagenerate.log &
You should check this out:
https://linux.die.net/man/1/screen
With this programm you can close your shell while a command or script is still running. They will not be aborted and you can pick the session up again later.

Resources