This question already has answers here:
Pass commands as input to another command (su, ssh, sh, etc)
(3 answers)
Closed 3 years ago.
I have a couple of commands that need to be run from root
cd FolderName
sudo su
export VARIABLE_NAME=120
. install/setup.bash
ros2 run node node
I tried to create a script from these commands, but after the sudo su command, the script stops.
How can I run this set of commands under the root bash script?
The best way to do this, run the script as root user like,
$ cat install.bash
#!/bin/bash
cd FolderName
export VARIABLE_NAME=120
. install/setup.bash
ros2 run node node
And then run as root,
$ su root install.bash
Related
This question already has an answer here:
Ansible inside script command not found
(1 answer)
Closed 3 years ago.
I am working on a script to get list of files from remote servers, each remote server has its own password, IP and user and path. That's why I am using sshpass to pass the password parameter dynamically. Here is my script to just get list of files for time being:
user='username'
pass='password' ->>> LOCATION_OF_REMOTE='hostpath'
ip='hostip'
path='hostpath'
data=`/usr/bin/sshpass -p $pass sftp -oBatchMode=no -b - -oStrictHostKeyChecking=no $user#$ip <<_EOF_
cd $path
ls
_EOF_`
echo $data >> list_of_files.txt
When I run this script I am getting error:
ERROR: sshpass: Failed to run command: No such file or directory
while exact same command works fine outside the script on command line. It just doesn't work when it's within the script. I tried to run this as root and non root use both. I don't know why it doesn't work from within the script.
I have a super silly mistake. By defining a variable "PATH", shell is considering it environment path and that's why I was getting that stupid error.
This question already has answers here:
Multiple commands on remote machine using shell script
(3 answers)
Why would SSH commands through Putty work differently to those via PHP's phpseclib?
(1 answer)
Paramiko: calling "cd" command with exec_command does nothing
(3 answers)
Closed 4 years ago.
I have a running instance my-instance running on google-cloud. I run the following code on my local machine:
gcloud compute ssh my-instance --command 'pwd'
gcloud compute ssh my-instance --command 'cd ..'
gcloud compute ssh my-instance --command 'pwd'
my output is:
/home/pal
/home/pal
My expectation is:
/home/pal
/home
It seems that cd .. is not working. Why is it? It is surprising for me especially that pwd works as shown above. How can I solve the issue?
I used this page as source: https://cloud.google.com/sdk/gcloud/reference/compute/ssh
This question already has answers here:
Pass commands as input to another command (su, ssh, sh, etc)
(3 answers)
Closed 5 years ago.
I'm a complete noob to writing bash scripts. I'm trying to do the following:
#!/bin/bash
mkdir New_Project
cd New_Project
pipenv install ipykernel
pipenv shell
python -m ipykernel install --user --name==new-virtual-env
jupyter notebook
The problem I'm having is that after it executes pipenv shell, it starts the new shell and then doesn't execute the last two commands. When I exit the new shell it then tries to execute the remaining lines. Is there any way to get a script to run all of these commands from start to finish?
As per the manual :
shell will spawn a shell with the virtualenv activated.
which is not what you need. Instead use run :
run will run a given command from the virtualenv, with any arguments
forwarded (e.g. $ pipenv run python).
In your case, something like
pipenv run python -m ipykernel install --user --name==new-virtual-env
Using the Linux terminal, I run bash scripts (.sh files) containing sequences of commands I want to execute.
The issue is that I am unable to run a Docker command from within my shell script. I can run this Docker command when it's typed directly at the terminal with root privileges but not when I include it in the shell script file.
My script executed as a general user from command line, looks like this:
#!/usr/bin/env bash
cd /home/user/docker_backup
# remove /home/user/docker_backup/data
rm -rf data
# Switch to root privileges. my system is set to only run Docker as root
su
# Copy a folder from Docker container to host OS
docker cp <container-name>:/home/user/data /home/user/docker_backup
# More general user commands
cd ..
My code only runs until the su line above. After i enter the root password, nothing happens. if i type exit, i get permission errors, meaning the docker cp command failed.
**
This is my desired solution
**After thorough research, as I wanted to run my script as a general user, and only run certain commands as Root when necessary, I came up with a solution that works.
My script now looks like this (run with
$ sh script_name.sh):
#!/usr/bin/env bash
cd /home/user/docker_backup
# remove /home/user/docker_backup/data
rm -rf data
# Switch to root privileges. my system is set to only run Docker as root
su - root -c "docker cp <container-name>:/home/user/data /home/user/docker_backup"
# More general user commands
cd ..
Run shell script as general user. For commands that require root privileges, I use su - root -c "<command>". Terminal prompts for root password and executes command in quotes as root, then shell proceeds as general user.
Actually posting this as an answer:
You switch your current user to root during the script, but the script was executed by your own user.
So the docker cp command will also be executed as your own user, but you will be logged into the root account.
This results in you not seeing the output of docker cp (which might give you insight to not working - I think insufficient privilege).
A solution to this is either using sudo before docker cp, starting the script as root or adding your user to the group "docker", which authorizes your user to use the docker commands
I had the similar issue where the docker commands were running fine on the Terminal but the same commands were not running when I compiled them into a bash script and the issue was basically because of two reasons.
The docker commands need to be run with uplifted privileges that is with the sudo command ( Eg: sudo docker ps works but docker ps won't work). One could add the current user to docker group so that we need not use sudo with each docker command. Please visit this link and follow the section 2 to do the same.
Run the script in the correct way
One should have #! bin/bash at the starting of the script. It is a shebang that is required by each script.
One should save the file without .sh extension
One should provide the execution permission to the script by giving command chmod 777 script_name
run the script with bash script_name
I have a simple shell script that is run with sudo as most of the script requires it, yet one of the commands in the script is a Homebrew install, which cannot be executed with sudo..
So, my question is when executing a shell script with sudo how do I execute sub commands as the current user and then continue the remainder of the script with sudo.
Prompting the user to enter his password again is not really practical as the script takes really long to execute and would require waiting 5-10 min for the prompt.
The easiest way is to run the subcommand via sudo from within the script. The user id to run with can be obtained by $SUDO_USER (look at the output of sudo env):
sudo -u $SUDO_USER ./exec_as_normal_user.sh
Instantiate the shell using
sudo -u $USER_NAME bash
and execute the shell script by calling,
./program.sh