I'm trying to slim down the process it takes to rebuild my php/react app, but I'm having trouble with yarn/npm commands. Is there a way to write a script that reads Powershell syntax? Right now I'm just copy pasting it since yarn isn't reading it properly.
Example:
yarn copy to run cp -r src/services build/services; cp -r vendor build/vendor
Related
I'm trying to copy a bash file called setup_envs.sh which is in the same directory of my Dockerfile.
How can I run this bash file only once after Dockerfile is created?
My code is (in the end of the Dockerfile):
RUN mkdir -p /scripts
COPY setup_env.sh /scripts
WORKDIR /scripts
RUN chmod +x /scripts/setup_env.sh
CMD [./scripts/setup_env.sh]
Current error:
/bin/bash: [./scripts/setup_env.sh]: No such file or directory
I don't have a type in the file btw, I checked this.
Moreover, after I solve this and run the image to create a container - how can I make sure this bash script is only called once? Should I just write a command in the bash script that checks if some folder exists - and if it does - don't install it?
Based on the different comments including mine, this is what your Dockerfile extract should be replaced with:
COPY --chmod 755 setup_env.sh /scripts/
WORKDIR /scripts
CMD /scripts/setup_env.sh
Alternatively you can use the exec form for CMD but there is not much added value here since you're not passing any command line parameters.
CMD ["/scripts/setup_env.sh"]
At this point, I'm not really sure the WORKDIR instruction is useful (it depends on the rest of your Dockerfile and the content of your script).
Regarding your single bash script execution, I think you need to give a bit more background on the exact goal you are targeting. I have the feeling you could be in an X/Y Problem. And since this is a totally different issue, it should go inside a new question anyway with all required details.
I have two issues I need help with on bash, linux and s3cmd.
First, I'm running into linux permission issue. I am trying to download zip files from a s3 bucket using s3cmd with following command in a bash script.sh:
/usr/bin/s3cmd get s3://<bucketname>/<folder>/zipfilename.tar.gz
I am seeing following error: permission denied.
If I try to run this command manually from command line on a linux machine, it works and downloads the file:
sudo /usr/bin/s3cmd get s3://<bucketname>/<folder>/zipfilename.tar.gz
I really don't want to use sudo in front of the command in the script. How do I get this command to work? Do I need to give chown permission to the script.sh which is actually sitting in a path i.e /foldername/script.sh or how do I get this get command to work?
Two: Once I get this command to work, How do I get it to download from s3 to the linux home dir: ~/ ? Do I have to specifically issue a command in the bash script: cd ~/ before the above download command?
I really appreciate any help and guidance.
First, determine what's failing and the reason, otherwise you won't find the answer.
You can specify the destination in order to avoid permission problems when the script is invoked using a directory that's not writeable by that process
/usr/bin/s3cmd get s3:////zipfilename.tar.gz /path/to/writeable/destination/zipfilename.tar.gz
Fist of all ask 1 question at a time.
For the first one you can simply change the permission with chown like :
chown “usertorunscript” filename
For the second :
If it is users home directory you can just specify it with
~user
as you said but I think writing the whole directory is safer so it will work for more users (if you need to)
zip --symlinks -q -r /cygdrive/d/folderA/folderA.zip folderA/
this is how I am using the zip command in bash script,
but after running that bash script, the resulted zip is having only one folder. it is not taking whole data while doing zip operation using bash script.
But on normal command prompt, it is working as expected.
So I'm trying to write a script that will let me run a command to initialize some things. To be more specific, let's say I start in my home directory but to run this command I want I must be in a directory three folders deep into the home directory.
My script looks generically like this.
#!/bin/sh
cd home/path/to/final/directory/
command
Now usually, when I cd to this directory I can run the command on the command line and everything works fine.
When I tried to use a script to do it, the command line throws an error saying that that command isn't recognized like the computer doesn't know where to look.
The temporary fix I used was making a symbolic link to the directory I wanted but I was hoping someone could help me so that when I ssh to this node this script can be run immediately so I will not have to go into the deep directory, run the command and leave again.
Try defining full paths, for example:
#!/bin/sh
cd $HOME/path/to/final/directory && /path/to/your/command
In this case, it will try to cd into your defined directory but if it can't find the dir it will not run the command, this because of the &&
To test before running the command you could do a ls, for example:
cd $HOME/path/to/final/directory && ls
On OS X, I'm writing a bash script to be run as a user, not root. Every so often I need to escalate to root to run certain commands.
In one section I'm trying to iterate over the contents of a directory owned by root with the permissions drwx------, which means I can't glob the contents of the directory as a normal user.
This doesn't work:
sudo for files in "/System/Library/User Template"/*
do
some command "$files"
done
What I would like to do is this:
for files in "/System/Library/User Template"/*
do
sudo some command "$files"
done
This is a new system bootstrap script, so I'd like to keep everything in one script and I definitely cannot run the whole thing as root. I'm wondering:
If there's a proper way to escalate to glob a directory in a for loop.
If I should change the permissions as root, run the code, change the permissions back.
I think you can only consider calling another script that's meant to run as root instead and use sudo with it.
Although maybe you can run bash reading commands from input:
sudo bash -s <<'EOD'
for files in "/System/Library/User Template"/*; do
some command "$files"
done
EOD