I'm using Bash on Windows and what I'm missing is a good IDE. I've tried running a GUI app with the following tutorial but it doesn't work well every time and it's frustrating.
What I want is to run the script that would copy the files from a folder on Windows into a folder on Unix subsystem, but only the files that are different. And same for the other direction (if I change something from terminal, to be updated in the Windows folder). I want that script to be run every time I call ./SOME_EXECUTABLE in that folder. For the check weather the file was changed or not I can use hg status because I'm mostly working with Mercurial.
Is there a way to do this without making a separate shell script that would combine those calls? Something like a macro.
You could use a function in .bashrc to achieve your requirement, and run your required script that copies stuff from across machines as you needed. Assuming you have the script in place, lets say, e.g. copyScript.sh you can add a function like
function copyOnExecute() {
./copyScript.sh
./EXECUTABLE
}
This way you can call the function copyOnExecute, every time you want to run your executable.
You could add an alias to .bash_aliases such as:
alias execute="./copyScript.sh && ./$1"
You can replace ./ with your scripts path
This runs the executable after your script has finished, and only if it finished successfuly.
It would be easier for you then writing a function in one of the .rc files
If you'd write a function called execute, in the future you might forget where It was written, using an alias helps you avoid this.
Related
I have a the following command in a file called $stat_val_result_command.
I want to add -Xms1g parameter at the end of the file so that is should look like this:
<my command in the file> -Xms1g
However, I want to run this command after append. I am running this in a workflow system called "nextflow". I tied many things, including following, but it does not working. check the script section which runs in Bash by default:
process statisticalValidation {
input:
file stat_val_result_command from validation_results_command.flatten()
output:
file "*_${params.ticket}_statistical_validation.txt" into validation_results
script:
"""
echo " -Xms1g" >> $stat_val_result_command && ```cat $stat_val_result_command```
"""
}
Best to avoid appending to or manipulating input files localized in the workdir as these can be, and are by default, symbolic links to the original files.
In your case, consider instead exporting the JAVA_TOOL_OPTIONS environment variable. This might or might not work for you, but might give you some ideas if you have control over how the scripts are being generated:
export JAVA_TOOL_OPTIONS="-Xms1g"
bash "${stat_val_result_command}"
Also, it's generally better to avoid localizing and running scripts like this. It might be unavoidable, but usually there are better options. For example, third-party scripts, like your Bash script could be handled more simply:
Grant the execute permission to these files and copy them into a
folder named bin/ in the root directory of your project repository.
Nextflow will automatically add this folder to the PATH environment
variable, and the scripts will automatically be accessible in your
pipeline without the need to specify an absolute path to invoke them.
This of course assumes you can control and parameterize the process that creates your Bash scripts.
I have a shell script, and a tarball. The shell script unpacks the tarball, and makes use of the files inside of for performing a task. I need to make this accessible on mac laptops, but in such a way that there is either a .app or .dmg file that when clicked, ultimately calls that shell script. I found several utilities, that can do this (create such an .app file), such as Platypus, or Appify. However, these require Mac to build the file. The thing is, I must package the .app/.dmg file, in an Ubuntu environment.
Is there any good software for creating a dmg or app file which call a shell script when clicked, but such that the software which can be run in Ubuntu (just for the purpose of creating the file)?
This is not an exact answer to your question but an workaround that might be acceptable if you can't find a better solution.
First, a zip file will automatically extract its content if you double click it in OS X so
tar -cvzf your_filename.zip ...
would create a file that can be easily extracted.
Secondly, if you create a shell script that has the extension .command, but otherwise is like any shell script, it can be run from OS X by double clicking on it (by opening a terminal and executing it there), it would mean an extra manual step for the user but like I said, this is a workaround :)
If you create a .command file, remember to make it executable.
I need to find a solution at work to backup specific folders daily, hopefully to a RAR or ZIP file.
If it was on PC, I would have done it already. But I don't have any idea to how to approach it on a Mac.
What I basically want to achieve is an automated task, that can be run with an executable, that does:
compress a specific directory (/Volumes/Audio/Shoko) to a rar or zip file.
(in the zip file exclude all *.wav files in all sub Directories and a directory names "Videos").
move It to a network share (/Volumes/Post Shared/Backup From Sound).
(or compress directly to this folder).
automate the file name of the Zip file with dynamic date and time (so no duplicate file names).
Shutdown Mac when finished.
I want to say again, I don't usually use Mac, so things like what kind of file to open for the script, and stuff like that is not trivial for me, yet.
I have tried to put Mark's bash lines (from the first answer, below) in a txt file and executed it, but it had errors and didn't work.
I also tried to use Automator, but it's too plain, no advanced options.
How can I accomplish this?
I would love a working example :)
Thank You,
Dave
You can just make a bash script that does the backup and then you can either double-click it or run it on a schedule. I don't know your paths and/or tools of choice, but some thing along these lines:
#!/bin/bash
FILENAME=`date +"/Volumes/path/to/network/share/Backup/%Y-%m-%d.tgz"`
cd /directory/to/backup || exit 1
tar -cvz "$FILENAME" .
You can save that on your Desktop as backup and then go in Terminal and type:
chmod +x ~/Desktop/backup
to make it executable. Then you can just double click on it - obviously after changing the paths to reflect what you want to backup and where to.
Also, you may prefer to use some other tools - such as rsync but the method is the same.
I've always mentally regarded the current directory as something for users, not scripts, since it is dependent on the user's location and can be different each time the script is executed.
So when I came across the Java jar utility's -C option I was a little puzzled.
For those who don't know the -C option is used before specifying a file/folder to include in a jar. Since the path to the file/folder is replicated in the jar, the -C option changes directories before including the file:
in other words:
jar -C flower lily.class
will make a jar containing the lily.class file, whereas:
jar flower/lily.class
will make a flower folder in the jar which contains lily.class
For a jar-ing script I'm making I want to use Bourne wild-cards folder/* but that would make using -C impossible since it only applies to the next immediate argument.
So the only way to use wild-cards is run from the current directory; but I still feel uneasy towards changing and using the current directory in a script.
Is there any downside to using the current directory in scripts? Is it frowned upon for some reason perhaps?
I don't think there's anything inherently wrong with changing the current directory from a shell script. Certainly it won't cause anything bad to happen, if taken by itself.
In fact, I have a standard script that I use for starting up a Java-based server, and the very first line is:
cd `dirname $0`
This ensures that the rest of the commands in the script are executed in the directory that contains the script file itself (useful when a single machine is hosting multiple server instances), regardless of where the shell script was actually invoked from. Without changing the current directory in the script, it would only work correctly if the user remember to manually cd into the corresponding directory before running the script.
In this case, performing the cd operation from within the script removes a manual step from the server startup/shutdown process, and makes things slightly less error-prone as a result.
So as with most things, there are legitimate uses for this sort of thing. And I'm sure there are also some questionable ones, as well. It really depends upon what's most appropriate for your specific use-case. Which is something I can't really comment on...I always just let maven build my JAR's for me.
I have seen that i can use chmod +x command in Mac to make a shell script executable. This works fine, but i have noticed that i have to do the same thing every time this shell script file is copied to another Mac computer.
Is there a way to make the shell script executable by default when double clicked, without such command ... As the shell script file will be given to many users, and doing this will be hard for some of them ?
Best regards.
If you pack your whole program in a .tar file (or in a .tar.gz-file, which is the same, but compressed), the executable-"permission" will be preserved.
Give it the '.command' extension and it can be executed from the Finder.