changing directories by using cd - shell

I executed the following command :
cd /mnt/c/Users/Daniel/Documents/Assg/ | cat file.txt
my question is why doesn't it change directory?. The output file.txt is displayed but the directory is not changed. I understand that if we execute the same command in the following order, it won't work because cd changes directory in a child process, so the net result is the same.
cat file.txt | cd /mnt/c/Users/Daniel/Documents/Assg/

Try just cd /mnt/c/Users/Daniel/Documents/Assg/

As was already stated, the following:
cd /mnt/c/Users/Daniel/Documents/Assg/
should do the trick, but I'd like to go a bit more into why the command you presented doesn't work as expected. In Bash (and other shells), you can have multiple "subshells" running under a parent shell. each of these subshells has its own working directory. When you run commands in a pipeline, as you have done, a subshell is created. The working directory of the subshell was changed, but that didn't have any effect on the shell you were working in.

It depends on the shell you use
When you run two commands in a pipeline, typically one or both of the commands is run in a separate child process.
In older shells this would be both, in later shells this can be either
the first or the last.
At one point, the ksh93 team decided to make the last command in the pipeline the parent. This would prevent race conditions, and if the command was a builtin, it allows it to run inside the current shell
process and preserve the results of the pipeline.
Nevertheless, cd is a command that does not consume or produce any input or output (except for diagnostics on stderr), and using it in a pipeline
by itself is just silly. A better, because more predictable, command line would be:
cd /mnt/c/Users/Daniel/Documents/Assg/ && cat file.txt
This will assure that cat only runs if cd succeeds, and will then
show the contents of file.txt from the given directory.

You have different options.
Perform cat after trying to change dir
cd /mnt/c/Users/Daniel/Documents/Assg/ ; cat file.txt
Perform cat only when change dir worked
cd /mnt/c/Users/Daniel/Documents/Assg/ && cat file.txt
Perform cat in the other directory, but return to the current dir when finished.
(cd /mnt/c/Users/Daniel/Documents/Assg/ && cat file.txt)
# or
cat /mnt/c/Users/Daniel/Documents/Assg/file.txt
EDIT:
Your question: "why doesnt cd /mnt/c/Users/Daniel/Documents/Assg/ | cat file.txt, change directory?." can be answered two ways.
The technical explanation is given by #Henk (The pipe introduces a subshell, and environ settings in a subshell get lost when the shell exits).
The functional explanation is that you used the wrong syntax for what you are trying to accomplish.

Related

Fish shell - advanced control flow

Normally, fish shell processes commands like this:
1 && 3 (2)
This is perfectly useful and it mirrors the order of execution that I would want most of the time.
I was wondering if a different syntax exists to get a slightly different order of execution?
Sometimes I want this:
2 && 3 (1)
is that possible without using multiple lines ?
This is a trivial example:
cd ~ && cat (pwd | psub)
In this example I want to run pwd first then run cd and then run cat
edit: oh! This seems to work:
cat (pwd | psub && cd ~)
This is one of those cases where I'm going to recommend just using multiple lines [0].
It's cleaner and clearer:
set -l dir (pwd)
cd ~ && cat (printf '%s\n' $dir | psub)
This is completely ordinary and straightforward, and that's a good thing. It's also easily extensible - want to run the cd only if the pwd succeded?
set -l dir (pwd)
and cd ~ && cat (printf '%s\n' $dir | psub)
as set passes on the previous $status, so here it passes on the status of pwd.
The underlying philosophy here is that fish script isn't built for code golf. It doesn't have many shortcuts, even ones that posix shell script or especially shells like bash and zsh have. The fish way is to simply write the code.
Your answer of
cat (pwd | psub && cd ~)
doesn't work because that way the cat is no longer only executed if the cd succeeds - command substitutions can fail. Instead the cd is now only done if the psub succeeded - notably this also happens if pwd fails.
(of course that cat (pwd | psub) is fairly meaningless and could just be pwd, I'm assuming you have some actual code you want to run like this)
[0]: Technically this doesn't have to be multiple lines, you can write it as set -l dir (pwd); cd ~ && cat (printf '%s\n' $dir | psub). I would, however, also recommend using multiple lines

gnu parallel to parallelize a for loop

I have seen several questions about this topic, but I lack the ability to translate this to my specific problem. I have a for loop that loops through sub directories and then executes a .sh script on a compressed text file inside each directory. I want to parallelize this process, but I'm struggling to apply gnu parallel.
Here is my loop:
for d in ./*/ ; do (cd "$d" && script.sh); done
I understand I need to input a list into parallel, so i have been trying this:
ls -d */ | parallel cd && script.sh
While this appears to get started, I get an error when gzip tries to unzip one of the txt files inside the directory, saying the file does not exist:
gzip: *.txt.gz: No such file or directory
However, when I run the original for loop, I have no issues aside from it taking a century to finish. Also, I only get the gzip error once when using parallel, which is so weird considering I have over 1000 sub-directories.
My questions are:
How do I get Parallel to work in my case? How do I get parallel to parallelize the application of a .sh script to 1000s of files in their own sub-directories? ie- what is the solution to my problem? I gotta make progress.
What am I missing? Syntax, loop, bad script? I want to learn.
Is Parallel actually attempting to run all these .sh scripts in parallel? Why dont I get an error for every .txt.gz file?
Is parallel the best option for the application? Is there another option that is better suited to my needs?
Two problems:
In:
ls -d */ | parallel cd && script.sh
what is paralleled is just cd, not script.sh. script.sh is only executed once, after all parallel cd jobs have run, if there was no error. It is the same as:
ls -d */ | parallel cd
if [ $? -eq 0 ]; then script.sh; fi
You do not pass the target directory to cd. So, what is executed by parallel is just cd, which just changes the current directory to your home directory. The final script.sh is executed in the current directory (from where you invoked the command) where there are probably no *.txt.gz files, thus the error.
You can check yourself the effect of the first problem with:
$ mkdir /tmp/foobar && cd /tmp/foobar && mkdir a b c
$ ls -d */ | parallel cd && pwd
/tmp/foobar
The output of pwd is printed only once, even if you have more than one input directory. You can fix it by quoting the command and then check the second problem with:
$ ls -d */ | parallel 'cd && pwd'
/homes/myself
/homes/myself
/homes/myself
You should see as many pwd outputs as there are input directories but it is always the same output: your home directory. You can fix the second problem by using the {} replacement string that is substituted with the current input. Check it with:
$ ls -d */ | parallel 'cd {} && pwd'
/tmp/foobar/a
/tmp/foobar/b
/tmp/foobar/c
Now, you should have all input directories properly listed in the output.
For your specific problem this should work:
ls -d */ | parallel 'cd {} && script.sh'

xargs Behavior : Semantics or Bug?

Wanting to push a number of directories onto the stack, I ran:
echo ~/{Desktop,Downloads,Movies} | xargs pushd
and encountered xargs: pushd: No such file or directory
Brace expansion is not the cause of the mismatch between what I have in mind and what happens because echo ~/Desktop | xargs pusdh results in the same error.
As a point of comparison, echo ~/Desktop | xargs cd changes directory as one would expect.
What's going on here?
It's semantics, the equivalent statement should be:
pushd $(echo ~/{Desktop,Downloads,Movies})
After my experiment, the behavior of builtin command is like
#!/bin/sh
function pushd()
{
accept input from $1, $2, $3.....
# Builtin will not read from stdin! So you can't use pipe.
}
The builtin command should be viewed as shell function.
[Edit]
The command 'pushd' in zsh is implemented together with 'cd', it only accept one argument.
So you can't push a number of directories in single statement.
source is there
Are you sure xargs cd does what you expect? I'd be surprised! xargs will call a binary, but pushd is not - run type pushd if you want confirmation. cd and pushd don't make much sense as external binaries.
You'll need to capture the directories in a variable, and call pushd in a for loop in the shell process itself, rather than from xargs, which is a child process of the shell, and hence any directory state modified by xargs or its children won't pass up to the parent shell.

Shell/Bash - pipe output into another script's input via a variable

Normally I would break things into separate actions and copy and paste the output into another input:
$ which git
/usr/local/bin/git
$ sudo mv git-credential-osxkeychain /usr/local/bin/git
Any quick hack to get output into input?
something like:
$echo which wget | sudo mv git-credential-osxkeychain
set -vx
myGit=$(which git)
gitDir=${myGit#/git} ; gitDir=${gitDir#/bin}/git
echo sudo mv git-credential-osxkeychain ${gitDir}
Remove the set -vx and the echo on the last line when you're sure this performs the action that you require.
It's probably possible to reduce the number of keystrokes required, but I think this version is easier to understand what techniques are being used, and how they work.
IHTH
use command substitution with $(command)
sudo mv git-credential-osxkeychain $(which git)
This substitutes the command for its output. You can find all about it in http://tldp.org/LDP/abs/html/commandsub.html
The answer would be what Chirlo and shellter said.
Why $echo which wget | sudo mv git-credential-osxkeychain wouldn't work is because piping redirect the stdout from a previous command to the stdin of the next command. In this case, move doesn't take input from stdin.
A curious thing is that which git returns
/usr/local/bin/git
but you are moving git-credential-osxkeychain to
/usr/local/git/bin/
Those two don't match. Is there a typo or something?
If you want to use the pipe syntax, then you should look at xargs.

How to get parent folder of executing script in zsh?

In bash i get the executing script's parent folder name by this line
SCRIPT_PARENT=`readlink -f ${BASH_SOURCE%/*}/..`
Is there any way to achieve this in zsh in a way that works both in zsh and bash?
Assume i have got a file /some/folder/rootfolder/subfolder/script with the contents:
echo `magic-i-am-looking-for`
I want it to behave this way:
$ cd /some/other/folder
$ . /some/folder/rootfolder/subfolder/script
/some/folder/rootfolder
$ . ../../folder/rootfolder/subfolder/script
/some/folder/rootfolder
$ cd /some/folder/rootfolder
$ . subfolder/script
/some/folder/rootfolder
$ cd subfolder
$ . script
/some/folder/rootfolder
This should work in bash and zsh. My first implements this behavior, but does due to $BASH_SOURCE not work in zsh.
So basically its:
Is there a way to emulate $BASH_SOURCE in zsh, that also works in bash?
I now realized that $0 in zsh behaves like $BASH_SOURCE in bash. So using $BASH_SOURCE when available and falling back to $0 solves my problem:
${BASH_SOURCE:-$0}
There is a little zsh edge case left, when sourcing from $PATH like:
zsh> cat ../script
echo \$0: $0
echo \$BASH_SOURCE: $BASH_SOURCE
echo '${BASH_SOURCE:-$0}:' ${BASH_SOURCE:-$0}
zsh> . script
$0: script
$BASH_SOURCE:
${BASH_SOURCE:-$0}: script
bash> . script
$0: bash
$BASH_SOURCE: /home/me/script
${BASH_SOURCE:-$0}: /home/me/script
I could do a which script but this would not play nice with other cases
While it would be easy to do this in zsh, it is just as easy to use pure bash which is able to be evaluated in zsh. If you cannot use any command that may or may not be on your path, then you can only use variable alteration to achieve what you want:
SCRIPT_SOURCE=${0%/*}
This is likely to be a relative path. If you really want the full path then you will have to resort to an external command (you could implement it yourself, but it would be a lot of work to avoid using a very available command):
SCRIPT_SOURCE=$(/bin/readlink -f ${0%/*})
This doesn't depend on your $PATH, it just depends on /bin/readlink being present. Which it almost certainly is.
Now, you wanted this to be a sourced file. This is fine, as you can just export any variable you set, however if you execute the above then $0 will be the location of the sourced file and not the location of the calling script.
This just means you need to set a variable to hold the $0 value which the sourced script knows about. For example:
The script you will source:
echo ${LOCATION%/*}
The script that sources that script:
LOCATION=$0
<source script here>
But given that the ${0%/*} expansion is so compact, you could just use that in place of the script.
Because you were able to run the command from your $PATH I'll do something like that:
SCRIPT_PARENT=$(readlink -f "$(which $0)/..")
Is that your desired output?

Resources