I want to check some file on the remote website.
Here is bash command to generate commands that calculate the file md5
[root]# head -n 3 zrcpathAll | awk '{print $3}' | xargs -I {} echo wget -q -O - -i {}e \| md5sum\;
wget -q -O - -i https://example.com/zrc/3d2f0e76e04444f4ec456ef9f11289ec.zrce | md5sum;
wget -q -O - -i https://example.com/zrc/e1bd7171263adb95fb6f732864ceb556.zrce | md5sum;
wget -q -O - -i https://example.com/zrc/5300b80d194f677226c4dc6e17ba3b85.zrce | md5sum;
Then I pipe the outputed commands to bash, but only the first command was executed.
[root]# head -n 3 zrcpathAll | awk '{print $3}' | xargs -I {} echo wget -q -O - -i {}e \| md5sum\; | bash -v
wget -q -O - -i https://example.com/zrc/3d2f0e76e04444f4ec456ef9f11289ec.zrce | md5sum;
3d2f0e76e04444f4ec456ef9f11289ec -
[root]#
Would you please try the following instead:
while read -r _ _ url _; do
wget -q -O - "$url"e | md5sum
done < <(head -n 3 zrcpathAll)
we should not put -i in front of "$url" here.
[Explanation about -i option]
Manpage of wget says:
-i file
--input-file=file
Read URLs from a local or external file. [snip]
If this function is used, no URLs need be present on the command line. [snip]
If the file is an external one, the document will be automatically treated as html if the Content-Type matches text/html.
Furthermore, the file's location will be implicitly used as base
href if none was specified.
where the file will contain line(s) of url such as:
https://example.com/zrc/3d2f0e76e04444f4ec456ef9f11289ec.zrce
https://example.com/zrc/e1bd7171263adb95fb6f732864ceb556.zrce
https://example.com/zrc/5300b80d194f677226c4dc6e17ba3b85.zrce
Whereas if we use the option as -i url, wget first
downloads the url as a file which contains the lines of urls
as above. In our case, the url is the target to download itself,
not the list of urls, wget causes an error: No URLs found in url.
Even if the wget fails, why the command outputs just one line, not
three lines as the result of md5sum?
This seems to be because the head command immediately flushes the remaining
lines when the piped subprocess fails.
Say I have a file at the URL http://mywebsite.example/myscript.txt that contains a script:
#!/bin/bash
echo "Hello, world!"
read -p "What is your name? " name
echo "Hello, ${name}!"
And I'd like to run this script without first saving it to a file. How do I do this?
Now, I've seen the syntax:
bash < <(curl -s http://mywebsite.example/myscript.txt)
But this doesn't seem to work like it would if I saved to a file and then executed. For example readline doesn't work, and the output is just:
$ bash < <(curl -s http://mywebsite.example/myscript.txt)
Hello, world!
Similarly, I've tried:
curl -s http://mywebsite.example/myscript.txt | bash -s --
With the same results.
Originally I had a solution like:
timestamp=`date +%Y%m%d%H%M%S`
curl -s http://mywebsite.example/myscript.txt -o /tmp/.myscript.${timestamp}.tmp
bash /tmp/.myscript.${timestamp}.tmp
rm -f /tmp/.myscript.${timestamp}.tmp
But this seems sloppy, and I'd like a more elegant solution.
I'm aware of the security issues regarding running a shell script from a URL, but let's ignore all of that for right now.
source <(curl -s http://mywebsite.example/myscript.txt)
ought to do it. Alternately, leave off the initial redirection on yours, which is redirecting standard input; bash takes a filename to execute just fine without redirection, and <(command) syntax provides a path.
bash <(curl -s http://mywebsite.example/myscript.txt)
It may be clearer if you look at the output of echo <(cat /dev/null)
This is the way to execute remote script with passing to it some arguments (arg1 arg2):
curl -s http://server/path/script.sh | bash /dev/stdin arg1 arg2
For bash, Bourne shell and fish:
curl -s http://server/path/script.sh | bash -s arg1 arg2
Flag "-s" makes shell read from stdin.
Use:
curl -s -L URL_TO_SCRIPT_HERE | bash
For example:
curl -s -L http://bitly/10hA8iC | bash
Using wget, which is usually part of default system installation:
bash <(wget -qO- http://mywebsite.example/myscript.txt)
You can also do this:
wget -O - https://raw.github.com/luismartingil/commands/master/101_remote2local_wireshark.sh | bash
The best way to do it is
curl http://domain/path/to/script.sh | bash -s arg1 arg2
which is a slight change of answer by #user77115
You can use curl and send it to bash like this:
bash <(curl -s http://mywebsite.example/myscript.txt)
I often using the following is enough
curl -s http://mywebsite.example/myscript.txt | sh
But in a old system( kernel2.4 ), it encounter problems, and do the following can solve it, I tried many others, only the following works
curl -s http://mywebsite.example/myscript.txt -o a.sh && sh a.sh && rm -f a.sh
Examples
$ curl -s someurl | sh
Starting to insert crontab
sh: _name}.sh: command not found
sh: line 208: syntax error near unexpected token `then'
sh: line 208: ` -eq 0 ]]; then'
$
The problem may cause by network slow, or bash version too old that can't handle network slow gracefully
However, the following solves the problem
$ curl -s someurl -o a.sh && sh a.sh && rm -f a.sh
Starting to insert crontab
Insert crontab entry is ok.
Insert crontab is done.
okay
$
Also:
curl -sL https://.... | sudo bash -
Just combining amra and user77115's answers:
wget -qO- https://raw.githubusercontent.com/lingtalfi/TheScientist/master/_bb_autoload/bbstart.sh | bash -s -- -v -v
It executes the bbstart.sh distant script passing it the -v -v options.
Is some unattended scripts I use the following command:
sh -c "$(curl -fsSL <URL>)"
I recommend to avoid executing scripts directly from URLs. You should be sure the URL is safe and check the content of the script before executing, you can use a SHA256 checksum to validate the file before executing.
instead of executing the script directly, first download it and then execute
SOURCE='https://gist.githubusercontent.com/cci-emciftci/123123/raw/123123/sample.sh'
curl $SOURCE -o ./my_sample.sh
chmod +x my_sample.sh
./my_sample.sh
This way is good and conventional:
17:04:59#itqx|~
qx>source <(curl -Ls http://192.168.80.154/cent74/just4Test) Lord Jesus Loves YOU
Remote script test...
Param size: 4
---------
17:19:31#node7|/var/www/html/cent74
arch>cat just4Test
echo Remote script test...
echo Param size: $#
If you want the script run using the current shell, regardless of what it is, use:
${SHELL:-sh} -c "$(wget -qO - http://mywebsite.example/myscript.txt)"
if you have wget, or:
${SHELL:-sh} -c "$(curl -Ls http://mywebsite.example/myscript.txt)"
if you have curl.
This command will still work if the script is interactive, i.e., it asks the user for input.
Note: OpenWRT has a wget clone but not curl, by default.
bash | curl http://your.url.here/script.txt
actual example:
juan#juan-MS-7808:~$ bash | curl https://raw.githubusercontent.com/JPHACKER2k18/markwe/master/testapp.sh
Oh, wow im alive
juan#juan-MS-7808:~$
Within a gitlab-ci job, I want to create a directory (on a remote server) with the job id and perform some actions more or less as follows:
- ssh user#server mkdir -p /user/$CI_JOB_ID
- ssh user#server <perform_some_action_that_creates_several_zip_files>
- LAST_MODIFIED_FILE=$(ssh user#server bash -c 'find /user/$CI_JOB_ID -iname "*.zip" | tail -n 1 | xargs readlink -f')
The directory does get created and the action that creates several zips works out.
However, the last command that I use for getting the last modified/created .zip does not work, because $CI_JOB_ID does not seem to get expanded.
Any suggestions?
This issue is due to your ssh call. The way you do it now, you are mixing contexts :
ssh user#server bash -c 'find /user/$CI_JOB_ID -iname "*.zip" | tail -n 1 | xargs readlink -f'
bash -c 'my_commands' : you use simple quotes, so ssh will execute exactly the instruction my_commands. In your case, it will try to find the remote value for $CI_JOB_ID, instead of using the local one.
ssh user#server bash -c "my_commands" : withssh, the commands to execute on the remote shell are sent as a single string. Thus, if you were to run ssh with this quotation, it would try to run "bash -c find ... | tail ... | xargs ...". Here, only find is run through bash -c.
In your case, simply writing directly the following statement should do the trick :
ssh user#server "find /user/$CI_JOB_ID -iname '*.zip' | tail -n 1 | xargs readlink -f"
Otherwise, if you want to keep using the bash -c syntax, you'd have to escape the quotation so that it is propagated to the remote machine :
ssh user#server bash -c \"find /user/$CI_JOB_ID -iname '*.zip' | tail -n 1 | xargs readlink -f\"
When I want to run Wireshark locally to display a packet capture running on another machine, this works on bash, using input redirection from the output of a subshell:
wireshark -k -i <(ssh user#machine "sudo dumpcap -P -w - -f '<filter>' -i eth0")
From what I could find, the syntax for similar behavior on the fish shell is the same but when I run that command on fish, I get the Wireshark output on the terminal but can't see the Wireshark window.
Is there something I'm missing?
What you're using there in bash is process substitution (the <() syntax). It is a bash specific syntax (although zsh adopted this same syntax along with its own =()).
fish does have process substitution under a different syntax ((process | psub)). For example:
wireshark -k -i (ssh user#machine "sudo dumpcap -P -w - -f '<filter>' -i eth0" | psub)
bash | equivalent in fish
----------- | ------------------
cat <(ls) | cat (ls|psub)
ls > >(cat) | N/A (need to find a way to use a pipe, e.g. ls|cat)
The fish equivalent of <() isn't well suited to this use case. Is there some reason you can't use this simpler and more portable formulation?
ssh user#machine "sudo dumpcap -P -w - -f '<filter>' -i eth0" | wireshark -k -i -
For example I grep these lines from a url
wget -qO- http://superman.com/installer/ | grep "something"
href="Superman-Linux-20.0.1222.x64.sh"
href="Superman-Linux-10.0.1222.x64.sh/*fingerprint*/"
href="Superman-Linux-10.0.1222.x64.sh/*view*/"
I want to grep only
Superman-Linux-20.0.1222.x64.sh
then I need to wget that
wget http://superman.com/installer/Superman-Linux-20.0.1222.x64.sh
Instead of somthing grep like this: egrep -o "Superman-Linux.*\.sh"
All in one line do this:
wget http://superman.com/installer/$(wget -qO- http://superman.com/installer/ | egrep -o "Superman-Linux.*\.sh")