wget print a mistake why? - terminal

I want to get an information of weather from weatherstack.com by wget.
But when I use wget on my Mac I faced a problem. Mistake:
[1] 7943
zsh: no matches found: http://api.weatherstack.com/current?access_key=ACCESS_KEY
[1] + exit 1 wget
Command: wget http://api.weatherstack.com/current?access_key=ACCESS_KEY&query=London

This has nothing to do with wget in particular, but with how the shell looks at command lines. Here, it detects the "&" character in the URL and interprets this as the "run this in the background" operator.
To avoid this, you need to put the URL in quotes to "hide" the special characters from the shell. Try
wget "http://api.weatherstack.com/current?access_key=YOUR_KEY&query=London"

Related

spaces,',`,/,\,<,>,?,&,| are filtered how to bypass them with Bash commands

i have PHP code use some bash codes which the PHP code can run it, and its have a bug to make RCE in bash,
the command would be "$(id)" command executed as well
but if i execute any other command like "ls -la" its have a space
the space replaced automatically with "-"
i checked the source as well and i found the following chars spaces,',`,/,\,<,>,?,&,| are filtered
how to bypass them and execute command like "wget link" and run it perfect
****UPDATE****
the following code i add as a live example.
send the command in sendcmd function
`https://pastebin.com/raw/1MfR6aic`
This is (example) output from id
uid=1000(ibug) gid=1000(ibug)
Since these characters aren't filtered, you can get an unfiltered space like this:
ID=$(id)
echo${ID:14:1}foo
Now you have space. You can get virtually any character with echo -e and then eval an expression.
I tried your PHP code and found this working:
sendcmd("http://52.27.167.139", "{echo,hello}");
Just wrap then in braces and use commas. The shell will expand the brace to
echo hello

cURL Scraping Wrong Webpage

I am attempting to scrape a webpage that requires a login using curl in the Mac Terminal but can't seem to get it right. I have a cookies.txt file with my login info that I am reading into the command, but I can't get it to scrape the intended page. When I run
curl -b /Users/dwm8/Desktop/cookies.txt -o /Users/dwm8/Desktop/file.txt https://kenpom.com/team.php?team=Duke&y=2002
the contents of file.txt are the webpage data from https://kenpom.com/team.php?team=Duke instead of https://kenpom.com/team.php?team=Duke&y=2002. Is there a fix for this? Thanks for the help.
& is a shell metacharacter that separates commands and indicates the command before it should be run in the background. So, your command:
curl ... https://kenpom.com/team.php?team=Duke&y=2002
gets parsed as two separate commands:
curl ... https://kenpom.com/team.php?team=Duke & # The & means run curl in the background
y=2002 # This just sets a shell variable
In order to get the shell to treat & as part of the argument to curl rather than a command separator, you need to quote it (either single- or double-quotes would work) or escape it with a backslash:
curl ... 'https://kenpom.com/team.php?team=Duke&y=2002'
curl ... "https://kenpom.com/team.php?team=Duke&y=2002"
curl ... https://kenpom.com/team.php\?team=Duke\&y=2002
Oh, and notice that I also escaped the ? in that last example? That's because ? is also a shell metacharacter (specifically, a wildcard). In this case it probably wouldn't cause any trouble, but it's safest to quote or escape it just in case. And since it's hard to keep track of exactly which characters can cause trouble, I'll recommend quoting instead of escaping, and just quoting everything that you're at all unsure about.
You need to wrap url part in quotes.

"Escaping" nightmare

I have been trying to put the following into a bash script:
scp /cygdrive/c/Program\ Files\ \(x86\)/Spiceworks/data/configurations/LA_$(date +%F)-firewall-5520 "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_Automation/SBOX_Dumps/08\ -\ Security/Firewalls"
...which works as expected via command line.
When I try to place it in a script, I'm obviously going to have to double escape the spaces, anything else? I'm unable to have success with the multitude of variations I've tried.
I would think:
scp /cygdrive/c/Program\\ Files\\ \(x86\)/Spiceworks/data/configurations/LA_\$(date\ +%F)-firewall-5520 "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_Automation/SBOX_Dumps/08\\ \\-\\ Security/Firewalls"
...would work, but it doesn't. Any suggestions, before I go grey and bald? Thanks
I wont waste the space with all variations I've tried, but I will say I have tried escaping almost everything and basically nothing and many in between with no success. When I receive a "No such file or Directory" I obviously escape until the file actually resolves, but even when I do not get a path error, the command is not successfully completing.
I do understand this is quite a specific case, but I imagine it will help others in the future: escaping spaces, in a bash script, using embedded expect (I have tested with a #!/bin/bash shebang and the embedded expect using expect -c ' .... ' as well as #!/usr/bin/expect using a simple spawn scp command with no change.
EDIT (Based on responses):
I should have mentioned I have tried quoting it...
Quoting the first (host) part gives me the error
can't read "(date +%F)": no such variable
while executing
"spawn scp "/cygdrive/c/Program Files (x86)/Spiceworks/data/configurations/LA_$(date +%F)-firewall-5520" "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_..."
...it is not a variable, it is a function giving me the current date in the format year-month-day
Quoting the destination section without escaping anything gives me the error
scp: ambiguous target
The same command that worked on your bash shell should work the same in your bash script. You shouldn't and cannot escape it twice.
If it doesn't work, make sure you're testing the script from the same terminal, ensure that the script contents and your command is identical, then paste the output from your terminal when you do:
$ scp /cygdrive/c/Program\ Files\ \(x86\)/Spiceworks/data/configurations/LA_$(date +%F)-firewall-5520 "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_Automation/SBOX_Dumps/08\ -\ Security/Firewalls"
(successful scp output here)
$ cat testscript
(testscript contents here, should be identical to above scp command)
$ bash testscript
(unsuccessful script/scp output here)
Just stick it in quotes and stop hitting yourself, as #that_other_guy said, you cannot, and shouldn't try and escape twice, wherever you got that idea, disregard it as a source of information. (my guess is it came from "thin air")
scp "/cygdrive/c/Program Files (x86)/Spiceworks/data/configurations/LA_$(date +%F)-firewall-5520" "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_Automation/SBOX_Dumps/08 - Security/Firewalls"
You could even give yourself some helpers:
export PROGFILESx86="/cygdrive/c/Program Files (x86)"
export SPICEWORKS_CONFIGS="${PROGFILESx86}/Spiceworks/data/configurations"
(add them to .bashrc, exec it.) and then do:
scp "${SPICEWORKS_CONFIGS}/LA_$(date +%F)-firewall-5520" "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_Automation/SBOX_Dumps/08 - Security/Firewalls"
I'd also be concerned the the files you are attempting to scp actually exist, wrap it up in a condition to verify they're there, it only makes sense when you're auto-generating a filename.
firewall_config="${SPICEWORKS_CONFIGS}/LA_$(date +%F)-firewall-5520"
if [ -e "${firewall_config}" ]
then
scp "${firewall_config}" "sf-mlawler#10.21.1.212:/home/sf-mlawler/SBOX_Automation/SBOX_Dumps/08 - Security/Firewalls"
else
echo "File doesn't exist: ${firewall_config}"
fi
Update:
Since you've updated your question, it's obviously the date call that's giving you the biggest problem, again, indirection saves you doing bizarre escape-char-yoga, just get the result into a var and use that.
fw_date=$(date +%F)
firewall_config="${SPICEWORKS_CONFIGS}/LA_${fw_date}-firewall-5520"

How to format a Windows path to a Unix path on Cygwin command line

When using Cygwin, I frequently copy a Windows path and manually edit all of the slashes to Unix format. For example, if I am using Cygwin and need to change directory I enter:
cd C:\windows\path
then edit this to
cd C:/windows/path
(Typically, the path is much longer than that). Is there a way to use sed, or something else to do this automatically? For example, I tried:
echo C:\windows\path|sed 's|\\|g'
but got the following error
sed: -e expression #1, char 7: unterminated `s' command
My goal is to reduce the typing, so maybe I could write a program which I could call. Ideally I would type:
conversionScript cd C:/windows/path
and this would be equivalent to typing:
cd C:\windows\path
Thanks all. Apparently all I need are single quotes around the path:
cd 'C:\windows\path'
and Cygwin will convert it. Cygpath would work too, but it also needs the single quotes to prevent the shell from eating the backslash characters.
Read about the cygpath command.
somecommand `cygpath -u WIN_PATH`
e.g.
cmd.exe doesn't like single quotes. You should use double quotes
C:\test>echo C:\windows\path|sed "s|\\|/|g"
C:/windows/path
You replace back-slash by slash using unix sed
Below I use star "*" to seperate fields in s directive
sed "s*\\\*/*g"
The trick is to use one back-slash more than you might think needed
to answer your question to achieve
cd C:\windows\path
since you are in bash this just works as you want - but add single quotes
cd 'C:\windows\path'
As noted by #bmargulies and #Jennette - cygpath is your friend - it would be worth it to read the cygwin man page
man cygpath

wget errors breaks shell script - how to prevent that?

I have a huge file with lots of links to files of various types to download. Each line is one download command like:
wget 'URL1'
wget 'URL2'
...
and there are thousands of those.
Unfortunately some URLs look really ugly, like for example:
http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc
It opens OK in a browser, but confuses wget.
I'm getting an error:
./tasks001.sh: line 35: syntax error near unexpected token `1'
./tasks001.sh: line 35: `wget 'http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc''
I've tried both URL and 'URL' ways of specifying what to download.
Is there a way to make a script like that running unattended?
I'm OK if it'll just skip the file it couldn't download.
Do not (ab)use the shell.
Save your URLs to some file (let's say my_urls.lst) and do:
wget -i my_urls.lst
Wget will handle quoting etc on it's own
I think you need to used double-quotes (") and not single quotes (') around the URL.
If that still doesn't work, try escaping the paren characters ( and ) with a backslash: \( and \)
Which shell are you using? Bash? zsh?
This doesn't exactly answer your question but:
Both of the following commands work directly in a bash shell:
wget "http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc"
and
wget 'http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc'
Can you check to see if either of those work for you?
What seems to be happening is that your shell is doing something with the ( characters. I would try using double quotes " instead of single quotes ' around your URL.
If you wish to suppress errors you can use a >/dev/null under unix to redirect standard output or 2> /dev/null to redirect standard error. Under other operating systems it may be something else.

Resources