I am trying to download a file from this link with wget.
https://sfirmware.com/downloads/downloader.php?fileid=257457&hash=458cd40aa7824c3d25fe096c0b01d4b7
I am aware that & is a special character in shell environment.
So far i've tried double quote,single quote ,putting %26 replacing '&' (as per some suggestion) but the solution doesn't seem to work with this error-
Hash not exist in db!Back to Home Page
It seems the hash number from the link is ignoring while requesting.
How do I modify the code to download this file with wget?
Related
In an Applescript I am trying to pass on a URL that I receive as an argument to a do shell script command to use it with curl.
With regular characters the procedure works fine, but as soon as my argument contains special characters like Umlauts, it gets all funky.
curl does download something, but replaces the letter Ü with à etc., which of course will not get me the correct result.
What do I need to do, to get this to work? I am neither very skilled with Applescript nor with encoding issues.
My setup at the moment is as follows:
set download_URL to item 1 of arguments
do shell script "curl " & download_URL & " > targetFile.html"
Some examples of what happens:
Äquivozität ---> Ãquivozität
Ökolikör ---> Ãkolikör
Übermütigkeit ---> Ãbermütigkeit
Schweißfuß ---> SchweiÃfuÃ
Which makes my confusion even greater. All Ä, Ö, Ü and ß render as Ã, but both in the editing mask here and in the one of the site in question they render as shown in this image.
Also, through some amateurish digging in the html-File, I figured out that instead of the letter Ü, I would need to pass the letters %C3%9C. So the whole procedure does work, if I pass %C3%9Cbermut instead of Übermut. However, I would of course like to avoid creating a translation table for all diacritics.
Can somebody figure out, what specific encoding problem is happening here?
After some more researching, I found out that what I need to urlEncode my string. That way, the letter Ü will be replaced with %C3%9C and it works for my purposes.
Applescript does not seem to support this natively, but one can use php to do the conversion. I found the method here: https://discussions.apple.com/message/9801376#9801376
So, in my case I used it like this:
set keyword to item 1 of arguments
set encodedKeyword to do shell script "php -r 'echo trim(urlencode(" & "\"" & keyword & "" & "\"));'"
do shell script "curl https://www.myUrl.com/" & encodedKeyword & ".html > targetFile"
This way, it works for me.
In case there is a better way - maybe something that works in Applescript directly - feel free to post another answer, then I'll change the accepted answer.
I am reading a file (with URL's) line by line:
#!/bin/bash
while read line
do
url=$line
wget $url
wget $url_{001..005}.jpg
done < $1
For first, I want to download primary url as you see wget $url. After that I want to add to the url sequence of numbers (_001.jpg, _002.jpg, _003.jpg, _004.jpg, _005.jpg):
wget $url_{001..005}.jpg
...but for some reason it's not working.
Sorry, missed out one thing: the url's are like http://xy.com/052914.jpg. Is there any easy way to add _001 before the extension? http://xy.com/052914_001.jpg. Or I have to remove ".jpg" from the file containing URL's then simply add later to the variable?
Another way escaping the underscore char:
wget $url\_{001..005}.jpg
Try encapsulating your variable name:
wget ${url}_{001..005}.jpg
Bash is trying to expand the variable $url_ in your command.
As for your jpg within the URL followup, see substring expansion in the bash manual.
wget ${url:0: -4}_{001..005}.jpg
The :0: -4 means, expand to the variable from position zero (the first character), minus the last 4 characters.
Or from this answer:
wget ${url%.jpg}_{001..005}.jpg
%.jpg removes .jpg specifically and will work on older versions of bash.
I've uploaded a big number of files including their folder structure to my Ubuntu 12.04 LTS Server using WinSCP.
The goal is to access these files in Owncloud.
However, all files that contain special character like German Umlauts cause problems. In Ownclouds view, their name is cut off at the special character and trying to view that folder or file will send you back to the folder root.
Using ls, the special character is always displayed as a question mark, e.g. "Moterschwei?en1.jpg"
What works is manually renaming them through "mv" in the shell. Inserting the special char properly, e.g. "Motorschweißen1.jpg" for this example, does work, but doing this for all of them would take ages.
Using find . -name "?" will not yield any hits.
Is there any way to replace all of those special characters, e.g. with an underscore?
Try the command rename:
rename 'y/\W/_' *
The above command will replace all non alphanumeric characters with _. See http://perldoc.perl.org/perlop.html#Regexp-Quote-Like-Operators and http://perldoc.perl.org/perlre.html#Special-Backtracking-Control-Verbs for the documentation of perl regex expression.
I'm using the latest stable version of Smarty and can't get this string to work. I've looked at other questions for solution to do this but none seem to work.
This is a template file (TPL), and doesn't contain any PHP at all. Note that the TPL file is compiled to a PHP script and then sent to the browser. It's not a PHP file.
Current code:
'foo{$bar}'
which outputs as:
'foo{$bar}'
instead of the value of $bar.
What am I doing wrong?
If you use any variables within text you have to use double-quotes " instead of single quotation marks '. Text within single quotation marks is not parsed for variables in PHP.
I already looked through other topics, but I still couldn't find a solution. I'm trying to install "nxhtml" plugin for Emacs in windows 7. I already setup my "HOME" environment variable as "C:\". So, my .emacs.d folder is there, and I put the nxhtml in there and added the following line to my "_emacs.d" file, as the readme says:
(load "C:\.emacs.d\nxhtml\autostart.el")
But it doesn't load.
I also tried putting:
(add-to-list 'load-path "C:\.emacs.d\nxhtml")
(load "autostart.el")
But to no avail... can anyone shed some light here? tnx.
A number of points here:
Firstly, _emacs.d is not a default file name for your init file, ie emacs will not load it automatically. Try ~/.emacs.d/init.el, or ~/.emacs instead.
Secondly, Windows 7 has a feature where it prevents programs from writing to certain system directories, but for backwards compatibility for the many old programs that do this, rather than causing them to fail, it silently redirects the write elsewhere, in an application specific directory. C:\ is one of those directories, so setting your HOME to point there is asking for trouble.
Thirdly, see the other response about backslash being an escape character in Lisp strings.
\ is special in the (double-quote) read syntax for strings, as certain characters take on a new meaning when prefixed by a backslash (e.g. \n is a newline, \t is a tab, and \" is a double-quote character). When the following character does not have any special meaning in conjunction with the backslash, that character is used verbatim, and the backslash is ignored.
"C:\.emacs.d\nxhtml\autostart.el" is actually the string:
C:.emacs.d
xhtml^Gutostart.el
To include a \ in the string you need to write \\
However, although it will understand the backslashes, Emacs is nowadays consistent across all platforms in allowing / as a directory separator1; so just do that instead.
1 and the obsolete directory-sep-char variable has been removed entirely.