I have a cURL request as follows.
$(curl --request PUT --upload-file "<path to catalog file on your local machine>" "<presigned URL>")
Let's say that I have to upload a bin/test.txt file with the presigned URL being https://www.someurl.com
I execute the command in my terminal
curl --request PUT --upload-file "bin/test.txt" "https://www.someurl.com" and it works fine.
How do I write a piece of Golang which does the same? I have tried
cmd := exec.Command("curl", "--request", "PUT", "--upload-file", fmt.Sprintf("\"%s\"", catalogPath), fmt.Sprintf("\"%s\"", presignedURL))
err = cmd.Run()
but found no success.
I see one obvious problem preventing that curl call from working properly, one quite possible and another one also possible.
The obvious problem is that string quoting — such as executing curl … --upload-file "bin/test.txt" … — in interpreted by the shell which is to execute the command. Quoting — using either double or single quotes — is used to inhibit interpreting of otherwise special characters by the shell; chiefly it's used to prevent the shell from splitting a string into separate "words" on whitespace characters or their series.
The key takeaway is that the command run by the shell after it's fully parsed the command to be executed (and interpreted the quotes) does not "see" these quotes because they are removed by the shell.
os/exec.Cmd calls the specified program directly and does not "pass it through" the shell. Hence if you include double quotes into the command-line parameters of the program to execute, they are passed to that program, unchanged. This means, curl were to try to find the file named test.txt" located in the directory named "bin — which is most probably not what you expected.
The same applied to the URL.
The second — possible — problem is that your call relies on the current directory of your Go program because you pass a relative path to curl.
This might or might not be a problem but you might check this anyway.
The third problem is that you might want to pass your URL through the "percent escaping" algorithm before passing it to curl.
You might look at PathEscape and QueryEscape functions of the net/url package.
Two pieces of advice follow.
First, I would try very hard not to call out to curl to perform such a ridiculously simple task. Go has excellent support for making HTTP requests (and serving them, FWIW) in its standard library, and PUTting a file is really a no-brainer with a solutions googleable in, like, five minutes.
Second, if, for some reason, you intend to stick with calling curl, consider passing it some options to make it fail loudly on errors — otherwise you're doomed to be in that „but found no success” situation in your attempts. For instance, consider passing curl the -s and -S command line options (together).
that's not how you quote shell arguments, that would break if your argument starts with or ends with \ or ", the proper way to quote shell arguments on unix would be
func quoteshellarg(str string) string {
if strings.Contains(str, "\x00") {
panic("argument contains null bytes, it is impossible to escape null bytes in shell arguments!")
}
return "'" + strings.ReplaceAll(str, "'", "'\\''") + "'"
}
and with that, just
cmd := exec.Command("curl --request PUT --upload-file " + quoteshellarg(catalogPath)+ " " + quoteshellarg(presignedURL));
... at least that's how to do it on unix systems. as for how to do it on Windows, it seems nobody knows for sure, not even Microsoft
Related
To get into detail, I am writing a shell script to automate usage of a plugin.
To do so, I am using xclip to pull a url from the x clipboard, then append it at the end of a command with arguments and execute the combined command.
I use url="$(xclip -o)" to get the url from the clipboard, then com='youtube-dl -x --audio-format mp3 ' to set the initial string.
I've been clumsily stumbling through attempts at printf and defining new strings as str=$com $url (and many variants of such. I haven't written anything in a long time and know I'm screwing up something pretty basic. Anybody able to help?
to concatenate two strings with a space in an assignment
str=$com' '$url
can be also written
str=$com" "$url
or
str="$com $url"
then the command can just be launched
$str
however
str=$com $url
is the syntax to call $url passing environment variable str=$com
also if url was a string which could contains spaces or tabs anan array should be used instead to avoid splitting when called
str=( $com "$url" )
"{str[#]}"
For my interest. I am trying to use the below curl query:
Delphi$ curl -o /dev/null -s -w %{time_total}
http://localhost:8080/getquery?db=EmpDt&col=HRDt&query=select * from
emp where id=1111
but unable to execute it:
[1] 2784
[2] 2785
0.003799invalid option or syntax: 10
[1]- Done curl -o /dev/null -s -w %{time_total}
http://localhost:8080/getquery?db=EmpDt
[2]+ Done col=HRDt
Something is not correct here but not able to get what? Any help would be really helpful. Thanks
in shell unquoted & terminates command and runs the command to its left in the background; thus your post contains three separate commands run concurrently. Either quote & individually with backslash as \& or surround at least the &s and usually the whole string with either singlequotes'http://host/q?x&y&z' or doublequotes "http://host/q?x&y&z"
? and * are also special in shell, although not command terminators, and in general must also be quoted, although in your case after fixing the spaces (below) this becomes less critical
URL cannot contain space; it must be encoded as + (preferred) or %20. Other special characters (here * and =) may not work depending on how your server handles URL parsing, which in turn depends on what your server is and you didn't give any hint; in that case they too must be percent-encoded. (If you want actual +, which you don't, it is encoded as %2B.)
In an Applescript I am trying to pass on a URL that I receive as an argument to a do shell script command to use it with curl.
With regular characters the procedure works fine, but as soon as my argument contains special characters like Umlauts, it gets all funky.
curl does download something, but replaces the letter Ü with à etc., which of course will not get me the correct result.
What do I need to do, to get this to work? I am neither very skilled with Applescript nor with encoding issues.
My setup at the moment is as follows:
set download_URL to item 1 of arguments
do shell script "curl " & download_URL & " > targetFile.html"
Some examples of what happens:
Äquivozität ---> Ãquivozität
Ökolikör ---> Ãkolikör
Übermütigkeit ---> Ãbermütigkeit
Schweißfuß ---> SchweiÃfuÃ
Which makes my confusion even greater. All Ä, Ö, Ü and ß render as Ã, but both in the editing mask here and in the one of the site in question they render as shown in this image.
Also, through some amateurish digging in the html-File, I figured out that instead of the letter Ü, I would need to pass the letters %C3%9C. So the whole procedure does work, if I pass %C3%9Cbermut instead of Übermut. However, I would of course like to avoid creating a translation table for all diacritics.
Can somebody figure out, what specific encoding problem is happening here?
After some more researching, I found out that what I need to urlEncode my string. That way, the letter Ü will be replaced with %C3%9C and it works for my purposes.
Applescript does not seem to support this natively, but one can use php to do the conversion. I found the method here: https://discussions.apple.com/message/9801376#9801376
So, in my case I used it like this:
set keyword to item 1 of arguments
set encodedKeyword to do shell script "php -r 'echo trim(urlencode(" & "\"" & keyword & "" & "\"));'"
do shell script "curl https://www.myUrl.com/" & encodedKeyword & ".html > targetFile"
This way, it works for me.
In case there is a better way - maybe something that works in Applescript directly - feel free to post another answer, then I'll change the accepted answer.
I'm using jamplus to build a vendor's cross-platform project. On osx, the C tool's command line (fed via clang to ld) is too long.
Response files are the classic answer to command lines that are too long: jamplus states in the manual that one can generate them on the fly.
The example in the manual looks like this:
actions response C++
{
$(C++) ##(-filelist #($(2)))
}
Almost there! If I specifically blow out the C.Link command, like this:
actions response C.Link
{
"$(C.LINK)" $(LINKFLAGS) -o $(<[1]:C) -Wl,-filelist,#($(2:TC)) $(NEEDLIBS:TC) $(LINKLIBS:TC))
}
in my jamfile, I get the command line I need that passes through to the linker, but the response file isn't newline terminated, so link fails (osx ld requires newline-separated entries).
Is there a way to expand a jamplus list joined with newlines? I've tried using the join expansion $(LIST:TCJ=\n) without luck. $(LIST:TCJ=#(\n)) doesn't work either. If I can do this, the generated file would hopefully be correct.
If not, what jamplus code can I use to override the link command for clang, and generate the contents on the fly from a list? I'm looking for the least invasive way of handling this - ideally, modifying/overriding the tool directly, instead of adding new indirect targets wherever a link is required - since it's our vendor's codebase, as little edit as possible is desired.
The syntax you are looking for is:
newLine = "
" ;
actions response C.Link
{
"$(C.LINK)" $(LINKFLAGS) -o $(<[1]:C) -Wl,-filelist,#($(2:TCJ=$(newLine))) $(NEEDLIBS:TC) $(LINKLIBS:TC))
}
To be clear (I'm not sure how StackOverflow will format the above), the newLine variable should be defined by typing:
newLine = "" ;
And then placing the carat between the two quotes and hitting enter. You can use this same technique for certain other characters, i.e.
tab = " " ;
Again, start with newLine = "" and then place carat between the quotes and hit tab. In the above it is actually 4 spaces which is wrong, but hopefully you get the idea. Another useful one to have is:
dollar = "$" ;
The last one is useful as $ is used to specify variables typically, so having a dollar variable is useful when you actually want to specify a dollar literal. For what it is worth, the Jambase I am using (the one that ships with the JamPlus I am using), has this:
SPACE = " " ;
TAB = " " ;
NEWLINE = "
" ;
Around line 28...
I gave up on trying to use escaped newlines and other language-specific characters within string joins. Maybe there's an awesome way to do that, that was too thorny to discover.
Use a multi-step shell command with multiple temp files.
For jamplus (and maybe other jam variants), the section of the actions response {} between the curly braces becomes an inline shell script. And the response file syntax #(<value>) returns a filename that can be assigned within the shell script, with the contents set to <value>.
Thus, code like:
actions response C.Link
{
_RESP1=#($(2:TCJ=#)#$(NEEDLIBS:TCJ=#)#$(LINKLIBS:TCJ=#))
_RESP2=#()
perl -pe "s/[#]/\n/g" < $_RESP1 > $_RESP2
"$(C.LINK)" $(LINKFLAGS) -o $(<[1]:C) -Wl,-filelist,$_RESP2
}
creates a pair of temp files, assigned to shell variable names _RESP1 and _RESP2. File at path _RESP1 is assigned the contents of the expanded sequence joined with a # character. Search and replace is done with a perl one liner into _RESP2. And link proceeds as planned, and jamplus cleans up the intermediate files.
I wasn't able to do this with characters like :;\n, but # worked as long as it had no adjacent whitespace. Not completely satisfied, but moving on.
I would need to read certain data using curl. I'm basically reading keywords from file
while read line
do
curl 'https://gdata.youtube.com/feeds/api/users/'"${line}"'/subscriptions?v=2&alt=json' \
> '/home/user/archive/'"$line"
done < textfile.txt
Anyway I haven't found a way to form the url to curl so it would work. I've tried like every possible single and double quoted versions. I've tried basically:
'...'"$line"'...'
"..."${line}"..."
'...'$line'...'
and so on.. Just name it and I'm pretty sure that I've tried it.
When I'm printing out the URL in the best case it will be formed as:
/subscriptions?v=2&alt=jsoneeds/api/users/KEYWORD FROM FILE
or something similar. If you know what could be the cause of this I would appreciate the information. Thanks!
It's not a quoting issue. The problem is that your keyword file is in DOS format -- that is, each line ends with carriage return & linefeed (\r\n) rather than just linefeed (\n). The carriage return is getting read into the line variable, and included in the URL. The giveaway is that when you echo it, it appears to print:
/subscriptions?v=2&alt=jsoneeds/api/users/KEYWORD FROM FILE"
but it's really printing:
https://gdata.youtube.com/feeds/api/users/KEYWORD FROM FILE
/subscriptions?v=2&alt=json
...with just a carriage return between them, so the second overwrites the first.
So what can you do about it? Here's a fairly easy way to trim the cr at the end of the line:
cr=$'\r'
while read line
do
line="${line%$cr}"
curl "https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json" \
> "/home/user/archive/$line"
done < textfile.txt
Your current version should work, I think. More elegant is to use a single pair of double quotes around the whole URL with the variable in ${}:
"https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json"
Just use it like this, should be sufficient enough:
curl "https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json" > "/home/user/archive/${line}"
If your shell gives you issues with & just put \&, but it works fine for me without it.
If the data from the file can contain spaces and you have no objection to spaces in the file name in the /home/user/archive directory, then what you've got should be OK.
Given the contents of the rest of the URL, you could even just write:
while read line
do
curl "https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json" \
> "/home/user/archive/${line}"
done < textfile.txt
where strictly the ${line} could be just $line in both places. This works because the strings are fixed and don't contain shell metacharacters.
Since you're code is close to this, but you claim that you're seeing the keywords from the file in the wrong place, maybe a little rewriting for ease of debugging is in order:
while read line
do
url="https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json"
file="/home/user/archive/${line}"
curl "$url" > "$file"
done < textfile.txt
Since the strings may end up containing spaces, it seems (do you need to expand spaces to + in the URL?), the quotes around the variables are strongly recommended. You can now run the script with sh -x (or add a line set -x to the script) and see what the shell thinks it is doing as it is doing it.