I am working on device which has embeded OS with limited features and space. I am can't install any scripting language like php curl. I would like to know is there any way to call https url with some header values (e.g. content type) using shell script and once get response that can be displayed on web page or can write into file.
Regds
curl and wget, as already suggested, are usually available in shells. If those are not available you'll have to give more details about your platform to figure out what could be available.
You can probably try wget like this:
wget --http-user=user --http-password=password http://domain.com/dir/file
Related
I am using the Dredd tool to test my API (which resides on apiary.io).
Question
I would like to provide dredd with a path to my documentation (it even asks for it), however my API doc is on apiary.io but i don't know the exact url that points to it. What would be the correct way to provide dredd with the API path?
What did work (but not what i'm looking for)
Note: I tried downloading the api to my local drive and providing dredd with a local path to the file (yml or apib) which works fine (yay!), but i would like to avoid keeping a local copy and simply providing dredd with the location of my real API doc which is being maintained on the apiary server.
How do I do this (without first fetching the file to local drive)?
Attempts to solve this that failed
I also read (and tried) on the following topics, they may be relevant but i wasn't successful in resolving the issue
- Using authentication token as environment variable
- Providing the domain provided by apiary.io//settings to dredd
- Providing the in the dredd command
all of these attempts still produces the same result, Dredd has no idea where to find the API document unless i provide a path in my local computer to the file (which i have to download or create manually on my computer first).
Any help is appreciated, Thanks!
If I understand it correctly, you would like to use dredd and feed it using the API description document residing on Apiar.io platform, right?
If so, you should be able to do that simply calling the init command with the right options:
dredd init -r apiary -j apiaryApiKey:privateToken -j apiaryApiName:sasdasdasd
You can find the private token going into the Test section of the target API (you'll find the button on the application header).
Let me know if this solves the problem for you - I'll make sure to propagate this and document it accordingly on our help page
P.S: You can also use your own reporter - in that case, simply omit -r apiary when writing the command line parameters.
You can feed Dredd not only with a path to file on your disk, but also with an URL.
If your API in Apiary is public, the API description document (in this case API Blueprint) should have a public URL. For example, if you go to http://docs.apiblueprintapi.apiary.io/, you can see on the left there is a Download link. Unfortunately, the link is visible only for users who do not have access to the editor of the API, so you can’t see the link if you’re owner of the API. Try to log out from Apiary and the link should appear:
Then you can feed Dredd with the link:
$ dredd 'http://docs.apiblueprintapi.apiary.io/api-description-document' 'http://example.com:8080/api'
I agree this isn’t very intuitive and since you’re not the first one to come up with this, I think we’ll think of some ways how to make it easier.
If your API isn't public then unfortunately there's no way to get the URL as of now. However, you can either use GitHub Sync or Apiary CLI to get the file on your disk in an automated manner.
I have written a Bash SOAP library that uses wget as the interface to HTTP servers. It's intentional to avoid curl, since that is not available or not installed by default on systems where this library is used.
The basis of the library is the query the WSDL, determine the parameters and allow functions / methods to be invoked from the command line, through a simple wrapper to setup the SOAP urls:
$ ./mysoap.sh MyMethod sKey=1234 bAnotherParameter=False sAnotherParam="Hello"
However, when wget receives a 500 response, it doesn't write the response body to the output document defined by -O. The response contains the SOAP errors that the server generated, which is useful to the client. Is there a way to force wget to write the response to the output document, regardless of the state? The documentation seems to be unclear about the function of -O in the event of an error, so to me, it's not working as intended.
This is the option:
Parameter: --content-on-error, available from wget 1.14:
https://superuser.com/a/648466
I am currently trying to port a chrome extension to firefox (addon-sdk). However I came up with a few problems porting the pac script functionality.
When setting a proxy through chromes extension API, you can set a PAC script as string inside the pac script object (ref: http://developer.chrome.com/extensions/proxy.html#type-PacScript)
Looking in Firefox, there is nothing like that. The only option I see is to pull the script from a url (http://kb.mozillazine.org/Firefox_:FAQs:_About:config_Entries). My problem with this is that the pac script have to change and react when the user adjust addon settings.
Is there a (hacky) way to accomplish something like that in Firefox?
The only solution I came up with is encoding the users options and post them to the pac script server. Server parses them and creates a script matching the needs. I want to avoid using servers at any costs as this results in another dependency!
You can use a data: URI for your PAC file. Generating it dynamically is easy:
var pacScript = "function FindProxyForURL(url, host){return 'DIRECT';}";
var uri = "data:text/javascript," + encodeURIComponent(pacScript);
alert(uri);
I have a bash script (supports linux/unix), that installs an application.
Instead of executing the script in the terminal, I want to deploy it as a web application.
I want to make a web graphical interface for this script, so that the user can give the necessary inputs in the web forms and when ready,then pass these variables into the bash script to be executed.
This script obvious needs root privileges.
I plan to make it with with tomcat 7 / servlet / jsp. I want to deploy it as .war file.
First, can this be done? Is it possible?
Second, is there any example? I didn't find anything.
Third, any alternative method/better idea?
I'd try tomcat's own CGI support.
http://tomcat.apache.org/tomcat-6.0-doc/cgi-howto.html
Well, it's possible, but keep in mind that sanitizing user input is hard.
What you want to do is use a scripting language or framework (I recommend sinatra), and use a html form to pass arguments to the backend. In the backend, you call your script by passing whatever arguments you want.
Example with sinatra:
post '/whatever' do
# This is dangerous!
`myscript #{params[...]}`
end
Err, but you want this to run on the client side, right?
So you don't really run it as bash on your system, you just template it within your web framework.
If the browser can then display this, it won't just d/l as a file, so you will need to set up a Content-Disposition: attachment header in the response to force a d/l.
You will naturally need the user's cooperation to run this as root on his or her system...
I was wondering the best way to upload file to a web server in cocoa. I cant seem to get my curl code to work even though it works when run from terminal.
curl code:
system(#"curl -T /file.txt http://webserevertouploadto.com")
Thanks for any help
Try using NSTask instead of system() to execute curl. If you're looking for a native Cocoa solution for uploading files via FTP, take a look at ConnectionKit.
One negative of using curl is that it won't respect the user's proxy settings.
I prefer to use the NSURLConnection API.
Check this out:
http://www.cocoadev.com/index.pl?HTTPFileUploadSample