sending https request using inetc - download

So I tried to send some POST data to an url using this code:
StrCpy $PostStr "a=input1&c=input2"
inetc::post $PostStr "https://url/index.php" "$INSTDIR\result.html" /END
Pop $0
StrCmpS $0 "OK" success failedToSubmit
failedToSubmit:
MessageBox MB_OK|MB_ICONEXCLAMATION "There was an error submitting information: $0"
Abort
success:
MessageBox MB_OK|MB_ICONINFORMATION "Your information was successfully received"
but when the url is in https, the following message always appeared:
There was an error submitting information: SendRequest Error
I tried this with http, and it run smoothly. The server php script does nothing but echo the POST variable.
Have I missed something in dealing with the https with inetc?
Thanks

INetC should be using the SECURITY_FLAG_IGNORE_UNKNOWN_CA + SECURITY_FLAG_IGNORE_REVOCATION flags on https URLs and there seems to be some kind of auth retry code in there so I'm not sure why it is not working.
There are other flags like SECURITY_FLAG_IGNORE_CERT_CN_INVALID that it is not using, maybe you could request a new /nosecurity switch here...

Related

Flock webhook token

I am trying to create webhook as per this document and this doesn't include any clue about where does the token comes from.
https://docs.flock.com/display/flockos/Create+An+Incoming+Webhook
My curl command as below
curl -X POST -H "Content-Type: application/json" https://api.flock.com/hooks/sendMessage/guid-guid -d '{"text": "This is a test message.","token":"test"}'
Error message:
{"error":"InvalidParameter","description":"A required parameter for the method call is missing or invalid","parameter":"token"}
Can someone point me what's missing here.
Flock gives you the token for the webhook when you finish adding a new one at https://dev.flock.com/webhooks
You can look it up again when you're done by going to the edit option for the webhook you've added; at the moment the token is given at the bottom of the page:
Webhook URL
Send your JSON payload to this URL
[your-token-here]

How to get custom header in bash

I'm adding a custom header in Asp.Net app:
context.Response.Headers.Add("X-Date", DateTime.Now.ToString());
context.Response.Redirect(redirectUrl, false);
When I'm using Fiddler I can see the "X-Date" header in the response.
I need to receive it by using bash.
I tried curl -i https://my.site.com and also wget -O - -o /dev/null --save-headers https://my.site.com with no success.
In both cases I see just the regular headers like: Content-Type, Server, Date, etc...
How I can receive the "X-Date" header?
Thanks,
Lev
protocol headers are different than file-headers (like http-header and tcp-header are different). When you create a protocol header you wiil need a server to resolve it and use the associated enviroment variables. Example ...
#!/bin/bash
# Apache - CGI
echo "text/plain"
echo ""
echo "$CONTENT_TYPE"
echo "$HTTP_ACCEPT"
echo "$SERVER_PROTOCOL"
When calling this script via web, The response ony my browser was...
text/html
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
HTTP/1.1
What you looking for are enviorment variables called $HTTP_ACCEPT, $CONTENT_TYPE and maybe $SERVER_PROTOCOL too.

+CMS ERROR: 304, sending message using at command

Im new to GSM and AT command sets,
based on the error +CMS ERROR: 304 its a Invalid PDU mode parameter according to activexperts
Im trying to send message using AT command to GSM modem in PDU mode
AT+CMGF=0
OK
AT+CMGS=31
> 079136190700203911000C913639350768260000AA13C834A89D07B9C3ED32283D0751C3F3F41B
+CMS ERROR: 304
Can you help me guys what did I missed or mistake? I used this to encode the PDU
Thanks guys.
Try by chaning PDU mode. Use the following command:
AT + CMGF = 1
If you are trying to Send SMS try AT + CMGF = 1 first and then again AT+CMGS="NUMBER" and hope after typing the message on > prompt, you are using CTRL+Z key combination to send SMS.
Instead of hitting ENTER after placing the PDU, use CTRL-Z to send message.
credits to #user2543882

POST receiver (server)

I find myself in need of a way to test the post requests made from my application.
I'm sending my request to http://macbook-pro.local/, and I'm trying to figure out what service I can run to read and display the request.
I looked into RestKit without any luck.
Using SignalR could work, but the port to macos isn't working as expected.
What you want is basically a very simple web server, but if all you want is printing out what comes in a HTTP POST request you can get away with using the built-in 'nc' command. You can use Terminal to print out the contents of incoming requests on local port 10000 by running 'nc' in a loop like this:
while true; do nc -l 10000 < /dev/null ; printf '\n\n\n'; done
You can then go to http://localhost:10000 in your browser and see the HTTP request appear in your Terminal window. The web browser will give an error message since 'nc' isn't smart enough to reply.
To test an HTTP POST request you can use 'curl':
curl --data "this-is-POST-data" http://localhost:10000
Again, curl will give an error message because 'nc' simply closes the connection without giving a proper HTTP reply. You can have 'nc' reply with a static HTTP response to all requests like this:
while true; do printf 'HTTP/1.0 200 OK\r\nContent-type: text-plain\r\n\r\nHello, world!' | nc -l 10000 ; printf '\n\n\n'; done
If you need to use port 80 you'll need to run 'nc' as root (e.g. using 'sudo').
If you need to have any kind of real HTTP traffic going on, however, you will need to get a proper web server. OS X comes with the Apache web server which can be started with the command "apachectl start" ("apachectl stop" to stop it). CGI is enabled so you can put executables into /Library/WebServer/CGI-Executables and access them using http://localhost/cgi-bin/filename. For example, if you create the following CGI script:
#!/bin/sh
printf 'Content-type: text/plain\r\n\r\n'
cat > /tmp/post-data
echo OK
call it, say, "test.sh" and place it in the CGI-Executables folder, and run:
chmod +x /Library/WebServer/CGI-Executables/test.sh
Then, whenever you send a POST request to http://localhost/cgi-bin/test.sh it will save the contents of the POST data to the file /tmp/post-data on your computer.
Note: in all examples, "localhost" can be replaced with "macbook-pro.local" for accesses over the network if that is your computer hostname.
Also note that your OS X firewall permissions may block 'nc' and other software from listening to TCP ports. Usually you should get a permission dialog, but if you simply get "permission denied", tweak your firewall settings in System Preferences -> Firewall -> Firewall Options.
Look for SBJson framework
These are sample lines u can write to parse the GET data.
SBJsonParser *parser = [[SBJsonParser alloc] init];
NSDictionary *dict = [parser objectWithData:urlData];
[dictionary setDictionary:dict];
[parser release];
These are sample lines u can write to POST data.
SBJsonWriter *writer = [[SBJsonWriter alloc] init];
jsonStr = [writer stringWithObject:dictionary];
[writer release];
There are many more methods in framework to do some useful stuffs.

How do I execute an HTTP PUT in bash?

I'm sending requests to a third-party API. It says I must send an HTTP PUT to http://example.com/project?id=projectId
I tried doing this with PHP curl, but I'm not getting a response from the server. Maybe something is wrong with my code because I've never used PUT before. Is there a way for me to execute an HTTP PUT from bash command line? If so, what is the command?
With curl it would be something like
curl --request PUT --header "Content-Length: 0" http://website.com/project?id=1
but like Mattias said you'd probably want some data in the body as well so you'd want the content-type and the data as well (plus content-length would be larger)
If you really want to only use bash it actually has some networking support.
echo -e "PUT /project?id=123 HTTP/1.1\r\nHost: website.com\r\n\r\n" > \
/dev/tcp/website.com/80
But I guess you also want to send some data in the body?
Like Mattias suggested, Bash can do the job without further tools. If you want to send data, you have to preset at least "Content-length". With variables "host", "port", "resource" and "data" defined, you can do a HTTP put with
echo -e "PUT /$resource HTTP/1.1\r\nHost: $host:$port\r\nContent-Length: ${#data}\r\n\r\n$data\r\n" > /dev/tcp/$host/$port
I tested this with a Rest API and it workes fine.

Resources