I want to put all arriving http.Requests into a queue and have a separate thread (goroutine?) process these requests and return the appropriate status.
However, the main http request handler directly completes the request even when the http.Request object is sent asynchronously to a goroutine.
Is there a way to control when the http.Request is completed and thereby asynchronously process it?
[Update]
I want to implement a producer-consumer model. The main request handler produces the requests and put them into a queue. A consumer thread (or threads) will read these requests, consume the body of the requests and return them.
http handlers are executed in a different goroutine per request, so if you are simply trying to free up the main serve loop, it's not neccesary.
If you are looking to serialize processing of requests, you could use a sync.Mutex and have your handler's lock on that. This would have a similar effect in that the requests would be handled one at a time.
I don't think sync.Mutex is fair, so it may not meet your needs.
also, if you wanted to be stateful between requests, then this is probably not the right solution.
As Jorge Marey mentioned, channels would work as well.
Though, i'd suggest you look at golang.org/x/net/context as it is a package specifically designed for multi-stage processing with timeouts and whatnot.
my guess is you will end up with a channel that passes structs that look like:
type Req struct{
ctx context.Context
w http.ResponseWriter
r *http.Request
}
Related
I'm writing a slack bot with Go and Aws Lambda. Slack requires for the bot to reply within 3 seconds. However, sometimes I can't make it reply that fast, cuz it's "talking" to other serverless applications for requesting some data or dispatching tasks. I have never worked with goroutines before, but I was hoping that I could implement something like this:
Lambda receives a request
The bot creates a goroutine that will process this request and act accordingly on it
The handler doesn't wait for all these actions to complete but replies right away with 200.
Lambda continues to run until goroutine is finished.
I'm not sure if that's even possible.
I've read about sync.WaitGroup, but I'm not sure how to incorporate it together with main function. Should I use it inside the handler? But I need to return response and that's not a function that I can wrap into a goroutine.
Ideally, I would like for handler to reply right away and then process goroutine in the background.
Don't try to do anything in your lambda handler after the request finishes.
A more reliable approach:
Accept the call and record whatever input data is needed.
Put the data in SQS
Respond with HTTP 200
Another (SQS triggered) function does the processing and if needed, calls Slack back on recorded response_url
Hi guys im passing from Python3 to Go so im trying to rewrite a lib ive created to get a better performance.
im facing a problem due to the fact that im noob in Golang XD, im using a limited API to download hundreds of jsons and i want to use as less as possible requests when i can.
so while downloading those jsons some of the URLs used are duplicated and the first idea i got is passing a map[stringLink]*myJsonReceived between my downloading functions ( goroutines ) and each goroutine before downloading checks if the link is already being processed by another one, so instead of requesting it again and waste Bandwidth + API Calls it should just wait for the Other goroutine to finish downloading it and get it from the dictionary.
I have few options :
1) the goroutine have to check if the link is within the map if so,it checks every 0.05s if the Pointer within the dictionary is still nil or contains the json. ( probably the badest way but it works )
2) change the map passed between goroutines to (map[stringlink]chan myjson) its probably the most efficient way but i have no idea how to send a single message to a channel and receive it by multiple awaiting Goroutines.
3) i can use the Option (2) by adding a counter to the struct and each time a goroutine founds that the url is already requested, it just add +1 to the counter and await the response from the channel,when the downloading goroutine completes it will send X messages to the channel. but this way will make me add too much LOCKs to the map which is a waste of performance.
Note: i need the map at the end of all functions execution to save the downloaded Jsons into my database to never download them again.
Thank you all in advance for your help.
What I would to to solve your task is I would use a goroutine pool for this. There would be a producer which sends URLs on a channel, and the worker goroutines would range over this channel to receive URLs to handle (fetch). Once a URL is "done", the same worker goroutine could also save it into database, or deliver the result on a result channel for a "collector" goroutine which could done the save sequentially should it be a requirement.
This construction by design makes sure every URL sent on the channel is received by only one worker goroutine, so you do not need any other synchronization (which you would need in case of using a shared map). For more about channels, see What are golang channels used for?
Go favors communication between goroutines (channels) over shared variables. Quoting from Effective Go: Share by communicating:
Do not communicate by sharing memory; instead, share memory by communicating.
For an example how you can create worker pools, see Is this an idiomatic worker thread pool in Go?
Can I pass the requestBody() headers() or anything else I retrieve from a finished OkHttp Call<> around to other threads, or is it necessary to copy the relevant data first?
You can pass the RequestBody to another thread, but only one thread is allowed to read the body. If multiple threads attempt to read it, you’re going to have a bad time.
Request and response headers are immutable.
Is it more idiomatic to have an async api, with a blocking function as the synchronous api that simply calls the async api and waits for an answer before returning, rather than using a non-concurrent api and let the caller run it in their own goroutine if they want it async?
In my current case I have a worker goroutine that reads from a request channel and sends the return value down the response channel (that it got in a request struct from the request channel).
This seems to differ from the linked question since I need the return values, or to synchronize so that I can be sure the api call finishes before I do something else, to avoid race conditions.
For golang, I recommend Effective Go-concurrency. Especially I think everyone using golang need to known the basics of goroutine and parallelization:
Goroutines are multiplexed onto multiple OS threads so if one should block, such as while waiting for I/O, others continue to run. Their design hides many of the complexities of thread creation and management.
The current implementation of the Go runtime dedicates only a single core to user-level processing. An arbitrary number of goroutines can be blocked in system calls, but by default only one can be executing user-level code at any time.
Just starting out with Go and hoping to create a simple Web API. I'm looking into using Gorilla mux (http://www.gorillatoolkit.org/pkg/mux) to handle web requests.
I'm not sure how to best use Go's concurrency options to handle the requests. Did I read somewhere that the main function is actually a goroutine or should I dispatch each request to a goroutine as they are received? Apologies if I'm "way off".
Assuming you're using the Go's http.ListenAndServe to serve your http requests, the documentation clearly states that each incoming connection is handled by a separate goroutine for you. http://golang.org/pkg/net/http/#Server.Serve
You would usually call ListenAndServe from your main function.
Gorilla mux is simply a package for more flexible routing of requests to your handlers than the http.DefaultServeMux. It doesn't actually handle the incoming connection or request just simply relays it to your handler.
I highly suggest you read a bit of the documentation, specifically this guide https://golang.org/doc/articles/wiki/#tmp_3 on writing web applications.
I'm providing an answer even though I voted to close for being too broad.
Anyway, none of that is really necessary. You're over thinking it. If you haven't read this it looks like a decent tutorial; http://thenewstack.io/make-a-restful-json-api-go/
You can really just set up routes like you would with most typical rest frameworks and let the webserver/framework worry about concurrency at the request handling level. You would only employ goroutines to generate the response of a request, say if you needed to aggregate data from 10 files that are all in a folder. Contrived example, but this is where you would spin off 1 goroutine per file, aggregate all the information by reading off a channel in a non-blocking select and then return the result. You can expect all points of entry to your code are called in an asynchronous, non-blocking fashion if that makes sense...