I have a spring controller and request mapped method.
#RequestMapping(value = { "/doSomething.do" })
public ModelAndView emailToNonBelievers(){
.....
// do something which takes lot of time
// send response..
return modelAndView;
}
This method takes very long time, about an hour.(It is for Admin, not users. And Admin doesn't need to wait an hour. Just fire and forget, that's ok. And this is not a batch job)
But I found that client send requests repeatedly with 3 minutes interval(observed 6 times and I stopeed Spring service).
I guess because client didn't get any response from server.
If my guess is right, server should response sort of "Your request is accepted, just shut up and wait!!" .
But how to send response(200 ok) before jobs finished in Spring?
Or am I missing something?
In this situation it would be recommended to use asynchronous task processing. Spring comes with out-of-box support for it via #Async annotation.Consult my answer for detailed setup for similar query and here for docs.
Related
I have 2 spring apps, I send requests from the First app to the second one with unirest. Something like below, as seen I am using basic auth. So far it works. I suddenly got this thought, will this create a session on each request? If so can I immediately end the session after the response is sent? I am not much willing to change the current implementation.
HttpRequest jsonResponse = Unirest.get(beeUrl+"/getPresentById/"+memId+"/"+role).basicAuth(bid.getPvalue(), bpwd.getPvalue()).header("Content-Type", "application/json");
Assume we have a 2 rest services:
// a rest controller
#GetMapping
private List<Employee> getAllEmployees() {
return employeeRepository.findAllEmployees();
}
// another controller
#GetMapping
private Flux<Employee> getAllEmployees() {
return employeeRepository.findAllEmployees(); // suppose reactive db driver here
}
a there any difference for client's web browser between this code?
Normal Rest Controller: Suppose server is going to return some 10000 records.In this case the server waits for the database to return all the data and this data is forwarded as response. So you recieve all the response in one shot in the mean time, browser will be continuosly loading with blank page and thats a bad experience in this modern era
Reactive Controller: In webflux spring Reactive Controller their is a concept of back pressure. In back pressure their is open connection between server and database, so whatever the records have been received will be continuously emitted as a response. Hence no blank screen and better user experience.
Note: The connection between browser will remain intact till the time all data is loaded in browser
I'm new on .NET technology and come into some problem. Currenlty i'm trying to build a REST API that handle long processing before sending the result to client.
What i'm trying to achieve is, i would like to do a background processing after receiving request from client. But, i would also like to send a response to client.
In short, it would be something like this.
Client Request -> Handled by controller ( doing some processing ) -> send response directly, ignoring the background that still running.
On Java, i can do this using Runnable Thread. How can i achieve this on C# Web API ?
Thank you.
In short, don't do this.
The job of an API is not to perform heavy duty, long running tasks.
You could simply let the API receive the request to perform something, then delegate that to another service. The API can then send a 200 response to show it received the request and maybe a URL to another resource which allows a user to track the progress.
The API needs to be available and responsive at all times. It needs to serve a number of users and if a number of them all request something that uses a lot of resources and takes a lot of time, chances are the API will simply go down and not serve anyone.
This is why you do not do such things in an API. Let other services do the heavy lifting.
Your api can call another async method and return 200/OK response without waiting for the request to complete.
You can learn more about async programing in c#.
static async Task Main(string[] args)
{
Console.WriteLine("coffee is ready");
var toastTask = MakeToastWithButterAndJamAsync(2);
async Task<Toast> MakeToastWithButterAndJamAsync(int number)
{
//Do something here.
}
}
This can be achieve this using loosed coupled architecture, by introducing service bus or blob storage, once you receive request in web api you can save it to blob/service bus and return acknowlegement response from web api. From service bus/blob storage use webjob/function/ durable function app to process the message using event.
I have several OkHttp requests that I need to send and even if there is no internet for some time, in the end they have to be delivered. It seems that WorkManager is a good API that can handle this.
But how to integrate OkHttp requests and WorkManager?
And (second) be able to get notifications?
I have recently used same as you want:
You can do that like:
#NonNull
#Override
public Result doWork() {
// DO OKHTTP REQUEST HERE........
return Result.success();
}
Note : Do not use any UI thread here like progress bar.
I have a rest controller which accepts post requests and returns the statuses of whether the operations are successfull or not. It works fine for 100 requests per second as I have multiple operations underlying it which at the end send the response.
There could be hundreds of users trying to send the requests to the controller, thereby its all done using a completable future and http Async invoker. The problem happens when there are a 1000 requests per second and then the controller threads are exhausted as there are already multiple thread processing multiple requests and all are waiting for there future to be complete and then sending the response.
How can I make my rest controller be able to handle 1000 requests per Second without breaking.
there are already multiple thread processing multiple requests and all are waiting for there future to be complete and then sending the response.
You can actually make you controllers asynchronous by making them return a CompletableFuture. Just chain the calls on the CompletableFuture returned by your service to convert them into a an appropriate response instead of using get() or join():
#RequestMapping
public CompletableFuture<ResponseEntity<…>> processRequest() {
return myService.getStatusFuture()
.thenApply(status -> convertToResponseEntity(status));
}
Of course for this to work properly you should have a truly asynchronous service. If you are using #Async or submitting tasks with CompletableFuture.supplyAsync(), this will just move the problem from the HTTP threadpool to another threadpool.
It depends on the servlet server you are using. In the application.properties file, you can use the server.* properties to set the parameters you need.
In this link you can find these properties under the EMBEDDED SERVER CONFIGURATION section. If you are using the default tomcat embedded server, check out the server.tomcat.* properties. Especially the server.tomcat.accept-count, server.tomcat.max-connections and server.tomcat.max-threads properties.