How to delay for agent response for Dialogflow cx - dialogflow-cx

I am using dialogflow with audiocode integration. How to I delay agent to wait for response so that the end user could complete his response?

You may want to try Dasha. If I understood you correctly, using DashaScript you can add a delay like that:
#waitForSpeech(1000);
https://docs.dasha.ai/en-us/default/dasha-script-language/built-in-functions/#waitforspeech-blocking-call

Related

API Gateway Proxy Integration. Return response of second lambda function

The premise is pretty simple.
The usual application flow is as follows:
API Gateway receives a request.
API Gateway triggers Lambda Function with parameters.
Lambda Function runs the logic.
The Lambda Function's response is automatically forwarded to API Gateway as the response to step 1 (Response to the Received API Request).
Here's the issue I'm having. I need to run two functions before returning the response to the received API request. I need the return statement from the second function in step 4 to be the response sent back
to the client.
Now there are more examples where this is necessary. In the future, we might need to run a few services (such as lambda > Lambda > PostgreSQL > API Response) before responding to the request.
Is there a way to receive a request from a client, then run a host of tasks, assemble the necessary data, then use this data as a response in the original API request? So far step-functions seemed a likely solution but I don't know if it can do this.
Until recently this would've been a pain with Step Functions but around re:invent time last year they announced the ability to orchestrate synchronous express workflows: https://aws.amazon.com/blogs/compute/new-synchronous-express-workflows-for-aws-step-functions/
IMO, this would be the best / easiest way to implement what you're looking for.

How to configure rate limit for a specific api action based on server response?

Is there a way to configure rate limit rule on a specific API action on Azure API management. The desired solution is to return HTTP 429 (too many request) if the end user receives a certain response from the backend, after X attempts.
See increment-condition on rate-limit-by-key policy: https://learn.microsoft.com/en-us/azure/api-management/api-management-access-restriction-policies#LimitCallRateByKey
In Azure APIM, rate limiting can be done based on a key or subscription. So rate limit based on the back end response over a number of times is not possible.
It is also not a good practice to keep track of response status between the calls.
You can possibly make use of retry policy to verify the response from back end service. https://learn.microsoft.com/en-us/azure/api-management/api-management-advanced-policies#Retry. You can use the retry result to send back 429 response.

Multiple subscribe not working with enableProtocol:true

I am using atmosphere jersey with redis broadcaster.
When I keep enableProtocol:true in javascript, the first subscribe request is successful.
But when I send next subscribe request I get Continuation Frame warning. I tested on Google chrome. I have attached the snapshot.
What could the issue be?
It works when I keep enableProtocol:false. But then onDisconnect is not called in long-polling.
After some observation I found that the X-Atmosphere-tracking-id=0 in first request and in subsequent requests I get it as the tracking id of previous request.
How do I avoid this?
You can try by manually setting the tracking-id to 0 before each subscribe.
$.atmosphere.uuid = 0

In Heroku, is there a way to add multiple HTTP post hooks?

I am currently using Heroku's HTTP post hook add-on to send a message to my company's chat client, but now I want to do more with it. Is it possible to add multiple HTTP post hooks, so more than one service could be notified when a deploy happens? (no I don't want to receive emails).
Thanks!
-Doug
My friend Jared made an app for this: https://github.com/deadlyicon/deploy-hook-forker
Not right now - a workaround would be having a tiny app to receives your HTTP post hook and trigger all the HTTP hooks you need.

How can I cancel request in Django

I have writen searching in my site and now I am trying to make it search every time I start printing. So now I am sending many requests which contains different text to search for using AJAX one by one and every next reqest has to wait, before previous one is finished. Apperently I dont need old requests to be answered, but I need the only one response for the last request.
How can I kill the queue of not actual requests in Django?
Does anybody know the answer?
On the server side, it's probably too late to cancel requests, but you can ignore the responses on the client side. I would suggest aborting a pending AJAX request before sending a new one.
Here is how:
Abort Ajax requests using jQuery
An easier way to do this could be by waiting a bit before sending your request to the server. After each input, set up a timer that stops the previous (setTimout) and only send the request if the timeout is met.
If a request was already performed and has not returned you can still kill it as suggested in another answer.
I'm not aware of how to stop other requests using django -- hope that it's not even possible, it would be a security thread if requests could be killed by others.

Resources