Terminating Dialogflow Application from Custom Client - go

I created a custom Golang server to handle Dialogflow's fulfillment. I want my fulfillment server to tell Dialogflow (which would be running the compiled version on a Google home) to terminate my action after a certain period of inactivity. Is this possible in the current architecture?

To mark the end of an Action and close it, you can return false for the expectUserResponse field, like:
{
"payload": {
"google": {
"expectUserResponse": false,
"richResponse": {
"items": [
{
"simpleResponse": {
"textToSpeech": "Goodbye!"
}
}
]
}
}
}
}

Related

Pass data to start of session (IVR)

Looking into starting a project using DialogFlow CX. Seems rather promising but have one issue I cannot seem to find an answer for. The agent will be connected to via IVR (from Flex/Callcenter). I need to gather some information on start so that I can identify the hotel/property that will be referenced in the conversation. I found session parameters but those are isolated to the session from start to finish but not passed to the start of a session. We are starting with about 60 properties and when the agent starts, it needs to "know" what property it is dealing with.
Another quick question - will I need a separate telephony integration number to run multiple concurrent instances?
I am really new to all this so my language may be off. Thanks in advance!!
Robert
Passing parameters to dialogflow Cx depends on your integration, but a general way to do this would be to call a webhook at the Welcome node with parameters set to null that you want to return with updated values from backend:
Return Response in the following manner and it will work via a webhook:
Link to Google Cloud Functions Webhook Example
{
"fulfillment_response":
{
"messages": [
{
"text": {
"text": [
"Test Response"
]
}
}
]
},
"session_info": {
"session": projects/project-name/locations/your-agent-location/agents/a6ed61ef-008b-49bb-8526-d8e68982d2b4/sessions/d3bd9c-9d9-8c8-4a8-f623bdb25,
"parameters": {
"property1": 746932,
"property2":34123
}
}
}
Python Flask Solution:
import os
from flask import Flask, request, redirect
app =Flask(__name__)
#app.route('/test', methods=['POST'])
def test():
req = request.get_json()
session_name=req['sessionInfo']['session']
print(req)
res = {
"fulfillment_response":
{
"messages": [
{
"text": {
"text": [
"Test Response"
]
}
}
]
},
"session_info": {
"session": session_name,
"parameters": {
"property1": 746932,
"property2":34123
}
}
}
return res
if __name__ == "__main__":
app.run(port=5000,debug=True)

Use Postman to test Appsync Subscription

I have been able to successfully execute Appsync GraphQL queries and mutations from Postman. However, i'm struggling to connect to subscriptions which are websocket urls.
How can I achieve the same ?
Since Postman supports WebSockets testing GraphQL subscriptions is achievable as well. Such a testing requires two steps:
connection to a server,
sending a start message.
Establishing a connection:
Create a new WebSocket request.
Put your server URL ws:// or wss://.
Add custom header parameter Sec-WebSocket-Protocol: graphql-ws. Other headers may depend on your server configuration.
Press the "Connect" button.
When the connection is established we may start a subscription.
In the "New message" field put the command.
Press the "Send" button.
The start message should look like this:
{
"id":"1",
"payload": {
"operationName": "MySubscription",
"query": "subscription MySubscription {
someSubscription {
__typename
someField1
someField2 {
__typename
someField21
someField22
}
}
}",
"variables": null
},
"type": "start"
}
operationName is just the name of your subscription, I guess it's optional. And someSubscription must be a subscription type from your schema.
query reminds regular GraphQL syntax with one difference:
__typename keyword precedes every field list.
For example, the query from the payload in regular syntax looks like the following:
subscription MySubscription {
someSubscription {
someField1
someField2 {
someField21
someField22
}
}
}
Example message with parameters (variables):
{
"id":"1",
"payload": {
"operationName": "MySubscription",
"query": "subscription MySubscription($param1: String!) {
someSubscription((param1: $param1)) {
__typename
someField
}
}",
"variables": {
"param1": "MyValue"
}
},
"type": "start"
}
It also reminds regular GraphQL syntax as described above.
variables is an object with your parameters.
#Vladimir's answer is spot on. Adding a few notes for folks still having trouble.
Full document here # https://docs.aws.amazon.com/appsync/latest/devguide/real-time-websocket-client.html
Step 1 - establish connection:
make sure to base64 encode values in "header" and "payload" querystrings
header example:
{
"host":"example1234567890000.appsync-api.us-east-1.amazonaws.com",
"x-api-key":"da2-12345678901234567890123456"
}
payload: You can pass in empty payload
{}
Step 2 - register subscription:
Include the authorization in the message. Escape line feeds properly "\n" throws an error but "\\n" works. it throws the following error - misleading.
Don't forget to stringify value in "data" field.
{
"type": "error",
"payload": {
"errors": [
{
"errorType": "UnsupportedOperation",
"message": "unknown not supported through the realtime channel"
}
]
}
}
{
"id": "2",
"payload": {
"data": "{\"query\":\"subscription onCreateMessage { changeNotification{ __typename changeType from } }\",\"variables\":{}}",
"extensions":{
"authorization":{
"host":"example1234567890000.appsync-api.us-east-1.amazonaws.com",
"x-api-key":"da2-12345678901234567890123456"
}
}
},
"type": "start"
}

Make an API call to service once the specific terraform module is destroyed using custom terraform provider

I am using a custom terraform provider to make an HTTP API call to service when a terraform module is completed or destroyed.
I am able to successfully make an API call to service when the module-a is provisioned post that custom_resource_complete is created which makes an API call.
Whereas in the case of destroying scenario where I am using option -target="module.module-a" to destroy the module. What I thought once the module is destroyed it will also delete custom_resource_destroy which will eventually make an API call. But here what happens is called
first and then only module-a is destroyed so I am unable to capture it.
I tried other options like -target="custom_resource_destroy.module-a-destroy" no luck this will destroy only this resource but not the module
My requirement is I need to make an API call to service once the specific terraform module is destroyed. Below is my terraform main.tf
{
"provider": {
"aws": {
},
"customprovider": {
"url": "some url"
}
},
"module": {
"module-a": {
"source": "./module-a"
}
},
"resource": {
"custom_resource_complete": {
"module-a-complete": {
"output": "${module.module-a.output}"
}
},
"custom_resource_destroy": {
"module-a-destroy": {
"output": "${module.module-a.output}"
}
}
}
}
Custom Provider Go Lang Snippet
func resourceCustomResourceDestroy() *schema.Resource {
return &schema.Resource{
Create: resourceCustomResourceDestroyCreate,
Read: resourceCustomResourceDestroyRead,
Update: resourceCustomResourceDestroyUpdate,
Delete: resourceCustomResourceDestroyDelete,
}
}
func resourceCustomResourceDestroyDelete(d *schema.ResourceData, m interface{}) error {
// HTTP Request to service
}

Why doesn't my MassTransit Consumer consume a message that exists in AzureServiceBus?

We're using MassTransit to build microservices using Azure ServiceBus. Up until now we've only used topics (as opposed to queues), since (I guess) we had an easier time getting topics up and running.
I would like to use queues, but can't seem to get it to work. I managed to Send a message - no problem there, and I can see the message in the Azure ServiceBus Explorer, but I just can't seem to make my Consumer eat the message. The same consumer had no problem eating the message when we Subscribed to a topic.
This is how I try to set up the consumer:
services.AddMassTransit(c =>
{
c.AddConsumer<MyConsumer>();
c.UsingAzureServiceBus((context,cfg) =>
{
cfg.Host(connectionString, host => { host.OperationTimeout = new System.TimeSpan(0,0,60); });
cfg.ReceiveEndpoint("super-queue", e =>
{
e.ConfigureConsumer<MyConsumer>(context);
e.RequiresSession = true;
});
// ...other configs
});
});
services.AddMassTransitHostedService();
And this is what the message looks like in Azure ServiceBus Explorer
{
"messageId": "00010000-0faa-0009-8bac-08d8f9a4cd7c",
"conversationId": "00010000-0faa-0009-0077-08d8f9a4cd82",
"sourceAddress": "sb://slvstmessaging.servicebus.windows.net/IT003477_iisexpress_bus_yyyoyyyxieyy19rjbdcxujf6nr?autodelete=300",
"destinationAddress": "sb://slvstmessaging.servicebus.windows.net/super-queue",
"messageType": [
"urn:message:SLV.MassTransit.Meddelanden:IAnsokanInkommen"
],
"message": {
"korrelationsId": "a5cd9107-fb75-4b4a-a41e-ad729834d4d2",
"ursprung": "Samtjänsten"
},
"sentTime": "2021-04-07T09:09:09.0369452Z",
"headers": {
"MT-Activity-Id": "00-978c69d24cc6e94798a901645e370f81-b5b1e47ba0c16e4b-00"
},
"host": {
"machineName": "IT003477",
"processName": "iisexpress",
"processId": 54288,
"assembly": "SLV.Samtjänst-API",
"assemblyVersion": "1.0.0.0",
"frameworkVersion": "5.0.1",
"massTransitVersion": "7.1.4.0",
"operatingSystemVersion": "Microsoft Windows NT 10.0.18363.0"
}
}
And this is what the consumer looks like
public class MyConsumer: IConsumer<IAnsokanInkommen>
{
public async Task Consume(ConsumeContext<IAnsokanInkommen> context)
{
Log.Information("Got it!");
}
}
PS - IAnsokanInkommen is an interface, in a shared nuget

gaction test caller has no permission

I am trying to setup custom commands on my raspberry pi 3 with google assistant SDK. I was following this guide to setup my custom command. Whenever I run gactions test ti would get the following error:
Pushing the app for the Assistant for testing...
ERROR: Failed to test the app for the Assistant
ERROR: The caller does not have permission
2019/05/16 17:50:23 Server did not return HTTP 200
I have used gactions update to upload my action definition json file, with the google account that owns the Google Action project, and it was updated successfully. Therefore I'm not sure why I would have no permission on a project I owned and with a successfully updated action json.
There is the json I had for my custom action.
{
"manifest": {
"displayName": "DJ Roomba",
"invocationName": "DJ Roomba",
"category": "PRODUCTIVITY"
},
"actions": [
{
"description": "Thanos Snap",
"name": "djroomba.name.ThanosSnap",
"availability": {
"deviceClasses": [
{
"assistantSdkDevice": {}
}
]
},
"fulfillment": {
"staticFulfillment": {
"templatedResponse": {
"items": [
{
"simpleResponse": {
"textToSpeech": "You should have gone for the head"
}
},
{
"deviceExecution": {
"command": "action.devices.commands.ThanosSnap"
}
}
]
}
}
},
"intent": {
"name": "djroomba.intent.ThanosSnap",
"trigger": {
"queryPatterns": [
"Thanos snap"
]
}
}
}
],
"locale": "en"
}
I'm not sure if this will help, but I'm using Raspbian Jessie(since snowboy only supports up to that)
So, it turns out to be that I'll have to at least publish my action in alpha or beta version, otherwise the gactions test will not work

Resources