Does azure LUIS support multi-intents in one message - azure-language-understanding

If I give the message "open the door and turn on the lights" come out two intents
"OpenDoor"
"TurnOnLights"
does luis support this?
thanks

Short answer would be "no, you will not be sure of the number of valid intents found".
LUIS is providing a scoring for all the intents of your project when you test a sentence, it is not giving one or two intents. Then it will be what you do with these scorings that will help you to define if there are 1, 2, 3 valid information.
For example if I define a model with 2 intents:
Turn lights On with an exemple "turn on the lights"
Open doors with an example "open the door"
Then train and test:
{
"query": "open the door and turn on the lights",
"topScoringIntent": {
"intent": "Turn lights On",
"score": 0.9421587
},
"intents": [
{
"intent": "Turn lights On",
"score": 0.9421587
},
{
"intent": "Open doors",
"score": 0.1412498
},
{
"intent": "None",
"score": 0.109745957
}
],
"entities": [
]
}
You can be surprised that the "Open doors" scoring is quite bad even if the query contains the utterance.
From my experience with LUIS, you should not try to detect several intents in 1 query.

Related

How can I get user response from adaptive card using Adaptive Cards Action.Submit action from MS Teams channel using Microsoft Bot Framework?

How can I get user response from the adaptive card using Adaptive Cards Action.Submit action from MS Teams channel using Microsoft Bot Framework?
Here is my sample Adaptive Card with two button Yes and No. Once the user will click on any button, I need to capture the response in the bot application in the backend implemented using Microsoft Bot Framework 4.
{
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.2",
"body": [
{
"type": "TextBlock",
"text": "Does this information help you?"
},
{
"type": "ActionSet",
"actions": [
{
"type": "Action.Execute",
"title": "Yes",
"verb": "personalDetailsFormSubmit",
"id": "surveyReplyYes",
"userIds": "" ,
"data":{
"key1": true,
"key2":"okay"
},
"fallback": {
"type": "Action.Submit",
"title": "Yes"
}
},
{
"type": "Action.Execute",
"title": "No",
"verb": "personalDetailsFormSubmit",
"id": "surveyReplyNo",
"userIds": "" ,
"data":{
"key1": false,
"key2":"np"
},
"fallback": {
"type": "Action.Submit",
"title": "No"
}
}
]
}
]
}
Every channel has some additional requirement to achieve this kind of requirement obviously MS Teams Channels Adaptive card required special property with the name "msteams" to the object in an object submit action's data property. As per your adaptive card, it only contains 'data' property so change a little bit and try it out.
Example:
{
"type": "Action.Submit",
"title": "Click me for messageBack",
"data": {
"msteams": {
"type": "messageBack",
"displayText": "I clicked this button",
"text": "text to bots",
"value": "{\"bfKey\": \"bfVal\", \"conflictKey\": \"from value\"}"
},
"extraData": {}
}
}
Reference: Adaptive Cards in Teams
Essentially, your bot is a service waiting to be called by the user. When the user sends a regular text message, that will come in to your bot as a "MessageActivity" event. However, if they click a button in an Adaptive Card, that will come as an "InvokeActivity" event, so you can hook into that and check if the user clicked one of your buttons, and respond appropriately. Here's an example of a bot doing that based on one of it's cards. See in particular OnMessageActivityAsync vs OnInvokeActivityAsync (C# only - see below for Node).
Here's another very detailed blog on working with this, covering both DotNet and Node, from the Microsoft Bot Framework team. That post is a bit old, so it doesn't cover what you're using in your sample, which is quite new - Universal Actions. It's just a slightly newer way of specifying the json for the Action.
This is totally optional, but there's also a way to make the card buttons behave a little differently. For instance, when the user clicks a button you can make it appear as if the user -typed- that text into the bot. See here for more on that.

Thumb up/down satisfaction emoji UI on bot framework composer with adaptative card

Going through the issues page of bot framework composer, I stumbled on this issue which shows an interesting UI card for getting the user's satisfactory :
Does this looks like an adaptive card ? How can we reproduce that ?
the easiest thing to do is go to the designer on Adaptivecards.io. They have a super simple Adaptive Cards Designer experience that is drag and drop.
on other tabs they have example templates which you can launch in the designer to see how they work and play with them.
In your example above, the card is simply a text field with two image links and the images are clickable.
You can use the containers and/or columns components in the designer to layout the items.
You can use a hero card to get user satisfaction response by using thumbs up/thumbs down emoji.
[Herocard
text = Are you satisfied?
buttons = 👍 | 👎
]
Following JSON is a sample for thumbs up and down input choice
{
"type": "TextBlock",
"text": "Do you like the product"
},
{
"type": "Input.ChoiceSet",
"style": "expanded",
"isMultiSelect": false,
"choices": [
{
"title": "👍",
"value": "yes"
},
{
"title": "👎",
"value": "no"
}
],
"placeholder": "Placeholder text",
"spacing": "None"
},

How to make LUIS respond with the matched entity

I am setting up a LUIS service for dutch.
I have this sentence:
Hi, ik ben igor -> meaning Hi, I'm igor
Where Hi is an simple entity called Hey, that can have multiple different values such as (hey, hello, ..) which I specified as a list in the phrases.
And Igor is a simple entity called Name
In the dashboard I can see that Igor has been correctly mapped as a Name entity, but the retrieved result is the following:
{
"query": "Hi, ik ben igor",
"topScoringIntent": {
"intent": "Greeting",
"score": 0.462906122
},
"intents": [
{
"intent": "Greeting",
"score": 0.462906122
},
{
"intent": "None",
"score": 0.41605103
}
],
"entities": [
{
"entity": "hi",
"type": "Hey",
"startIndex": 0,
"endIndex": 1,
"score": 0.9947428
}
]
}
Is it possible to solve this? I do not want to make a phrase list of all the names that exist.
Managed to train LUIS to even recognize asdaasdasd:
{
"query": "Heey, ik ben asdaasdasd",
"topScoringIntent": {
"intent": "Greeting",
"score": 0.5320666
},
"intents": [
{
"intent": "Greeting",
"score": 0.5320666
},
{
"intent": "None",
"score": 0.236944184
}
],
"entities": [
{
"entity": "asdaasdasd",
"type": "Name",
"startIndex": 13,
"endIndex": 22,
"score": 0.8811139
}
]
}
To be honest I do not have a great guide on how to do this:
Add multiple example utterances with example entity position
Did this for about 5 utterances
No phrase list necessary
I'm going to accept this as an answer, but once someone explains in-depth and technically what is happening behind the covers, I will accept that answer.

Slack API - Create a button that returns text from a variable

I am creating a Post.message to Slack through Python and want to add in a button feature. I want the button to provide a list of serial numbers that are represented by low2 = low["serials"]. This is the code I currently have and it adds the button to the slack message but when I click the button I get an error saying "Oh no, something went wrong. Please try that again." from slackbot. I saw posts saying that most people have to create a bot to fix their problems with buttons but if the button just has to read this variable I assume there is a way around that. Thanks for the help!
"fields": [
{
"title": "Amount Used:",
"value": low1,
"short": 'true'
},{
"title": "Distinct Device ID's:",
"value": out1,
"short": 'true'
},
{
"title": "Total Connection Time (hr):",
"value": data2,
"short": 'true'
}
],
"actions": [
{
"name": "game",
"text": "Serials",
"type": "button",
"value": "serials",
}
],
No, there is no way around it. You must create a Slack App (or "Internal Integration") in order to use buttons in your app. One reason is that you need to tell Slack what URL to call if someone clicks a button (your "Action URL") and that can only by configured as part of a Slack app. Check out this documentation on interactive messages for details.
Regarding your approach. A button will only display one value to the user. If your aim is to let the use choose from a list of serial numbers, you have two options in my opinion:
a) Create a group of buttons, one for each serial number
b) Use an interactive menu to create a drop-down menu for your list
I solved my problem by converting the confirm action button to display the values I wanted.
with open('Count_BB_Serial_weekly.json', 'r') as lowfile:
low = json.load(lowfile)
low1 = low["total_serials"]
low2 = low["serials"]
low3 = '\r\n'.join(low2)
Above is my script that imports the array and reads the values. Below I put the results into the "confirm" pop up button.
],
"actions": [
{
"name": "game",
"text": "Serials",
"type": "button",
"value": "serials",
"confirm": {
"title": "Serial Numbers",
"text": low3,
"ok_text": "Yes",
"dismiss_text": "No"
}
}],

Bot Framework names entities

In google's api.ai, to process such a sentence:
"What is John Doe's email?"
I create a prebuilt entity called "given-name" and "last-name" to get the name "John Doe"
How to do the same with Microsoft Bot Framework/Luis?
In Ms LUIS you need to add utterances base on your questions, and assign entity in that phrase.
you can refer below links.
http://aihelpwebsite.com/Blog/EntryId/4/Creating-Intelligent-Web-Applications-With-LUIS
https://learn.microsoft.com/en-us/azure/cognitive-services/luis/home
I hope this answer will help you.
You don't do it with the Bot Framework, not directly. Bot Framework help you build your conversation flow but doesn't come with built-in NLU. You are likely to use LUIS (also luis.ai) which it supports natively and do your intent detection and entity extraction there. You can also consume your api.ai agent from the Bot Framework if you like. I did that to support a language that LUIS doesn't speak yet (more details - http://www.pveller.com/integrating-bot-framework-with-api-ai/)
UPDATE
Expanding on my comment. Here's how I approached extracting a contact entity in one of my bot prototypes. These are JSON snippets from the exported LUIS model:
"entities": [
{
"name": "Contact"
}
],
"model_features": [
{
"name": "Contact",
"mode": true,
"words": "John Smith,John Doe,Mary Jay,Robin Smith",
"activated": true
}
],
"utterances": [
{
"text": "please email to john smith and robin smith",
"intent": "Email",
"entities": [
{
"entity": "Contact",
"startPos": 16,
"endPos": 25
},
{
"entity": "Contact",
"startPos": 31,
"endPos": 41
}
]
}
]

Resources