LUIS intents not relating to entities - azure-language-understanding

We want to create a chatbot with about 80 intents, where most of them are just questions with direct answers related to HR (benefits, payroll, etc). It could be done with QnA maker but we decided using LUIS and utterances to have the option to query entities in the future if we need to.
I have tried to create some entities to lift the score of some intents, for example, we have a benefit question about inscriptions to sports in our company:
Utterance:
How can I enroll in soccer?
What steps do I have to register in tennis lessons?
...
So I have created two entities, one for "ActivityType" (soccer, tennis, etc), and other for "Enrollment" (register, enrollment, etc). The later is more as a synonym variable so I don't have to write that many utterances.
Then the utterances translates into:
How can I {Enrollment} in {ActivityType}
What steps do I have to {Enrollment} in {ActivityType} lessons?
These are my questions:
1) "Enrollment" entity is used to avoid creating that many utterances, does it make sense here or there is something better in LUIS for that?
2) I have tested the system and in some questions, it picks up the two entities (enrollmnet and activitytype) which are only present in a specific intent, but then it assigns in the top another intent which doesn't have at all those entities. I would expect the entities to somewhat lift the probability of selecting the intents that are using it (in this case, there is only one using them, so it should be pretty obvious but is not being selected)
Any help will be greatly appreciated :)

Related

Entity groups in dialogflow cx

How would you approach a problem where a user can order multiple objects which each can have entities associated
For example with a user utterance
"I want to order a large pizza with pepperoni and a small pizza with ham and pineapple"
I would want to
Recognise two distinct pizzas
The different size for each pizza
The topping associated with each pizza
I know Rasa has an option called entity groups which can handle this but does dialogflow cx? Or is it better to design a conversation flow that manages the conversation in a way that doesn't allow this sort of input?
You would have to use a form parameter in your page collecting your pizza order. You can see in the form parameter documentation that there's a boolean option for each parameter of the form named isList that collects multiple instances of a particular entity type you specify, which in your case I assume it would be the entity pizza.
For the example your provided ("I want to order a large pizza with pepperoni and a small pizza with ham and pineapple"), you can use numerical indexes in the intent parameter names in the training phrase annotations.
The annotations may look like this:
Numerical indexes will allow you to understand how many pizzas were ordered.
Extracted parameter values for this annotation will look like this:
For this proof of concept, the fulfillment is defined as conditional response:
You can define more sophisticated dynamic responses in your webhook.
Note that if you opt for this approach, you will need to add multiple diverse training phrases and annotated all of them consistently. Check out agent design best practices.
An alternative approach – collecting parameter values one by one via required form parameters – has various advantages:
you don't need to add and annotate training phrases
parameter value extraction may be more accurate.
In form parameter prompts, you may need to instruct the end-user to responds with only one piece of information at a time.

What is the difference of a machined learned entity with a list entity constraint vs using a list entity itself when using LUIS NLU entities?

In the v3 api for building LUIS apps I notice an emphasis on Machined learned entities. When working with them I notice something that concerns me and I was hoping to get more insight into the matter.
The idea is that when using a machined learned entity you can bind it to descriptors of phrase lists or other entities or list entities as a constraint on that machined learned entity. Why not just aim to extract the list entity by itself? What is the purpose of wrapping it in a machined learnt object?
I ask this because I have always had great success with lists. It very controllable albeit you need to watch for spelling mistakes and variations to assure accuracy. However, when I use machined learnt entities I notice you have to be more careful with word order. If there is a variation it could not pick up that machined learnt entity.
Now training would fix this but in reality if I know I have the intent I want and I just need entities from that what really does the machine learnt entity provide?
It seems you need to be more careful with it.
Now I say this with this suspicion. Would the answer lie in the fact that a machine learnt entity would increase intent detection where a list entity would only serve to increase entity detection. If that is the answer that most fits I think I can see the solution to what it is I am looking for.
EDITED:
I haven't been keeping up with LUIS ever since I went on maternity leave, and lo and behold, it's moving from V2 to V3!
The following shows an email conversation from a writer of the LUIS team's documentation.
LUIS is moving away from different types of entities toward a single ML entity to encapsulate a concept. The ML entity can have children which are ML entities themselves. An ML entity can have a feature directly connected to it, instead of acting as a global feature.
This feature can be a phrase list, or it can be another model such as a prebuilt entity, reg ex entity, or list entity.
So a year ago a customer might have built a composite entity and thrown features into the app. Now they should create an ML entity with children, and these children should have features.
Now (before //MS Build Conference) any non-phrase-list feature can be a constraint (required) so a child entity with a constrained regex entity won’t fire until the regex matches.
At/after //Build, this concept has been reworked in the UI to be a required feature – same idea but different terminology.
...
This is about understanding a whole concept that has parts, so an address is a typical example. An address has street number, street name, street type (street/court/blvd), city, state/province, country, postal code.
Each of the subparts is a feature (strong indicator) that an address is in the utterance.
If you used a list entity but not as a required feature to the address, yes it would trigger, but that wouldn’t help the address entity which is what you are really trying to get.
If however, you really just want to match a list, go head. But when the customer says the app isn’t predicting as well as they thought, the team will come back to this concept of the ML entity parent and its parts and suggest changes to the entities.

LUIS Intent prediction

I'm having trouble with a LUIS Intent prediction. The utterance "consumo" should trigger the intent "1529_CONSULTAR_CONSUMO" but LUIS keeps assigning it to a wrong intent, even though the exact utterance is registered as an example to the right intent. How can I fix this situation?
The solution to this problems depends on your model. Because the intent that LUIS associates to one utterance and the score can be affected by any entity or any utterance added on any intent.
You can try :
- Look at the wrong intent "consumo" was associated to and if some utterances are similar to "consumo" maybe the two intents should be the same
- Create a list entity with "consumo" and other entities in the list

Luis: How to have a generic entity that matches any word

Suppose I have a intent "findStuff" that takes things of the form
find xxx
list xxx where something = somevalue
find xxx where something = somevalue
Getting the LUIS to understand that "xxx" is any word seems hard. I defined a "plainWord" entity, and defined a pattern feature with the same name & value "\w+". I thought that that used to work, but doesn't seem to be doing it any more. Some words that it has seen it recognizes, but it can never seem to deal with "find junk" -- "junk" is never recognized as any entity.
The system for which this is intended is open-ended. Users can add there own types of things that we may "find."...
How extensively trained is your model? You should update your model by labeling users' utterances. I recommend against using generalized entities like a "plainWord" entity, from your description it sounds like this entity is supposed to just be applied to words that occur after "find" and "list". If an entity has not been labeled/applied to many utterances, your model will not catch the words you want it to catch.
If you post your LUIS model I might be able to better help you. You can export the JSON model or provide your application ID to share it.

Microsoft LUIS: Prebuilt entities and Intent

If I have a prebuilt entity temperature.
And how can I match it to my intent AskTemperature
Because In the AskTemperature.
It can't add the prebuilt entity.
So how do I implement to make entity temperature belongs to the intent AskTemperature?
Thank you!!!
If you provide an example of the labeled utterances, the community may be better equipped to answer your question. That said... Once you train your model, the model should automatically assign the prebuilt entity to the recognized tokens (words, numbers, etc).
After training your model, if this recognition doesn't occur, you may need to train with more straight forward utterances (e.g. "Is the temperature in Redmond right now 16 celsius?").
If all of this doesn't work, then posting relevant portions of your LUIS model may aid in getting help from the SO community.
After you trained your model/intents, the prebuilt entities should be assigned to that utterances.
I have this utterance: are there room available on 12/21/2016?
When I tried to assign the 12/21/2016 to "datetimeV2" entity directly, I cannot find that prebuilt entity; but after I trained it, it shows automatically.
Click to see this image (cannot post image since reputation not reaching 10)
Hope this helps!

Resources