Mapping LUIS Utterance with Intent - azure-language-understanding

Framework: Microsoft Bot Framework
I have a requirement in LUIS to get the entity details based on an intent.
Intent name: SingleWord
Entity name: Book
Attributes: Version, Book, Chapter, Word
Example Utterance: 2 turningpoints 3:1
Here
2 - Version
turningpoints - Book
3 - Chapter
1 - Word
I am unable to map the below Utterance with the SingleWord intent as we do not have space in between
2turningpoints3:1
I have also tried with Phrase lists option as well but no luck.
Thoughts?

The Patterns feature with the Patterns.any entity should fix this.

Related

luis utterance with example of words not be considered

I am using LUIS in composer and I use a phrase like the below to give examples to the ML entity "Location".
What is the weather in the {# location = London}
Is there a way in composer to tell LUIS not to pick "moon" as a location in the utterance "what is the weather in the moon".

How to Intent classification with similar examples in nlu.md in Rasa

I am developing chatbot using Rasa for a Contract Manager Organisation. I am facing few issues and after reading a lot on the forums and Rasa blog, I am unable to conclude to a solution for this. I have several similar intents with similar examples like -
“inform_supplier_start_date” and “inform_contract_start_date”.
“inform_supplier_email” and “inform_customer_email” and “inform_reviewer_email”
Now the issue is, for both the categories of intents the example sentence in nlu.md is same. What I exactly mean is-
##intent:inform_suppler_start_date
-what is the supplier [Microsoft] (supplier_name) start date
-[EON Digital] (supplier_name) start date
##intent:inform_contract_start
1) start-date of [O2 Mobile phones] (contract_name)
2) [O2 Mobile phones] (contract_name) start date
The model isnt able to differentiate and identify the correct intent. It is getting confused and identifying the wrong intent, since the words in these intents are similar.
I need correct intents to be recognised ,so that accordingly, In custom action i can query the Database and get the corresponding result for supplier and contract.
I have many fields like this for which the example data and user queries will be same. For Example-
customer_email & supplier_email & reviewer_email
total_spend_contract & total_spend_supplier & total_spend_customer
contract_number_for_supplier & contract_number_of_contract & contract_number_organisation
What exactly I should be doing to get correct classification. One solution i am thinking of is merging the intents like “supplier_start_date” and "contract_start_date" as one “start_date” and check for the extracted entity inside custom actions in both supplier and contract database. But I dont think that would be proper usage of Natural Language.
Please Suggest, I shall be highly greatful for the same. Regards.
As the examples for your intents are very similar, the model will not be able to differentiate between them. Also the intent is actual the same, inform_suppler_start_date and inform_contract_start inform the bot about a start date. What kind of start date it is should be figured out via the entity recognition. So I would propose to merge the similar intents and check what the entity recognition detected as entities. Depending on whether a supplier or a contract was found, you can execute query A or B.

LUIS - Can I have 2 languages (Chinese and English) in same App, and still have good result?

I am currently using MS LUIS for Chatbot.
Our country usually talks and chats using 2 languages, English and Chinese.
However, in LUIS I can only define one culture.
As a result, when my culture is set to English, and when I imports Chinese text, the confidence level is very low (e.g. English - 0.88, Chinese - 0.1). The other way round is the same.
The situation is the same even after I did tokenize the Chinese text using library like JieBa or THULAC.
Therefore when I did testing, it is usually very easy to fall into an unrelated intent.
I would like to make LUIS recognize both English AND Chinese easily. Are there any way to solve this problem?
Thank you very much for your help.
I would like to make LUIS recognize both English AND Chinese easily. Are there any way to solve this problem?
Yes, the way is to separate your LUIS apps/projects, 1 for each language, and use a language detection before calling LUIS.
That's the official approach from LUIS docs (see here):
If you need a multi-language LUIS client application such as a chat
bot, you have a few options. If LUIS supports all the languages, you
develop a LUIS app for each language. Each LUIS app has a unique app
ID, and endpoint log. If you need to provide language understanding
for a language LUIS does not support, you can use Microsoft Translator
API to translate the utterance into a supported language, submit the
utterance to the LUIS endpoint, and receive the resulting scores.
For the language detection you can use Text Analytics API from Microsoft Cognitive Services for example to get the text language, and then with this result query the right LUIS app.
How to use it?
Documentation of Language detection in Text Analytics API here
Text analytics API: here
As Nicolas mentioned above you can use Multilanguage chat application with separate LUIS apps for each culture.
In order to have single LUIS application, you could use the Translator Text API to translate all incoming messages before they're sent to LUIS. In this case you'll want to use middleware to handle the translation before your LUIS Recognizer is called. You can also use middleware to translate your bot's response so you don't have to use additional localization inside of your bot
Tokenizing in LUIS is different for each language in LUIS.
In the zh-cn culture, LUIS expects the simplified Chinese character set instead of the traditional character set.
Here is one more sample where you can select a language from the bot and continue the consversation as required.
After investigation and several testings, I guess I found a way on how to do that properly:
First of all, I am currently using MS BotFramework - NodeJS (3.14.0) version to create my Bot.
And in botbuilder class, you have a function called IntentDialog, it accepts a list of recognizers. So I wrote something like this:
In luis.js
const builder = require("botbuilder")
// Setting for LUIS.ai
// Universal API key for both apps
let luisAPIKey = process.env.LuisAPIKey;
// First assign variables for Chinese LUIS app
let luisAppId_Chi = process.env.LuisAppId_Chi;
let luisAPIHostName_Chi = process.env.LuisAPIHostName_Chi || 'westus.api.cognitive.microsoft.com';
let LuisModelUrl_Chi = 'https://' + luisAPIHostName_Chi + '/luis/v2.0/apps/' + luisAppId_Chi + '?subscription-key=' + luisAPIKey + '&verbose=true';
// Then assign variables for English LUIS app
let luisAppId_Eng = process.env.LuisAppId_Eng;
let luisAPIHostName_Eng = process.env.LuisAPIHostName_Eng || 'westus.api.cognitive.microsoft.com';
let LuisModelUrl_Eng = 'https://' + luisAPIHostName_Eng + '/luis/v2.0/apps/' + luisAppId_Eng + '?subscription-key=' + luisAPIKey + '&verbose=true';
// Return an object with 2 attributes: Chi for Chinese LUIS and Eng for English LUIS
let luis = {};
luis.chi = new builder.LuisRecognizer(LuisModelUrl_Chi);
luis.eng = new builder.LuisRecognizer(LuisModelUrl_Eng);
module.exports = luis;
In app.js
const luis = require("./luis")
builder.IntentDialog({ recognizers: [luis.chi, luis.eng] });
And when I tested in botEmulator, it seems that it will first check LUIS Chi app, and then go to LUIS Eng app.
I don't know what is the criteria/threshold that this recognizer used to controls whether jumps to another app or not. But at present it works for me to certain extend. It is not accurate by at least a good(?) start. :D
No MS Text translation API need.
By the way, the code will look nicer if I can get the topIntent and LUIS path right in session variable.
Hope it helps someone.

Rasa-core, Slots not getting Populated

I am trying to create simple printer support chat bot using rasa-core via nlu interpreter, bot should get the printer model, and printer type and post a issue.
I have used the printermodel and printertype variable in slot and entity, but the slots are not getting populated from the chat string.
Please help me on this.
Not very much information to go off of, but here are several things I would check if my slots weren't being filled correctly:
Is NLU parsing the entities correctly? Slots are usually filled from NLU entities. Send your text direct to the NLU and see if the entities are found.
Entity and Slot names are not consistent? The default method of filling slots without custom programming expects the slot name to match the entity name.
Are the slots defined correctly in the domain information?
If you're still having trouble I encourage you to create an issue or join us on gitter.
For example, we have to design simple conversation
User: I am Shivam
Bot: Hello Shivam
Here, we have to extract name and respond using it.
Step 1: In nlu.md file
## intent:told_name
- i am [shivam](name)
- my name is [shivam](name)
- hi, i am [shivam](name)
Step 2 In domain.yml file
intents:
- told_name
actions:
- utter_greet
entities:
- name
slots:
name:
type: text
templates:
utter_greet:
- text: "Hello {name}"
- text: "Hello {name}, happy to meet you."
Step 3 In stories.md file
# story_01
* told_name{"name": "Mayank"}
- utter_greet
I think, you are missing someting in step 3

Luis intents with composite terms

I am developping a bot using the LUIS API by Microsoft. This bot must understand queries related to shopping like "I want potatoes".
The only thing my application is struggling with is detecting intents from composite terms:
For the query "I want chopped steak", LUIS will detect chopped as an intent but not "chopped steak". I tried creating a productName phrase list feature and setting different composite terms including "chopped steak" but this doesn't seem to work.
What could I do to achieve that purpose ?

Resources