Can anyone clarify how to configure follow up intents or prompts like in api.ai ? I am trying to create an application similar to api.ai using rasa nad spacy as backend.
Rasa NLU is just for the intent and entity classification. From their website:
Rasa NLU is an open source tool for intent classification and entity extraction. You can think of it as a set of high level APIs for building your own language parser using existing NLP and ML libraries.
To implement conversation or dialogue you need a different tool or to program your own solution.
Popular ones in the Rasa community are:
botkit.ai
Rasa Core
Articulate
As Keller said, it can be done with Rasa-core. DialogFlow supports both (input) parameters and “contexts”. Rasa also supports both (input) parameters and contexts with “Rasa slots”.
There are three steps:
1) In slots section of domain.yml, you can add a context, for example:
slots:
zipcode:
type: text
request_user_affirm:
type: text
2) request_user_affirm is the context slot, which will be filled by a customAction
3) use the context in your stories:
* inform{"zipcode": "78733"}
- bot_request_affirm
* deny{"request_user_affirm": "yes"}
- utter_request_info
bot_request_affirm is the custom action which will fill the request_user_affirm slot. if next user intent is deny and request_user_affirm is set, than the bot will response with utter_request_info action.
have fun with rasa-core.
RASA Core was specifically built for this, rather than creating a
dialog flow with simple if-else statements, RASA Core uses machine
learning to decide the flow.
More Information here
Related
Trying to figure out how I can take the intent from LUIS and insert it into the response from the bot. I cannot seem to find the variable/path to this value. I've tried:
#intentName
turn.recognized.intents.intentName
luisResult.intents[0]
The docs are not helping since it just shows java classes, doesn't directly translate to the Composer app afaik.
You can pull the first intent via the following adaptive expression:
${foreach(turn.recognized.intents, x, x.key)[0]}
page with intent and static fulfillment response, sys.no-match has also static respone
training phrases for intents
Using one of the training phrase triggers sys.no-match instead of the intent.
How do I get it to match the intent correctly?
edit: I ended up changed back from advanced NLU to standard NLU and now it works fine. I initially changed to advanced NLU because I thought it was better but turns out I was wrong
Normally the Standard NLU is automatically trained after any intent changes, however, switching to the Advanced NLU disables automatic training and the flow needs to be actively trained via the ‘Train’ button in the ML settings. Note that Auto Train is not available for the Advanced NLU setting.
In your situation, despite the intent creation, the training may have not been updated afterwards due to the Advanced NLU setting not automatically doing the training, resulting in the non-updated behavior.
In short: to properly use the Advanced NLU setting, you must press the ‘Train’ button before testing and wait for the training status to end ‘Training in progress’ so that the behavior of your Agent reflects the changes made since the last training session.
In my training data, I want to extract the following values from user input.
phone number,location,reference_number. So which entity extractor I need to use?
I think you should go with Duckling entity extractor. It can extract numbers, locations, etc. However if you are running this you have to run it in a separately and connect it through config.yml file.
Refer Rasa components and forum for more information
Can aws-lex be used to build a conversation flow bot?
For example:
Thank you very much!
Reason for all this: So we have our own "dialogue builder" and "bot-service".
Our own "Dialogue Builder": is maybe similar to Amazon Connect dialogue builder, and our own "Bot-service" is similar to Microsoft bot framework. Before we were using microsoft-luis to get "intention" of a sentence while using our own dialogue builder and our own bot-service to build a conversation/dialogues flow like if a user says "yes" then go to another flow and if a user says "no" then go to different flow (can this be done in slots?) === Binary tree :)
So now we are switching from luis to aws-lex and trying to think if it is possible to just use aws-lex UI and not our (dialogue builder/bot-service) anymore. But what I am understanding is that to use aws-lex without some kind of dialogue builder we would need to write alot of if/case statements if it contains large data, right? what is your suggestion? One way would be to just use "Amazon Connect" to utilize their dialogue builder so we don't have to write alot of if statements but then if we are using dialogue builder we can just use our own (old one) dialogue builder? what do you think?
Questions:
1)Is there a way to do something like this in aws-lex or not? I tried using slots/prompts/lambda but I am not able to go to 2nd or 3rd level depth in diagram. can be done somehow?
2) Do I have to use lambda and use "switch/if conditions each time it has to change the flow (ex: if answer is yes then reply this, and if no then reply this)?
3) If #2 is true, then is it possible for it to be used by non-developer. Even if I write if/conditions ~1k - 2k if conditions, then if a person (non-developer) tries to edit a dialogue/or-something through UI won't able to do it, right? (So does this mean that we are't really using UI of aws-lex, we are just writing "if conditions" in programming + using aws-lex "intention" to get intention, right?
4) Would it be possible to give example and show how making a flow is possible? So far using slots replies/responses don't change based on user's input. It doesn't matter if users says "no" or says "yes" it is going to reply with same path/answer. Is there a way to change reply based on user's input.
5) If #3 is not possible (non-developer) can't use aws-lex UI to make something like this, should we use custom dialogue builder which does this?
Thank you very very much!
It sounds like you're switching from the Microsoft Bot Framework to find a simpler solution to structured flows without entity recognition.
You may want research Microsoft's QnAMaker multi-turn ability. It's supported in the QnA Maker online editor, but not in the bot framework SDK (yet). They do have an example bot that uses it through the Web API.
https://learn.microsoft.com/en-us/azure/cognitive-services/qnamaker/how-to/multiturn-conversation
I realize this doesn't answer you Lex question, but might address your concern.
I have created a custom entity in MS CRM 4.0 and am trying to update a couple of the attributes via a custom worflow in .Net. I have read through several of the forums and blog posts and am still confused on how to access the custom entity and update some of their attributes.
I created a custom entity to replace how CRM was doing allotments as our company has some specific business rules that CRM wasn't doing. When a task is completed on an incident I want to update an attribute in the custom entity with the task duration. Any help would be greatly appreciated.
Thanks
When using the CRM web service in a custom workflow, you'll need to use DynamicEntity objects. The workflow context webservice is just an ICrmService so it doesn't know about your specific customizations. There's a pretty sample here: http://www.stunnware.com/crm2/topic.aspx?id=CustomWorkflowActivity
I imagine you could also add the CRM web services as a web reference to your workflow project. Then you'd have strongly types objects for your custom entities. I've never done this for my custom workflows, but it works for other custom apps accessing CRM.
Choosing Dynamic Entities over WSDL in favour is always the better choice.
When you develop a piece of code, you are more flexible with your classes. You could use your piece of software in different contexts for different systems. That's the reason Dynamic Entities were invented.
It's very easy and you dont'have to use DynamicEntity. You have to go to Settings -> Customization -> Download WSDL. Take the wsdl and use it in your project. Now you have all your custom entities strongly typed. All you have to do is to write something like this:
Guid entityId = getEntityId();
new_yourCustomEntity entity = new new_yourCustomEntity();
entity.new_yourCustomEntityid = entityId;
entity.new_customProperty = "value";
CrmService crmService = new CrmService();
crmService.Update(entity);
Maybe what you really mean is Custom Workflow Activity? This involves writing your own .NET class to add functionality to the standard CRM WF in form of new step types. If what you want to do is just to update an attribute you don't really need this, even if it is on a custom entity. The Update record step does just this and allows dynamic values (coming from other entities) to be specified.
Hope it helps
Daniel