I am currently working on a project where visitors are normally using both English and Chinese to talk to each other.
Since LUIS did not support multi-language very well (Yes I know it can support in certain ways but I want a better service), I would like to build my own Neural Network as a REST API so that, when someone submits their text, we can simply predict the "Intent", while we are still using MS BotFramework (NodeJS).
By doing this we can bypass MS LUIS and using our own Language understanding service.
Here are my two questions:
Has anyone done that before? Any GitHub link I can reference to?
If I did that, what is the BotFramework API I should use? There is a recognizer called "Custom Recognizer" and I wonder if it really works.
Thank you very much in advance for all your help.
Another option apart from Alexandru's suggestions is to add a middleware which will call the NLP service of your choosing everytime the bot receive a chat/request.
Botbuilder allows middleware functions to be applied before handling any dialogs, I created a sample code for a better understanding below.
const bot = new builder.UniversalBot(connector, function(session) {
//pass to root
session.replaceDialog('root_dialog');
})
//custom middleware
bot.use({
botbuilder: specialCommandHandler
});
//dummy call NLP service
let callNLP = (text) => {
return new Promise((resolve, reject) => {
// do your NLP service API call here and resolve the result
resolve({});
});
}
let specialCommandHandler = (session, next) => {
//user message here
let userMessage = session.message.text;
callNLP.then(NLPresult => {
// you can save your NLP result to a session
session.conversationData.nlpResult = NLPResult;
// this will continue to the bot dialog, in this case it will continue to root
// dialog
next();
}).catch(err => {
//handle errors
})
}
//root dialog
bot.dialog('root_dialog', [(session, args, next) => {
// your NLP call result
let nlpResult = session.conversationData.nlpResult;
// do any operations with the result here, either redirecting to a new dialog
// for specific intent/entity, etc.
}]);
For Nodejs botframework implementation you have at least two ways:
With LuisRecognizer as a starting point to create your own Recognizer. This approach works with single intent NLU's and entities arrays (just like LUIS);
Create a SimpleDialog with a single handler function that calls the desired NLU API;
Related
I am building a slackbot that will remind people in my organisation to perform certain admin (hours expenses etc) every week. I know this can be very easily done by each person creating a recurring reminder. What i want is to create a bot that will send a preconfigured message to people every week. I've looked online extensively, and haven't yet found out how slackbot can send a message without an event or being otherwise prompted.
I'm currently testing this on a local ngrok server with the following backend:
const { WebClient } = require('#slack/web-api');
const { createEventAdapter } = require('#slack/events-api');
const slackSigningSecret = process.env.SLACK_SIGNING_SECRET;
const slackToken = process.env.SLACK_TOKEN;
const port = process.env.SLACK_PORT || 3000;
const slackEvents = createEventAdapter(slackSigningSecret);
const slackClient = new WebClient(slackToken);
slackEvents.on('app_mention', (event) => {
console.log(`Got message from user ${event.user}: ${event.text}`);
(async () => {
try {
await slackClient.chat.postMessage({ channel: event.channel, text: `Hello <#${event.user}>! Have you completed your Time sheets for this week yet?` })
} catch (error) {
console.log(error.data)
}
})();
});
slackEvents.on('error', console.error);
slackEvents.start(port).then(() => {
console.log(`Server started on port ${port}`)
});
Once this reminder is done, i intend to build upon it (more features, just need a beginning) so please don't recommend alternative ways my organisation can send reminders to people.
You can try using the chat.scheduleMessage method instead (https://api.slack.com/methods/chat.scheduleMessage). Since you won't rely on an event you may want to store the necessary conversations ids so that they're ready when the app needs to call the method.
I have created a Teams bot in .NET Core from following the sample found here: https://github.com/microsoft/BotBuilder-Samples/tree/master/samples/csharp_dotnetcore/57.teams-conversation-bot
This is working and is running locally with ngrok. I have a controller with a route of api/messages:
[Route("api/messages")]
[ApiController]
public class BotController : ControllerBase
{
private readonly IBotFrameworkHttpAdapter Adapter;
private readonly IBot Bot;
public BotController(IBotFrameworkHttpAdapter adapter, IBot bot)
{
Adapter = adapter;
Bot = bot;
}
[HttpPost]
public async Task PostAsync()
{
// Delegate the processing of the HTTP POST to the adapter.
// The adapter will invoke the bot.
await Adapter.ProcessAsync(Request, Response, Bot);
}
}
I now want to call a POST to api/messages from my Angular client using TypeScript to send a proactive message to a specific Teams user.
I did figure out how to set the ConversationParameters in TeamsConversationBot.cs to a specific Teams user by doing the following:
var conversationParameters = new ConversationParameters
{
IsGroup = false,
Bot = turnContext.Activity.Recipient,
Members = new[] { new ChannelAccount("[insert unique Teams user guid here]") },
TenantId = turnContext.Activity.Conversation.TenantId,
};
but what I'm struggling with is how to build a JSON request that sends the Teams user guid (and maybe a couple other details) to my api/messages route from TypeScript.
How do I go about doing this? What parameters/body do I need to send? I haven't been able to find samples online that show how to do this.
Update below for added clarification
I am building a web chat app using Angular for our customers. What I'm trying to do is send a proactive message to our internal employees, who are using Microsoft Teams, when a customer performs some action via the chat app (initiates a conversation, sends a message, etc.).
I've built a Teams bot using .NET Core using this sample: https://kutt.it/ZCftjJ. Modifiying that sample, I can hardcode my Teams user ID and the proactive message is showing up successfully in Teams:
var proactiveMessage = MessageFactory.Text($"This is a proactive message.");
var conversationParameters = new ConversationParameters
{
IsGroup = false,
Bot = turnContext.Activity.Recipient,
Members = new[] { new ChannelAccount("insert Teams ID here") },
TenantId = turnContext.Activity.Conversation.TenantId,
};
await ((BotFrameworkAdapter)turnContext.Adapter).CreateConversationAsync(teamsChannelId, serviceUrl, credentials, conversationParameters,
async (t1, c1) =>
{
conversationReference = t1.Activity.GetConversationReference();
await ((BotFrameworkAdapter)turnContext.Adapter).ContinueConversationAsync(_appId, conversationReference,
async (t2, c2) =>
{
await t2.SendActivityAsync(proactiveMessage, c2);
},
cancellationToken);
},
cancellationToken);
What I'm struggling with is:
How to configure my Angular app to notify my bot of a new proactive message I want to send.
How to configure the bot to accept some custom parameters (Teams user ID, message).
It sounds like you've got some progress with pro-active messaging already. Is it working 100%? If not, I've covered the topic a few times here on stack overflow - here's an example that might help: Programmatically sending a message to a bot in Microsoft Teams
However, with regards -trigging- the pro-active message, the truth is you can do it from anywhere/in any way. For instance, I have Azure Functions that run on their own schedules, and pro-active send messages as if they're from the bot, even though the code isn't running inside the bot at all. You haven't fully described where the Angular app fits into the picture (like who's using it for what), but as an example in your scenario, you could create another endpoint inside your bot controller, and do the work inside there directly (e.g. add something like below:)
[HttpPost]
public async Task ProActiveMessage([FromQuery]string conversationId)
{
//retrieve conversation details by id from storage (e.g. database)
//send pro-active message
//respond with something back to the Angular client
}
hope that helps,
Hilton's answer is still good, but the part about proactively messaging them without prior interaction requires too long of a response. So, responding to your latest comments:
Yes, the bot needs to be installed for whatever team the user resides in that you want to proactively message. It won't have permissions to do so, otherwise.
You don't need to override OnMembersAddedAsync; just query the roster (see below).
You don't need a conversation ID to do this. I'd make your API, instead, accept their Teams ID. You can get this by querying the Teams Roster, which you'll need to do in advance and store in a hash table or something...maybe a database if your team size is sufficiently large.
As far as required information, you need enough to build the ConversationParameters:
var conversationParameters = new ConversationParameters
{
IsGroup = false,
Bot = turnContext.Activity.Recipient,
Members = new ChannelAccount[] { teamMember },
TenantId = turnContext.Activity.Conversation.TenantId,
};
...which you then use to CreateConversationAsync:
await ((BotFrameworkAdapter)turnContext.Adapter).CreateConversationAsync(
teamsChannelId,
serviceUrl,
credentials,
conversationParameters,
async (t1, c1) =>
{
conversationReference = t1.Activity.GetConversationReference();
await ((BotFrameworkAdapter)turnContext.Adapter).ContinueConversationAsync(
_appId,
conversationReference,
async (t2, c2) =>
{
await t2.SendActivityAsync(proactiveMessage, c2);
},
cancellationToken);
},
cancellationToken);
Yes, you can modify that sample. It returns a Bad Request because only a particular schema is allowed on /api/messages. You'll need to add your own endpoint. Here's an example of NotifyController, which one of our other samples uses. You can see that it accepts GET requests. You'd just need to modify that our build your own that accepts POST requests.
All of this being said, all of this seems like it may be a bigger task than you're ready for. Nothing wrong with that; that's how we learn. Instead of jumping straight into this, I'd start with:
Get the Proactive Sample working and dig through the code until you really understand how the API part works.
Get the Teams Sample working, then try to make it message individual users.
Then build your bot that messages users without prior interaction.
If you run into trouble feel free to browse my answers. I've answered similar questions to this, a lot. Be aware, however, that we've switched from the Teams Middleware that I mention in some of my answers to something more integrated into the SDK. Our Teams Samples (samples 50-60) show how to do just about everything.
Does Microsoft publish Bot Skills as a API for consumption. I have a need to invoke botskills command programmatically in order to publish it to my VA. How can i do this?
botskills is a Command Line Tool, so no, there's no direct API for it.
That being said, most programming languages allow you to execute shell commands. Based on your post history, it looks like you're using Node. So, you can do something like this:
const { exec } = require("child_process");
exec("botskills connect --localManifest "./skills/customSkill/customSkillManifest.json" --skillsFile "./skills.json" --cs --verbose", (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
if (stderr) {
console.log(`stderr: ${stderr}`);
return;
}
console.log(`stdout: ${stdout}`);
});
If botskills is hosted somewhere and you need to use it like an API, you can always add an endpoint to your bot:
server.post('/api/botskills', (req, res) => {
// 1. Do some kind of conditional check on the request to make sure it is allowed to do this
// 2. Execute the command in the previous code block
// 3. Return a response
}
If your questions is not about the botskills package/CLI and is actually about "Bot Skills", you interact with a skill just like a bot. They're basically the same thing and you'd use the same REST API.
In actions-on-google , both the request and response object need to provide as input to this library. but in lambda function, only the request object exists.
So how can i override it ?
in aws lambda the format is
exports.handler = function (event, context, callback) { // event is the request object , the response is provided using the callback() functon
}
the actions-on-google object is created as :
const DialogflowApp = require('actions-on-google').DialogflowApp;
const app = new DialogflowApp({ request: request, response: response });
To get a Google Action to work on AWS Lambda, you need to do 2 things:
Code your app in a way that it's executable on Lambda
Create an API Gateway to your Lambda Function which you can then use for Dialogflow Fulfillment
I believe the first setp can't be done off-the-shelf with the Actions SDK. If you're using a framework like Jovo, you can create code that works for both Amazon Alexa and Google Assistant, and host it on AWS Lambda.
You can find a step by step tutorial about setting up a "Hello World" Google Action, host it on Lambda, and create an API Gateway here: https://www.jovo.tech/blog/google-action-tutorial-nodejs/
Disclaimer: I'm one of the founders of Jovo. Happy to answer any further questions.
This is only a half answer:
Ok, so I dont think I can tell you how to make the action on google sdk correct working on AWS Lambda.
Maybe its easy, I just dont know and need to read everything to know it.
My, "easy to go", but at the end you will maybe have more work solution, would be just interprete the request jsons by yourself and responde with a message as shown below
This here would be a extrem trivial javascript function to create a extrem trivial JSON response.
Parameters:
Message is the string you would like to add as answer.
Slots should be an array that can be used to bias the speech recognition.
(you can just give an empty array to this function if you dont want to bias the speech).
And State is any kind of serilizable javascript object this is for your self to maintain states or something else It will be transfered between all the intents.
This is an standard response on an speech request.
You can add other plattforms than speech for this, by adding different initial prompts please see the JSON tabs from the documentation:
https://developers.google.com/actions/assistant/responses#json
function answerWithMessage(message,slots,state){
let newmessage = message.toLowerCase();
let jsonResponse = {
conversationToken: JSON.stringify(state),
expectUserResponse: true,
expectedInputs: [
{
inputPrompt: {
initialPrompts: [
{
textToSpeech: newmessage
}
],
noInputPrompts: []
},
possibleIntents: [
{
intent: "actions.intent.TEXT"
}
],
speechBiasingHints: slots
}
]
};
return JSON.stringify(jsonResponse,null, 4);
}
I am having a problem similar to Botframework findEntity() issue.
I have created a node.js botframework app using the azure interface. I am using the azure ide for development (to keep things simple).
The relevant code is:
// Main dialog with LUIS
var recognizer = new builder.LuisRecognizer(LuisModelUrl);
var intents = new builder.IntentDialog({ recognizers: [recognizer] })
/*
.matches('<yourIntent>')... See details at http://docs.botframework.com/builder/node/guides/understanding-natural-language/
*/
.matches('Help',(session, args) => {
var entities = args.entities;
var itype = builder.EntityRecognizer.findEntity(args.entities, 'ItemTypes');
session.send(args.entities[0]["entity"]);
session.send(args.entities[0]["type"]);
session.send('How may I assist you? ' + JSON.stringify(args));
session.send('Value of entity (didnt match) you said: \'%s\'.', itype);
})
the findEntity function returns null in itype (at least that is what I see in the session.send results.
I tried using both args.entities and args.intents.entities, no change.
When I look at the results of the args.entities[0]["entity"] and [type] I do get values. The results of the JSON.stringify are below (also showing it is finding the entity).
How may I assist you?
{"score":0.970185757,"intent":"Help","intents":[{"intent":"Help","score":0.970185757},{"intent":"Joke","score":0.0711096451},{"intent":"Greeting","score":0.0438234434},{"intent":"None","score":0.0408537947},{"intent":"Goodbye","score":0.04074517}],"entities":[{"entity":"stapler","type":"ItemTypes","
I'm assuming there is more but that it was cut off by the chat window.
I'm new to every technology involved and will take any help I can get.