How to create training data for RASA NLU through program nodejs - mean-stack

How to create training data through program for RASA NLU?
Actually I am developing an application using MEAN stack, this application prepares the data that needs to be trained with RASA NLU.
But I don't know how to pass this info from my nodejs server to RASA NLU. Is there any supported api's to achieve this?

Rasa has a highly functional API as documented here.
To answer the specific question you can pass training data to the Rasa NLU API via the below commands:
If your training data is in a file:
curl -XPOST localhost:5000/train?project=my_project -d #data/examples/rasa/demo-rasa.json
If your training data is in json format:
curl --request POST \
--url 'http://localhost:5000/train?project=test&fixed_model_name=tested-project' \
--header 'content-type: application/json' \
--data ' {
"rasa_nlu_data": {
"regex_features": [
{
"name": "zipcode",
"pattern": "[0-9]{5}"
}
],
"entity_synonyms": [
{
"value": "chinese",
"synonyms": ["Chinese", "Chines", "chines"]
},
{
"value": "vegetarian",
"synonyms": ["veggie", "vegg"]
}
],
"common_examples": []
}
}'
Obviously you'll need to create the json file or payload. and in Node you wouldn't be using curl, but a library like request.
I've written a series of tutorials that may be good to help you get started interacting with Rasa's API.

I use a python library which is pretty good to power your conversational software, based on the latest Machine Learning research.
In order to use this, you have to build a python service that will interact with your nodejs server.
It would be also easy for you to scale and maintain both in the future
Or you can check this opensource app https://github.com/aashreys/chatbot-example

Related

How to publish an Extension to Azure FHIR repository?

I am using Azure API for FHIR. I have a Claims payload that requires some additional fields, which I am adding to the extension structure like:
extension: [
{
"url": "ROW_ID",
"valueString": "1"
},
{
"url": "LOB",
"valueString": "MAPD"
}
]
To perform a search on ROW_ID, and LOB, I need to publish this extension which I would be using in my SearchParameter.
How and where do I publish the extension ?
To publish an extension, you post the structure definition to your FHIR server. In addition, if you need to search for it you will need to create a custom search parameter and reindex the database. You can read more about that here: https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/how-to-do-custom-search

Send pdf with smba trafficmanager or botframework

I'm trying to send pdf file as attachment proactively to teams user with https://smba.trafficmanager.net/in/v3/conversations/ and attachment format as below
"attachments": [
{
"contentType": "application/pdf",
"contentUrl": "http://www.africau.edu/images/default/sample.pdf",
"name": "sample.pdf",
"content": {
"uniqueId": "1150D938-8870-4044-9F2C-1213213123",
"fileType": "pdf"
}
}
]
I'm able to send txt files but not pdf, every time I'm getting
{"error":{"code":"BadArgument","message":"Unknown attachment type"}}
Is there any other contentType other than application/pdf ?
I agree with Hilton and Dev here. I think it's more convenient to post messages with card attachments referencing existing SharePoint files using the Microsoft Graph APIs.
MS documents referred to 2 ways for bots send attachment. Using the Microsoft Graph APIs works for bots in all scopes in Teams while using the Teams APIs works only in the personal context.
By the way, the document says pdf is ok to be a file type.
#kiran, The below payload works for me. So i am adding the payload for your convenience (below), so that you can copy/test it.
{ "body": { "contentType": "html", "content": "Here's the latest budget. <attachment id=\"153fa47d-18c9-4179-be08-9879815a9f90\"></attachment>" }, "attachments": [ { "id": "153fa47d-18c9-4179-be08-9879815a9f90", "contentType": "reference", "contentUrl": "m365x987948.sharepoint.com/sites/test/Shared%20Documents/…", "name": "Budget.pdf" } ] }
Based on the discussion in the comments, it is definitely better to rather provide a link to the document, hosted in SharePoint (files tab) - that's exactly what the Files tab is intended for, rather than every user having to download their own copy. In addition, have a look at the topic of Link Unfurling to see how to provide a better embedded experience for the posted file.

Azure IoT Edge module logs - Task upload logs failed because of error

I was following the experimental features of Built-in log pulls
https://github.com/Azure/iotedge/blob/master/doc/built-in-logs-pull.md
When I am trying to upload logs using the following payload from the azure portal(using Direct Method under each module)
PAYLOAD:
{
"schemaVersion": "1.0",
"sasUrl":"https://veeaiotcentralstorage.blob.core.windows.net/iotedgeruntimelogs/iotedgeruntimelogs.txt?sv=2019-02-02&st=2020-08-08T08%3A56%3A00Z&se=2020-08-14T08%3A56%3A00Z&sr=b&sp=rw&sig=xyz",
"items": [
{
"id": "zigbee_template-arm64v8",
"filter": {
"tail": 10
}
}
],
"encoding": "none",
"contentType": "text"
}
I am getting the error mentioned below after checking the task status
ERROR:
{"status":200,"payload":{"status":"Failed",
"message":"Task upload logs failed because of error Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.",
"correlationId":"b85002d8-d8f9-49d5-851d-9123a8d7d740"}}
Please let me know where I am having an issue
Digging into the code some more, I noticed the UploadLogs implementation doesn't create a container, but rather a folder structure within the container that you supply. As far as I can tell, the casing restriction applies when creating a blob container, but there are no such restriction on creating folders within the container.
Please check the SAS URL that you supplied, or something on the storage end. Double check that your SAS URL is generated for a pre-existing blob container.

Filter AWS resources using regex in aws-sdk-go

So I have some different types of aws resources tagged as xxx/yyy/<generated_id>. I need to fetch them using go-sdk.
Here is a sample code for subnets, the filters look the same for every other resource.
This doesn't work.
var resp *ec2.DescribeSubnetsOutput
resp, err = d.ec2Client().DescribeSubnets(&ec2.DescribeSubnetsInput{
Filters: []*ec2.Filter{
{
Name: aws.String("vpc-id"),
Values: []*string{&d.VpcId},
},
{
Name: aws.String(fmt.Sprintf(`tag:"xxx/yyy.[*]"`),
Values: []*string{aws.String("owned")},
},
},
})
This does:
aws ec2 describe-subnets --filters `Name=tag:"xxx/yyy.[*]",Values=owned`
I'm obviously doing something wrong, can someone point out what?
There is nothing in the API documentation to suggest that DescribeSubnets accepts a regular expression in filter names: https://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeSubnets.html
If it works in the CLI, that's likely something the CLI is doing on top of what the SDK offers. The Go SDK is like any other AWS SDK; it exposes the AWS API in a language-specific way. The AWS CLI adds convenience features on top of the API to make it more useful on the command line, but that doesn't mean those features are exposed by the API or any published SDK.
I stepped with this problem recently, my issue was the version of the sdk I was using;
Filters: [ ]*ec2.Filter{
is for v1 sdk mod and it was not working as I was importing github.com/aws/aws-sdk-go-v2/aws, while
Filters: [ ]types.Filter{
is for v2 and this one worked in my case.
https://aws.amazon.com/blogs/developer/aws-sdk-for-go-version-2-general-availability/

How to create app with parse server?

In parse.com, when I want to create new app, I use:
curl -X POST \
-H "X-Parse-Email: <PARSE_ACCOUNT_EMAIL>" \
-H "X-Parse-Password: <PARSE_ACCOUNT_PASSWORD>" \
-H "Content-Type: application/json" \
-d '{"appName":"my new app","clientClassCreationEnabled":false}' \
https://api.parse.com/1/apps
But when I deployed Parse server to Heroku and Digital Ocean, I didn't know to create new app, because my server doesn't have PARSE_ACCOUNT_EMAIL and PARSE_ACCOUNT_PASSWORD. When I deployed parse dashboard, it didn't have "Create a new app" like Parse.com.
How can I create new app with my self-hosted Parse server?
The self hosted parse servers can only handle one app per server, at least for now.
This means that you will have to use several installations of Parse, one app per installation using multiple servers or multiple instances of parse on the same server but configure each server to use different ports.
To answer you question: No you do not need to use parse.com to create new apps.
To create a new app you set the appID and password in the parse config/start file on your digital ocean or other hosted server.
The appID and password can be anything that you make up, it does not need to be from parse.com.
Below is an example of the environment settings in a startup file:
**Example file: ~/parse-server-example/my_app.js**
var express = require('express');
var ParseServer = require('parse-server').ParseServer;
// Configure the Parse API
var api = new ParseServer({
databaseURI: 'mongodb://localhost:27017/dev',
cloud: __dirname + '/cloud/main.js',
appId: 'myOtherAppId',
masterKey: 'myMasterKey'
});
var app = express();
// Serve the Parse API on the /parse URL prefix
app.use('/myparseapp', api);
// Listen for connections on port 1337
var port = 9999;
app.listen(port, function() {
console.log('parse-server-example running on port ' + port + '.');
});
Then run the file with:
node my_app.js
You can read more here: Parse Server at Digital Ocean
There is an open issue for that: https://github.com/ParsePlatform/parse-dashboard/issues/188
For the moment, I just use parse's hosted dashboard to create new apps. They say on January 28th, calls to their API will cease to function. They don't say that the hosted dashboard will be going away. I imagine that, if they don't get it into the self-hosted version, you'll still be able to create new apps within the hosted dashboard.
In any case, for now what I am doing is creating the app as I normally would in the hosted dashboard. I then run the migration tool at app > app settings > general > Migrate to external database option. You have to add at least one class to the database in order for the migration tool to work. Basically, the migration tool will fail with some ambiguous error message if it's a completely fresh app with a clean database.
Once the migration is done and read/writes are hooked up to my self-hosted Parse Server, I then providing the app's keys, etc in the parse-dashboard-config.json file of my self-hosted Parse Dashboard. You can add multiple apps to this config file, thus manage all of your apps from a single self-hosted Parse Dashboard.
Here's an example of that config file with two apps:
{
"apps": [
{
"serverURL": "https://my-parse-server-1.herokuapp.com/parse",
"appId": "b44gL7uAB1z...lwUJneaoKdX9",
"masterKey": "HrSqFbH...hfiwuCCOLDvHF",
"appName": "parse-server-1"
},
{
"serverURL": "https://my-parse-server-2.herokuapp.com/parse",
"appId": "b44gL7uAB1z...lwUJneaoKdX9",
"masterKey": "HrSqFbH...hfiwuCCOLDvHF",
"appName": "parse-server-2"
}
],
"users": [
{
"user":"admin",
"pass":"somePasswordHere"
}
]
}
This seems to be the only way currently to create apps that you can connect to your self-hosted Parse Dashboard.
It's also important to note that, at the moment, it appears as though the self-hosted Parse Server package only supports a single app. I have no idea if there are any plans to support multiple apps as they have done with Parse Dashboard.
And finally, you can use the Parse Command Line tool to create new apps as well: https://parse.com/docs/cloudcode/guide#command-line-creating-a-parse-app
They also have some interesting integrations with Heroku which facilitate the entire process. That might be worth looking into. You could create a simple Node app yourself with a GUI for creating new Parse apps. In this case, you would create a simple form, that when submitted is validated and then executes the command line methods to create a new app with the ShellJS node package. You could even modify the Parse Dashboard package to include this feature yourself within the self-hosted Dashboard.

Resources