is it possible to query the cost of sms or call in SINCH? - sinch

Hi currently our project is using sinch ios sdk and implementing sending sms and calls- through verifyCode:completionHandler and initiateWithCompletionHandler .
i as a backend developer needs to get the price/cost of each sms and call through rest API
how can i do this ?

use this, you should use callbacks for a few reasons like security. But take advantage of that callback and also log the price.
[RequestBody]
{
string - id
string - event
string - method
identity - identity
money - price
string? - reference
string? - custom
string[]? - acceptLanguage
}
https://www.sinch.com/docs/verification/rest/#VerificationRequestEvent

Related

Can I query users by using lastLoginTime in query in Google Admin sdk?

In my organization , we want to filter users who has not logged in for last 30 days.
Can I use query=lastLoginTime<2022-03-15 in google admin sdk api.
I am using this api https://admin.googleapis.com/admin/directory/v1/users?query=lastLoginTime<2022-03-15
Or is there any other api which return the expected result.
Thanks,
I am afraid it is not possible at the moment. This is the list of query parameters you can use with that method so far:
name
email
givenName
familyName
isAdmin
isDelegatedAdmin
isSuspended
im
externalId
manager
managerId
directManager
directManagerId
address
addressPoBox
addressExtended
addressStreet
addressLocality
addressRegion
addressPostalCode
addressCountry
orgName
orgTitle
orgDepartment
orgDescription
orgCostCenter
phone
orgUnitPath
isEnrolledIn2Sv
isEnforcedIn2Sv
schemaName.fieldName
You can find additional info of each parameter in the official documentation here.
In addition to that you can request this as a feature for this API here.

bot framework and Microsoft teams - how to get all channels associated to a team?

I am trying to get all channels associated with a specific team so that my bot can send proactive messages. Based on the reading I've done, I need to use the FetchChannelList method in the Microsoft.Bot.Connector.Teams namespace, in the TeamsOperationsExtensions class.
If I do this:
var connector = new ConnectorClient(new Uri(activity.ServiceUrl));
ConversationList channels = connector.GetTeamsConnectorClient().Teams.FetchChannelList(activity.GetChannelData<TeamsChannelData>().Team.Id);
channels is null. If I break it down to only connector.GetTeamsConnectorClient(), that is not null, but connector.GetTeamsConnectorClient().Teams.FetchChannelList(activity.GetChannelData().Team.Id) is.
To break it down further, I tried to get activity.GetChannelData(). Only the Tenant property is not null. All the others (Channel, Team, EventType and Notification) are null.
I am using tunnelrelay, which forwards messages sent to the bot's public endpoint to a private endpoint, and am using tenant filter authentication in the messages controller. Not sure if that would cause any problems? (When I watch messages coming in through tunnel relay, I see there too that only Tenant is the only channeldata property which is not null. Here's what I see in tunnelrelay:
"entities":[{"locale":"en- US","country":"US","platform":"Windows","type":"clientInfo"}],"channelData":{"tenant":{"id":"our_tenant_id"}}}
Also, regarding the teamID expected as a parameter to the FetchChannelList method, how do I find out what that is for a given team other than the GetChannelData() method? I tried the powershell cmdlet Get-Team (for example: Get-Team -User me#abc.com). It returns a distinct groupId for each team I am a part of, but I'm assuming groupId != TeamId. Is that correct? And, where can I find the teamId that the FetchChannelList is expecting other than the GetChannelData method?
Thanks in advance for any help!
The problem here was that the message to the bot (the activity) was a direct message, not a part of a channel conversation. Apparently, the Channel and Team properties are only available in a channel conversation.
Also, regarding the team ID, one way to get it outside of code is to click the "..." next to the team and click "get link to team". You will see something like:
https://teams.microsoft.com/l/team/19%3a813345c7fafe437e8737057505224dc3%40thread.skype/conversations?groupId=Some_GUID&tenantId=Some_GUID
The line after team/ (19%3a813345c7fafe437e871111115934th3%40thread.skype) contains the teamId, but not exactly. If you replace the first % and the two characters immediately following it with : and the second % and the two characters immediately following it with #, that is your teamid. So, from:
19%3a813345c7fafe437e871111115934th3%40thread.skype
the team ID is:
19:813345c7fafe437e871111115934th3#thread.skype

LUIS - Can I have 2 languages (Chinese and English) in same App, and still have good result?

I am currently using MS LUIS for Chatbot.
Our country usually talks and chats using 2 languages, English and Chinese.
However, in LUIS I can only define one culture.
As a result, when my culture is set to English, and when I imports Chinese text, the confidence level is very low (e.g. English - 0.88, Chinese - 0.1). The other way round is the same.
The situation is the same even after I did tokenize the Chinese text using library like JieBa or THULAC.
Therefore when I did testing, it is usually very easy to fall into an unrelated intent.
I would like to make LUIS recognize both English AND Chinese easily. Are there any way to solve this problem?
Thank you very much for your help.
I would like to make LUIS recognize both English AND Chinese easily. Are there any way to solve this problem?
Yes, the way is to separate your LUIS apps/projects, 1 for each language, and use a language detection before calling LUIS.
That's the official approach from LUIS docs (see here):
If you need a multi-language LUIS client application such as a chat
bot, you have a few options. If LUIS supports all the languages, you
develop a LUIS app for each language. Each LUIS app has a unique app
ID, and endpoint log. If you need to provide language understanding
for a language LUIS does not support, you can use Microsoft Translator
API to translate the utterance into a supported language, submit the
utterance to the LUIS endpoint, and receive the resulting scores.
For the language detection you can use Text Analytics API from Microsoft Cognitive Services for example to get the text language, and then with this result query the right LUIS app.
How to use it?
Documentation of Language detection in Text Analytics API here
Text analytics API: here
As Nicolas mentioned above you can use Multilanguage chat application with separate LUIS apps for each culture.
In order to have single LUIS application, you could use the Translator Text API to translate all incoming messages before they're sent to LUIS. In this case you'll want to use middleware to handle the translation before your LUIS Recognizer is called. You can also use middleware to translate your bot's response so you don't have to use additional localization inside of your bot
Tokenizing in LUIS is different for each language in LUIS.
In the zh-cn culture, LUIS expects the simplified Chinese character set instead of the traditional character set.
Here is one more sample where you can select a language from the bot and continue the consversation as required.
After investigation and several testings, I guess I found a way on how to do that properly:
First of all, I am currently using MS BotFramework - NodeJS (3.14.0) version to create my Bot.
And in botbuilder class, you have a function called IntentDialog, it accepts a list of recognizers. So I wrote something like this:
In luis.js
const builder = require("botbuilder")
// Setting for LUIS.ai
// Universal API key for both apps
let luisAPIKey = process.env.LuisAPIKey;
// First assign variables for Chinese LUIS app
let luisAppId_Chi = process.env.LuisAppId_Chi;
let luisAPIHostName_Chi = process.env.LuisAPIHostName_Chi || 'westus.api.cognitive.microsoft.com';
let LuisModelUrl_Chi = 'https://' + luisAPIHostName_Chi + '/luis/v2.0/apps/' + luisAppId_Chi + '?subscription-key=' + luisAPIKey + '&verbose=true';
// Then assign variables for English LUIS app
let luisAppId_Eng = process.env.LuisAppId_Eng;
let luisAPIHostName_Eng = process.env.LuisAPIHostName_Eng || 'westus.api.cognitive.microsoft.com';
let LuisModelUrl_Eng = 'https://' + luisAPIHostName_Eng + '/luis/v2.0/apps/' + luisAppId_Eng + '?subscription-key=' + luisAPIKey + '&verbose=true';
// Return an object with 2 attributes: Chi for Chinese LUIS and Eng for English LUIS
let luis = {};
luis.chi = new builder.LuisRecognizer(LuisModelUrl_Chi);
luis.eng = new builder.LuisRecognizer(LuisModelUrl_Eng);
module.exports = luis;
In app.js
const luis = require("./luis")
builder.IntentDialog({ recognizers: [luis.chi, luis.eng] });
And when I tested in botEmulator, it seems that it will first check LUIS Chi app, and then go to LUIS Eng app.
I don't know what is the criteria/threshold that this recognizer used to controls whether jumps to another app or not. But at present it works for me to certain extend. It is not accurate by at least a good(?) start. :D
No MS Text translation API need.
By the way, the code will look nicer if I can get the topIntent and LUIS path right in session variable.
Hope it helps someone.

Recognize the user input inside the handler function using dialogflow.ai/api.ai and Microsoft Bot Builder

I am building a bot using the given technology stacks:
Microsoft Bot Builder
Node.js
Dialogflow.ai (api.ai)
We used waterfall model to implement a matched intent Dialog, which includes a couple of handler functions and prompts. After following this scenario I need to identify the entities inside an inner handler function, for a user input.
eg:
Bot : Where do you want to fly?
User: Singapore. (For this we added entities like SIN - Singapore,SIN(Synonym), So I need to resolve the value as SIN)
Any help on this scenario is much appreciated.
Here is a post Using api.ai with microsoft bot framework you can refer for your reqirement, and with a sample at https://github.com/GanadiniAkshay/weatherBot/blob/master/api.ai/index.js. The weather api key leveraged in this sample is out of date, but the waterfall and recognizer api key is still working.
Generally speaking:
Use api-ai-recognizer
Instantiate the apiairecognizer and leveraged builder.IntentDialog to include the recognizer:
var recognizer = new apiairecognizer("<api_key>");
var intents = new builder.IntentDialog({
recognizers: [recognizer]
});
In IntentDialogs, use builder.EntityRecognizer.findEntity(args.entities,'<entity>'); to recognize the intent entities.

Aurelia Validation - possible to validate one by one?

I have a simple login form - username and password. I'd like to validate each field but only upon success of the previous one i.e. only validate password if email has succeeded. This is because I'm targeting mobile and I really only want to display the first error message due to limited real estate.
Is this possible? It seems like something that ought to be possible with the fluent API:
this.validator = validation.on(this)
.ensure("email")
.isNotEmpty()
.isEmail()
.isBlocking() //something like
.ensure("password").isNotEmpty();
It is possible to create more than one validator.
this.emailValidator = validation.on(this)
.ensure("email")
.isNotEmpty()
.isEmail()
.isBlocking();
this.passwordValidator = validation.on(this)
.ensure("password")
.isNotEmpty();

Resources