Follow the steps here (https://learn.microsoft.com/ja-jp/azure/bot-service/bot-service-manage-analytics?view=azure-bot-service-4.0), I can check bot analytics. I think these data are from Azure application insight. But how could I export them using powerBI, or any other tools? I need the same charts.
By the way, I'm using botframework V3.
Yes, this is possible.
First, I would recommend reading this blog article as it shows a bit of how the bot analytics are structured and might give insight into what you might want to query for.
Once you have your query up and returning the correct data that you want in Applications Insights Logs(Analytics), you can query that in PowerBI by exporting the M language script from Logs and then opening that in PowerBI which will then query the data you need. Please view how to do that step here.
Related
I have a java application where users with different roles logged in and perform various activities. I am tracking each user with their useid and Roles and creating custom reports in Google analytics through GTM. In reports i am displaying which user with what roles logged in how many times which date visited etc.I GA i created custom reports which gives charts and table.Charts which GA is giving is as below.
In my java spring application in admin section i want to display the above graph. Please suggest me the steps and action i need to follow to integrate GA reports graph to my java application.
Regards,
Prabhash Mishra
GA4 has a reporting API You can use that to pull the data. It may be quite uncomfortable to pull the data via it, so there's another option. You can seamlessly export your data to BQ then ETL it from there using some public libraries for working with BQ.
Going the BQ route will result in way easier debugging since you'd see how the data really looks like directly in BQ.
For a given Google API, is there any way to dynamically check usage against any of the current limits for that API?
For example, this page https://developers.google.com/classroom/limits?hl=en shows that I can query the Classrooms API 4,000,000 times per client per day. At midday, without going to the API Console, how could I know that I've already hit 3 million queries?
I'm hoping that there's a billing or usage API that covers this, but can't see it.
Note: I'm not having any issue right now with a specific call, just anticipating that my usage will scale up significantly in the next few months, so am looking for a solution for monitoring rather than advice on not hitting the limits at all. My specific use-case is for Google Classrooms, but reading wider around this I can't see a general solution either.
Answer:
No, dynamically you can't retrieve this information.
Feature Request:
You can however let Google know that this is a feature that is important for the Google Workspace APIs to have, and that you would like to request they implement it.
The page to file a Feature Request for the Google Classroom API is here, as there is no specific component for Google Workspace APIs in general I would suggest filing it here instead.
You can use Google's Cloud Monitoring API to achieve this. This is the documentation page for APIs-
https://cloud.google.com/monitoring/api/v3
This is the documentation page for concerned metrics-
https://cloud.google.com/monitoring/api/metrics_gcp#serviceruntime/quota/allocation/usage
https://cloud.google.com/monitoring/api/metrics_gcp#serviceruntime/quota/exceeded
https://cloud.google.com/monitoring/api/metrics_gcp#serviceruntime/quota/limit
My chatbot has been down for the last few days (quota for free conversations has been exceeded), and I am seriously thinking of moving to a paid plan.
Please see attached the error returned when initiating a conversation from emulator.
The documentation to move to a paid plan is not clear at all.
Can someone please list the steps that I have to take or pinpoint me to the proper documentation please?
Technology : MS Botframework with nodejs (luis,qna,language detection)
I suppose I have to create a key in azure and insert it in the luis.ai app. But searching for LUIS in azure returns no result.
Grateful if you guys can help.enter image description here
Seems you are trying to implement LUIS on azure portal and want to know more about pricing but cannot get LUIS on azure portal.
Well,Follow the below steps to know how you would implement that.
LUIS :
Step 1:
For LUIS go to your azure portal then and click on Create a resource finally search by Language Understanding you would get that. See the below screen shot:
Step 2:
Once you type Language Understanding you would get below Option. See the screen shot below:
Once you are done with setup you should click Keys and Endpoint to call from your application. See the screen shot below:
though correct official document reference you would get there also given here as well.
1. Luis Official Document
2. Pricing
For your information pricing varies based on Region. Usual price is:
You can do 50 transaction per second and have to pay $1.5 per 1000
(one thousand transaction)
QnA Maker :
For QnA maker you could also follow above steps. See the following screen shot:
QnA Maker Official Document
Pricing
Language Detection:
If you plan to detect your language you could use Text Analytics which provides advanced natural language processing and identify the language of the text with language detection See the screen shot below:
Language Detection Official Document
Pricing
Note: All the service has language support for Node.js. Hope this would help.
In LUIS, we have text/synonym recommendation in the entity/phrase list generation. Do anyone have idea on the api that is used for the synonym recommendation. We wanted to integrate the cognitive service api to get the synonym of the input text in one of the client application but we could not get any relevant Microsoft service to attain this.
[I am unable to provide sample code because of the nature of the question]
Luis Text Analytics
TextAnaytics API
Thanks
Unfortunately, we don't have any API for synonym recommendation yet. You can request this as feature here.
To give an context, we are trying to achieve reporting functionality based on the Yammer activity/usage information.
Questions
We are not able to find any Analytics API in the link https://developer.yammer.com/documentation/ except Data Export API. Please let us know if we have any other API related to usage Analytics.
Is there any way to execute our query against yammer big data for us to get the usage information?
• The data export is not providing the information such as ‘like’, ‘share’ and ‘followed by’. Is there any other way to export yammer data including these missing items?
How to remove the deleted entries from the past collection just in case if we are going to do periodic data export?
How the third party tools companies like ‘good data’ will contact yammer for analytics data. This will help us to find the approach involved in this.
Can someone help us on this.
have you tried this : http://blogs.msdn.com/b/richard_dizeregas_blog/archive/2014/04/09/yammer-analytics-with-excel-and-power-bi.aspx Yammer API and Excel 2013 for bigger data..
Having said that - I think the tool is currently broken as I seem be be able to download 114% of my stats, then get some blank CSV files which then fail on power pivot.
You need to be a verified admin to access this tool.
Cheers
Rich
About Question 2
There are good ways of doing it through the API. You can use Excel and PowerQuery or a home made .net data extractor. Like, Share and followed By are possible with the API but not with the useless Data Export feature. Be aware the API have some issues when trying to retrive all data. I'm currently trying to get these issues fixed.
About Question 4
They need the account of a trusted admin. Thanks to a Token they will be able to access all your networks messages and provide you most of the Analytics you need.
They have good webinards for their tool. Only issue is that they copy your data in their environment in the US (is not acceptable for European customers).