Azure IoT Edge module logs - Task upload logs failed because of error - azure-blob-storage

I was following the experimental features of Built-in log pulls
https://github.com/Azure/iotedge/blob/master/doc/built-in-logs-pull.md
When I am trying to upload logs using the following payload from the azure portal(using Direct Method under each module)
PAYLOAD:
{
"schemaVersion": "1.0",
"sasUrl":"https://veeaiotcentralstorage.blob.core.windows.net/iotedgeruntimelogs/iotedgeruntimelogs.txt?sv=2019-02-02&st=2020-08-08T08%3A56%3A00Z&se=2020-08-14T08%3A56%3A00Z&sr=b&sp=rw&sig=xyz",
"items": [
{
"id": "zigbee_template-arm64v8",
"filter": {
"tail": 10
}
}
],
"encoding": "none",
"contentType": "text"
}
I am getting the error mentioned below after checking the task status
ERROR:
{"status":200,"payload":{"status":"Failed",
"message":"Task upload logs failed because of error Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.",
"correlationId":"b85002d8-d8f9-49d5-851d-9123a8d7d740"}}
Please let me know where I am having an issue

Digging into the code some more, I noticed the UploadLogs implementation doesn't create a container, but rather a folder structure within the container that you supply. As far as I can tell, the casing restriction applies when creating a blob container, but there are no such restriction on creating folders within the container.
Please check the SAS URL that you supplied, or something on the storage end. Double check that your SAS URL is generated for a pre-existing blob container.

Related

Unable to save configuration of Teams connector

Past two days, I have been trying to configure MS Teams connector following this tutorial:
https://learn.microsoft.com/en-us/learn/modules/msteams-webhooks-connectors/7-exercise-o365-connectors
I configured the connector via Connectors Developer Dashboard.
Then I tried both, cloning and reconfiguring this sample:
https://github.com/OfficeDev/TrainingContent/tree/master/Teams/60%20Webhooks%20O365%20Connectors/Demos/03-o365-connector
and also bootstrapping the project via yo teams, following the tutorial step-by-step.
After building the project and serving it via ngrok, I can sideload the connector into Teams (tried both, desktop app and web), it successfully brings me to configuration page, but never allows me to save the connector settings. I always get this error:
Unable to save “My First Teams Connector” connector configuration. Please try again.
I adapted the code and debugged it to see, that the call to /api/connector/connect succeeds and saveEvent.notifySuccess() is called.
Then I noticed, that right after saving the connector via browser, this error appears in the console:
{
"seq": 1597590187271,
"timestamp": 1597593891957,
"flightSettings": {
"Name": "ConnectorFrontEndSettings",
"AriaSDKToken": "d127f72a3abd41c9b9dd94faca947689-d58285e6-3a68-4cab-a458-37b9d9761d35-7033",
"SPAEnabled": true,
"ClassificationFilterEnabled": true,
"ClientRoutingEnabled": true,
"EnableYammerGroupOption": true,
"EnableFadeMessage": false,
"EnableDomainBasedOwaConnectorList": false,
"EnableDomainBasedTeamsConnectorList": false,
"DevPortalSPAEnabled": true,
"ShowHomeNavigationButtonOnConfigurationPage": false,
"DisableConnectToO365InlineDeleteFeedbackPage": true
},
"status": 500,
"clientType": "SkypeSpaces",
"connectorType": "f39fe17c-6452-4879-b692-a93d73684348",
"name": "handleMessageError"
}
Any idea what could be incorrectly configured, or whether there is a place to check for more descriptive error? Log of desktop Teams was not helpful either.
ConnectorID: f39fe17c-6452-4879-b692-a93d73684348
So, what really helped me in the end for that particular tutorial:
run gulp ngrok-serve
configure the connector following the tutorial (with valid domains; excluding protocol) on Connectors Developer Dashboard
extract the content of packaged connector
adapt extracted manifest.json with newly created connector ID (both occurrences)
repackage the connector as zip
upload to Teams & configure it
I filled the valid domains field with xxxxxxxx.ngrok.io which is the domain of my configuration page.
Be careful, if you update an existing connector, apparently these changes needs time to be taken into account. To be sure, you can create a fresh new connector.

Grant Google App Maker Full Google Drive API Scope / Server Rejected Error

I have an app maker application using the Google Drive File picker widget. But uploading files does not work, I get a server rejected error.
API Upload Response
{
"errorMessage": {
"reason": "REQUEST_REJECTED",
"additionalInfo": {
"uploader_service.GoogleRupioAdditionalInfo": {
"completionInfo": {
"status": "REJECTED"
},
"requestRejectedInfo": {
"reasonDescription": "agent_rejected"
}
}
},
"upload_id": "UrqXRaPdnG_DZ0L-iDX5BJsG4XEg"
}
I believe the issue is the app's oauth scopes. It appears to only have read access which is insufficient for uploading new files.
I'd like to grant it full access https://www.googleapis.com/auth/drive but I'm not clear how to do this.
This behavior is a bug and it has been recently introduced as far as I'm aware. To be honest, the full drive scope should be granted when enabling the simple uploads feature in the drive picker but some reason it is not doing it. As a work around, you can put the following in any server script:
/***
Dummy function for drive full permissions
****/
function dummyForDrivePermission(){
var file = DriveApp.addFile();
}
The above code, will force appmaker to recognize that it needs to upload files, although you will never invoke the function but it will server the purpose of granting the scope you need.

How to send a local file path thorugh the QnA Create Knowledgebase API?

I am referring to this QnA create knowledgebaseAPI documentation. I want to upload the knowledge base through the Excel file which is stored in the local path. I don't have URL for the excel file, only the local path.
I followed the code given on github link, I removed the unnecessary things and kept the variable "kb" as shown below:
static string kb = $#"
{{
'name': 'VivekKB',
'qnaList': [],
'urls': [ ],
'files': [
'files': {{DBFile}}
]
}}
DBFile is the filename with full path. When I run the code, it creates an Empty knowledge store. It doesn't upload the excel file which I mentioned. Can you please help me to figure it out how to upload a local excel QnA data directly to QnA store. I want to avoid manual uploading excel to knowledgebases at https://www.qnamaker.ai.
Thanks in advance.
Vivek
Connected with the QnA team directly, asking the following:
Question: Is it possible to use a local file path to create KB through the programmatic API?
I know given in the programmatic QnA API docs it demonstrates how to add a PDF file that’s already also available online as well:
"files": [
{
"fileName": "SurfaceManual.pdf",
"fileUri": "https://download.microsoft.com/download/2/9/B/29B20383-302C-4517-A006-B0186F04BE28/surface-pro-4-user-guide-EN.pdf"
}
However when I tried using “POST Create Knowledgebase “ in Postman using the relative file path in the “fileUri”, I get “invalid uri” error.
Request Body
{
"name": "Simple QnA",
"files": [
{
"fileName": "simpleQnaSource.docx",
"fileUri": "C:\\Users\\v-asho\\Documents\\RandomWordDocs"
}
]
}
Response
{
"error": {
"code": "BadArgument",
"message": "Invalid input. See details.",
"details": [
{
"code": "ValidationFailure",
"message": "File Uri has one or more invalid uri.",
"target": "Files[0].FileUri"
}
]
}
}
Uploading the .docx file online through qnamaker.ai portal successfully creates a KB, it’s specifically through the programmatic api where I’m having issues.
QnA Team's Answer:
“fileUri” can have a publicly available and downloadable URI as value.
When using it via API, please upload the contents of local file to a publicly available domain. (Example: Azure storage blob shared to public).

Visual Studio diagnostics configuration error in event hub set up

I am trying to set up streaming from an Azure VM scale set to an event hub via Diagnostics configuration.
I have my public config which includes the SinksConfig as follows (I have omitted the rest of the config for the sake of brevity):
{
"WadCfg": {
"DiagnosticMonitorConfiguration": {
*** config for performance counters and ETW ***
"SinksConfig": {
"Sink": [
{
"name": "eventhub",
"EventHub": {
"Url": "sb://myhub.servicebus.windows.net/mycompanyapplication",
"SharedAccessKeyName": "RootManageSharedAccessKey"
}
}
]
}
},
"StorageAccount": "<storageaccount>"
}
and the private config:
{
"storageAccountName": "<storageaccountname>",
"storageAccountKey": "<storageaccountkey>",
"storageAccountEndPoint": "https://core.windows.net",
"EventHub": {
"Url": "sb://myhub.servicebus.windows.net/mycompanyapplication",
"SharedAccessKeyName": "RootManageSharedAccessKey",
"SharedAccessKey": "<sharedaccesskey>"
}
}
However, nothing is being received by the event hub. I can see in the storage account logs that the Diagnostics extension is running:
but in the substatus there are many errors around the SAS key and the event hub:
When I check back in the Visual Studio Diagnostics configuration on the Scale set I see this error:
I have checked the naming convention on the SharedAccessKeyName (which is the default provided when the event hub was set up) know that the SAS key works as I wrote a console app to send messages to the same event hub with the same credentials and it worked fine.
So there is obviously a problem with the authentication to the event hub as it can't read the access key from the config file. However, I can't see any other way of providing it.
Am I missing something obvious here in my config?
Turns out the problem was quite simple, I had grabbed the URL from the connection string in the portal which was
sb://myhub.servicebus.windows.net/mycompanyapplication
when it should have been
https://myhub.servicebus.windows.net/mycompanyapplication
Now the data is flowing freely into the event hub.
However, the diagnostics config in VS still shows the warning about not being able to read the SAS key, which now looks like a "red herring" that ended up costing me a lot of time :(

Downloading native documents from Google Drive

I'd like to download files from Google Drive in their native format, using the Google Drive SDK (in order to perform some manipulation on them, and upload them back to Google Drive).
Obviously, I can use the Export Links to convert to another format (Office, etc.), but that means that the file will be converted from native to Office, and then back to native format (during upload). I'm trying to avoid this, as I expect this will not maintain 100% fidelity.
I've tried the following request:
GET /drive/v2/files/1q7BYvDDYWwXXXXXXXxIxsuby0IrXe5L4?alt=media
Authorization: Bearer ya29.XXXXXXXXXXXXXtKVg3P3zg
But the response I got was:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "badRequest",
"message": "The specified file does not support the requested alternate representation.",
"locationType": "parameter",
"location": "alt"
}
],
"code": 400,
"message": "The specified file does not support the requested alternate representation."
}
}
Not being able to get the content of a file in its native format is a big "hole" in the Drive API.
It's not possible to download Google Docs in their native format. Your only choice is export to HTML , Word, odt, etc and then re-import.

Resources