Synapse pipeline - Blob storage event trigger - Pipeline failing with Microsoft.DataTransfer.Common.Shared.HybridDeliveryException - azure-blob-storage

I have a copy activity on storage event trigger. The pipeline gets triggered by a blob getting added to storage (ADLS Gen 2). However, the pipeline's copy activity fails with below error after being run by storage trigger. Pipeline runs successfully using Run/Debug, but fails on StorageEventTrigger (and ManualTrigger).
Operation on target Copy failed: ErrorCode=UnsupportedDataStoreEndpoint,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The data store endpoint is not supported in 'AzureBlobFS' connector. Error message : 'The domain of this endpoint is not in allow list. Original endpoint: '::redacted::.blob.core.windows.net'',Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.DataTransfer.SecurityValidation.Exceptions.UrlValidationException,Message=The domain of this endpoint is not in allow list. Original endpoint: '::redacted::.blob.core.windows.net',Source=Microsoft.DataTransfer.SecurityValidation,'
Trigger payload:
{
"topic": "/subscriptions/<redacted>/resourceGroups/<redacted>/providers/Microsoft.Storage/storageAccounts/<redacted>",
"subject": "/blobServices/default/containers/<redacted>/blobs/<redacted>",
"eventType": "Microsoft.Storage.BlobCreated",
"id": "<redacted>",
"data": {
"api": "PutBlob",
"clientRequestId": "<redacted>",
"requestId": "<redacted>",
"eTag": "<redacted>",
"contentType": "text/plain",
"contentLength": 214,
"blobType": "BlockBlob",
"blobUrl": "https://<redacted>.blob.core.windows.net/<redacted>",
"url": "https://<redacted>.blob.core.windows.net/<redacted>",
"sequencer": "<redacted>",
"identity": "$superuser",
"storageDiagnostics": {
"batchId": "<redacted>"
}
},
"dataVersion": "",
"metadataVersion": "1",
"eventTime": "2022-10-05T22:31:48.3346541Z"
}
UPDATE: According to Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory or Azure Synapse Analytics for System-assigned managed identity authentication:
The AzureBlobFS connector must have:
Property-Description-Required
url-Endpoint for Data Lake Storage Gen2 with the pattern of https://.dfs.core.windows.net.-Yes
My trigger payload returns with blob.core.windows.net and seems to interfere with the pipeline activity.

Resolved by creating a new Linked Service to Blob storage acc. Only difference was using account selection method > From Azure Subscription instead of Enter Manually.

Related

How to connect internal private DB2 to Cognos Dynamic Dashboard Embedded on IBM Cloud

Im working on cognos dashboard embedded using the reference from -
Cognos Dashboard embedded.
but instead of csv i'm working on JDBC data sources.
i'm trying to connect to JDBC data source as -
"module": {
"xsd": "https://ibm.com/daas/module/1.0/module.xsd",
"source": {
"id": "StringID",
"jdbc": {
"jdbcUrl": "jdbcUrl: `jdbc:db2://DATABASE-HOST:50000/YOURDB`",
"driverClassName": "com.ibm.db2.jcc.DB2Driver",
"schema": "DEFAULTSCHEMA"
},
"user": "user_name",
"password": "password"
},
"table": {
"name": "ROLE",
"description": "description of the table for visual hints ",
"column": [
{
"name": "ID",
"description": "String",
"datatype": "BIGINT",
"nullable": false,
"label": "ID",
"usage": "identifier",
"regularAggregate": "countDistinct",
},
{
"name": "NAME",
"description": "String",
"datatype": "VARCHAR(100)",
"nullable": true,
"label": "Name",
"usage": "identifier",
"regularAggregate": "countDistinct"
}
]
},
"label": "Module Name",
"identifier": "moduleId"
}
Note - here my database is hosted on private network on not hosted on public IP address.
So when i add the above code to add datasources, then the data is not loading from my DB,
even though i mentioned correct user and password for jdbc connection in above code then also when i drag and drop any field from data sources then it opens a pop up and which asks me for userID and Password.
and even after i filled userID and Password details again in popup i'm still unable to load the data.
Errors -
1 . when any module try to fetch data then calls API -
'https://dde-us-south.analytics.ibm.com/daas/v1/data?moduleUrl=%2Fda......'
but in my case this API is failing and giving the error - Status Code: 403 Forbidden
In SignOnDialog.js
At line - 98 call for saveDataSourceCredential method fails and it says saveDataSourceCredential is not a function.
Expectation -
It should not open a pop to asks for userID and password. and data will load directly just as it happens for database hosted on public IP domains.
This does not work in general. If you are using any type of functionality hosted outside your network that needs to access an API or data on your private network, there needs to be some communication channel.
That channel could be established by setting up a VPN, using products like IBM Secure Gateway to create a client / server connection between the IBM Cloud and your Db2 host, or by even setting up a direct link between your company network and the (IBM) cloud.

Bot framework direct line using POST with JSON data

I now use bot framework with Azure functions.
it now works when the user sends his message its writes it to queue storage then picked up by Azure function and sends it back to the bot with direct line build in Azure function connector.
I want to change the functionality to LogicAppp and return the answer to the user with direct-line with http rest.
I have a key and have a json input that the function got like this:
{
"relatesTo": {
"user": {
"id": "default-user",
"name": "User"
},
"bot": {
"id": "b5023440-b1ce-11e8-9ad8-f5b615a4c6c3",
"name": "Bot"
},
"conversation": {
"id": "33cd0410-bf46-11e8-a228-a5c7cd21a798|livechat"
},
"channelId": "emulator",
"serviceUrl": "https://0a87dff1.ngrok.io"
},
"text": "example",
"isTrustedServiceUrl": true
}
I try to answer the chat using
https://directline.botframework.com/v3/directline/conversations/{conversationId}/activitie
I can't make it work, the conversation id looks different, it's like a guid instead of an id.
how can help me with the right POST syntax from the json provided?

Update Azure Event Grid function subscription with dead-letter storage

I have successfully created an event trigger on storage blob creation, on a storage account called receivingtestwesteurope, under resource group omni-test, which is received via a function called ValidateMetadata. I created this via the portal GUI. However I now want to add deadletter/retry policies, which can only be done via the CLI.
The working trigger is like this:
{
"destination": {
"endpointBaseUrl": "https://omnireceivingprocesstest.azurewebsites.net/admin/extensions/EventGridExtensionConfig",
"endpointType": "WebHook",
"endpointUrl": null
},
"filter": {
"includedEventTypes": [
"Microsoft.Storage.BlobCreated"
],
"isSubjectCaseSensitive": null,
"subjectBeginsWith": "/blobServices/default/containers/snapshots/blobs/",
"subjectEndsWith": ".png"
},
"id": "/subscriptions/fa6409ab-1234-1234-1234-85dd2b3ceab4/resourceGroups/omni-test/providers/Microsoft.Storage/StorageAccounts/receivingtestwesteurope/providers/Microsoft.EventGrid/eventSubscriptions/png",
"labels": [
""
],
"name": "png",
"provisioningState": "Succeeded",
"resourceGroup": "omni-test",
"topic": "/subscriptions/fa6409ab-1234-1234-1234-85dd2b3ceab4/resourceGroups/omni-test/providers/microsoft.storage/storageaccounts/receivingtestwesteurope",
"type": "Microsoft.EventGrid/eventSubscriptions"
}
First I thought I could update the existing event with a deadletter queue:
az eventgrid event-subscription update --name png --deadletter-endpoint receivingtestwesteurope/blobServices/default/containers/eventgrid
Which returns:
az: error: unrecognized arguments: --deadletter-endpoint
receivingtestwesteurope/blobServices/default/containers/eventgrid
Then I tried via REST Patch:
https://learn.microsoft.com/en-us/rest/api/eventgrid/eventsubscriptions/update
scope: /subscriptions/fa6409ab-1234-1234-1234-85dd2b3ceab4/resourceGroups/omni-test/providers/microsoft.storage/storageaccounts/receivingtestwesteurope
eventSubscriptionName: png
api-version: 2018-05-01-preview
Body:
"deadletterdestination": {
"endpointType": "StorageBlob",
"properties": {
"blobContainerName": "eventgrid",
"resourceId": "/subscriptions/fa6409ab-1234-1234-1234-85dd2b3ceab4/resourceGroups/omni-test/providers/microsoft.storage/storageaccounts/receivingtestwesteurope"
}}
Which returns
"Model state is invalid."
===================
Final working solution:
{
"deadletterdestination": {
"endpointType": "StorageBlob",
"properties": {
"blobContainerName": "eventgrid",
"resourceId": "/subscriptions/fa6409ab-1234-1234-1234-85dd2b3ceab4/resourceGroups/omni-test/providers/microsoft.storage/storageaccounts/receivingtestwesteurope"
}
}
}
have a look at Manage Event Grid delivery settings, where in details is described turning-on a dead-lettering. Note, you have to install an eventgrid extension
az extension add --name eventgrid
also, you can use a REST API for updating your event subscription for dead-lettering.
besides that, I have just released my tinny tool Azure Event Grid Tester for helping with an Azure Event Grid model on the local machine.
Update:
The following is a deadletterdestination property:
"deadletterdestination": {
"endpointType": "StorageBlob",
"properties": {
"blobContainerName": "{containerName}",
"resourceId": "/subscriptions/{subscriptionId}/resourceGroups/{resgroup}/providers/Microsoft.Storage/storageAccounts/{storageAccount}"
}
}
you can use the Event Subscriptions - Update (REST API PATCH) with the above property. Note, that the api-version=2018-05-01-preview must be used.

Microsoft Graph - Can't read/write the calendar of other users

I have a web app registered on Azure with the goal of being able to read and write the calendars of other users. To do so, I set these permissions for this app on Azure.
However, when I try to, for example, create a new event for a given user, I get an error message. Here's what I'm using:
Endpoint
https://graph.microsoft.com/v1.0/users/${requester}/calendar/events
HTTP Header
Content-Type application/json
Request Body
{
"subject": "${subject}",
"body": {
"contentType": "HTML",
"content": "${remarks}"
},
"start": {
"dateTime": "${startTime}",
"timeZone": "${timezone}"
},
"end": {
"dateTime": "${endTime}",
"timeZone": "${timezone}"
},
"location": {
"displayName": "${spaceName}",
"locationEmailAddress": "${spaceEmail}"
},
"attendees": [
{
"emailAddress": {
"address": "${spaceEmail}",
"name": "${spaceName}"
},
"type": "resource"
}
]
}
Error message
{
"error": {
"code": "ErrorItemNotFound",
"message": "The specified object was not found in the store.",
"innerError": {
"request-id": "XXXXXXXXXXXXXXXX",
"date": "2018-07-11T09:16:19"
}
}
}
Is there something I'm missing? Thanks in advance for any help!
Solution update
I managed to solve the problem by following the steps described in this link:
https://developer.microsoft.com/en-us/graph/docs/concepts/auth_v2_service
From your screenshot it's visible that you used application permission (although it'd be nice to include this information in your question):
Depending on kind of the permission you have given, you need to use proper flow to obtain access token (on behalf of a user or as a service. For application permissions you have to use flow for service, not on behalf of a user.
You can also check your token using jwt.io and make sure it's payload contains appropriate role. If it doesn't, it's very likely you used incorrect flow.
Regarding the expiration time of it, you may have found the information about refresh token (for example here). Keep in mind that it applies only to rights granted on behalf of a user. For access without a user you should make sure that you know when your token is going to expire and request a new one accordingly.

Google Logging API - What service name to use when writing entries from non-Google application?

I am trying to use Google Cloud Logging API to write log entries from a web application I'm developing (happens to be .net).
To do this, I must use the logging.projects.logs.entries.write request. This request dictates that I provide a serviceName argument:
{
"entries": [
{
"textPayload": "test",
"metadata":
{
"serviceName": "compute.googleapis.com"
"projectId": "...",
"region": "us-central1",
"zone": "us-central1-a",
"severity": "DEFAULT",
"timestamp": "2015-01-13T19:17:01Z",
"userId": "",
}
}]
}
Unless I specify "compute.googleapis.com" as the serviceName I get an error 400 response:
{
"error":
{
"code": 400,
"message": "Unsupported service specified",
"status": "INVALID_ARGUMENT"
}
}
For now using "compute.googleapis.com" seems to work but I'm asking - what service name should I give, given that I'm not using Google Compute Engine or Google App Engine here?
The Cloud Logging API currently only officially supports Google resources, so the best course of action is to continue to use "compute.googleapis.com" as the service and supply the labels "compute.googleapis.com/resource_type" and "compute.googleapis.com/resource_id", which are used for indexing and visible in the UI drop-downs.
We also currently permit the service name "custom.googleapis.com" with index labels "custom.googleapis.com/primary_key" and "custom.googleapis.com/secondary_key" but that is not officially supported and subject to change in a future release.

Resources