Error to create a Databricks datasource using Power BI Rest API - azure-databricks

I successfuly used the "Gateways - Create Datasource" (https://learn.microsoft.com/en-us/rest/api/power-bi/gateways/create-datasource) method from Power BI Rest API to create a SQL datasource, but I´m stuck when I try to create a Databricks datasource.
I saw that there is no Databricks datasource kind, but it could be possible to use the kind "Extension". So a try this code:
OBS: I generated the credential part using the data gateway publick key and a Databricks key.
What am I missing, or doing wrong?
{
"datasourceName": "Databricks AIDA Teste",
"datasourceType":"Extension",
"connectionDetails":{
"path":"{\"host\":\"adb-xxx.azuredatabricks.net\",\"httpPath\":\"\\/sql\\/1.0\\/warehouses\\/xxx\"}",
"kind":"Databricks"
},
"credentialDetails": {
"credentialType": "Key",
"credentials": "xxx",
"privacyLevel": "Organizational"
}
}```
{
"error": {
"code": "BadRequest",
"message": "Bad Request",
"details": [
{
"message": "Unexpected character encountered while parsing value: {. Path 'connectionDetails', line 4, position 30.",
"target": "datasourceToGatewayRequest.connectionDetails"
},
{
"message": "'datasourceToGatewayRequest' is a required parameter",
"target": "datasourceToGatewayRequest"
}
]
}
}
Many thanks!

Related

Microsoft graph API - unable to use $filter operation in "toRecipients" array

Hello guys hope ur doing good,
I want to filter out the graph api response based on the "toRecipient" which is a array, hence I used lambda expression, but it gives error.
"error": {
"code": "ErrorInvalidUrlQueryFilter",
"message": "The query filter contains one or more invalid nodes.",
"innerError": {
"date": "2021-09-22T06:04:17",
"request-id": "c6077cd4-dbec-4671-9c11-10e547917d29",
"client-request-id": "66dfbc92-2482-11f3-86f9-22652a4e4e00"
}
}
My actual response is
"toRecipients": [
{
"emailAddress": {
"name": "abc",
"address": "abc#abc.com"
}
}
],
I had used this operation to filter out
https://graph.microsoft.com/v1.0/me/mailFolders/sentItems/messages?&$top=1000&$search="abc#abc.com"
Graph API calls underlaying Office 365 API.
According to the documentation, property ToRecipients is not filterable.

Problem creating "Global-OptionSet" attribute using CRM Dynamics WebApi

I'm trying to create a "Global OptionSet"-attribute (sd_MyAttribute) for an existing entity (entity ID = 70816501-edb9-4740-a16c-6a5efbc05d84) via Dynamics CRM WebAPI.
The JSON I send is this using method "POST":
{
"#odata.type": "Microsoft.Dynamics.CRM.PicklistAttributeMetadata",
"OptionSet": {
"#odata.type": "Microsoft.Dynamics.CRM.OptionSetMetadata",
"IsGlobal": true,
"Name": "sd_MyPickList",
"OptionSetType": "Picklist",
"MetadataId": "a50cfc0a-e206-ea11-a811-000d3ab82e70"
},
"AttributeType": "Picklist",
"SchemaName": "sd_MyAttribute",
"Description": {
"#odata.type": "Microsoft.Dynamics.CRM.Label",
"LocalizedLabels": [
{
"#odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel",
"Label": "This is the attribute I want to create.",
"LanguageCode": 1033
}
]
},
"DisplayName": {
"#odata.type": "Microsoft.Dynamics.CRM.Label",
"LocalizedLabels": [
{
"#odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel",
"Label": "This is the attribute I want to create.",
"LanguageCode": 1033
}
]
},
"RequiredLevel": {
"Value": "None",
"CanBeChanged": true
}
}
I expected to get a status 204 response, indicating that a new Picklist attribute on the entity using the sd_MyPickList option set has been created.
Unfortunately, the response is:
{
"error": {
"code": "0x80048403",
"message": "Only Local option set can be created through the attribute create. IsGlobal flag must be set to 'false'.",
"innererror": {
"message": "Only Local option set can be created through the attribute create. IsGlobal flag must be set to 'false'.",
"type": "Microsoft.Crm.CrmException",
"stacktrace": " ...)"
}
}
}
There is already an issue in the github project (see https://github.com/MicrosoftDocs/dynamics-365-customer-engagement/issues/601), but I wonder whether there is a way around this problem - what json do I need to send to create an attribute adressing a global option set? Is there someone who has successfully created such an entity attribute via web-api?
There is a usecase, I don't have the ability to use an existing library for that and importing a solution is not an option in my case.
Would be perfect if someone can provide a simple json that can be send e.g. using the Contact entity and any global optionset.
Finally, I found a way to accomplish what I need. To specify the global option set I need to use the "#odata.bind" action in the JSON data. For an attribute "sd_MyAttribute" that uses the global OptionSet with the MetaDataId "62654906-7A0b-ea11-a817-000d3ab826fd", I need to do POST:
{
"#odata.type": "Microsoft.Dynamics.CRM.PicklistAttributeMetadata",
"GlobalOptionSet#odata.bind": "/GlobalOptionSetDefinitions(62654906-7A0b-ea11-a817-000d3ab826fd)",
"AttributeType": "Picklist",
"SchemaName": "sd_MyAttribute",
"Description": { ... },
"DisplayName": { ... },
"RequiredLevel": { ... }
}
If the entity does have the MetaDataId "70916b01-edb2-4840-a16b-6a2efbc75d84", the URI for the POST would be "/api/data/v9.0/EntityDefinitions(70916b01-edb2-4840-a16b-6a2efbc75d84)/Attributes" (logical- or schema-names are not supported).
Hope my question and answer does help someone who gets the same error message.

API Discovery Service return error only in BigQuery

API Discovery Service of BigQuery had worked well, but recently it suddenly returns error.
NG https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest?fields=kind
OK https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
OK https://www.googleapis.com/discovery/v1/apis/discovery/v1/rest?fields=kind
Google's API Discovery Service has fields parameter.
It works well in some api such as discovery (case 3), but doesn't work in bigquery (case 1).
{
"error": {
"code": 400,
"message": "Request contains an invalid argument.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"field": "kind",
"description": "Error expanding 'fields' parameter. Cannot find matching fields for path 'kind'."
}
]
}
]
}
}
It works well if fields is deleted (case 2).
google-api-javascript-client has same issue.
I think this is bug of google, or is there any mistake?
This was indeed a Google issue and is now fixed.

Data Factory Copy Activity met an internal service error

I have an ADF pipeline that copies 34 tables from an on premise Oracle database to an Azure data lake store; 32 of these copy just fine on a daily basis, the other 2 consistenly fail with...
Copy activity met an internal service error.
For more information, provide this message to customer support. ErrorCode: 8601 GatewayNodeName=XXXXXXXX,
ErrorCode=SystemErrorOdbcWrapperError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=Unknown error from wrapper.,
Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,
''Type=Microsoft.DataTransfer.ClientLibrary.Odbc.Runtime.ValueException,Message=[DataSource.Error] The ODBC driver returned an invalid value.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.Wrapper,'.
The activity JSON is templated so is identical for all 34 activities. I can run the oracleReaderQuery in Oracle SQL Developer using the same connection details and credentials and get results.
Searches for this have shown 1 unanswered question on here (StackOverflow) and another Microsoft with a response that says "We will get back to you ASAP when we have new updates"....but there are no updates.
It seems I am not the only one having this issue; has anyone found a solution?
I have tried to do a one off copy in ADF but get the same result; I have tried copying the table to blob storage and get the same result.
Can anyone help me try to fathom what is wrong with this please?
The activity JSON is as follows...
{
"type": "Copy",
"typeProperties": {
"source": {
"type": "OracleSource",
"oracleReaderQuery": "SELECT stuff FROM <source table>"
},
"sink": {
"type": "AzureDataLakeStoreSink",
"writeBatchSize": 0,
"writeBatchTimeout": "00:00:00"
}
},
"inputs": [
{
"name": "<source table dataset>"
},
{
"name": "<scheduling dependency dataset>"
}
],
"outputs": [
{
"name": "<destination dataset>"
}
],
"policy": {
"timeout": "02:00:00",
"concurrency": 1,
"retry": 3,
"longRetry": 2,
"longRetryInterval": "03:00:00",
"executionPriorityOrder": "OldestFirst"
},
"scheduler": {
"frequency": "Day",
"interval": 1
},
"name": "Copy Activity 34",
"description": "copy activity"
}
As I said though, this is identical, apart from the table it is accessing, to the 32 activities that work perfectly fine.
What's the data type of stuff in your table?

Non unique query with Freebase MQL read google api

It seems I am only able to do unique queries (i.e. including an entity id in the query) with the new freebase MQL read api:
The following searches on id and type:
https://www.googleapis.com/freebase/v1/mqlread?query={"name":null,"id":"/en/bob_dylan","type":"/people/person"}
and successfully returns:
{
"result": {
"type": "/people/person",
"id": "/en/bob_dylan",
"name": "Bob Dylan"
}
}
The following searches with type only:
https://www.googleapis.com/freebase/v1/mqlread?query={"name":null,"type":"/people/person"}
or
https://www.googleapis.com/freebase/v1/mqlread?query={"name":[],"type":"/people/person"}
and returns the following error:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "badRequest",
"message": "Unique query may have at most one result. Got 100"
}
],
"code": 400,
"message": "Unique query may have at most one result. Got 100"
}
}
I expected it to return a list of people's names
You have to wrap your query in [ ], as in the following example:
https://www.googleapis.com/freebase/v1/mqlread?query=[{"name":[],"type":"/people/person"}]
I too faced a similar problem recently. The best way to make sure you get a single result set is to use "limit:1" parameter in your mql query.
for example:
https://www.googleapis.com/freebase/v1/mqlread?query={"type":[],"name":"india","limit":1}

Resources