Azure data factory with blob sftp giving aceess error in integration runtime - azure-blob-storage

i am working on poc where am creating adf pipeline which needs to pickup file from source sftp and move to target sftp.as this is poc i am using azure blob storage sftp feature which is in preview for now. But when i am creating linked service using sftp connection string i am getting error as below
Error code: 9978
Details: Access 'storageacc.container.user#acc.blib.core.windows.net' is not allowed right on azure integration runtime.
I searched a lot but couldn't find any solution,if you know the issue plz help me. Thanks in advance.

Your Host control can't accept 'storageacc.container.user#acc.blib.core.windows.net'. It have to be divided into two parts:
acc.blib.core.windows.net - for [Host field] control
storageacc.container.user - for [User name] control
My current Connection:

Related

How overcome error 400 in Watson Discovery Upload Data

I am new to IBM cloud. I deleted my Watson Discovery service by mistake. Afterwards, I re-created a new service and there was no issue. But when I try to upload data to Watson Discovery, I'm given error 400 "Only one free environment is allowed per resource group". I'm on the Lite plan.
Any help?
login into your ibm cloud account and go to https://cloud.ibm.com/shell and run the following commands
ibmcloud resource reclamations
the above command list all resource reclamations under your account. to know which resource to delete check the Entity CRN and copy it's ID then use below command to delete the resource
ibmcloud resource reclamation-delete [ID] --force
Replace the ID with resource id to delete.
Maybe it is too late, but I found some information under this link: https://cloud.ibm.com/docs/discovery?topic=discovery-gs-api.
It mentions something like: "If you have recently deleted a Lite instance and then receive a 400 - Only one free environment is allowed per resource group error message when creating a new environment in a new Lite instance, you need to finish deleting the original Lite instance. See ibmcloud resource reclamations and follow the reclamation-delete instructions."
Also further information can be gathered from here: https://cloud.ibm.com/docs/cli?topic=cloud-cli-ibmcloud_commands_resource#ibmcloud_resource_reclamations

Azure Storage Explorer - Inadequate resource type access

I am attempting to use the Microsoft Azure Storage Explorer, attaching with a SAS URI. But I always get the error:
Inadequate resource type access. At least service-level ('s') access
is required.
Here is my SAS URI with portions obfuscated:
https://ti<...>hare.blob.core.windows.net/?sv=2018-03-28&ss=b&srt=co&sp=rwdl&se=2027-07-01T00:00:00Z&st=2019-07-01T00:00:00Z&sip=52.<...>.235&spr=https&sig=yD%2FRUD<...>U0%3D
And here is my connection string with portions obfuscated:
BlobEndpoint=https://tidi<...>are.blob.core.windows.net/;QueueEndpoint=https://tidi<...>hare.queue.core.windows.net/;FileEndpoint=https://ti<...>are.file.core.windows.net/;TableEndpoint=https://tid<...>hare.table.core.windows.net/;SharedAccessSignature=sv=2018-03-28&ss=b&srt=co&sp=rwdl&se=2027-07-01T00:00:00Z&st=2019-07-01T00:00:00Z&sip=52.<...>.235&spr=https&sig=yD%2FRU<...>YU0%3D
It seems like the problem is with the construction of my URI/endpoints/connectionstring/etc, more than with permissions granted me on the server, due to the fact that when I click Next, the error displays instantaneously. I do not believe it even tried to reach out to the server.
What am I doing wrong? (As soon as I get this working, I'll be using the URI/etc to embed in my C# app for programmatic access.)
What you need to connect is a service requirement the "SRT" part of the URI.
The URI you have has a SRT of "CO" container and object and needs the "S" part, you need to create a new sas key this can be generated in portal, azure cli or powershell.
In the portal is this part:
You have to enter to the storage acount and select what you need:
Allowed services (if you are looking for blob)
Blob
Allowed resource types
Service (make sure this one is activated)
Container
Object
Allowed permissions (this to do everything)
Read
Write
Delete
List
Add
Create
Example where to look
If you need more info look here:
https://learn.microsoft.com/en-us/rest/api/storageservices/create-account-sas?redirectedfrom=MSDN
If you like to create the SAS key in the CLI use this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-cli
If you like to create the SAS key in powershell use this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-powershell
I has a similar issue trying to connect to the blob container using a Shared Access Signature (SAS) URL, and this worked for me:
Instead of generating the SAS URL in Azure Portal, I used Azure Storage Explorer.
Right click the container that you want to share -> "Get Shared Access Signature"
Select the Expiry time and permissions and click create
This URL should work when your client/user will try to connect to the container.
Cheers
I had the same problem and managed to get this to work by hacking the URL and changing "srt=co" to "srt=sco". It seems to need the "s".

Ms Integration Runtime data factory

Kind of new with the integration runtime.
I had a pipeline running with no issues but recently we had an AD upgrade and the local on premesis SQL db changed my user from 'bluecompany\joe' to 'redcompany\joe'
This has caused my datafactory to stop working properly . as it can't connect to the SQL onpremesis .
I can't seem to find the place of where I can update this change?
Error:
Copy activity encountered a user error at Source side: Integration Runtime (Self-hosted) Node Name=ORG200016,ErrorCode=UserErrorFailedToConnectToSqlServer,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot connect to SQL Server: 'org200016.bluecompany.com.au', Database: 'GroupRisk', User: 'bluecompany\joe'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.ComponentModel.Win32Exception,Message=This user can't sign in because this account is currently disabled,Source=Microsoft.DataTransfer.ClientLibrary,'.
any ideas would be very welcomed. Thank you
As your login account has changed, I think you will need to update the account in the corresponding linked service, where you entered your credentials for this database previously.
Be sure the test connection succeeds after you edit the linked serivce. Then the pipeline should be able to connect to your database again.
Depending on which version of ADF you're using, there are different ways to update your linked service:
login to https://portal.azure.com/ and find you data factory (if you don't have an account to login to portal, you need to find the admin who create this linked service and ask him to update for you)
if you're using v1 data factory, find the "Author and Deploy" where you should be able to find your linked service corresponding to your on premise SQL server.
if you're using v2 data factory, find the "Author and Monitor", click on the pen logo where you should be able to find your linked service from the "connections" tab, it will allow you to edit the linked service.
Thanks,
Eva

Mule Connect to remote flat files

I am new to Mule and I have been struggling with a simple issue for a while now. I am trying to connect to flat files (.MDB, .DBF) located on a remote desktop through my Mule application using the generic database connector of Mule. I have tried different things here:
I am using StelsDBF and StelsMDB drivers for the JDBC connectivity. I tried connecting directly using jdbc URL - jdbc:jstels:mdb:host/path
I have also tried to access through FTP by using FileZilla server on remote desktop and using jdbc URL in my app - jdbc:jstels:dbf:ftp://user:password#host:21/path
None of these seem to be working as I am always getting Connection exceptions. If anyone has tried this before, what is the best way to go about it? Connecting a remote flat file with Mule? Your response on this will be greatly appreciated!
If you want to load the contents of the file inside a Mule flow you should use the file or FTP connector, i don't know for sure about your JDBC option.
With the File connector you can access local files (files on the server where mule is running), you could try to mount the folders as a share.
Or run an FTP server like you already tried, that should work.
There is probably an error in your syntax / connection.
Please paste the complete XML of your Mule flow so we can see what you are trying to do.
Your usecase is still not really clear to me, are you really planning to use http to trigger the DB everytime? Anyway did you try putting the file on a local path and use that path in your database url. Here is someone that says he had it working, he created a separate bean.
http://forums.mulesoft.com/questions/6422/setting_property_dynamically_on_jdbcdatasource.html
I think a local path is maybe possible and it's better to test that first.
Also take note of how to refer to a file path, look at the examples for the file connector: https://docs.mulesoft.com/mule-user-guide/v/3.7/file-transport-reference#namespace-and-syntax
If you manage to get it working and you can use the path directly in the JDBC url, you should have a look at the poll scope.
https://docs.mulesoft.com/mule-user-guide/v/3.7/poll-reference
You can use your DB connector as an inbound endpoint when wrapped in a poll scope.
I experienced the same issue when connect to Microsoft Access Database (*.mdb, *.accdb) using Mule Database Connector. After further investigation, it's solved by installing Microsoft Access Database Engine
Another issue, I couldn't pass parameter to construct a query as same as I do for other databases. e.g.: SELECT * FROM emplcopy WHERE id = #[payload.id]
To solve this issue:
I changed the Query type from Parameterized into Dynamic.
I generated the query inside Set Payload transformer (generate the query in form of String, e.g.: SELECT * FROM emplcopy WHERE id = '1').
Finally, put it into the Dynamic query area: #[payload]

Configuration Issue for IBM Filenet 5.2

I installed IBM Filenet Content Engine 5.2,on my machine.I am getting problem while configuring GCD datasources for new profile.
Let me first explain the setps I did,then I would mention the problem that I am getting.
First,I created GCD database in DB2,then I created datasources required for configuration of profile in WAS Admin Console.I created J2C Authentication Alias,for user which has access to GCD database and configured it with datasources.I am getting test database connection as successful but when I run task of configuring GCD datasources,it fails with the following error:-
Starting to run Configure GCD JDBC Data Sources
Configure GCD JDBC Data Sources ******
Finished running Configure GCD JDBC Data Sources
An error occurred while running Configure GCD JDBC Data Sources
Running the task failed with the following message: The data source configuration failed:
WASX7209I: Connected to process "server1" on node Poonam-PcNode01 using SOAP connector; The type of process is: UnManagedProcess
testing Database connection
DSRA8040I: Failed to connect to the DataSource. Encountered java.sql.SQLException: [jcc][t4][2013][11249][3.62.56] Connection authorization failure occurred. Reason: User ID or Password invalid. ERRORCODE=-4214, SQLSTATE=28000 DSRA0010E: SQL State = 28000, Error Code = -4,214.
It looks like simple error of user id and password not valid.I am using same alias for other datasources as well and they are working fine.so not sure,why I am getting error.I have also tried changing scope of datasources,but no success.Can somebody please help?
running "FileNet Configuration Manager" task of configuring GCD datasources will create all the needs things in WAS (including Alias), do not created it before manually.
I suspect it had an issue with exciting JDBC data sources/different names Alias
Seems from your message that you are running it from Filene configuration manager. Could you please double check from your database whether user id is authorised to execute query in GCD database. It is definitely do it with permission issue.

Resources