Self-Hosted Integration Runtime times out after ADF pipeline loads a few tables - runtime

I have recently installed Integration Runtime for a local server which hosts an Access DB. The idea is to pull data from it and store in in Azure SQL DB. I have done the following:
Integration Runtime Services Installed on local machine hosting the MS Access DB and connected to it using ODBC
Created linked services in ADF to connect to the DB
Created Datasets for source and destination DB for each table required. One for the source msaccess and one for the target in azure sql db
Created a pipeline to copy the data from the source and sink into the asql db mentioned in step #3
Basically, all the connections work however when I trigger my pipeline to load around 10 of these tables, it runs and loads the first two and then fails afterwards by timing out. I must restart the Integration Runtime everytime to get it back up and running otherwise I can no longer query the tables.
To mitigate this I figured there was too much traffic and the server needed to rest between calls so I added wait timers in between each step of the pipelines but not much success. It did help a bit but that might be coincidence.
The error log in the monitor spits out at the failed step is :
Error: 2200
ErrorCode=UserErrorFailedToConnectOdbcSource,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=,Source=Microsoft.DataTransfer.Runtime.GenericOdbcConnectors,''Type=System.Data.Odbc.OdbcException,Message=,Source=,'
btw the integration runtime install service running is 5.12.7984.1 and the version of access installed is through office 365 x64. The exact MS access driver is 16.14430.20006. The OS is Windows Server 2019

I am getting the exact same error. To start, I did check the event viewer logs and saw some errors to do with access. So I gave the user running the IR more access to the registry keys/general log on as a service rights. This helped a little, but I am still stuck with the same problem.

When copying from an Access DB located on the SHIR itself to flat files in lake storage, I encountered the same error.
Removing Office 365 from the machine, then re-installing the Access runtime solved the problem.
This answer is from #ezaidi comments above.

Related

SSAS Deployment fails on Metadata due to using wrong data Source

When I try to deploy a SSAS Tabular project from my local machine to the integration server I get an error in the metadata. It's a many to many error that I had previously solved, and I know I solved it because the Tabular project can be processed locally. That is the crux of the issue, when I process the project, it is using the correct data source, but when I try to deploy the project it uses a previous data source that is no longer valid.
Steps I've taken:
I Went into the model, selected existing connections, and modified them to connect to the correct environment (they were previously pointed at the invalid environment) this allowed me to process the data, but I still can not deploy correctly.
I created a new configuration in the configuration manager, to no avail.
I looked for other areas where the source DB is listed, I haven't found any, which is frustrating because I know it's somewhere.
Is there a cache I'm missing or a setting I didn't check, any help would be appreciated, thank you.

SQL Server 2016 PolyBase install

I'm trying to install and Run PolyBase in a brand new SQL Server environment - the installation runs fine, but I can't get the PolyBase services to start.
In SQL Server Configuration manager I see the state as 'Change Pending...', and task manager shows the services as 'starting'
If I try to configure PolyBase in SSMS it sits there for a while then bombs out with a null reference exception.
I can kill the processes in task manager but still no dice when I try to restart them - sits there for ages before finally returning 'the request failed or the service did not respond in timely fashion'
This is on a newly provisioned Azure VM, using the Developer version of 2016, so as far as I can tell the environment is completely clean - worth noting that I've tried this on two separate environments with the same result.
I've got plenty of SQL Server experience, but I've never used PolyBase before, so I accept that I might be missing something, but I would expect a fresh out-of-the-box install of SQL server to at least let me run the services...
Hopefully I'm doing something stupid and one of you can steer me in the right direction. Thanks in advance for any help.
CMR
Try enabling TCP/IP for SQL Server Network (by default only Shared Memory is enabled)
PolyBase head nodes are only supported in SQL Server 2016 Enterprise Edition.
https://msdn.microsoft.com/en-us/library/cc645993.aspx
It might be worth installing Service Pack 1 and trying again, because Polybase is now a Feature in Standard Edition. https://blogs.msdn.microsoft.com/sqlreleaseservices/sql-server-2016-service-pack-1-sp1-released/

using sqlite database with Windows store app

I guess I am not the first one who encount this issue, but can't find much information after a bit of research. Here is my question:
A windows store app access a sqlite database, the database contains a
few tables, and it is read only. The size of database is 20 MB.
at the starting of the App, it will copy the database to
application folder (if it is not already there). It works fine,
when i test it manually (although it is not lighting fast). but it
always failed badly when testing again the certification test
toolkits, failed at the preformance test with "app crash" or "app
can start" error.
so my question is
1) is this the correct way of using sqlite database in windows
store app? (i mean using a 20MB database locally) or should i port
the data to cloud?
2) is the failure of the certification toolkit really matter? (
will it also means failure of publishing process?)
Thanks in advance
You are going on perfect way. If your app doesn't need Internet connectivity at all then don't go for cloud database. You should use extended splash screen to copy the database, you should not do that thing in App.xaml.cs. If you use cloud database then it will require more time for request-response. I think SQLite transaction is faster than that.
The certification may fail, if you are not using latest version of WACK. If your app fails WACK test, it won't be published.

Exception while using Windows Azure Caching : No such host is known

I am trying to get started with Azure and am trying to use the Caching feature. I created a cloud service project and added a Cache worker role and a web role. I installed "Windows Azure Caching" nuget into projects for both the roles and added the name of cache worker role as identifier in DataCacheClients element in web.config of the web role.
I added the following code into the web role:
DataCacheFactory cf = new DataCacheFactory();
DataCache c = cf.GetDefaultCache();
When I try to run this locally on the emulator, I get the following exception:
ErrorCode<ERRCA0017>:SubStatus<ES0006>:There is a temporary failure.
Please retry later. (One or more specified cache servers are unavailable,
which could be caused by busy network or servers. For on-premises cache clusters,
also verify the following conditions. Ensure that security permission has been granted
for this client account, and check that the AppFabric Caching Service is allowed through
the firewall on all cache hosts. Also the MaxBufferSize on the server must be greater
than or equal to the serialized object size sent from the client.).
Additional Information : The client was trying to communicate with the server: net.tcp://MvcWebRole1:24233.
Inner Exception : No such host is known
Can you please tell me what I am missing here?
Azure SDK used : v2.0
Timing of your question couldn't be better. We also faced exactly the same issue and were scrathing our head as to what the problem could be. We had one project where everything worked perfectly fine and in one we were getting the same error. Based on our research, we have identified the problem with the Nuget package for caching. It seems a new version (2.1.0.0) was released yesterday and we found that if we install that package, we get this error. Can you check the package version in your case? The documentation states that this new version can only be used with the latest SDK (2.1) released today.
One solution would be to uninstall version 2.1.0.0 and install version 2.0.0.0. To install version 2.0.0.0, open Package Manager Console (View --> Other Windows --> Package Manager Console) and type following command there:
Install-Package Microsoft.WindowsAzure.Caching -Version 2.0.0.0
This fixed our problem. Hopefully it should fix yours too.
Here is a link to the Windows Azure Cloud Integration Engineering blog on how to deal with this same issue. They recommend upgrading to Azure SDK v 2.1 or rolling back as the accepted answer states.
http://blogs.msdn.com/b/cie/archive/2013/08/08/windows-azure-caching-2-1-0-0-no-such-host-is-known.aspx
This exception could also occur under compute emulator if for the role in the client library configuration there is no cache configured. In my case this happened on purpose, since the cache emulator has some problems that can slow down the test and debugging on the service.
In previous version of Windows Azure Caching, in this scenario the construction of DataCacheFactory would fail with an exception (handled by my code); with Windows Azure Caching 2.1 (and Azure SDK 2.1) in this same scenario the code would consider the role name as a server address and thus on DataCache construction would try to communicate with the non existing cache -- this leads to a 3 minute wait and the exception reported in the question.
I have changed my code to detect the new situation for this scenario -- you can find more detail in this SO question.

When installing Tableau Server getting error

When I am installing Tableau server I am seeing this server. I am the administrator of the system but still I was seeing the message.Please help me in this regard
It's often safer to use the default NTAuthority\NetworkService account if you're installing Tableau Server for the first time, since this is (almost always) guaranteed to work and can always be changed later.
If you do want to proceed with using SriHarsha-PC\SriHarsha as the Run As account, then take a look at the following link from the Tableau Software Knowledge Base which lists all of the permissions that your chosen account will need in order to run Tableau Server correctly.
Tableau Server Run As account permissions
If that does not provide sufficient information, then create a support request and Tableau Technical Support will try and help resolve the issue.

Resources