How to use Azurite for Integration test in Blobs - azure-blob-storage

Azurite does not connect with my local project for saving blobs inside local emulator.
I am trying to connect Azurite with my Integration test in blob storage? any guide

if you need a way to write integration test for blobs (Upload,Delete , get) you can use Azureit for that.
Azureit is inbuid in VS 2022, so can install Azurite and use it by mentioning your connection string as "ConnectionString": "UseDevelopmentStorage=true;"
up and run the Azureit. you can find application in
Visual Studio\2022\Professional\Common7\IDE\Extensions\Microsoft\Azure Storage Emulator (run as admin)
C:\Program Files\Microsoft

Related

Self-Hosted Integration Runtime times out after ADF pipeline loads a few tables

I have recently installed Integration Runtime for a local server which hosts an Access DB. The idea is to pull data from it and store in in Azure SQL DB. I have done the following:
Integration Runtime Services Installed on local machine hosting the MS Access DB and connected to it using ODBC
Created linked services in ADF to connect to the DB
Created Datasets for source and destination DB for each table required. One for the source msaccess and one for the target in azure sql db
Created a pipeline to copy the data from the source and sink into the asql db mentioned in step #3
Basically, all the connections work however when I trigger my pipeline to load around 10 of these tables, it runs and loads the first two and then fails afterwards by timing out. I must restart the Integration Runtime everytime to get it back up and running otherwise I can no longer query the tables.
To mitigate this I figured there was too much traffic and the server needed to rest between calls so I added wait timers in between each step of the pipelines but not much success. It did help a bit but that might be coincidence.
The error log in the monitor spits out at the failed step is :
Error: 2200
ErrorCode=UserErrorFailedToConnectOdbcSource,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=,Source=Microsoft.DataTransfer.Runtime.GenericOdbcConnectors,''Type=System.Data.Odbc.OdbcException,Message=,Source=,'
btw the integration runtime install service running is 5.12.7984.1 and the version of access installed is through office 365 x64. The exact MS access driver is 16.14430.20006. The OS is Windows Server 2019
I am getting the exact same error. To start, I did check the event viewer logs and saw some errors to do with access. So I gave the user running the IR more access to the registry keys/general log on as a service rights. This helped a little, but I am still stuck with the same problem.
When copying from an Access DB located on the SHIR itself to flat files in lake storage, I encountered the same error.
Removing Office 365 from the machine, then re-installing the Access runtime solved the problem.
This answer is from #ezaidi comments above.

google cloud sdk is installed in my EC2 instance and I could access gcloud. But bq is not available even though I see it in the list of components

I am installed google-cloud-sdk in my matillion instance hosted on EC2. I am able to access gcloud command in the ssh instance and also by using a bash component in my matillion.
However, I am not able to run bq commands. I see it has been installed as part of the cloud sdk. I was able to configure my account and everything. But it doesn't work.
Can someone help me with this?
As per the documentation, its necessary that you activated the BigQuery API in order to use the bq command-line tool.
These are all the steps that you need to follow:
In the Cloud Console, on the project selector page, select or create
a Cloud project.
Install and initialize the Cloud SDK.
BigQuery is automatically enabled in new projects. To activate BigQuery in a preexisting project, go to Enable the BigQuery API.
I also was getting the same error than you and activating the API was the solution.

Oracle objects deployment using Azure Devops Pipeline

I have a requirement to automate deployment to different environment(dev, stage and prod) using Azure devops. I am not able to find a task for the same. Azure devops has task for SQLServer database deploy, MySql Database deploy but not for the Oracle database deploy.
I am very new in Azure devops. Please guide me how can I achieve this.
For this issue, Red Gate has a set of deployment tools for Oracle, but integrated in Azure Devops is the SQL Change Automation extension, which is only applicable to SQL Server database.
So AFAIK, there is currently no task for Oracle database deploy. You could add your request for this feature on our UserVoice site, which is our main forum for product suggestions. You could also vote the suggestion ticket and share your comment there, so product team would provide the updates if they view it.
As a workaround , you could try to use the PowerShell on Remote machine task to deploy your Oracle changes and place them in an Azure DevOps CI/CD pipeline without having to install an extension from the Marketplace. For details ,please refer to this blog.

How to use archiveDir with Web Deploy 3

I am currenlty using Web Deploy 3 with Automatic Backup. Yesterday, backup failed with error message 'ERROR_PACKAGE_TOO_LARGE'.
as per MSDN
Resolution - Use the archiveDir provider when creating a package instead. Currently there is no solution for this limit with respect to automatic backups.
Now archiveDir works fine on command line
msdeploy -verb:sync -source:apphostconfig="Site" -dest:archivedir=c:\archive
I am pretty new to Web Deploy, can someone please help me to automate this with Web Deploy using VS2013? I don't want to login to server to manually back up my site and then come back to VS to publish. Is there any setting in .pubxml(Publish Profile) to achieve it?
Thanks

Azure migrate from GTE CyberTrust Global Root to the Baltimore CyberTrust Root

I am new to Azure, I am running one web-application (ASP.Net, 4.0, C#) using SQL AZURE as database on one small instance of VM cloud service, I am also use Azure blob storage to save my files & images, I want to know how this migration will affect my application, what precautions I have to made so my application will not down after this migration.

Resources