How to set up CI/CD pipelines for Azure Data Factory in GitLab. Can we deploy Data Factory through GitLab. Can someone please share useful links/documents for the same. TIA!
Unfortunately, Azure Data Factory doesn't support Gitlab.
Currently, Azure Data Factory allows you to configure a Git repository with either Azure DevOps or GitHub.
Reference: Continuous integration and delivery in Azure Data Factory
I would suggest you to vote up an idea submitted by another Azure customer.
https://feedback.azure.com/d365community/idea/aa3d7da4-6f26-ec11-b6e6-000d3a4f032c
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
Related
Long time lurker, first-time question so apologies if I do this wrong.
I have successfully used the following to create a continuous deployment pipeline in Azure DevOps:
Composer CICD Pipeline Sample
However, I would like to use additional pipeline variables to insert into the appsettings.json file: such as additional API keys and the ApplicationInsights connectionString.
Does anyone have experience of doing this or can someone point me in the right direction?
Google has shone no light on this and unfortunately, I have found the botframework documentation to be lacking.
Azure deployments by the pipeline you reference do not use the appsettings.json file. Those settings are ignored.
The pipeline installs pipeline variable values in Azure as App Service Configuration Application Settings using the task "Configure App Service Settings". You might start there.
I have a requirement to automate deployment to different environment(dev, stage and prod) using Azure devops. I am not able to find a task for the same. Azure devops has task for SQLServer database deploy, MySql Database deploy but not for the Oracle database deploy.
I am very new in Azure devops. Please guide me how can I achieve this.
For this issue, Red Gate has a set of deployment tools for Oracle, but integrated in Azure Devops is the SQL Change Automation extension, which is only applicable to SQL Server database.
So AFAIK, there is currently no task for Oracle database deploy. You could add your request for this feature on our UserVoice site, which is our main forum for product suggestions. You could also vote the suggestion ticket and share your comment there, so product team would provide the updates if they view it.
As a workaround , you could try to use the PowerShell on Remote machine task to deploy your Oracle changes and place them in an Azure DevOps CI/CD pipeline without having to install an extension from the Marketplace. For details ,please refer to this blog.
We have enabled a CI/CD pipeline using azure pipeline. Whenever someone check in to the master, the build should happen and deployment should follow. I wanted to understand how can I disallow someone to deploy to azure function web app from local visual studio
You could use RBAC Rules which may require a lot of config work.
Once you have CI/CD pipeline enabled, setting up RBAC(Role Based Access Control) helps to prevent users from getting the publishing profile, setting deployment credentials etc.,
There will definitely be some config work involved in doing this because you would have to only allow permission to one user so that user could set up the service principal connection between azure and DevOps but also prevent users from creating a deployment user.
I am Working on Azure Devops and building CI/CD pipelines for azure data approach provided by the Microsoft.and this will create Whole ADF and its Entities in test/prod Environments. Where ever we deploy the ARM templates,but we just want to deploy changed pipelines not the Whole ADF.
The Approach i know is :
Configure the ADF with GIT-->Merge to Master-->publish to ADF_publish
branch-->setup
CI/CD pipeline to use the template &Parameters jsons to respective test/Prod Environments.
The Ask is"How to deploy just the ADF Pipelines/Datasets/linked Services /Triggers that has changed. if there is no changes to other entities don't deploy.
can someone kindly suggest me the best approach .
I'm using Jenkins to produce cspkg files using msbuild. It stores build results in azure blob storage. Then I use management portal to deploy them.
The biggest drawbacks I see are:
1. Deployments can be accidentally deleted easily.
2. There is no straightforward* way to check which version the cloud service has.
Is there a better way to manage deployments?
Its definitely not the best experience is it?
The approach I tend to use is as follows:
Build the deployment package and add the version number to the package filename (taken from AssemblyInfo.cs) e.g. MyCloudService-1.2.0.0.cspkg - this should be trivial using msbuild.
Push the package to Cloud Storage.
Perform the deployment of the package from Storage, with the Deployment Label '[CLOUD SERVICE NAME]-[VERSION] # [DATE & TIME]' e.g. 'MyCloudService-1.2.0.0 # 10-09-2015 16:30'
Check the deployment package into a 'Packages' directory in source control.
If you need to identify the version of the package deployed to the cloud service, you can see the Deployment Label on the Azure Management Portal:
'Old' Portal (manage.windowsazure.com):
'New' Portal (portal.azure.com):