Copy the files from Linux vm to azure repository - bash

I don't know if it's possible or not.
I have one Linux vm where we have some files. we don't have any git installed inside the Linux vm, in other hand we have azure repository.
Using azure pipeline job, we have copy all the files from Linux vm to azure repository and start the build, if the build is successful then we have to merge the copied code into azure repo.
Please let me if it's possible

Related

Copy deployment result from an Azure Mac hosted agent to a private Windows agent

In Azure DevOps I have a release definition that executes the command productbuild --component $(System.DefaultWorkingDirectory)/$(RELEASE.PRIMARYARTIFACTSOURCEALIAS)/My/Folder.app/ /Applications My.pkg to create a new pkg file starting from the built artifact. This command is executed on a Mac hosted agent.
Now I need to put the pkg on a specific path of a Windows machine on which I have an Azure DevOps' private agent. My problem is the copy operation from the Mac hosted machine to a private machine having the private agent. Is there any way to accomplish this task?
Thank you
Since you can't move pkg creation to build pipeline you need to upload it to for instance Blob Storage (if you use already Azure it should not be a problem), or to FTP (it could be on your host agent or not) then you should trigger pipeline/release (using this extension and passing url/location of upload pkg file.
You can publish the pkg file to Azure artifacts feed in your release pipeline using Universal Package task. And then download the pkg file to your private machine. See below steps:
1, Create a Azure Artifacts feed from Azure devops portal.
2, Add Universal Package task in your release pipeline to publish your pkg file as a universal package to above artifacts feed.
- task: UniversalPackages#0
displayName: 'Universal publish'
inputs:
command: publish
publishDirectory: '$(Build.ArtifactStagingDirectory)/package.pkg'
vstsFeedPublish: 'FeedId'
vstsFeedPackagePublish: 'package_name'
3, Add a agent job in your release pipeline stage. And configure it to run on your private agent.
Then you can and add Universal Package task in this agent job to download this pkg file to your private machine.

How to update a Windows machine with changes done in a git repository

I am planning to do below
Copy from git repository to a Windows machine each time a commit/ update is made to that folder only. May be something like Jenkins can be used for same but unable to determine how can I do it?
Check commit made to repo ( this I have done)
As soon as commit is made to repo, trigger a jenkins job that will update this change to a windows server ( How to do this?)
If the repository is local, it would be easier to push directly to the Windows machine, assuming it has an SSH server (which Windows 10 2019.09 and more now have)
If the repository is distant, you can configure a webhook in order to call a Jenkins server for a specific job.
See for instance "Triggering a Jenkins build every time changes are pushed to a Git branch on GitHub" by David Luet
Or you can define a Jenkins pipeline that GitLab-CI can execute.
In both cases, your Jenkins job will have to copy the checked out repository.
I would use git bundle to compress the repository into one file (or a simple tar), copy it over the Windows server, and decompress there.

How to download files from remote windows share in jenkins?

I am using Publish Over CIFS plugin in Jenkins to transfer the files over to remote windows share folder.
But, it seems like there is no option available in this plugin to download files from remote windows share to local.
Is there any jenkins plugin can achieve this?
Is the the "File Operations Plugin"'s fileDownloadOperation support downloading from a windows share folder?
Thanks.
The File Operations Plugin does work with network shares. Just use the fileCopyOperation and give the path to your share like this: \\ShareServer\ShareName\path\to\your\file

Maven artifacts are deployed to a wrong location at when invoking Gradle install task on Team City CI

I'm trying to setup simple continuous integration system on my local PC. I use gradle as my build system (gradle wrapper option). One of the steps in the build process in to deploy build artifacts to a local repository (located at:
"{user_dir}/.m2/repository)". It works ok when I run it from local PC, but when it runs on Team City CI (version 9) it deploys it to a
"{windows_dir}\System32\config\systemprofile.m2\repository". This is probably some configuration issue but I couldn't manage to solve it. In the build logs I saw that it can't find the local repository in the settings.xml file. I've tried to add it but it didn't help. How can I configure Team City to use local repository folder in user directory?
I found out what was the issue. If you install Team City system services to run under admin account it will always use windows directory. In order to use the User's directory you need to install the services under that user account.
Source: https://confluence.jetbrains.com/display/TCD9/Maven+Server-Side+Settings.

Copying files between two windows servers through jenkins

I am working on a project where i want to copy the compiled file (which compiled through jenkins) from one windows server to another through jenkins. Jenkins is installed on a windows server and after building the code, those compiled file should be copied to another windows server through jenkins. Is there any way to achive it?
Jenkins might be able to do it, via the script steps running the scp command; however, if this is part of a build, I would suggest attaching the file(s) to a project, and distributing them through the maven repository.

Resources