How to modify Teamcity artifact S3 storage upload path - teamcity

I want to publish my project build artifact to S3 bucket directly via Teamcity. To do so , I am using the "Artifact Storage" in the left administrative panel.
I have linked the S3 storage to my S3 bucket. When I run the build, the artifact is uploaded to S3 but not to the target folder that I want.
For example if I want to publish file.js to {My s3 bucket name }/{my Path}/file.js, the end result is always {My s3 bucket name}/{Teamcity Project name}/{Teamcity project Build Configuration Id}/{build number}/{my Path}/file.js
Is there any way to configure the default target path creation to not include the Teamcity project details and only use the path I want?

Related

what command will accomplish a successful push to a s3 bucket using maven-publish

publishing maven repo to s3 assuming I have credentials and an existing bucket. What command do I call in order to run the publishing part of the build. With mavenDeployer I was using gradle uploadArchives but now I cant figure out how to get this binary to s3. Ive looked all over and nobody is showing the cli command to run it. Just the example setup in build.gradle
Is doing this in 2 steps an option for you ?
step1 - maven/gradle will build the artifact / binary
step2 - use aws cli (s3) to upload - for e.g. aws s3 cp /tmp/binary s3://bucket/

Jenkins Azure Blob Storage plugin to copy the contents of Build folder and not build folder to $web azure blob static cotainer

We are using Jenkins Azure Blob plugin to deploy into Azure Blob Storage and have to use $web hosting feature in Azure.what happens is while mentioning the folder like build/* , this is copying the entire build folder into $web container instead of the files inside the build folder. As per the Azure static web hosting, the document index file will be only pointed to index.html under $web i.e. $web/index.html, but in this case it is coming as $web/build/index.html which we don't want. This build folder can't be configure in the Azure Web static hosting and it has to be in the root as there are couple of files which comes as relative path automatically after build.
https://azure.microsoft.com/en-us/blog/azure-storage-static-web-hosting-public-preview/
https://learn.microsoft.com/en-us/azure/storage/common/storage-java-jenkins-continuous-integration-solution
Please can someone check and responds..
Thanks
Just use the dir command to run the Azure Storage inside your root folder, and remove the folder from the parameter:
dir('build'){
azureUpload blobProperties: … filesPath: '**/'
}

How to upload an artifact to Jfrog Artifactory using Jfrog CLI?

I'm trying to upload an artifact from Bamboo CI using CLI for Jfrog Artifactory
I need to upload .p2 plugins and I have two options:
Upload the .zip and deploy it as Bundle Artifact
Upload the uncompressed folder with all subfolders and data.
I'm trying to upload the uncompressed folder with all the subfolders and data using this command:
jfrog rt upload --include-dirs=true ${bamboo.build.working.directory}/unzip/${bamboo.public.name-update-site}/* p2-release-local/${bamboo.public.name-update-site}/
But the problem is that the subfolders are empty.
Also I try to use this command:
jfrog rt upload --flat=false ${bamboo.build.working.directory}/unzip/${bamboo.public.name-update-site}/* p2-release-local/${bamboo.public.name-update-site}/
This command upload all the subfolders with all the data but the path isn't correct because is:
/name-update-site/datos/agents-home/xml-data/build-dir/PREDEL-RELPLU-JOB1/unzip/name-update-site
The content of the variable ${bamboo.build.working.directory} is
/datos/agents-home/xml-data/build-dir/PREDEL-RELPLU-JOB1/
EDITED: Log info:
INFO: Listing Bamboo directory
prueba-update-site.zip
unzip
INFO: Listing files from unzip folder
prueba-update-site
INFO: Listing files from custom folder
artifacts.jar
content.jar
features
plugins
site.xml
uninstall_fortify_plugins.cmd
Any help?
Thanks.
Solved!
The solution is:
Unzip the artifact .zip archive
Upload with this command:
jfrog rt upload --flat=false "${bamboo.public.name-update-site}/*" p2-release-local/
Thanks.

AutoDeploy a Laravel app from GitHub branch to AWS EC2 or Elastic Beanstalk

I'm trying to auto deploy a Laravel app from a Github branch into AWS EC2 or Elastic Beanstalk (Prefer) but I haven't found the right solution, one of the tutorials I have followed is the one bellow. Does anyone have a solution for this?
Thank you in advance!
https://aws.amazon.com/blogs/devops/building-continuous-deployment-on-aws-with-aws-codepipeline-jenkins-and-aws-elastic-beanstalk/
You can do this with the following steps
Setup Jenkins with Github plugin
Install AWS Elastic Beanstalk CLI
Create IAM user with Elastic Beanstalk deploying privileges and add the access keys to AWS CLI (if Jenkins run inside a EC2, instead of creating a user, you can create a Role with requird permission and attach to the EC2 instance)
In Jenkins project, clone the branch go to project directory and executive eb deploy in a shell script to deploy it to Elastic Beanstalk. (You can automate this with a build trigger when new code pushed to the branch)
Alternatively there are other approaches, for example
Blue/Green deployment with Elastic Beanstalk
Deploy Gitbranch to specific environment.
Using AWS CodeStar to setup the deployment using templates(Internally it will use AWS Code pipeline, CodeDeploy and etc.)
An alternative to using eb deploy is to use the Jenkins AWS Beanstalk Publisher plugin https://wiki.jenkins.io/display/JENKINS/AWS+Beanstalk+Publisher+Plugin
This can be installed by going to Manage Jenkins > Manage Plugins > Search for AWS Beanstalk Publisher. The root object is the zip file of the project that you want to deploy to EB. The Build Steps can execute a step to zip up the files that are in your repo.
You will still need to fill in the Source Control Management section of the Jenkins Job configuration. This must contain the URL of your GitHub repo, and the credentials used to access them.
Execute Shell as part of the Build Steps which zip up the files from the repo that you want to deploy to EB. For example zip -r myfiles.zip * will zip up all the files within your GitHub repo.
Use the AWS Beanstalk Publisher Plugin and specify myfiles.zip as the value of the Root Object (File / Directory).

Advice needed with Teamcity artifact paths

For a .NET Developer, the Teamcity artifact paths are not very straightforward.
Per project I do, I have a folder called BuildTools and, within it, folders called Drops and Inputs (drops being the reports and outputs inputs being the config files for various command line apps).
BuildTools/Drops/NDependOut => GenericSolution/Drops/NDepend
Is this correct? BuildTools is from the root of the (custom) checkout dir, and then GenericSolution is from the root of the artifacts path (Called "Artifacts" folder).
The other problem I have is that the NDepend report has a lot of images etc in the same folder as the .html file. How would I upload this? Do I upload the entire folder (in which case, is the syntax above correct?)
In general this is right. TeamCity has an option to zip artifacts before publish. For that use the following syntax
Folder/folder/*/ => destfolder/archive.zip
Another trick is to use TeamCity service message to publish artifacts dynamically from build script.

Resources