Bitbucket Pipeline Execute step multiple times or for loop through step - yaml

Hey I'm trying to find the best way to write this bitbucket pipeline. It's got multiple 'build' steps. These steps are setup on conditionals, to build only if there's a modification to a certain folder in the repo. Once those steps are done building that particular app, it needs to be deployed. The deploy step is the same for each of the builds except for 1 variable. The build is about 10 shell commnads.
Looking to do either of the following:
Put the script section of the deploy step in a for loop, that'll loop through an artifact.txt file I've created
OR
tell the deploy step to execute multiple times, based upon a variable/conditional statement.
Thanks
Tried putting the script section like this:
if [[ cat deploy.txt == "api" ]]; then
source gitversion.properties
echo Octopus will pack, push and deploy package as version $GITVERSION_FULLSEMVER
export VERSION=$GITVERSION_FULLSEMVER.$BITBUCKET_BUILD_NUMBER
fi

Related

How to iterate through different sub directories using for loop

I am trying to implement a python linter using pylint. But I am getting the score of each python file and also displaying the suggestion to improve the score but I am also looking to terminate the GitHub action job if my pylint score is below 7.0 but currently its not failing my job. I have got a way to fail the build but it only works for one directory. But if there is a sub directory which has a python file it does not lint that
for file in */*.py; do pylint --disable=E0401 "$file" --fail-under=7.0; done
This is the for loop i have used but if there is a directory inside which has another python file I have to write another for loop to lint that and it would look like this
for file in */*/*.py; do pylint --disable=E0401 "$file" --fail-under=7.0; done
is there a way such that for loop can lint all the files even if there is a sub-directory ? In case some new directory is added by a developer this solution is not a great way to fix the issue. I have tried to use find command but it does not fail the GitHub action workflow if there pylint score of a file is less than 7.0
The below snippet works fine and is able to find .py file even from sub directories. This is actually working for me.
- name: Checks if pylint score is above 7.0
run: |
for file in $(find -name '*.py')
do
pylint --disable=E0401,W0611 "$file" --fail-under=7.0;
done
For detailed explanation refer : Lint python files in subdirectries using Github Workflow

Is there a way or a bash script through JENKINS to give me the name of the folder in which the files were modified after a push?

i am a beginner in the DEVOPS field and I have a "trial" repo in github which i'm tracking through Jenkins
here is the folder
i would like to ask if there is any way(jenkins plugin or bash script) that if i modify a file in ok1,ok2, or ok3 folders and do a push, i get which folder was modified.
the result of the console's output shall be something like this :
"folder ok1 was changed"
or
"folder ok1 , ok2 were changed"
...etc
thank you
Yes, you can. You just need to set up the Git repo to trigger a build job on your Jenkins machine every time you push using the Github Jenkins webhook. You can read on how to do this here
Once the job successfully builds with every push, you can just do a git-diff to get the files that were changed. After that, its just a matter of extracting the folder name using regex or string manipulation.

Xcode appears to be modifying paths inside bash scripts during run script build phase

I have a simple script that looks like this:
#!/bin/sh
set -eux
install_folder="${HOME}/Library/MobileDevice/Provisioning Profiles"
mkdir -p "${install_folder}"
if [[ $? != 0 ]]; then
echo "Unable to create destination directory: ${install_folder}"
exit 1
fi
If I run this script from the command line by doing ./my_script.sh everything works as expected. Things go wrong though when I call from Xcode as part of a run script build phase. I currently call it by having "${SRCROOT}/path/to/my_script.sh" in the run script build phase, but the same issue occurs even if I copy and paste the code above in directly.
So what's the issue? Well, it seems Xcode is causing the wrong folder to be created. When I run from the command line, I get a folder named Provisioning Profiles inside ~/Library/MobileDevice/ as expected. When I run from Xcode, the folder is named Provisioning\ Profiles (that \ is literally part of the name).
But it gets weirder. If I change the mkdir line to mkdir -p $install_folder then I'd expect to get a folder called Provisioning inside the MobileDevice folder and a folder called Profiles wherever I ran the command. That's what happens when I run from the command line. If I run from Xcode however, I get a folder Profiles inside MobileDevice but I also get a folder called Provisioning\ Profiles.
I cannot explain this behavior at all. It seems totally counter to everything I (thought) knew about shell scripts.
How is Xcode influencing this? How do I make it stop?
The trick, as always, was realising that there was more to this than I had considered. Xcode wasn't just running this script in the phase, it was doing it with a list of output files set. It was then creating the path for those after it ran the script. The script was behaving exactly as it should have, it was just the extra stuff which made it appear to be broken.
Lesson learned: Xcode will create a folder for output files if it doesn't exist.

Execute shell script via Jenkins

I have a very simple shell script build-dev.sh. This is how it looks like:
#artifact build script
echo "Running application build for DEV environment"
ng build --deploy-url "js/" --base-href "/my-app-ui/" --configuration=dev
mkdir dist/my-app-ui/js
mv ./dist/my-app-ui/*.{js,svg,css} ./dist/my-app-ui/js
It builds the Angular application, then creates a folder js and then it moves files with the extensions js, svg and css to this folder.
When I execute this script directly by myself it works perfectly.
The issue is that I want the script to be executed by Jenkins. So I have configured "Execute shell" step in my build. Once the Jenkins job is executed, it fails on the execution of the third line of the script (mv command).
mv: cannot stat './dist/my-app-ui/*.{js,svg,css}': No such file or directory
Build step 'Execute shell' marked build as failure
I think it might be related to the fact that I have .*{js,svg,css} in my script.
Can you please tell me what I am doing wrong?
Well, I am still not sure why it does not work, but the problem is in the usage of brackets - {js,svg,css}.
I have replaced the mv command with three lines:
mv ./dist/my-app-ui/*.js ./dist/my-app-ui/js
mv ./dist/my-app-ui/*.svg ./dist/my-app-ui/js
mv ./dist/my-app-ui/*.css ./dist/my-app-ui/js
This works perfectly. It is still some kind of workaround, but it's doing the exactly same, so it works fine for me.

Access Jenkins workspace files in pipeline scrpit

I'm quite new to Jenkins so apologies if the question is not detailed enough but I swear I've done my own searching first.
I have a pipeline script that needs to process files that have been pulled from a SCM (git) in a previous step.
One of the parameters passed to the pipeline is a folder where all these files reside. There may be subfolders contained in this folder and I need to process those as well.
So, for example, I may pass a parameter ./my-folder to the pipeline and my-folder may contain the following:
./my-folder/file1.json
./my-folder/file2.json
./my-folder/subfolder/file3.json
The my-folder directory will be part of the repository cloned during the build phase.
While I was developing my Groovy script locally I was doing something similar to this:
def f = new File(folder)
but this doesn't work in Jenkins given the code is running on the master while the folder is on a different node.
After an extensive research I now know that there are two ways to read files in Jenkins.
Use readFile. This would be ok but I haven't found an easy way to scan an entire folder and subfolders to load all files
Use FilePath. This would be my preferred way since it's more OO but I haven't found a way to create an instance of this class. All the approaches I've seen while searching on the internet, refer to the build variable which, I'm not entirely sure why, is not defined in the script. In fact I'm getting groovy.lang.MissingPropertyException: No such property: build for class: WorkflowScript
I hope the question makes sense otherwise I'd be happy to add more details.
Thanks,
Nico
I've managed to scan the content of a folder using the following approach:
sh "find ${base-folder} -name *.* > files.txt"
def files = readFile "files.txt"
and then loop through the lines in files.txt to open each file.
The problem is this works only for txt files. I'm still unable to open a binary file (eg: a zip file) using readFile

Resources