I have a user-defined variable in my Xcode project - MY_VARIABLE:
I linked MY_VARIABLE also in my .plist file:
And then I use it in my code:
NSString *myVariable = [[NSBundle mainBundle] objectForInfoDictionaryKey:#"MY_VARIABLE"];
In the fastfile I have my AppStore lane and, only in that case, I would like to change the value of MY_VARIABLE.
I'm currently using:
ENV["MY_VARIABLE"] = "appStoreValue"
but this doesn't work.
After a bit of research I found a solution to this.
I'm using xcargs in the gym action, like:
gym(
scheme: "MyScheme",
configuration: "Release",
use_legacy_build_api: 1,
xcargs: "MY_VARIABLE=appStoreValue"
)
Thx to https://stackoverflow.com/a/56179405/5790492 and https://nshipster.com/xcconfig/
I've created the xcconfig file, added it to project in Info tab. For fastlane added this plugin to work with xcconfig. And now it looks like:
def bumpMinorVersionNumber
currentVersion = get_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_VERSION')
versionArray = currentVersion.split(".").map(&:to_i)
versionArray[2] = (versionArray[2] || 0) + 1
newVersion = versionArray.join(".")
update_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_VERSION',
value: newVersion.to_s)
UI.important("Old version: #{currentVersion}. Version bumped to: #{newVersion}")
end
def bumpBuildNumber
currentBuildNumber = get_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_BUILD')
newBuildNumber = currentBuildNumber.to_i + 1
update_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_BUILD',
value: newBuildNumber.to_s)
UI.important("Old build number: #{currentBuildNumber}. Build number bumped to: #{newBuildNumber}")
end
Related
I'm using gradle. I have project like that:
project/
--- sub1/
--- sub2/
I want to have artifact uploaded as 2 differents files (i.e. sub1.jar and sub2.jar separately).
Actually, I'm using this job:
- uses: actions/upload-artifact#v3
with:
name: Artifacts
path: project*/build/libs/*.jar
But the file uploaded is only one file, with sub folder to files.
I tried to run same upload-artifact job, but with different argument. I can't do that.
I don't want to copy/paste the same job, because in the futur I will have multiple sub project, and I don't want to have 50 lines or same code ...
How can I upload my generated files, or run same job multiple times ?
So using a matrix strategy would allow you to do this for a list of inputs.
You can do something like this as job in a workflow which would do the same steps for each value in the matrix.
some-job:
name: Job 1
runs-on: ubuntu-latest
strategy:
matrix:
subdir: [sub1, sub2]
steps:
- name: Create some files
run: echo "test data" > /tmp/${{ matrix.subdir }}/.test.jar
- uses: actions/upload-artifact#v3
with:
name: Artifacts
path: /tmp/${{ matrix.subdir }}/*.jar
It doesn't seems to be possible, so I made my own script. I'm using same code as actions/upload-artifact for the upload itself.
We should run JS script with the required dependency #actions/artifact. So, there is 2 actions to setup node and the dep.
My code is like that:
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- uses: actions/setup-node#v3
with:
node-version: 16
- name: Install NPM package
run: npm install #actions/artifact
- uses: actions/github-script#v6
name: Artifact script
with:
script: CHECK MY SCRIPT BELOW
I'm using this script to upload all files in all sub folder:
let artifact = require('#actions/artifact');
const fs = require('fs');
function getContentFrom(path, check) {
return fs.readdirSync(path).filter(function (file) {
return check == fs.statSync(path+'/'+file).isDirectory();
});
}
function getDirectories(path) {
return getContentFrom(path, true);
}
function getFiles(path) {
return getContentFrom(path, false);
}
const artifactClient = artifact.create();
for(let sub of getDirectories("./")) { // get all folders
console.log("Checking for", sub);
let filesDir = "./" + sub; // if you are using multiples folder
let files = [];
for(let build of getFiles(filesDir)) {
// here you can filter which files to upload
files.push(filesDir + "/" + build);
}
console.log("Uploading", files);
await artifactClient.uploadArtifact(
"Project " + sub,
files,
filesDir,
{ continueOnError: false }
)
}
I went through the following link and successfully implemented a task which calls build.gradle file from another project. i.e. solution provided by #karl worked for me.
But I need something up on that.
Can somebody help me to know how I can pass command line arguments while calling another build.gradle? Command line argument should be the variable which I have generated from my current build.gradle file.
In my case, I am defining a buildNumber and doing something like this:
def buildNumber = '10.0.0.1'
def projectToBuild = 'projectName'
def projectPath = "Path_till_my_Project_Dir"
task executeSubProj << {
def tempTask = tasks.create(name: "execute_$projectToBuild", type: GradleBuild)
// ****** I need to pass buildNumber as command line argument in "$projectPath/$projectToBuild/build.gradle" ******
tempTask.tasks = ['build']
tempTask.buildFile = "$projectPath/$projectToBuild/build.gradle"
tempTask.execute()
}
You should never call execute directly on any gradle object. The fact it's feasible doesn't mean it should be done and it's highly discouraged since you intrude internal gradle's execution graph structure.
What you need is a task of type GradleBuild which has StartParameter field that can be used to carry build options.
So:
task buildP2(type: GradleBuild) {
buildFile = '../p2/build.gradle'
startParameter.projectProperties = [lol: 'lol']
}
Full demo can be found here, navigate to p1 directory and run gradle buildP2.
You should modify your script in the following way:
def buildNumber = '10.0.0.1'
def projectToBuild = 'projectName'
def projectPath = "Path_till_my_Project_Dir"
task executeSubProj(type: GradleBuild) {
buildFile = "$projectPath/$projectToBuild/build.gradle"
tasks = ['build']
startParameter.projectProperties = [buildNumber: '10.0.0.1']
}
In the project that is executed use project.findProperty('buildNumber') to get the required value.
I am trying to use the Ruby Google API Client to create a deployment on the Google Compute Platform (GCP).
I have a YAML file for the configuation:
resources:
- name: my-vm
type: compute.v1.instance
properties:
zone: europe-west1-b
machineType: zones/europe-west1-b/machineTypes/f1-micro
disks:
- deviceName: boot
type: PERSISTENT
boot: true
autoDelete: true
initializeParams:
sourceImage: global/images/myvm-1487178154
networkInterfaces:
- network: $(ref.my-subnet.selfLink)
networkIP: 172.31.54.11
# Create the network for the machines
- name: my-subnet
type: compute.v1.network
properties:
IPv4Range: 172.31.54.0/24
I have tested that this works using the gcloud command line tool.
I now want to do this in Ruby using the API. I have the following code:
require 'google/apis/deploymentmanager_v2'
require 'googleauth'
require 'googleauth/stores/file_token_store'
SCOPE = Google::Apis::DeploymentmanagerV2::AUTH_CLOUD_PLATFORM
PROJECT_ID = "my-project"
ENV['GOOGLE_APPLICATION_CREDENTIALS'] = "./service_account.json"
deployment_manager = Google::Apis::DeploymentmanagerV2::DeploymentManagerService.new
deployment_manager.authorization = Google::Auth.get_application_default([SCOPE])
All of this is working in that I am authenticated and I have a deployment_manager object I can work with.
I want to use the insert_deployment method which has the following signature:
#insert_deployment(project, deployment_object = nil, preview: nil, fields: nil, quota_user: nil, user_ip: nil, options: nil) {|result, err| ... } ⇒ Google::Apis::DeploymentmanagerV2::Operation
The deployment_object type is 'Google::Apis::DeploymentmanagerV2::Deployment'. I can create this object but then I am do not know how to import the YAML file I have into this to be able to programtically perform the deployment.
There is another class called ConfigFile which seems akin to the command line option of specifying the --config but again I do not know how to load the file into this nor then turn it into the correct object for the insert_deployment.
I have worked this out.
Different classes need to be nested so that the configuration is picked up. For example:
require 'google/apis/deploymentmanager_v2'
require 'googleauth'
SCOPE = Google::Apis::DeploymentmanagerV2::AUTH_CLOUD_PLATFORM
PROJECT_ID = "my-project"
ENV['GOOGLE_APPLICATION_CREDENTIALS'] = "./service_account.json"
# Create a target configuration
target_configuration = Google::Apis::DeploymentmanagerV2::TargetConfiguration.new(config: {content: File.read('gcp.yaml')})
# Now create a deployment object
deployment = Google::Apis::DeploymentmanagerV2::Deployment.new(target: target_configuration, name: 'ruby-api-deployment')
# Attempt the deployment
response = deployment_manager.insert_deployment(PROJECT_ID, deployment)
Hope this helps someone
I have a YAML file like that:
---
name: dummy
version: 0.2.0
title: dummy
summary: dummy
Now I tried to get the version number:
config = YAML.load_file('Index.yml')
oldversion = config[0]['version']
Why do it get the following error:
NoMethodError: undefined method `[]' for nil:NilClass
Try with this:
config = YAML.load_file('Index.yml')
oldversion = config['version']
With config[0] you are indexing a sequence/array, and at the top level of your yaml file you have a mapping, not a sequence, so leave that out: oldversion = config['version'] or change your YAML file to:
---
- name: dummy
version: 0.2.0
title: dummy
summary: dummy
if you eventually want a list of such objects (with name, version, etc.) in your configuration.
Gradle is giving me fits. I have a build.gravle that includes the java plug-in and I have built a task within it that generates a properties file and drops it into the build/classes directory. This all works like a champ. Here is my custom task.
task generateBuildSignature << {
def whoami = System.getProperty( 'user.name' );
def hostname = InetAddress.getLocalHost().getHostName();
def buildTag = System.env.BUILD_TAG ?: "dev"
ant.propertyfile( file: "${buildDir}/buildsignature.properties", comment: "This file is automatically generated - DO NOT EDIT!" ) {
entry( key: "version", value: "${project.version}" )
entry( key: "buildTimestamp", value: "${new Date().format('yyyy-MM-dd HH:mm:ss z')}" )
entry( key: "buildUser", value: "$whoami" )
entry( key: "buildSystem", value: "$hostname" )
entry( key: "buildTag", value: "$buildTag" )
}
}
Now I want to get the execution of this task integrated into the Java build lifecycle, preferably immediately after processResources or as a dependency of classes, but I'm at Chapter 23 of the documentation http://gradle.org/docs/current/userguide/java_plugin.html and it is not yet clear exactly how get my task into the dependency chain of the task(s) that come in through the java plug-in. Any advice for a fledgling gradle user?
All you have to (and can) do is to add a task dependency. For example:
classes.dependsOn(generateBuildSignature)