I'm using gradle. I have project like that:
project/
--- sub1/
--- sub2/
I want to have artifact uploaded as 2 differents files (i.e. sub1.jar and sub2.jar separately).
Actually, I'm using this job:
- uses: actions/upload-artifact#v3
with:
name: Artifacts
path: project*/build/libs/*.jar
But the file uploaded is only one file, with sub folder to files.
I tried to run same upload-artifact job, but with different argument. I can't do that.
I don't want to copy/paste the same job, because in the futur I will have multiple sub project, and I don't want to have 50 lines or same code ...
How can I upload my generated files, or run same job multiple times ?
So using a matrix strategy would allow you to do this for a list of inputs.
You can do something like this as job in a workflow which would do the same steps for each value in the matrix.
some-job:
name: Job 1
runs-on: ubuntu-latest
strategy:
matrix:
subdir: [sub1, sub2]
steps:
- name: Create some files
run: echo "test data" > /tmp/${{ matrix.subdir }}/.test.jar
- uses: actions/upload-artifact#v3
with:
name: Artifacts
path: /tmp/${{ matrix.subdir }}/*.jar
It doesn't seems to be possible, so I made my own script. I'm using same code as actions/upload-artifact for the upload itself.
We should run JS script with the required dependency #actions/artifact. So, there is 2 actions to setup node and the dep.
My code is like that:
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- uses: actions/setup-node#v3
with:
node-version: 16
- name: Install NPM package
run: npm install #actions/artifact
- uses: actions/github-script#v6
name: Artifact script
with:
script: CHECK MY SCRIPT BELOW
I'm using this script to upload all files in all sub folder:
let artifact = require('#actions/artifact');
const fs = require('fs');
function getContentFrom(path, check) {
return fs.readdirSync(path).filter(function (file) {
return check == fs.statSync(path+'/'+file).isDirectory();
});
}
function getDirectories(path) {
return getContentFrom(path, true);
}
function getFiles(path) {
return getContentFrom(path, false);
}
const artifactClient = artifact.create();
for(let sub of getDirectories("./")) { // get all folders
console.log("Checking for", sub);
let filesDir = "./" + sub; // if you are using multiples folder
let files = [];
for(let build of getFiles(filesDir)) {
// here you can filter which files to upload
files.push(filesDir + "/" + build);
}
console.log("Uploading", files);
await artifactClient.uploadArtifact(
"Project " + sub,
files,
filesDir,
{ continueOnError: false }
)
}
Related
I have a SAP CAP Nodejs application and I'm trying to send emails from the CAP application using sap-cf-mailer package.
I've created a destination service in the BTP as mentioned in the sample and when I'm trying to deploy the application to BTP it failed.
When I run the application locally using cds watch, it gives following error
VError: No service matches destination
This is my mta.yaml
## Generated mta.yaml based on template version 0.4.0
## appName = CapTest
## language=nodejs; multitenant=false
## approuter=
_schema-version: '3.1'
ID: CapTest
version: 1.0.0
description: "A simple CAP project."
parameters:
enable-parallel-deployments: true
build-parameters:
before-all:
- builder: custom
commands:
- npm install --production
- npx -p #sap/cds-dk cds build --production
modules:
# --------------------- SERVER MODULE ------------------------
- name: CapTest-srv
# ------------------------------------------------------------
type: nodejs
path: gen/srv
parameters:
buildpack: nodejs_buildpack
requires:
# Resources extracted from CAP configuration
- name: CapTest-db
- name: captest-destination-srv
provides:
- name: srv-api # required by consumers of CAP services (e.g. approuter)
properties:
srv-url: ${default-url}
# -------------------- SIDECAR MODULE ------------------------
- name: CapTest-db-deployer
# ------------------------------------------------------------
type: hdb
path: gen/db
parameters:
buildpack: nodejs_buildpack
requires:
# 'hana' and 'xsuaa' resources extracted from CAP configuration
- name: CapTest-db
resources:
# services extracted from CAP configuration
# 'service-plan' can be configured via 'cds.requires.<name>.vcap.plan'
# ------------------------------------------------------------
- name: CapTest-db
# ------------------------------------------------------------
type: com.sap.xs.hdi-container
parameters:
service: hana # or 'hanatrial' on trial landscapes
service-plan: hdi-shared
properties:
hdi-service-name: ${service-name}
- name: captest-destination-srv
type: org.cloudfoundry.existing-service
This is the js file of the CDS service
const cds = require('#sap/cds')
const SapCfMailer = require('sap-cf-mailer').default;
const transporter = new SapCfMailer("MAILTRAP");
module.exports = cds.service.impl(function () {
this.on('sendmail', sendmail);
});
async function sendmail(req) {
try {
const result = await transporter.sendMail({
to: 'someoneimportant#sap.com',
subject: `This is the mail subject`,
text: `body of the email`
});
return JSON.stringify(result);
}
catch{
}
};
I'm following below samples for this
Send an email from a nodejs app
Integrate email to CAP application
Did you create your default-.. json files? They are required to connect to remote services on your BTP tenant. You can find more info about this on SAP blogs like this one:
https://blogs.sap.com/2020/04/03/sap-application-router/
You could also use the sap-cf-localenv command:
https://github.com/jowavp/sap-cf-localenv
This tool is experimental,a s far as I know, this only works for the CF CLI V6. Higher version are fetching the service keys in another format, which leads to the command to fail.
Kind regards,
Thomas
I wanted to test automatic testing during each push in Github. So I wrote a basic API and test cases. In the local machine where I am using ubuntu, all are working fine. I uploaded the code in a GitHub repository and and wrote the GitHub workflow like below:
name: testing
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
test-code:
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout#v2
- name: Set up python
uses: actions/setup-python#v2
with:
python-version: 3.8.8
- name: Caching
uses: actions/cache#v2
with:
path: $/{/{ env.pythonLocation /}/}
key: $/{/{ env.pythonLocation /}/}-$/{/{ hashFiles('setup.py') /}/}-$/{/{ hashFiles('requirements.txt') /}/}
- name: Install dependencies
run: python -m pip install -r requirements.txt
- name: Run django integratin test
run: python manage.py test
My test code is like belwo:
PREDICT_API_URL = 'http://127.0.0.1:8000/predict/'
class PredictApi(TestCase):
'''
Using postman these can be tested as well.
'''
#tag('important')
def test_predict_negative(self):
# python manage.py test predict.tests.PredictApi.test_predict_negative
data = {
"cough": [0],
"fever": [0],
"sore_throat": [0],
"shortness_of_breath": [0],
"head_ache": [0],
"age_60_and_above": [0],
"gender": [0],
"test_indication": [0]
}
response = requests.post(PREDICT_API_URL, json=data)
results = response.json()
assert results['corona'] == 0
But when I am pushing to GitHub I am getting connection error.
I am learning Github actions. It will be good if I make it run. The Github link of this is: https://github.com/bikashckarmokar/covid_prediction
My question: How can Codepipeline read the value of a field in a json file which is in SourceCodeArtifact?
I have Gthub repo that contains a file imageManifest.json which looks like this:
{
"image_id": "docker.pkg.github.com/my-org/my-repo/my-app",
"image_version": "1.0.1"
}
I want my AWS Codepipeline Source stage to be able to read the value of image_version from imageManifest.json and pass it as a parameter to a CloudFormation action in a subsequent stage of my pipeline.
For reference, here is my source stage.
Stages:
- Name: GitHubSource
Actions:
- Name: SourceAction
ActionTypeId:
Category: Source
Owner: ThirdParty
Version: '1'
Provider: GitHub
OutputArtifacts:
- Name: SourceCodeArtifact
Configuration:
Owner: !Ref GitHubOwner
Repo: !Ref GitHubRepo
OAuthToken: !Ref GitHubAuthToken
And here is my deploy stage:
- Name: DevQA
Actions:
- Name: DeployInfrastructure
InputArtifacts:
- Name: SourceCodeArtifact
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: '1'
Configuration:
StackName: !Ref AppName
Capabilities: CAPABILITY_NAMED_IAM
RoleArn: !GetAtt [CloudFormationRole, Arn]
ParameterOverrides: !Sub '{"ImageId": "${image_version??}"}'
Note that image_version in the last line above is just my aspirational placeholder to illustrate how I hope to use the image_version json value.
How can Codepipeline read the value of a field in a json file which is in SourceCodeArtifact?
StepFunctions? Lambda? CodeBuild?
You can use a CodeBuild step in between Source and Deploy stages.
In CodeBuild step, read the image_version from SourceArtifact (artifact produced by soruce stage) and write to an artifact 'Template configuration' file 1 which is a configuration property of the CloudFormation action. This file can hold parameter values for your CloudFormation stack. Use this file instead of ParameterOverrides you are currently using.
Fn::GetParam is what you want. It can returns a value from a key-value pair in a JSON-formatted file. And the JSON file must be included in an artifact.
Here is the documentation and it gives you some examples: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/continuous-delivery-codepipeline-parameter-override-functions.html#w2ab1c13c20b9
It should be something like:
ParameterOverrides: |
{
"ImageId" : { "Fn::GetParam" : ["SourceCodeArtifact", "imageManifest.json", "image_id"]}
}
I'm getting the following error while trying to state.apply sls on windows machine.
ID: ProvisionADDC
Function: module.run
Name: dsc.apply_config
Result: False
Comment: Module function dsc.apply_config threw an exception. Exception: No JSON results from powershell. Additional info follows:
retcode:
0
stderr:
stdout:
Started: 12:06:08.044000
Duration: 2684.0 ms
Changes:
Since win_dsc is execution module, then I'm forced to use state.module module to run this function on minion:
C:\DSC:
file.directory:
- makedirs: True
allprofiles:
win_firewall.disabled
CopyDSCModules:
file.recurse:
- name: 'C:\Program Files\WindowsPowerShell\Modules'
- source: salt://windows/dsc/
InstallADDomainServices:
win_servermanager.installed:
- name: AD-Domain-Services
- restart: True
- require:
- file: CopyDSCModules
ProvisionADDC:
module.run:
- name: dsc.apply_config
- path: C:\DSC\
- source: salt://windows/mof
- require:
- file: 'C:\DSC'
- file: CopyDSCModules
- win_servermanager: InstallADDomainServices
Anybody have experience with win_dsc and SaltStack ?
I think it's a case of the documentation lacking a bit, but you need to actually run the configuration in the same ps1 file, eg.
Configuration myconfig {
Node 'localhost' {
WindowsFeature 'DNS' {
Name = 'DNS'
Ensure = Present
}
}
}
myconfig
I'm playing with this a litle at the moment and hopefully I can come up with a helpful issue/PR because it is lacking a bit (even if just for better error logging).
I'm not sure how this works in terms of determining a specific config as I'e not tested that yet (using the config_name param).
I have a user-defined variable in my Xcode project - MY_VARIABLE:
I linked MY_VARIABLE also in my .plist file:
And then I use it in my code:
NSString *myVariable = [[NSBundle mainBundle] objectForInfoDictionaryKey:#"MY_VARIABLE"];
In the fastfile I have my AppStore lane and, only in that case, I would like to change the value of MY_VARIABLE.
I'm currently using:
ENV["MY_VARIABLE"] = "appStoreValue"
but this doesn't work.
After a bit of research I found a solution to this.
I'm using xcargs in the gym action, like:
gym(
scheme: "MyScheme",
configuration: "Release",
use_legacy_build_api: 1,
xcargs: "MY_VARIABLE=appStoreValue"
)
Thx to https://stackoverflow.com/a/56179405/5790492 and https://nshipster.com/xcconfig/
I've created the xcconfig file, added it to project in Info tab. For fastlane added this plugin to work with xcconfig. And now it looks like:
def bumpMinorVersionNumber
currentVersion = get_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_VERSION')
versionArray = currentVersion.split(".").map(&:to_i)
versionArray[2] = (versionArray[2] || 0) + 1
newVersion = versionArray.join(".")
update_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_VERSION',
value: newVersion.to_s)
UI.important("Old version: #{currentVersion}. Version bumped to: #{newVersion}")
end
def bumpBuildNumber
currentBuildNumber = get_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_BUILD')
newBuildNumber = currentBuildNumber.to_i + 1
update_xcconfig_value(path: 'fastlane/VersionsConfig.xcconfig',
name: 'FC_BUILD',
value: newBuildNumber.to_s)
UI.important("Old build number: #{currentBuildNumber}. Build number bumped to: #{newBuildNumber}")
end