I'm using Lerna whilst building a component library using a mono-repo structure. I'd like to create a few new files in a large number of packages, but crucially, not EVERY package.
Ordinary I would run:
lerna exec -- touch docs/readme.md
or something similar. However I'd like to scope this command to just packages in a sub-directory e.g - packages/molecules.
Any ideas?
You can use filter options --scope which accept all filter flags
--scope Include only packages with names matching the given glob.
E.g.
Create readme.md file only in packages/pkg-a/docs directory:
⚡ npx lerna exec --scope pkg-a -- touch docs/readme.md
lerna notice cli v3.22.1
lerna notice filter including "pkg-a"
lerna info filter [ 'pkg-a' ]
lerna info Executing command in 1 package: "touch docs/readme.md"
lerna success exec Executed command in 1 package: "touch docs/readme.md"
lerna version:
⚡ npx lerna -v
3.22.1
In a shell:
# This returns all packages with directory and package name
lerna list --all --parseable --long
# We then grep for your directory name
lerna list --all --parseable --long | grep packages/molecules
# We get just the package name
lerna list --all --parseable --long | grep packages/molecules | cut -d ':' -f 2
# We add the --scope flag to each package name and put it all in one line
lerna list --all --parseable --long | grep packages/molecules | cut -d ':' -f 2 | sed 's/^/--scope=/' | xargs
# We now pass all of above into `lerna exec`
lerna exec -- touch docs/readme.md $(yarn lerna list --all --parseable --long | grep packages/molecules | cut -d ':' -f 2 | sed 's/^/--scope=/' | xargs)
References:
list
exec
filters
Related
I'm using gitlab runner on a mac mini server.
While using user named "runner" I manage to use this command:
gsutil ls -l gs://tests/ |grep staging | sort -k 2 | tail -n 3| head -n 2 |awk '{print $3}' | gsutil -m cp -I .
I manage to get the files, but while using the same command in gitlab-ci.yml like this:
stages:
- test
test:
stage: test
when: always
script:
- gsutil ls -l gs://tests/ |grep staging | sort -k 2 | tail -n 3| head -n 2 |awk '{print $3}' | gsutil -m cp -I .
I get the error:
bash: line 141: gsutil: command not found
Also I checked and gitlab runner is using the same user I used.
The gitlab runner is configured with shell executor.
Changing the command to hold the full path of gsutil didn't help either.
I added whoami to the gitlab-ci.yml and got the result of the same user "runner"
I managed to solve this issue by using this solution:
gcloud-command-not-found-while-installing-google-cloud-sdk
I included this 2 line into my gitlab-ci.yml before using the gsutil command.
source '[path-to-my-home]/google-cloud-sdk/path.bash.inc'
source '[path-to-my-home]/google-cloud-sdk/completion.bash.inc'
I have this command
clear; sudo kubectl exec -it $(kubectl get pods | grep 'app' | cut -d ' ' -f 1) ash
I will land here
/src #
I want to also run a command ls
/src # ls
Procfile composer.lock phpunit.xml server.php
app config public storage
artisan database resources tests
benu.code-workspace heroku.sh routes vendor
bootstrap package-lock.json run.sh webpack.mix.js
composer.json package.json scripts
/src #
I've tried
clear; sudo kubectl exec -it $(kubectl get pods | grep 'app' | cut -d ' ' -f 1) ash echo "ls"
and
clear; sudo kubectl exec -it $(kubectl get pods | grep 'app' | cut -d ' ' -f 1) echo "ls"
Please correct me
If you want to run the command, you want kubectl exec -it $podname ls. If you put echo "ls" then that is the command which run, i.e. print "ls".
In Xcode 10 and below, the stdout and stderror of the test suite run could be found and extracted from inside the specified -resultBundlePath using this shell command:
cp -fv $(find $resultBundlePath/1_Test -print | grep TestSuite1 | grep 'StandardOutputAndStandardError.txt' | xargs) TestSuite1Xctestrun.log"
With Xcode 11+, this file no longer is found in the bundle.
Where and how do I extract it from the xcbundle?
xcrun xcresulttool --formatDescription shows that there is a logRef key with a unique value pointing to the log that we can query for if we know the item's ID.
Using jq, I was able accomplish this task.
First we get the id of the logRef, then we extract its value from the .xcresult bundle into a text file.
# find the id that points to the location of the encoded file in the .xcresult bundle
id=$(xcrun xcresulttool get --format json --path Tests.xcresult | jq '.actions._values[]' | jq -r '.actionResult.logRef.id._value')
# export the log found at the the id in the .xcresult bundle
xcrun xcresulttool export --path Tests.xcresult --id $id --output-path TestsStdErrorStdout.log --type file
How to use this command in windows 10 familly :
docker-compose run api composer install --no-interaction
Example:
docker-compose run api composer install --no-interaction
- Interactive mode is not yet supported on Windows.
Please pass the -d flag when using `docker-compose run`.
Is it possible ?
Do you have an example ?
The interactive mode support for docker-compose on Windows is tracked by issue 2836 which proposes some alternatives:
Use bash from within the container:
docker exec -it MY_CONTAINER bash
Use a docker-compose-run script by Rodrigo Baron:
Script ( put the function in ~/.zshrc or ~/.bashrc in a Windows git bash shell for instance):
#!/bin/bash
function docker-compose-run() {
if [ "$1" = "-f" ] || [ "$1" = "--file" ] ; then
docker exec -i $(docker-compose -f $2 ps $3 |grep -m 1 $3 | cut -d ' ' -f1) "${#:4}"
else
docker exec -i $(docker-compose ps $1 | grep -m 1 $1 | cut -d ' ' -f1) "${#:2}"
fi
}
docker-compose-run "$#"
Usage:
usage:
docker-compose-run web rspec
# or:
docker-compose-run -f docker-compose.development.yml web rspec
Simpler alternative is to use option -d and to get logs
docker-compose run -rm <service> <command>
is replaced by:
docker-compose-run <service> <command>
For this to work, add this snippet in your ~/.bashrc :
docker-compose-run() {
CONTAINER_NAME=$(docker-compose run -d $#)
docker logs -f $CONTAINER_NAME
docker rm $CONTAINER_NAME
}
I want to take the following "Names" from this command and put them in an array for a bash script, any ideas on the best way to do this?
$ virsh -r -c qemu:///system list --all
Id Name State
----------------------------------------------------
3 KVM_Win7-KVM running
EDIT:
The final result was this:
declare -a kvm_list=( $(virsh -r -c qemu:///system list --all --name) )
First consider using the --name option to virsh --list, so you end up with:
virsh -r -c qemu:///system list --all --name
And then reading output of command into array in bash