We have multiple repositories that are added locally in one yarn workspace
packages
- Repo1
- Repo2
- Repo3
package.json
yarn.lock
Team1: only needs Repo1&2 so they do not checkout Repo3
Team2: only needs Repo1&3 so they do not checkout Repo2
This works fine but we have one issue with the lockfile. If I am on team 1 and do yarn install, then there would only be the deps of Repo1 and 2 included and if I am on team 2 there would only be the deps of 1 and 3 included.
So it is not possible to have a shared lockfile as we would overwrite it constantly.
Our idea now would be that we have multiple lockfiles like maybe this
packages
- Repo1
- Repo2
- Repo3
package.json
team1-yarn.lock
team2-yarn.lock
Is it possible to have a configuration with yarn so that I can change how the name of the lockfile is? Or do you see any other way how this could be achieved?
Related
I have a GitLab pipeline setup that has a package step to do a maven build during the tag event and a release to upload the jar to the GitLab generic package registry using curl and GitLab-release cli.
What I'm expecting to happen is a cache of the .m2 to be loaded into the package step to allow the mvn clean package to do its thing. Then archive the created jar and test results only.
The release step should begin clean with no git clone, no cache and only the jar and test results.
Instead the 'find .' shows the release step contains everything including
Git directory (.git)
Full checked out repository
.m2 cache
target (fully built as the Package step produced)
From the cache documentation (https://docs.gitlab.com/ee/ci/caching/) on GitLab it states
Archive: 'dependencies' keyword to control which job fetches the artifacts
Disable Cache uses the 'cache: []'
Why is GitLab putting so much content into the release job? The release job fails at times because its finding multiple Jar files from previous tags (IE the clean and the archiving are holding past version).
gitlab-ci.yml
variables:
MAVEN_CLI_OPTS: "-s $CI_PROJECT_DIR/.m2/settings.xml"
MAVEN_VERSION_PLUGIN_VERSION: 2.11.0
MAVEN_ARTIFACT_NAME: test-component
GIT_CLEAN_FLAGS: -ffd
PACKAGE_REGISTRY_URL: "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${MAVEN_ARTIFACT_NAME}"
cache:
key: primary
paths:
- .m2/repository
stages:
- package
- release
package:
stage: package
image: maven:latest
script:
- mvn ${MAVEN_CLI_OPTS} clean package
artifacts:
paths:
- target/*.jar
- target/surefire-reports
only:
- tags
- merge_requests
- branches
except:
- main
release:
stage: release
image: alpine:latest
cache: []
variables:
GIT_STRATEGY: none
dependencies:
- package
script:
- |
apk add curl gitlab-release-cli
find .
JAR_NAME=`basename target/${MAVEN_ARTIFACT_NAME}-${CI_COMMIT_TAG}.jar`
'curl --header "JOB-TOKEN: ${CI_JOB_TOKEN}" --upload-file target/${JAR_NAME} ${PACKAGE_REGISTRY_URL}/${CI_COMMIT_TAG}/${JAR_NAME}'
release-cli create --name "Release $CI_COMMIT_TAG" --description "$TAG_MESSAGE" --tag-name ${CI_COMMIT_TAG} --assets-link "{\"name\":\"jar\",\"url\":\"${PACKAGE_REGISTRY_URL}/${CI_COMMIT_TAG}/${JAR_NAME}\"}"
only:
- tags
See the GitLab docs on GIT_STRATEGY:
A Git strategy of none also re-uses the local working copy, but skips all Git operations normally done by GitLab. GitLab Runner pre-clone scripts are also skipped, if present. This strategy could mean you need to add fetch and checkout commands to your .gitlab-ci.yml script.
It can be used for jobs that operate exclusively on artifacts, like a deployment job. Git repository data may be present, but it’s likely out of date. You should only rely on files brought into the local working copy from cache or artifacts.
So GitLab documentation is pretty clear that you should always expect the git repository to be present. When you want to work exclusively with artifacts, I you can create a new temporary directory and reference the path to the artifacts explicitly rather than relying on a totally clean working directory.
I have a mono-repo that contains the main code, and some custom packages as workspaces.
so my directory structure is like this
- mainRepo
-- directories of main project
-- /node_modules
-- /packages
--- my-foo-package
---- /node_modules
so since the workspace (my-foo-package in this case) has its own node_modules, every dependency that it has is going to be there
but main repo also needs the workspace dependencies too, to be able to import codes from workspace package
but after i added the workspace package, linked it and added it to main repo dependencies and ran yarn install i still cannot see the dependencies of workspace package in roo node_modules
is there any step missing?
I'm using yarn with monorepos that contain several packages. For examples packages foo and bar might be located in repo/foo and repo/bar within the monorepo root repo. The problem is that I sometimes accidentally run yarn without parameters in the repository root instead of the packages directories. This creates a repo/node_modules directory and a repo/yarn.lock file. Can I somehow prevent yarn from creating node_modules and yarn.lock in the repository root directory?
I configured google cloud build (GCB) to trigger a build on one of my repositories in Github. This repository requires another git repository in order to be built. This other repository is configured using a git submodule.
I search and at the moment it looks like GCB do not support submodules. So I am trying to run git submodule update --init manually on the source code that GCB downloaded, but there is not .gitdirectory on it and the command fails.
What am I missing here?
I am using this issue as reference: https://github.com/GoogleCloudPlatform/cloud-builders/issues/435
If trigger your build using github it will not work because of the lack of the .git folder. In order for it to work, all repositories need to be mirrored by Cloud Source Repositories. Then, the submodule can be updated like this:
- name: 'gcr.io/cloud-builders/git'
entrypoint: 'bash'
args:
- '-c'
- |
git config -f .gitmodules submodule.[my-sub-repo-name].url https://source.developers.google.com/p/[my-project]/r/github_[my-sub-repo-name]
git submodule init
git submodule update
Ref: https://github.com/GoogleCloudPlatform/cloud-builders/issues/26
Sometimes you get an error with GIT_DISCOVERY_ACROSS_FILESYSTEM or the missing .git folder. The following worked for me:
- id: git-submodule
name: 'gcr.io/cloud-builders/git'
entrypoint: 'bash'
env: ['GIT_DISCOVERY_ACROSS_FILESYSTEM=1']
args:
- '-c'
- |
git init
git config -f .gitmodules submodule.[my-sub-repo-name].url https://source.developers.google.com/p/[my-project]/r/github_[my-sub-repo-name]
git submodule init
git submodule update
For someone like me, who is using submodules in Bitbucket and ran into similar problems with Cloud Build: As soon as you mirror your repositories into the Cloud Source Repository, the submodule URLs are not allowed to have the .git extension.
For example for the main repository https://source.cloud.google.com/Project_ID/main_repo and submodules a, b in the "submodules" folder the .gitmodules configuration might look similar to this
[submodule "submodules/a"]
path = submodules/a
url = ../a
[submodule "submodules/b"]
path = submodules/b
url = ../b
Previously, I used for URLs ../a.git and ../b.git which works fine in Bitbucket but not in Cloud Source Repository.
I configured 2 workspaces in package.json e.g example and gatsby-theme but later I found I was actually developing a gatsby-starter so I removed the example workspace that depended on the latter workspace from package.json.
I wonder if I moved all files from gatsby-theme/ to the project root directory and overwrote the package.json and other files with gatsby-theme's, does it become a project that could be managed with both npm and yarn?