poetry install multiple projects in the same environment - python-poetry

I'm trying to solve what I assume is a common problem with poetry but am unable to find the relevant documentation. My project includes multiple packages and uses pyproject.toml and poetry to manage dependencies with this structure
/pyproject.toml
/poetry.lock
/package1/pyproject.toml
/package1/poetry.lock
/package1/src/package1/...
/package1/pyproject.toml includes pypi dependencies in [tool.poetry.dependencies] and defines the buildable package as
packages = [
{ include = "package1", from = "./src" },
]
/pyproject.toml references package1 with
[tool.poetry.dependencies]
package1 = { path = "./package1", develop = true }
Finally, my Dockerfile installs the application with
WORKDIR /app/package1
RUN poetry install
WORKDIR /app
RUN poetry install
The problem is that Poetry installs each "project" (identified by the pyproject.toml file) in a separate virtual environment, and doesn't seem to support installing both in the same environment. When I execute the application it can find package1 but none of package1's dependencies.
How can I get everything installed in the same environment?
How am I supposed to handle this situation?

Related

Installing a library from codeartifact with poetry without moving all libraries to be installed from there

My situation I have a specific library in codeartifact.
I have a project (not a library) managing dependencies with poetry consuming this library.
Now I need to install this library in the project but can't find instructions which
Don't change the actual pypi index url for all projects in my computer
Being able to install the specific library from codeartifact without it being proxy for pypi for ALL libraries
Any ideas or clues on how to achieve this?
In order to do that you need to add a source to your pyproject.toml and mark it as secondary:
[[tool.poetry.source]]
name = "my-repo"
url = "your-pypi-url"
secondary = true
And then point your library to this source.
With CLI:
poetry add my-library --source my-repo
Or directly in pyproject.toml
[tool.poetry.dependencies]
…
my-library = {version = "1.0.0", source = "my-repo"}

How can we manage the front-end projects dependency packages like Maven in IDEA

There are more and more front-end projects, and each project has its own node_modules folder.
There are a lot of duplicate files in the modules folder.
How can we manage the dependency packages of all front-end projects in one folder like Maven in IDEA?
Demand:
When running and packaging different projects, WebStorm can refer to the dependent packages in a specified folder.
When run npm install, computer will check whether the public dependency package folder has the dependency version that the current project needs to use.
If so, you will not download the installation.
If not, you will download your own dependency to the public folder.
When multiple versions exist in the same dependent package, the project can automatically reference the correct version.
Maybe after reading my question, you know my actual needs better than I do. Thank you.
If you look in the package.json file in any front-end project with npm you will see all the dependencies in the current project and can manage the versions there. npm install installs the dependencies listed in that file.
Read more about package.json here: package.json
Using the yarn workspace
Yarn workspace features, and solves
multiple projects repeat node in large quantities_ Black hole problem of modules disk
when NPM install is executed for a project, all dependent packages will be placed in the node of the project in the current project_ Install it again under the modules folder
2.1 when installing a new dependency package, you should update the package.json of the subproject, and then execute the yarn install in the root directory to install it
Install the yarn tool first
npm i yarn -g
If there are projects project-a and project-b in the root folder, the directory structure is as follows:
root
project-a
project-b
create package.json in the root folder, with the following contents:
{
"private": true,
"workspaces": ["project-a", "project-b"]
}
ensure that the name attribute values in the package.json of project-a and project-b projects are:
Package.json in project-a:
{
...
"name": "project-a"
...
}
Package.json in project-b:
{
...
"name": "project-b"
...
}
use the command line tool to enter the root folder and execute the yarn install
3.1 after installation, you can enter the normal start-up project
tips:
4.1 all dependent packages will be installed at root/node_ Under modules folder
4.2 node of subproject_ The related link file will be generated under the modules folder, do not delete it
4.3 when installing a new dependency package, you should update the package.json of the subproject, and then execute the yarn install in the root directory to install it

Yarn How to Specify Install Path for a Package

I migrated from bower away and now my dev dependencies are in node_modules/ while my prod dependencies are in node_modules/#bower_components/. This is good because I can easily grab all the front end dependencies in gulp with node_modules/#bower_components/**/*. Now I want to install new "bower" packages and I also want them appear in node_modules/#bower_components/ but they are installed into node_modules/. I tried to use yarn add <package> --modules-folder node_modules/#bower_components but this moves all my packages into the subfolder.

How can `setuptools` `dependency_links` be used with the latest master branch of a Git repository?

I want to be able to pip install a package that installs a dependency package from GitHub. I want the version of that dependency package it installs to be the latest code in the master branch of the repository (i.e. I am not referencing a release of the package) (and there is a different version of the package for Python 2 and for Python 3). When I attempt to do this, the dependency is ignored. How can I get the dependency to be picked up and installed?
In setup.py I have lines like the following:
dependency_links = [
"git+https://github.com/veox/python2-krakenex.git;python_version<'3.0'",
"git+https://github.com/veox/python3-krakenex.git;python_version>='3.0'",
],
When I run pip, I do it using commands of the following form:
sudo pip install package_name --upgrade --process-dependency-links
I don't think it's possible. dependency_links aren't versioned, they're simple a list of URLs for packages listed in install_requires; those packages could be versioned but not in your case — you're trying to provide 2 URLs for one package which would confuse pip.
Perhaps you could rename one of the packages and provide package names
in the URLs:
install_requires=[
'krakenex;python_version<3',
'krakenex3;python_version>=3',
],
dependency_links = [
"git+https://github.com/veox/python2-krakenex.git#egg=krakenex;python_version<'3.0'",
"git+https://github.com/veox/python3-krakenex.git#egg=krakenex3;python_version>='3.0'",
],

Package nodejs application with global packages

We have a project which have to be packaged as a zip so we can distribute it to our cliens. With the normal node_modules directory i have no problems. I just put the directory and the node.exe together in my project folder and can start our project on every other computer without installing node or running any npm command.
But now i have a dependecy on phantomjs which needs to be installed as a global package npm install -g phantomjs.
How do i pack modules like this into our project? I first thought of copying phantomjs into the local node_modules directory and set the path variable NODE_PATH to this directory. It doesn't find phantomjs.
Development and client platforms are both windows.
Well, generally it is fine to install global dependencies with the --save flag and call their bins like ./node_modules/phantomjs/bin/phantomjs /*now executes*/ (just as an illustrative example).
However, with with Phantom it's not that simple, since it's downloading binaries and/or even compiling. You would have three options:
ssh into target and just npm install -g phantomjs before or define it in a manifest e.g. Dockerfile just like that, if you are using containers.
Compile it from source as advised here.
If you are using the CLI, then just the --save approach.
So I hardly advise just making a Docker image out of it and ship it as tarball. You can't zip the platform dependent Phantom installation, unfortunately.
Also lots of dependencies like karma-runner-phantomjs look for the path of the global dependencies to resolve it for their use.

Resources