How to convert requirements.txt file to environment.yml - pip

I have a requirements.txt file which is updated with pip freeze. I want to transfer all this information to conda environment and for this reason I need .yml file. Is there any good way to convert the txt file to yml?
This is pretty much how my txt file looks (just some of the libs)
alembic==1.8.1
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
astor==0.8.1
asttokens==2.0.5
attrdict==2.0.1
attrs==22.1.0

Related

Where is the location of homebrew installed youtube-dl configuration? [macOS]

I am new to youtube-dl and programming in general so this has been a lot for me to get even this far. So homebrew is installed on my Macbook and homebrew was used to install youtube-dl and ffmpeg. I read somewhere about a configuration file located at ~/.config/youtube-dl/config on
https://github.com/ytdl-org/youtube-dl/blob/master/README.md#options
So on the user directory I opened .config (found out it was hidden) and no youtube-dl folder found let alone the config file mentioned in the link.
Why is that?
How do I make a configuration file to use?
Like written on the Guthub page:
Note that by default configuration file may not exist so you may need to create it yourself.
You can simply create the directory ~/.config/youtube-dl yourself and then use your favorite text editor, place your options in a new file and save it at ~/.config/youtube-dl/config

is it possible to get a gemspec from a .gem archive?

This might be really naive as I have never had to do this before but is there a way to generate a .gemspec file from a .gem? Like the opposite of gem build xxx.
A .gem file is basically a .tar archive. Inside this .tar archive you will have 2 important files:
data.tar.gz - containing the source code and possibly a .gemspec file (this is not a guarantee though)
metadata.gz - basically a .gemspec, but in a YAML format which you won't be easily able to convert back

Export conda yml environment file without package version alphanumeric text

As a newbie in python, I have successfully exported my project environment into a yml file so as to share so far. See sample here
name: climate
channels:
conda-forge
defaults
dependencies:
affine=2.3.0=py_0
bokeh=2.4.2=py310h5588dad_0
However, I wish to have the dependencies without the alphanumeric text following the package versions.
I like to have -bokeh=2.4.2 not bokeh=2.4.2=py310h5588dad_0.
Add the --no-builds flag when you're exporting the environment.
conda env export --no-builds > environment.yml

How to pip install interdependent packages from a local directory in editable mode using a requirements file

I'm having issues with pip failing to install editable packages from a local directory. I was able to install the packages manually using commands like pip install -e pkg1. I wanted to use a requirements.txt file to automate future installs, because my coworkers will be working on the same packages. My ideal development workflow is for each developer to checkout the source from version control and run pip install -r requirements.txt. The requirements file would designate all the packages as editable so we can import our code without the need for .pth files but we wouldn't have to keep updating our environments. And by using namespace packages, we can decouple the import semantics from the file structures.
But it's not working out.
I have a directory with packages like so:
index/
pkg1/
src/
pkg1/
__init__.py
pkg1.py
setup.py
pkg2/
src/
...etc.
Each setup.py file contains something like:
from setuptools import setup, find_packages
setup(
name="pkg1",
version="0.1",
packages=find_packages('src'),
package_dir={'':'src'},
)
I generated my requirements.txt file using pip freeze, which yielded something like this:
# Editable install with no version control (pkg1==0.1)
-e c:\source\pkg1
# Editable install with no version control (pkg2==0.1)
-e c:\source\pkg2
...etc...
I was surprised when pip choked on the requirements file that it created for itself:
(venv) C:\Source>pip install -r requirements.txt
c:sourcepkg1 should either be a path to a local project or a VCS url beginning with svn+, git+, hg+, or bzr+
Also, some of our packages rely on other of our packages and pip has been absolutely useless at identifying these dependencies. I have resorted to manually installing packages in dependency order.
Maybe I'm pushing pip to its limits here. The documentation and help online has not been helpful, so far. Most sources discuss editable installation, installation from requirements files, package dependencies, or namespace packages, but never all these concepts at once. Usually when the online help is scarce, it means that I'm trying to use a tool for something it wasn't intended to do or I've discovered a bug.
Is this development process viable? Do I need to make a private package index or something?

Automatically read requirements.txt in fabric or deploy

I have a flask app where I'm trying to automate deployment to EC2.
Not a big deal, but is there a setting in either Fabric or Distribute that reads the requirements.txt file directly for the setup.py, so I don't have to spell everything out in the setup(install_requires=[]) list, rather than writing a file reader for my requirements.txt? If not, do people have recommendations or suggestions on auto-deployment and with pip?
I'm reviewing from here and here.
Not a big deal, but is there a setting in either Fabric or Distribute
that reads the requirements.txt file directly for the setup.py, so I
don't have to spell everything out in the setup(install_requires=[])
list, rather than writing a file reader for my requirements.txt?
You might still want to checkout frb's answer to the duplicate question How can I reference requirements.txt for the install_requires kwarg in setuptools.setup?, which provides a straight forward two line solution for writing a file reader.
If you really want to avoid this, you could alternatively add the common pip install -r requirements.txtto your fabfile.py, e.g.:
# ...
# create a place where we can unzip the tarball, then enter
# that directory and unzip it
run('mkdir /tmp/yourapplication')
with cd('/tmp/yourapplication'):
run('tar xzf /tmp/yourapplication.tar.gz')
# now install the requirements with our virtual environment's
# pip installer
run('/var/www/yourapplication/env/scripts/pip install -r requirements.txt')
# now setup the package with our virtual environment's
# python interpreter
run('/var/www/yourapplication/env/bin/python setup.py install')

Resources