When I run the code,
conda info --envs
it lists three environments, but I know I have more environments because when I activate other environments (that I remember creating), it works.
Is this an indication that something is wrong with my conda environments?
Is there a way to fix it?
I'm running a windows 10 system. python 3.5 installed
If the environments which are not being listed are in some non-standard location, you can always add them to the list of directories to be searched for:
conda config --append envs_dirs /path/to/directory/containing/other/environments
Related
I have created anaconda environments in my previous session but after creating a new session in my terminal it seems like anaconda forgot the names to my old environments... Can anyone tell me why this is?
I tried accessing my old environments by doing conda activate qts1, but it gave me this error:
EnvironmentNameNotFound: Could not find conda environment: qts1
You can list all discoverable environments with `conda info --envs`.
When I listed conda info --envs I get
So my previously defined environments exist, but their names have all been deleted? How do I fix this so I can use my old environments?
stupid question... forgot I installed miniconda and that was the problem... Specifically, previous environments were defined in anaconda, and not miniconda.
To access previously defined anaconda environments, just activate it through its path:conda activate /Users/usr/opt/anaconda3/envs/qts1 for example
conda 4.10.1
airflow 2.2.2
I normally run a script in the following manner
conda activate env
python /path to script/script.py
So I put those two commands into a bash script and used the bashOperator like so:
t1 = BashOperator(
task_id='testtask',
depends_on_past=False,
bash_command='/path to bash/script.bash ',
retries=0,
)
and got the dreaded conda is not setup to activate environments.
Then I did:
conda init bash
conda activate env
python /path to script/script.py
but of course, the shell has to be restarted, which I don't know how to do in apache airflow. There has to be default args or something secret with the .bashrc etc. to activate anaconda environments in non interactive mode, but I'm a windows conda transplant and a tutorial is not handy.
There's this other solution which basically does a bunch of tricky things to start python in the environment of your choice,
How to run Airflow PythonOperator in a virtual environment
That secret hack is to just run the python in the environment:
bash_command='~/anaconda3/envs/env_of_choice/bin/python
/python_files/python_task1.py',
This guy was able to do it on anaconda 3.9!
How to change working directory and specify conda environment in Apache Airflow
But mysteriously, my environment and my base environment have the same python. When I type env for both environments the difference is in the following:
conda_shlvl=2 instead of 1
conda_prefix_1 = users/me/opt/anaconda3
path includes /users/me/opt/anaconda3/envs/env_of_choice/bin
conda_prefix=/users/me/opt/anaconda3/envs/env_of_choice
conda_default_env=sfdc
There are a few ways to go. Maybe I didn't set up the environment correctly and its using the base python instead of making a python in the virtual environment. I used a yml file. It's also really tempting just to set these environment variables in the DAG, but maybe that's not the accepted way? I couldn't find a tutorial. What's the right path? Or maybe my version, 4.10.1 is too advanced and I should downgrade to 3.9. Too many options. Advice?
The way I ended up doing this was to use the conda run command (inspired from this answer). conda run allows you to trigger a conda environment programmatically without needing to activate it - and this works within airflow.
I'm new to anaconda and conda. I have created an identical environment in two different directories. Is it safe to just delete the env folder or the environment that I no longer need, or do I need to do something in the anaconda prompt to remove the environment thoroughly? I'm not sure if creating an environment in a local folder leaves a trace in the registry or somewhere else in the computer that needs to be removed too?
conda remove --name myenv --all
Another option is
conda env remove --name myenv
Effectively no difference from the accepted answer, but personally I prefer to use conda env commands when operating on whole envs, and reserve conda remove for managing individual packages.
The difference between these and deleting the folder manually is that Conda provides action hooks for when packages are removed, and so allows packages to execute pre-unlink and post-unlink scripts.
I use conda 4.7.11 with auto_activate_base: false in ~/.condarc. I installed htop using conda install -c conda-forge htop. It was installed at ~/miniconda3/bin/htop. When I am in base environment I am able to use htop because ~/miniconda3/bin is prepended to PATH variable. But when I am outside all environments then only ~/miniconda3/condabin is prepended to PATH. When I am in all other environments except base then ~/miniconda3/envs/CUSTOM_ENV/bin and ~/miniconda3/condabin are prepended to PATH but not ~/miniconda3/bin, that's why I can use htop only from base environment. So my question is about how to be able to use htop installed using conda from all environments, including case when all environments are deactivated.
Please, don't suggest using package managers like apt or yum in my case (CentOS), because I have no root access to use this package manager. Thank you in advance.
Conda environments aren't nested, so what is in base is not inherited by the others. Isolation of environments is the imperative requirement, so it should make sense that the content in base env is not accessible when it isn't activated.
Option 1: Environment Stacking
However, there is an option to explicitly stack environments, which at this point literally means what you're asking for, namely, keeping the previous enviroment's bin/ in the PATH variable. So, if you htop installed only in base, then you can retain access to it in other envs like so
conda activate base
conda activate --stack my_env
If you decide to go this route, I think it would be prudent to be very minimal about what you install in base. Of course, you could also create a non-base env to stack on, but then it might be a bother to have to always activate this env, whereas in default installs, base auto-activates.
Starting with Conda v4.8 there will be an auto_stack configuration option:
conda config --set auto_stack true
See the documentation on environment stacking for details.
Option 2: Install by Default
If you want to have htop in every env but not outside of Conda envs, then the naive solution is to install it in every env. Conda offers a simple solution to this called Default Packages, and is in the Conda config under the key create_default_packages. Running the following will tell Conda to always install htop when creating a new env:
conda config --add create_default_packages htop
Unfortunately that won't update any existing envs, so you'd still have to go back and do that (e.g., Install a package into all envs). There's also a --no-default-packages flag for ignoring default packages when creating new envs.
Option 3: Global Installs
A Word of Caution
The following two options are not official recommendations, so caveat emptor and, if you do ever use them, be sure to report such a non-standard manipulation of $PATH when reporting problems/troubleshooting in the future.
Linking
Another option (although more manual) is to create a folder in your user directory (e.g., ~/.local/bin) that you add to $PATH in your .bashrc and create links in there to the binaries that you wish to "export" globally. I do this with a handful of programs that I wanted to use independently of Conda (e.g., emacs) even though they are installed and managed by Conda.
Dedicated Env
If you plan to do this with a bunch of software, then it might work to dedicate an env to such global software and just add its whole ./bin dir to $PATH. Do not do this with base - Conda wants to strictly manage that itself since Conda v4.4. Furthermore, do not do this with anything Python-related: stick strictly to native (compiled) software (e.g., htop is a good example). If an additional Python of the same version ends up on your $PATH this can create a mess in library loading. I've never attempted this and prefer the manual linking because I know exactly what I'm exporting.
I have been using miniconda for a while and have set up conda environments for each for each of my projects. What I can't figure out after looking through the documentation, is there a way to bond/associate my conda environment to my project folder for that conda environment? So that when I activate a specific conda environment it moves directly into the associated project directory. This virtualenvwrapper etc. can do for example. Is conda able to this?
As mentioned in Activating an environment, conda automatically executes "activation scripts" when an environment is activated. These scripts are typically provided by conda packages installed in the environment.
Just add a script of your own with a cd command. See here for details:
https://stackoverflow.com/a/43415167/11451509