Superuser for Google Colab Notebooks - terminal

I wanted to use to add some python packages using pip in my Google Colab Notebook terminal and also set my environment variable(a JSON file).
Where can I find a terminal so that I can do these tasks?
Is it even possible to run Google Colab Notebook as a superuser?

Colab notebooks execute as root on ephemeral VMs. You can run pip install and configure environment variables like so:
Pip install:
!pip install -q matplotlib-venn
Environment variables:
import os
os.environ['YOUR_VARIABLE'] = 'value'
Did you encounter a particular error attempting to do something specific?

Related

How do I clone a Google Colab environment to a local Anaconda environment on Windows?

I have a notebook that works fine in Google Colab. I am not able to properly create an Anaconda environment with the packages due to dependency issues. Is there a way to install all required packages from colab to local Windows Anaconda environment? Using pip freeze gives a list that is appropriate for Linux but not for windows
In google Colab, run the following code:
!pip freeze >> requierements.txt
All the packages that were installed will appear here to be downloaded
Later on, on your localhost run,
pip install -r requierements.txt

Best way to copy python environment of Google colab

I have a code which was developed on Google colab and now I want to run it on local machine or on a server. The problem is my code has got a lot of dependencies and its getting difficult to prepare an virtual / conda environment. The code is working perfectly on colab. So is there any way so that I can have an image of that environment so its easier for me to run it wherever I want
You can run this command: !pip freeze and you will get all packages installed and their versions:
you can copy paste this to your local machine into a requirements.txt, or you can download this file with this commands:
Execute in one cell !pip freeze >> requirements.txt
Go to the left panel :
Right click on file created, and select download
the last step is create the environment, yoy can use conda or virtenv. Use pip with this command: pip install -r requirements.txt

difference between '%pip' and '!pip' in python jupyter notebook and colab

what is the difference between '!pip' and '%pip'? We can use both of them in jupyter notebook and google colab. But we can not use the simple 'pip' there, can we?
%pip is a magic function and works mostly the same like pip. You can use other magic functions in jupyter which you can list with %lsmagic
(https://ipython.readthedocs.io/en/stable/interactive/magics.html#magic-lsmagic)
Using ! allows to run commands like ls or pip or what you have available on your OS.
Colab is just like jupyter so you can run native pip like pip install tensorflow.
See related answers:
What is the meaning of exclamation and question marks in Jupyter notebook?
What is %pylab?
How do I install Python packages in Google's Colab?

How to resolve ModuleNotFoundError: No module named 'twitter' [duplicate]

I installed Anaconda (with Python 2.7), and installed Tensorflow in an environment called tensorflow. I can import Tensorflow successfully in that environment.
The problem is that Jupyter Notebook does not recognize the new environment I just created. No matter I start Jupyter Notebook from the GUI Navigator or from the command line within the tensorflow env, there is only one kernel in the menu called Python [Root], and Tensorflow cannot be imported. Of course, I clicked on that option multiple times, saved file, re-opened, but these did not help.
Strangely, I can see the two environments when I open the Conda tab on the front page of Jupyter. But when I open the Files tab, and try to new a notebook, I still end up with only one kernel.
I looked at this question:
Link Conda environment with Jupyter Notebook
But there isn't such a directory as ~/Library/Jupyter/kernels on my computer! This Jupyter directory only has one sub-directory called runtime.
I am really confused. Are Conda environments supposed to become kernels automatically? (I followed https://ipython.readthedocs.io/en/stable/install/kernel_install.html to manually set up the kernels, but was told that ipykernel was not found.)
I don't think the other answers are working any more, as conda stopped automatically setting environments up as jupyter kernels. You need to manually add kernels for each environment in the following way:
source activate myenv
python -m ipykernel install --user --name myenv --display-name "Python (myenv)"
As documented here:http://ipython.readthedocs.io/en/stable/install/kernel_install.html#kernels-for-different-environments
Also see this issue.
Addendum:
You should be able to install the nb_conda_kernels package with conda install nb_conda_kernels to add all environments automatically, see https://github.com/Anaconda-Platform/nb_conda_kernels
If your environments are not showing up, make sure you have installed
nb_conda_kernels in the environment with Jupyter
ipykernel and ipywidgets in the Python environment you want to access (note that ipywidgets is to enable some Juptyer functionality, not environment visibility, see related docs).
Anaconda's documentation states that
nb_conda_kernels should be installed in the environment from which
you run Jupyter Notebook or JupyterLab. This might be your base conda
environment, but it need not be. For instance, if the environment
notebook_env contains the notebook package, then you would run
conda install -n notebook_env nb_conda_kernels
Any other environments you wish to access in your notebooks must have
an appropriate kernel package installed. For instance, to access a
Python environment, it must have the ipykernel package; e.g.
conda install -n python_env ipykernel
To utilize an R environment, it must have the r-irkernel package; e.g.
conda install -n r_env r-irkernel
For other languages, their corresponding kernels must be installed.
In addition to Python, by installing the appropriatel *kernel package, Jupyter can access kernels from a ton of other languages including R, Julia, Scala/Spark, JavaScript, bash, Octave, and even MATLAB.
Note that at the time originally posting this, there was a possible cause from nb_conda not yet supporting Python 3.6 environments.
If other solutions fail to get Jupyter to recognize other conda environments, you can always install and run jupyter from within a specific environment. You may not be able to see or switch to other environments from within Jupyter though.
$ conda create -n py36_test -y python=3.6 jupyter
$ source activate py36_test
(py36_test) $ which jupyter
/home/schowell/anaconda3/envs/py36_test/bin/jupyter
(py36_test) $ jupyter notebook
Notice that I am running Python 3.6.1 in this notebook:
Note that if you do this with many environments, the added storage space from installing Jupyter into every environment may be undesirable (depending on your system).
The annoying thing is that in your tensorflow environment, you can run jupyter notebook without installing jupyter in that environment. Just run
(tensorflow) $ conda install jupyter
and the tensorflow environment should now be visible in Jupyter Notebooks started in any of your conda environments as something like Python [conda env:tensorflow].
I had to run all the commands mentioned in the top 3 answers to get this working:
conda install jupyter
conda install nb_conda
conda install ipykernel
python -m ipykernel install --user --name mykernel
Just run conda install ipykernel in your new environment, only then you will get a kernel with this env. This works even if you have different versions installed in each envs and it doesn't install jupyter notebook again. You can start youe notebook from any env you will be able to see newly added kernels.
Summary (tldr)
If you want the 'python3' kernel to always run the Python installation from the environment where it is launched, delete the User 'python3' kernel, which is taking precedence over whatever the current environment is with:
jupyter kernelspec remove python3
Full Solution
I am going to post an alternative and simpler solution for the following case:
You have created a conda environment
This environment has jupyter installed (which also installs ipykernel)
When you run the command jupyter notebook and create a new notebook by clicking 'python3' in the 'New' dropdown menu, that notebook executes python from the base environment and not from the current environment.
You would like it so that launching a new notebook with 'python3' within any environment executes the Python version from that environment and NOT the base
I am going to use the name 'test_env' for the environment for the rest of the solution. Also, note that 'python3' is the name of the kernel.
The currently top-voted answer does work, but there is an alternative. It says to do the following:
python -m ipykernel install --user --name test_env --display-name "Python (test_env)"
This will give you the option of using the test_env environment regardless of what environment you launch jupyter notebook from. But, launching a notebook with 'python3' will still use the Python installation from the base environment.
What likely is happening is that there is a user python3 kernel that exists. Run the command jupyter kernelspec list to list all of your environments. For instance, if you have a mac you will be returned the following (my user name is Ted).
python3 /Users/Ted/Library/Jupyter/kernels/python3
What Jupyter is doing here is searching through three different paths looking for kernels. It goes from User, to Env, to System. See this document for more details on the paths it searches for each operating system.
The two kernels above are both in the User path, meaning they will be available regardless of the environment that you launch a jupyter notebook from. This also means that if there is another 'python3' kernel at the environment level, then you will never be able to access it.
To me, it makes more sense that choosing the 'python3' kernel from the environment you launched the notebook from should execute Python from that environment.
You can check to see if you have another 'python3' environment by looking in the Env search path for your OS (see the link to the docs above). For me (on my mac), I issued the following command:
ls /Users/Ted/anaconda3/envs/test_env/share/jupyter/kernels
And I indeed had a 'python3' kernel listed there.
Thanks to this GitHub issue comment (look at the first response), you can remove the User 'python3' environment with the following command:
jupyter kernelspec remove python3
Now when you run jupyter kernelspec list, assuming the test_env is still active, you will get the following:
python3 /Users/Ted/anaconda3/envs/test_env/share/jupyter/kernels/python3
Notice that this path is located within the test_env directory. If you create a new environment, install jupyter, activate it, and list the kernels, you will get another 'python3' kernel located in its environment path.
The User 'python3' kernel was taking precedence over any of the Env 'python3' kernels. By removing it, the active environment 'python3' kernel was exposed and able to be chosen every time. This eliminates the need to manually create kernels. It also makes more sense in terms of software development where one would want to isolate themselves into a single environment. Running a kernel that is different from the host environment doesn't seem natural.
It also seems that this User 'python3' is not installed for everyone by default, so not everyone is confronted by this issue.
To add a conda environment to Jupyter:
In Anaconda Prompt :
run conda activate <env name>
run conda install -c anaconda ipykernel
run python -m ipykernel install --user --name=<env name>
** tested on conda 4.8.3 4.11.0
$ conda install nb_conda_kernels
(in the conda environment where you run jupyter notebook) will make all conda envs available automatically. For access to other environments, the respective kernels must be installed. Here's the ref.
This worked for me in windows 10 and latest solution :
1) Go inside that conda environment ( activate your_env_name )
2) conda install -n your_env_name ipykernel
3) python -m ipykernel install --user --name build_central --display-name "your_env_name"
(NOTE : Include the quotes around "your_env_name", in step 3)
The nb_conda_kernels package is the best way to use jupyter with conda. With minimal dependencies and configuration, it allows you to use other conda environments from a jupyter notebook running in a different environment. Quoting its documentation:
Installation
This package is designed to be managed solely using conda. It should be installed in the environment from which you run Jupyter Notebook or JupyterLab. This might be your base conda environment, but it need not be. For instance, if the environment notebook_env contains the notebook package, then you would run
conda install -n notebook_env nb_conda_kernels
Any other environments you wish to access in your notebooks must have an appropriate kernel package installed. For instance, to access a Python environment, it must have the ipykernel package; e.g.
conda install -n python_env ipykernel
To utilize an R environment, it
must have the r-irkernel package; e.g.
conda install -n r_env r-irkernel
For other languages, their corresponding kernels must be installed.
Then all you need to do is start the jupyter notebook server:
conda activate notebook_env # only needed if you are not using the base environment for the server
# conda install jupyter # in case you have not installed it already
jupyter
Despite the plethora of answers and #merv's efforts to improve them, it still hard to find a good one. I made this one CW, so please vote it to the top or improve it!
This is an old thread, but running this in Anaconda prompt, in my environment of interest, worked for me:
ipython kernel install --name "myenvname" --user
We have struggle a lot with this issue, and here's what works for us. If you use the conda-forge channel, it's important to make sure you are using updated packages from conda-forge, even in your Miniconda root environment.
So install Miniconda, and then do:
conda config --add channels conda-forge --force
conda update --all -y
conda install nb_conda_kernels -y
conda env create -f custom_env.yml -q --force
jupyter notebook
and your custom environment will show up in Jupyter as an available kernel, as long as ipykernel was listed for installation in your custom_env.yml file, like this example:
name: bqplot
channels:
- conda-forge
- defaults
dependencies:
- python>=3.6
- bqplot
- ipykernel
Just to prove it working with a bunch of custom environments, here's a screen grab from Windows:
I ran into this same problem where my new conda environment, myenv, couldn't be selected as a kernel or a new notebook. And running jupter notebook from within the env gave the same result.
My solution, and what I learned about how Jupyter notebooks recognizes conda-envs and kernels:
Installing jupyter and ipython to myenv with conda:
conda install -n myenv ipython jupyter
After that, running jupter notebook outside any env listed myenv as a kernel along with my previous environments.
Python [conda env:old]
Python [conda env:myenv]
Running the notebook once I activated the environment:
source activate myenv
jupyter notebook
hides all my other environment-kernels and only shows my language kernels:
python 2
python 3
R
This has been so frustrating, My problem was that within a newly constructed conda python36 environment, jupyter refused to load “seaborn” - even though seaborn was installed within that environment. It seemed to be able to import plenty of other files from the same environment — for example numpy and pandas but just not seaborn. I tried many of the fixes suggested here and on other threads without success. Until I realised that Jupyter was not running kernel python from within that environment but running the system python as kernel. Even though a decent looking kernel and kernel.json were already present in the environment. It was only after reading this part of the ipython documentation:
https://ipython.readthedocs.io/en/latest/install/kernel_install.html#kernels-for-different-environments
and using these commands:
source activate other-env
python -m ipykernel install --user --name other-env --display-name "Python (other-env)"
I was able to get everything going nicely. (I didn’t actually use the —user variable).
One thing I have not yet figured is how to set the default python to be the "Python (other-env)" one. At present an existing .ipynb file opened from the Home screen will use the system python. I have to use the Kernel menu “Change kernel” to select the environment python.
I had similar issue and I found a solution that is working for Mac, Windows and Linux. It takes few key ingredients that are in the answer above:
To be able to see conda env in Jupyter notebook, you need:
the following package in you base env:
conda install nb_conda
the following package in each env you create:
conda install ipykernel
check the configurationn of jupyter_notebook_config.py
first check if you have a jupyter_notebook_config.py in one of the location given by jupyter --paths
if it doesn't exist, create it by running jupyter notebook --generate-config
add or be sure you have the following: c.NotebookApp.kernel_spec_manager_class='nb_conda_kernels.manager.CondaKernelSpecManager'
The env you can see in your terminal:
On Jupyter Lab you can see the same env as above both the Notebook and Console:
And you can choose your env when have a notebook open:
The safe way is to create a specific env from which you will run your example of envjupyter lab command. Activate your env. Then add jupyter lab extension example jupyter lab extension. Then you can run jupyter lab
While #coolscitist's answer worked for me, there is also a way that does not clutter your kernel environment with the complete jupyter package+deps.
It is described in the ipython docs and is (I suspect) only necessary if you run the notebook server in a non-base environment.
conda activate name_of_your_kernel_env
conda install ipykernel
python -m ipykernel install --prefix=/home/your_username/.conda/envs/name_of_your_jupyter_server_env --name 'name_of_your_kernel_env'
You can check if it works using
conda activate name_of_your_jupyter_server_env
jupyter kernelspec list
First you need to activate your environment .
pip install ipykernel
Next you can add your virtual environment to Jupyter by typing:
python -m ipykernel install --name = my_env
Follow the instructions in the iPython documentation for adding different conda environments to the list of kernels to choose from in Jupyter Notebook. In summary, after installing ipykernel, you must activate each conda environment one by one in a terminal and run the command python -m ipykernel install --user --name myenv --display-name "Python (myenv)", where myenv is the environment (kernel) you want to add.
Possible Channel-Specific Issue
I had this issue (again) and it turned out I installed from the conda-forge channel; removing it and reinstalling from anaconda channel instead fixed it for me.
Update: I again had the same problem with a new env, this time I did install nb_conda_kernels from anaconda channel, but my jupyter_client was from the conda-forge channel. Uninstalling nb_conda_kernels and reinstalling updated that to a higher-priority channel.
So make sure you've installed from the correct channels :)
I encountered this problem when using vscode server.
In the conda environment named "base", I installed the 1.2.0 version of opennmt-py, but I want to run jupyter notebook in the conda environment "opennmt2", which contains code using opennmt-py 2.0.
I solved the problem by reinstalling jupyter in conda(opennmt2).
conda install jupyter
After reinstalling, executing jupyter notebook in the opennmt2 environment will execute the newly installed jupyter
where jupyter
/root/miniconda3/envs/opennmt2/bin/jupyter
/root/miniconda3/bin/jupyter
For conda 4.5.12, what works for me is (my virtual env is called nwt)
conda create --name nwt python=3
after that I need to activate the virtual environment and install the ipykernel
activate nwt
pip install ipykernel
then what works for me is:
python -m ipykernel install --user --name env_name --display-name "name of your choosing."
As an example, I am using 'nwt' as the display name for the virtual env. And after running the commands above. Run 'jupyter notebook" in Anaconda Prompt again. What I get is:
Using only environment variables:
python -m ipykernel install --user --name $(basename $VIRTUAL_ENV)
I just wanted to add to the previous answers: in case installing nb_conda_kernels, ipywidgets and ipekernel dosen't work, make sure your version of Jupyter is up to date. My envs suddenly stopped showing up after a period of everything working fine, and it resumed working after I simply updated jupyter through the anaconda navigator.
In my case, using Windows 10 and conda 4.6.11, by running the commands
conda install nb_conda
conda install -c conda-forge nb_conda_kernels
from the terminal while having the environment active didn't do the job after I opened Jupyter from the same command line using conda jupyter notebook.
The solution was apparently to opened Jupyter from the Anaconda Navigator by going to my environment in Environments: Open Anaconda Navigator, select the environment in Environments, press on the "play" button on the chosen environment, and select 'open with Jupyter Notebook'.
Environments in Anaconda Navigator to run Jupyter from the selected environment

Jupyter Notebook set-up

The Jupyter notebook worked initially, but I tried importing tensorflow and that would not work, so that led to me messing up everything.
I basically messed everything up, and I feel like the only way out now is to just nuke my device and restart. I had no idea what pip and anaconda are (still don't really), tried a bunch of funky updates and installations and whatever and now everything is just dead. My jupyter notebook cannot even run the normal python kernel.
How can I hard reset everything?
As a bonus, if someone were to ELI5 the difference between conda, pip, gitbash, and PowerShell are. And what versions of stuff does Jupyter run on (since my conda and device had different versions of things I think?). I use Windows 10.
My first piece of advice is to not use Windows, though I'll probably get downvote spam for that. On Ubuntu, I could stuff Jupyter setup into one line:
# update, install python3, python3-dev, and pip3; get pip packages
sudo apt-get update && sudo apt-get install -y python3 python3-dev python3-pip && sudo -H python3 -m pip install jupyter notebook ipykernel tensorflow
Once the packages are installed, it's as easy as running jupyter notebook in the terminal.
Anaconda is a distribution of Python that includes a ton of pre-built packages, including Jupyter and scipy, numpy, pandas, etc. It's an "out of the box" solution basically, that comes with most of the tools you need. "Pip" is a package manager for Python; pip install [package] lets you use a package in your script, like import [package]. In this case, that's tensorflow.
ipykernel is a package that will open up a Python kernel for Jupyter. You could run a Jupyter notebook on a Python3.7 backend but do stuff with Python2 code by installing ipykernel with Python2's pip, usually (on Ubuntu) sudo apt-get update && sudo apt-get install -y python-pip && sudo -H python -m pip install ipykernel.
What happens when you run jupter notebook? Do you get errors? Can you get the notebook to open, but there's just no kernel to attach to a notebook?
I have just set up a new Windows 10 machine for Python, Jupyter, and Tensorflow. I did the set-up without anaconda. I did the normal set-up procedure with some special steps:
1) Python 3.8 und Jupyter as installed by "pip install" does not work. You need to add three lines of code in a module that is installed as dependency when you install Jupyter. change asyncio.py
2) Current Tensorflow does not work with Python 3.8. You need to install Python 3.7. You don't need to delete your Python 3.8 if you have one. Create a virtual environment with virtualenv as described here and give the Path to your Python 3.7 Special Python in virtualenv
3) If you want to use GPU for NVIDIA in Tensorflow, you need to deal with the fact that two things do not fit together: current Tensorflow and the current version of ‘NVIDIA GPU Computing Toolkit’ (a tools you need for GPU support). Take a look here for the fix: cudart64_XYZ.dll not found
Let's start with the basics:
As a bonus, if someone were to ELI5 the difference between conda, pip, gitbash, and powershell are
You probably know the classical cmd.exe which opens a basic terminal where you can use different commands and call programs from. It is basically a text based way to interact with your operating system.
Powershell is in my understanding just an extension of this (I don't use it myself) and has more capabilities of what you can do and also better scripting support.
gitbash is an optional tool that you probably installed when you installed git on your computer. It emulates a bash shell that many people are used to from different operating system like ubuntu where bash is often the default terminal and therefore makes it easier to use, as all the syntax and commands are then the same as these ppl are used.
Neither of these is in any way directly related to using python on your computer other than being able to type python or jupyter notebook into these terminals to start the applications.
To the more python specific questions:
conda is a package and virtual environment management tool. It can be used to install a variety of software and also create virtual environments to keep different set ups seperate from one another (e.g. different python versions on the same machine). But it is not limited to python. It is pre-installed when you download and install miniconda or anaconda which are two python distributions.
pip is a package manager only for python packages and comes pre-installed with most python distributions.
anaconda/miniconda , often times confused with conda are two python distributions, i.e. what you would consider as "I installed python on my system" that come with the conda package manager pre-installed. miniconda does thereby not ship any other packages while anaconda comes with a long list of useful packages pre-installed and is therefore a popular choice when you want an easy acces into using python for your research
For more info, you can also read understanding-conda-and-pip
How can you save your system now
I basically fucked everything up
Difficult to access the current state of your system, but I would suggest you try the following steps to get to a working condition again:
Go into Setting -> Apps and remove everything that is related to python or anaconda. Make sure that everything is deleted by also searching (using windows search feature) for python or conda folders somewhere in C:\Users. This should make sure that everything about your setup is purged
Make sure that neither python, pip or jupyter commands are working anymore in your cmd (confirming the purge)
Download and install miniconda
Now Create a virtual environment and install tf. This is a good way to go because if you should manage to f*k up the environment, you can just delete and recreate it without much trouble:
conda create -n venv pip python=3.7 #create environment
conda activate venv #activate the environment
conda install jupyter #for jupyter notebook
pip install https://storage.googleapis.com/tensorflow/windows/gpu/tensorflow_gpu-2.1.0-cp37-cp37m-win_amd64.whl
Start jupyter notebook: jupyter notebook. Since it only exists in this environment, same as tensorflow, there should be no more issues to use tensorflow normally

Resources