In Jupyter Lab, How do I clear previous tensorboard graphs? - macos

I am using Jupyter Lab on Mac to run tensorflow and wondering how can I clear all the previous graphs every time I rerun my code? I have a problem in overlaying all the previous graphs on tensorboard whenever I rerun my code.
Thanks

You should restart the kernel through the kernel tab on the top of the notebook.

Related

Jupyter Notebook became extremely slow after a conda update and installation

Recently I've been trying to set up an environment in order to use GPU for Jupyter Notebook, and wrote a couple codes as following:
'''
pip install tensorflow-gpu
'''
'''
conda update -n base conda
'''
After all, I didn't get what I wanted, but instead it ruined my existing Jupyter notebook performance. It gave me a few messages regarding downloading and removing some of the packages that I don't know so well.
Now when I'm running kernel, it is relatively slow. It is somewhat bearable when I'm running simple codes on pandas or numpy, but I can definitely tell it has been slowed down when I'm running some libraries with sophisticated calculations, like sklearn. It took me about 3 to 5 mins for some of the codes before I did what I did, but it's now taking me around 30 mins to do the same process.
I've tried to reinstall Anaconda3 multiple times, but it's not taking me back to the default settings. Can someone help how to fix this, or may be just change my Jupyter Notebook settings(including packages) to when I first downloaded Anaconda3?.
I'm quite new to softwares so I'm not very familiar with certain terms and concepts. If someone can explain me the solution by process, my appreciation is yours.
Thank you

DataSpell running very slow

My projects run so much slower in JetBrains DataSpell rather than Jupyter notebook. When I try to hyper-tune my parameters using sklearn it takes nearly 50 times longer than when on Jupyter notebook.
Does anyone have any solutions?

Julia .jl in Jupyter notebook

I'm new to Julia, I have a file called "example.jl" and I want to open it in Jupyter. I added the Julia kernel to my jupyter kernels.
My question is:
Is there a terminal command like:
jupyter notebook blabla.ipynb [that I use to open my notebooks]
Which opens my "example.jl" script in my jupyter notebook with the right jl kernel?
I looked into many pages and couldn't find an answer.
P.S: What I do now is to open a notebook with jl kernel and copy the Julia script into it. But I'd like to know if there are more elegant ways to open .jl s.
Generally you need to create a new empty Jupyter notebook with Julia kernel and copy-paste your code there.
There is also a nice Julia implementation - Weave.jl. Since Jupyter's format is more complex, special code formatting is required (for hints see pictures at https://github.com/JunoLab/Weave.jl) - once it is done you can do the conversion in the following way:
convert_doc("examples/some_code.jl", "some_notebook.ipynb")
There are some other (usually Python-based) tools available, that under some circumstances can be used to split source code file into several Jupyter cells but again every time this assumes some specific code formatting.
P.S.
If you are looking for and IDE try Visual Studio Code with Julia extension which is great.

Jupyter shows no output

I am running few cells on my Jupyter notebook. It looks like after running the cell I don't get output second time. I have to restart the kernel, after which I am able to see the output on running the cell. But then when I try and rerun the cell there is no output again. Not sure the reason. I am using jupyter with anaconda.
I read some thread which said the output going to Ipython terminal in their case, I am not sure similar thing is happening in my case or is it something else.
I know Question is little vague but then I can not think of what more Info I can provide
Thanks

PDB debugger stability questions

I am using either pdb or ipdb for debugging my python code. However whenever I am using set_trace() I can typically run a handful of lines of code to test but it eventually freezes while I am typing. I kill the python process and have to re-run the entire process from the start - which usually kills about 5-10 minutes of data processing time to get back to where I was.
I am using an anaconda build with python 2.7.
The only anomaly I have is that I needed to run
conda install -c conda-forge psycopg2=2.6.2 in order to be able to use psycopg2. I have been ignoring it for the last two months but realize that it isn't an acceptable work flow.
Any thoughts to help resolve would be appreciated.
Resolved it.
I still don't know why this behavior is happening but if I press caps lock twice while it is frozen - it unlocks the set_trace. Don't ask me why but it works.

Resources