yarn cache size on Mac OS too big - yarnpkg

I just used Clean My Mac's space lens feature to understand what was eating my disk space and I found this under ~/Libary/Caches
Even with the biggest imagination, I can't think at a reason for that folder being so big, is it possible to safely (and periodically) delete this folder?
Thank you

Yes, you can delete that directory (or run yarn cache clean -- see How to clear cache in Yarn?).
Yarn, by default caches the packages it downloads (including different versions). If you delete this cache, the main side-effect that you'll see is it may take longer to run a yarn install because it will need to fetch the necessary packages again.

Related

Miniconda environments taking up ~2 GB each

I am using SpaceSniffer to try to clean up some space on my drive. I noticed my miniconda folder was about 25 GBs. The "pkgs" subfolder is about 11 GB, which I think makes sense given that I use tensorflow, keras, etc.
What doesn't make sense to me is why each separate environment takes up about 2 GBs of space. Has anyone else run into this issue?
This depends on how often you update your packages. However, you can always use
conda clean --all
to remove the index cache, lock files, unused cache packages, and tarballs.
https://docs.conda.io/projects/conda/en/latest/commands/clean.html

Why are there multiple copies of conda files?

I installed Miniconda a while ago, and since then I've noticed there seem several copies of the same files (or files with very similar names) in different locations on my computer.
For example, almost the exact same files in my folder "C:/ProgramData/Miniconda/pkgs" are also in the folder "C:/Users/me/.conda/pkgs". I should note that the only other things in the ".conda" folder is an "environments.txt" file and and "envs" folder with a file called "conda_envs_dir_test".
I've also noticed that the folder "C:/ProgramData/Miniconda/Lib/site-packages" also contains files with very similar names.
Anyway, I wanted to ask if all this is necessary, and why? Sorry if this seems like a weird question. I'm still relativity new to programming.
Conda Package Caching
Conda downloads and unpacks packages into a package cache, and then uses hardlinking to install those packages into environments. One can freely delete the files in the package caches, though this undermines Conda's ability to minimize redundancy across environments going forward. The safest way to clear the package cache is to use the command
conda clean -tp
Multiple Package Caches
It should be noted that you appear to have two package caches, a system-level cache at C:/ProgramData/Miniconda/pkgs and a user-level cache at C:/Users/me/.conda/pkgs. This occurs when users install with the "Install for All Users" option. This is typically not recommended for regular end users, but rather more for System Administrators who are managing a multi-user installation. Conda functions perfectly (and arguably with less hassle) without ever needing elevated privileges.
All that to say, you may need to elevate your privileges for the conda clean command to also clear out the system-level cache. Additionally, if you haven't been using it too long, you may consider uninstalling the system-level install and reinstalling at the user level.

Is it safe to delete ~/.conda/pkgs [duplicate]

I ran this command to release disk space on anaconda
$ conda clean --all
However, there are still some big files that remain in pkgs folder in anaconda python.
Is it safe to manually delete all the files in pkgs folder? Any risk of corrupting my anaconda environment? What are some side effects, if any?
I am using anaconda 2018 on windows 10.
Actually, under certain conditions it is an option to have the pkgs subdirs removed. As stated here by Anaconda Community Support "the pkgs directory is only a cache. You can remove it completely is you want to.
However, when creating new environments, it is more efficient to leave whatever packages are in the cache around."
According to the documentation you can use conda clean --packages to remove unused packages in pkgs (which will move them to pkgs/.trash from which you can then safely delete them). While this does not check for packages installed using symlinks back to the package cache, this is not a topic if you don't use such environments or work under Windows. I guess that's why conda clean --packages is included in conda clean --all.
To more aggressively save space you can use conda clean --force-pkgs-dirs to remove all writable package caches (with the same caveat that there could be environments linked to these dirs). If you don't use environments or use Anaconda under Windows, you're probably safe. Personally, I use this option without issues.
Edit Commentary
After reviewing the documentation pointed out in #Robert's answer, I must admit my initial response was overly alarmist and, in parts, blatantly incorrect. My apologies for the misleading response.
Nevertheless, I do believe some of what I raised still has some merit for this thread, and so I am deciding to retain the answer with amendments. In particular, I think it worth emphasizing that deleting the pkgs directory may not actually achieve what OP was hoping for (to save space) and that removing the package cache undermines Conda's redundancy minimization strategy going forward by making it impossible to share already installed packages.
Instead, my final recommendation concurs with what #Robert suggested, namely, use conda clean -p to delete unused packages, but keep the cache (pkgs dir) so that future environments can still leverage hardlinks. One last point to note, is that some tools, such as conda-pack, rely on the integrity of the package cache in order work, so deleting pkgs will prevent their use.
Amended Original Response
No, it is definitely not safe, and in fact the only way you would actually free disk space is if you broke your base env. The issue is that all envs use hardlinks to the pkgs directory, so even if you delete the link located in the pkgs directory, the ones in the envs will still be there and so you won't delete any physical files on the disk. The only real deletion you might do is something that is only referenced by base, i.e., the only copy is in pkgs, hence the potential for a breaking base.
Correction: The base env still links packages to other locations, so deleting pkgs will not impact base as I originally concluded.
I'd highly recommend looking at this other post on estimating the real disk usage of Conda. You may be overestimating how much space is really being used. For most files in pkgs, there is only one physical copy, so there isn't any additional manual optimization to be done.

Yarn Offline Mirroring Doesn't Work As Expected

Trying to utilise yarn offline mirroring, and it's not working as I expect.
How I expect it to work:
Tarballs get saved in a cache folder, as specified in the .yarnrc, and yarn install --offline extracts those into node_modules in up to 5 seconds and everyone is happy.
How it seems to (not) work:
After I did everything described in the doc above, I:
Delete node_modules
Try to yarn install --offline again with my wifi turned off.
This results in a failure in the Linking dependecies... step (3rd one after Resolving & Fetching). The error is a package (chromedriver) trying to use internet connection and also seems to be a symptom and not the actual problem.
Fetching step is very quick, so it does seem like it finds the local tarballs, but why does the linking take so long? I'm talking about 4-5~ minutes of a yarn install step each time, which eventually takes pretty much the same amount of time, thus gaining me nothing overall except for lots of binaries in my repo.
Is the process itself faulty? Am I doing anything wrong, or not running the correct commands? The docs are not clear, to say the least.

Intelli J IDEA takes forever to update indices

Is it normal for Intelli J to take a lot of time (almost 12 hours) to update indices for a project? I just installed Intelli J on my machine and imported a rather large Maven project (13k+ files).
I understand that the project is large but I let my computer up all night and when I woke up in the morning, Intelli J still hasn't finished updating the indices for the files yet, which makes it impossible to do anything since the popup with title 'Updating Index' keep hanging there in the middle of the screen.
There are several answers in the Forums for different IntelliJ Versions, here is what I´ve tried (IntelliJ 13).
Give more Memory. Does not help anything with the 'Updating indices'
Delete .idea and iml in project. Does not help.
In the end what resolved my problem with 'Updating indices' was:
delete 'caches' folder in user/.IntellIJIdea13/system/
I tried deleting the cache and it works perfectly. Thanks for the solution friends.
Just:
Open IntelliJ IDEA
Select the File menu
Select the Invalidate Caches / Restart... menu.
Once selected you get a pop-up with a bunch of options.
Select Invalidate and Restart
and before doing that make sure you saved all your changes else it might delete some unsaved changes.
Once you hit that, IntelliJ will restart and then you can see that all the indexing is done really fast.
Delete caches in library folder
rm -rv ~/Library/Caches/IdeaIC15/caches/
On Mac OSX the location of cache is ~/Library/Caches
In Intellij 2020.2 I faced this problem too. Restart/Invalidating cache didn't work for me. What I did was just deleting the cache folder in the following path:
C:\Users\davoud\AppData\Local\JetBrains\IntelliJIdea2020.2
I was able to solve this by going to File -> Invalidate Caches / Restart
Then on the dialog that opens select "Invalidate and Restart"
I had the same problem with IntelliJ 2017.2.3 - i.e. my project would keep updating indexes over and over again.
I discovered that I had gone over my disk quota in my home directory. By default IntelliJ stores the indexes in the home directory like this:
~/.IdeaIC2017.2/system/index/
The solution for me was to:
Quit IntelliJ
Move the whole .IdeaIC2017.2 directory to another
mount which has more space:
mkdir /space/ideaConfig
mv ~/.IdeaIC2017.2 /space/ideaConfig/IdeaIC
Update bin/idea.properties to point at the new index/config location:
idea.config.path=/space/ideaConfig/IdeaIC/config
idea.system.path=/space/ideaConfig/IdeaIC/system
It is possible that some of the other answers to this question were due to the same problem and were inadvertently fixed by "deleting caches folder", "invalidating caches" etc which would have potentially freed up enough disk space to build the indexes.
Though the accepted and other answers may fix a particular problem, I have found that the problem with very long indexing times often comes from the fact that a repository contains or links to some directory that contains a large number of files. Often this is done for testing and the directory in question is not actually part of the project, e.g. ignored by the VCS.
The IDE does not automatically ignore those directories when indexing, but it is possible to "exclude" the directory from the project. This will prevent indexing as well.
the easier way is as follow:
file --> settings --> (uncheck) Synchronize files on frame or editor tab activation.
I guess Idea is more collecting garbage than doing useful work. Use G1 GC instead of the default.
Help -- Edit Custom VM Options
-XX:+UseG1GC
instead of
-XX:+UseConcMarkSweepGC
and of course restart Idea.
Downside: G1 tries to collect garbage before stopping the process. This is insane, but this is what it does. For a program with 16G of heap, cleaning-up took 27 minutes. So do not configure your Idea to use a 16G heap.
you can manually exclude file or directory withIDEA Indexing, it can improve file load speed.
some files are generated dynamic, we don't need index these files every time when we start IDEA.
"Invalidate Caches.." always works for me.
So, for all of you that are still in pain because of this when using Windows (with or without WSL) you can try this:
https://youtrack.jetbrains.com/issue/IDEA-293604/IntelliJ-is-hanging-during-build-process-and-indexing-process-when-working-on-the-WSL-projects.#focus=Comments-27-6427409.0-0
Windows defender can be very demanding with these things. Excluding idea64.exe process just made everything fast again for me.
probably - old bug in caching system. it happens in ALL versions, especially if you upgrade your version of IntelliJ or JDK.
To fix it:
1) close the GUI.
2) go to %HOME_DIR%.IntelliJIdeaXXXX\system\caches and delete it
3) start the the GUI again.
~/Library/Caches/JetBrains/IntelliJIdea2021.1
it's works for me..
IDEA 2020.2
Right click folder under the Project, then mark as Exclude to exclude folder that you don't want to index

Resources