Store on Windows - pnpm

I don't know if this is a bug or a lack of understanding. I don't get how the pnpm store works on Windows.
Say, I use a folder "test" on C: - the store (.pnpm-store) is created in %USERPROFILE%. When I have this folder "test" on another drive, the store is created inside "test". Furthermore, when there would be another folder "test2" on this very drive, another new store will be created in "test2". IMO, "test" and "test2" on this (other - not c) drive should use a store in the root of this drive (yes, my user can create a folder there), shouldn't it.
Then, let's suppose I have a folder "test", with a package folder inside called "package1" and I create package.json by "pnpm init -y". Now I add a package, eg "pnpm add debug". IMO the store on this drive should be referenced. But it isn't:
C:\test\package1>dir /s | findstr JUNCTION
10.10.2019 13:25 <JUNCTION> debug [C:\test\package1\node_modules\.pnpm\registry.npmjs.org\debug\4.1.1\node_modules\debug]
10.10.2019 13:25 <JUNCTION> ms [C:\test\package1\node_modules\.pnpm\registry.npmjs.org\ms\2.1.2\node_modules\ms]
10.10.2019 13:25 <JUNCTION> ms [C:\test\package1\node_modules\.pnpm\registry.npmjs.org\ms\2.1.2\node_modules\ms]
So what's wrong?

drive should use a store in the root of this drive (yes, my user can create a folder there), shouldn't it.
When you install on the same drive where the users dir is, the store is created in users dir. When you install on other drives, the store is created at the root of the drive (for instance, at D:\.pnpm-store)
the store on this drive should be referenced. But it isn't
That package that you see in your project is physically the same package as the one in the store. It is a hard link. More details from the pnpm FAQ page:
pnpm creates hard links from the global store to project's
node_modules folders. Hard links point to the same place on the disk
where the original files are. So, for example, if you have foo in your
project as a dependency and it occupies 1MB of space, then it will
look like it occupies 1MB of space in the project's node_modules
folder and the same amount of space in the global store. However, that
1MB is the same space on the disk addressed from two different
locations. So in total foo occupies 1MB, not 2MB.

Related

WS2012R2: Symlink from a network share to another network share?

I have a question according to creating symlinks on network share which link to another network share.
The Windows clients in our company have a network drive mapped on J:\
the UNC path is \\DataServer01\network
previously, there was some kind of a symlink in the network directory called "import" (so the UNC path was \\DataServer01\network\import), which was linking so \\ERPServer01\share\import.
So the users could go to their mapped network drive on J and put a excel file into J:\import - so the excel file was put to \\ERPServer01\share\import in reality.
Accidentaly, the symlink was deleted by another admin. Now I was trying to recreate the symlink using
mklink /d import \\ERPServer01\share\import
And so far the symlink was created, and you could access it from the DataServer01. But - you can't access that symlink from the network drive J:\. If you try this, you receive the error that the symbolic link cannot be accessed. I googled a lot and the reasons why this concept couldn't work (links are resolved relatively by clients) was quite plausible.
The thing is, my predecessor got it to work somehow, he somehow managed to create a proper "symlink" or hard link or something similar. How the hell did he managed to get it to work? Unfortunately I can't ask him.
There is also no DFS in use. It must have beed some other method.
I have to recreate it exactly how it was, because I don't want to explain to 300 users why they have to put their excel sheets in another directory now. And I don't want to map another network drive.
Any ideas?
Possibly it wasn't a symlink before (checked your backup?). Alternatively, you can create a "magic" Explorer folder:
create an empty source folder
inside the source folder, create an Explorer link to the target folder named target
inside the source folder, create a desktop.ini text file with the contents
[.ShellClassInfo]
CLSID2={0AFACED1-E828-11D1-9187-B532F1E9575D}
flag desktop.ini as System and Hidden
flag source folder as System
An Explorer magic link folder looks similar to a symlink but only works with Windows Explorer whereas a symlink works with (nearly) everything, once activated through GPO.

Why does %AppData% in windows 7 seemingly points to wrong folder?

The best way to find and get to AppData folder (specifically windows 7) is to type %AppData% in start command. But it takes me to C:\Users\user.name\AppData\Roaming
Why does it takes me to Roaming folder when the actual AppData Folder is one level up? It happens on both computers I have.
There is no environment variable for the root "AppData" folder because no one should ever put data there. Instead, applications put data into subfolders of the "AppData" folder depending on the nature of that data.
There are three subfolders:
Roaming
Local
LocalLow
The regular old %AppData% environment variable points to the path for the "Roaming" subfolder, which is where most applications should store their data, unless they have a specific reason that that data should not roam with the user's profile.
If you want the "Local" (non-roaming) sub-folder, use the %LocalAppData% environment variable instead.
As for typing this into the Start → Run dialog, that doesn't make a lot of sense to me. Data that is stored here is not intended for user consumption. It is private data stored by applications for their own use—things like configuration files, databases, etc. User-facing data should be in the Documents folder or at a path specified by the user. If you're a software developer accessing this folder for testing purposes, just go up one level.
Typing appdata (not %appdata%) in Run dialog (Win+R) takes you to %USERPROFILE%\AppData (i.e. %AppData%\..).

Sync a directory to Google Drive from outside the Drive folder

Context
I am trying to have the resources directory of a project sync itself to Google Drive as a means of backup, without having to physically move it into the Drive folder, and while maintaining full tab completion and access via the terminal from within the project. I'm on OS X.
The project is stored in a separate portion of my filesystem dedicated to GitHub repos, but I don't want to store all of the media (it's a vision project) on GitHub.
Now, the simplest solution is to alias the resources directory, move the original to the Drive Folder and access it via the alias within the project. However, accessing the resources via an alias in the terminal appears to break not just tab completion, but actual access to the media...
Eg: ./program input=resources/video.mp4
...where resources is an alias of Drive/resources, fails. And I'm afraid my understanding of aliases, symlinks and hardlinks isn't deep enough to see why.
Questions
Is there a way to sync the a directory to drive without physically moving the original directory to Drive/, such that I can leave my project filesystem untouched.
OR
Is there a way to maintain normal path behaviour and tab completion with an aliased directory? eg: ./program input=Project/res_alias/video.mp4. If so, the Drive issue is null and void.

Directory location for writing cache file

Hi I am trying to find out what is the best location to save a cache file.
I have an Windows form application that updates user's data from the server using a custom tool.
I want to write the timestamp of the latest updates done on user's machine in the cache file.
Where is the best location for keeping this file:
1. in application directory (c:\program files..)
2. in a temp location e.g. Users profile folder or c:\windows\temp
3. in any location (e.g. c:\dataupdates) where user has full access to read/write to.
Not in the application directory. That much is clear. :) The application directory shouldn't even be writable by the program (or actually by the user account that runs the program). Although some applications still use this location, it has actually been deprecated since Windows 95, I believe, and it has become a real pain since the more rigid UAC applied in Windows Vista and 7.
So the most obvious options are:
The temp folder, which is for temporary files. Note however, that you will need to clean those files up. Temp folder is not automatically cleared by default, so adding new files all the time will consume increasingly much space on the hard drive. On the other hand, some users do clear their temp folders, or may have scripts installed that do that for them, so you cannot trust such files to remain. Also, this is not always C:\Temp of whatever. You'll have to ask Windows what the location is.
You can pick 'any' location. Note that you cannot write anywhere. You cannot even expect the C drive to exist. If you choose this, then you have to make it a configurable setting.
The %app data% directory, my personal favorite, which is a special directory for applications to store their data in. The advantage is, that you can ask Windows for this location, and you can make up a relative path based on that directory, so you don't really have to make it an application setting. For more info on how to get it, see also this question: C# getting the path of %AppData%
I would definitely choose the App Data path for this purpose.

DLL loading with hardlink

I am trying to devise a method which helps to load DLL from a common location for various products. This helps the following directory structure to avoid file replication.
INNSTALLDIR/Product1/bin
INNSTALLDIR/Product2/bin
..
INNSTALLDIR/ProductN/bin>
Instead of replicating DLLs in each product's bin directory above, I can create a DLL repository/directory - 'DLLrepo' in INSTALLDIR and make all product exceutables load from it. I am thinking to do this by creating hardlink to each DLL in 'DLLrepo' in each product's bin directory. This will help to address platforms starting from WinXP. Using 'probing' method can address only Windows server 2008 and above.
I like to get your opinion if this approach looks like a reasonable solution.
When we create hardlink to a file, the explorer or DIR command doesn't account valid size of the folder involving link. It account the actual data size in the linked file in total size of the directory. This is a known issue in windows if I am not wrong. Is there any utility that I can use to verify the actual folder size? Is it possible to use 'chkdisk' on a directory path? Another thing which I like to know is to get the list of links created on file data.
When we create hardlink to a file, the
explorer or DIR command doesn't
account valid size of the folder
involving link. It account the actual
data size in the linked file in total
size of the directory. This is a know
issue in windows if I am not wrong. Is
there any utility that I can use to
verify the actual folder size?
I can provide an answer, of sorts, for this part of the question. When you create file hardlinks, there's not really any concept of which "file" is the original. Each of them points to the space on disk that the data is occupying and modifying the file via any of these references affects the data that's seen when accessing it via any other hardlink. As such it's less a known "issue" and more of a "this is how it works".
As such, there's no way to verify "actual folder size" unless you're looking at the size of the highest common parent folder of the folders that contain the links. At that point you can start single-counting each hard-link to get an accurate idea of space used on disk.

Resources