How do I stop OneDrive from downloading git.exe on Windows? - windows

I have used Git on Windows for a while, but recently changed the setting and got this.
On almost every command for Git Bash (also on PowerShell and Github Desktop) I get
git.exe is being downloaded on OneDrive
(translation may not be exactly the same)
The setting that changed recently is moving my repos to a OneDrive folder in order to have them synced between two sessions: that is work desktop and remote virtual machine.
I can see that this may not be ideal, but it really works for me since I have the same settings on both sessions, and not really get used to doing many commit-push-pull. Not the main topic here, but feel free to comment.
(Edit): Upon reading solution, there are other ways to set this syncing that doesn't mess up with the internals of Git. Look for that instead. Thanks.
In any case, the strange thing is that the notifications happen only on the Remote Virtual Machine, but not on the desktop.
I have seen some notifications about some files in the repos, which I then attribute to OneDrive being nosy about every move I make file I move. But then I've also seen files I don't know about, and theres always git.exe attached to the notification.
In the first scenario I have tried tuning down the notifications for OneDrive. Some might say Microsoft does have a background for not letting users setup their notifications, so I'm still looking.
Thanks.

Most file syncing tools like OneDrive and Dropbox operate by syncing data file by file. This is a great approach if you're working on a single word-processing document or spreadsheet. However, it's not as great when you're working with a Git repository.
When changing between branches or making a commit, Git changes and creates a lot of files all at once. In order to be synced correctly, all of the created files must be written in a similar order: all the blobs must be written, then the trees, then the commits, and then the refs can be updated. If you do this out of order, your repository can be corrupted, since you can have branches that refer to objects that don't exist (or objects that refer to other objects that don't exist).
In addition, these tools can end up deleting files you wanted to have in your working tree or recreating files you didn't. So overall, you don't want to sync any Git repository using one of these tools.
You can write a bundle file with git bundle and sync that, or you can use rsync to sync a repository provided it's idle (not being modified) when you do. Note that if you sync a working tree, Git will need to refresh all files when you sync it across to the new machine, and also Git doesn't try to defend against untrusted users who have access to the working tree.
It's also not a good idea to sync your Git installation itself via OneDrive, which is what it sounds like might be happening. Instead, install Git for Windows on each machine independently and don't try to sync it across. OneDrive should have configuration options that let you control what's synced.

Related

Git/Windows: Possible for Two Users to Share the Same Folder?

In my scenario, I have two people that do work on the same code base. Their only available workspace is a shared dev environment (where the files built are used to host the dev version of the site to boot). As such, they perform their work directly in that location. I've recently introduced source control to the project, and turned that location into a Git repository.
Let me preface by saying: Yes, I would love it if the dev host spot was a deploy-to spot, and these people had their own local copies of the source code. But that isn't feasible right now.
My question: Is it possible for two different Windows users/Git users (they have separate accounts that they can use to interact with GitHub/etc. with) to share the same folder? My hope would be that SourceTree (our weapon of choice) or Git, at least, wouldn't have a problem with this: Just show diffs of what's changed, and use the currently-logged-in user's information when making commits/other actions.
It looks like that while SourceTree has separate installation directories, it still embeds some account information in the .git folder itself. When I try to interact with Git (via a pull for example), it first tries to prompt for new credentials/etc., but shortly thereafter it says "please enter password for {other-user}" without an option to hop usernames.
It looks like we'll just have to do things the right way after all. Painful (for them) but no choice.

Safely using junction or mklink /j with a git repository on Windows

Using Git on Windows, I'm trying to deal with content that's external to my git repo. We have artwork and content files for instance that are being updated by non git-users in google drive so to capture these changes I've setup something similar to the following;
d:\MyRepo
\.git
\code1
\images1
\fonts (junction) => c:\users\%username%\google drive\designerLtd\fonts
\etc
Where 'fonts' is a folder has been linked using either junction.exe or mklink /j (same thing). This generally works out great because Git status immediately highlights new changes (either on purpose or by accident) and prepares them for checkin or undo.
ISSUE: sometimes when switching branches Git prunes the linked directories and re-creates them if content in those folders is different between branches. In effect it breaks the link. Now Git is always correct and the build is consistent but it's not always obvious that it is no longer keeping track of those external resources.
Worse still, it can delete files in the external location. They can be recovered from git of course, but it's very unwieldy.
Swapping the content in the external locations when branches are switched isn't a problem, because there's only one PC that's hooked up this way and they're easily merged, but I just wish it didn't break the links.
QUESTION: Is there a better way to allow external junction points within a Git repo on Windows?
To be clear, there are no symlinks in the GIT repository (yet) as far as I know and this isn't a question about interoperability between Unix and Windows git clients (which most of the other questions on SO seem to relate to).
You can modify permissions of the junction point so that git can no longer delete it. Git usually doesn't care if removing a directory fails (except if it needs to replace the directory with a file).
See "Usage Recommendations" in https://support.microsoft.com/en-us/kb/205524

DropBox as Version Control and Offsite Backup

After reading Michael Lopp's book "Being Geek," I started using Dropbox as a means of synchronizing files between my home computer and work computer. It's been fantastic, it really makes it painless to keep track of the latest version of files you're working on.
My question has to do with people's experience with this tool, especially programmers who may have used it to develop larger projects.
Right now, I see 3 main uses of Dropbox:
1. synchronize files between home and work computers
2. version control (you have to log into the dropbox site to access previous versions)
3. off-site backup
Right now I'm using it as my main backup tool, which I'm not sure is a good idea. But right now I have a local (working) copy of my entire project "checked out" on each computer (my home laptop and my work computer), and additionally, my entire project is kept on the dropbox site. So I'm thinking, if anything happens to one of my computers, or both, I'll still have that off-site backup available and I'll simply have to reinstall dropbox to access all my files.
Does anyone have experience with doing this? Has anyone done a major file recovery using dropbox? Or is this even widely used? Thanks for your feedback in advance.
Using Dropbox to maintain several files and its associated metadata when those files are historized in a VCS is always a bit tricky because of potential corruption issue (if one of those metadata part of the repository isn't correctly synchronized, you can end up with a non_working repo)
That is why I always use with DropBox:
a DVCS (like Git): I can work directly in a working tree within a DropBox repo or I can clone said repo anywhere else outside the DropBox if I need to,
a single bundle file to which I can push at any time the changes from my local repo, wherever that repo might be.
That way, the only file that really need to be in sync in DropBox is that unique bundle file (representing a bare repo as one file).
See "Git with DropBox" for more.

Check in - Check out process/version control for PSDs and Image files

The title may not be so clear but the issue I am facing is this:
Are designers are working on large photoshop files across the network, this has a number of network traffic and file corruption issues which I am trying to overcome.
The way I want to do this is to have the designers copy the the files to their machine (Mac OSX) and work on them locally. But the problem then stands that they may forget to copy them back up or that another designer may start work on the version stored on the network.
What I need is a system where the designer checks out the files or folders from the server which locks those files so no other user can copy them until they are checked back in. We do not need to store revisions for the files.
My initial idea was to use SVN or preferably GIT and force lock on checkout somehow, does this sound feasible or is there a better system?
How big are the files on average? Not sure about GIT haven't used it but SVN should be ok - If you did go with SVN I would trial checking out over Http/Https vs Network Path to the repo as you may get a speed advantage out of one or the other. When we vpn to our repo at work it is literally 100 times faster over http than checking out using a network \\path to the repo.
SVN is a good option, but you will have revisions (this is the whole point of SVN). SVN doesn't lock files by default, but you may configure it so that it does. See http://svnbook.red-bean.com/nightly/en/svn-book.html?bcsi_scan_554E00F99A9AD604=0&bcsi_scan_filename=svn-book.html#svn.advanced.locking
I don't know git very well, but since it's not a centralized VCS, I'm pretty sure it isn't the right tool for your situation.

Concurrency in a GIT repo on a network shared folder

I want to have a bare git repository stored on a (windows) network share. I use linux, and have the said network share mounted with CIFS. My coleague uses windows xp, and has the network share automounted (from ActiveDirectory, somehow) as a network drive.
I wonder if I can use the repo from both computers, without concurrency problems.
I've already tested, and on my end I can clone ok, but I'm afraid of what might happen if we both access the same repo (push/pull), at the same time.
In the git FAQ there is a reference about using network file systems (and some problems with SMBFS), but I am not sure if there is any file locking done by the network/server/windows/linux - i'm quite sure there isn't.
So, has anyone used a git repo on a network share, without a server, and without problems?
Thank you,
Alex
PS: I want to avoid using an http server (or the git-daemon), because I do not have access to the server with the shares. Also, I know we can just push/pull from one to another, but we are required to have the code/repo on the share for back-up reasons.
Update:
My worries are not about the possibility of a network failure. Even so, we would have the required branches locally, and we'll be able to compile our sources.
But, we usually commit quite often, and need to rebase/merge often. From my point of view, the best option would be to have a central repo on the share (so the backups are assured), and we would both clone from that one, and use it to rebase.
But, due to the fact we are doing this often, I am afraid about file/repo corruption, if it happens that we both push/pull at the same time. Normally, we could yell at each other each time we access the remote repo :), but it would be better to have it secured by the computers/network.
And, it is possible that GIT has an internal mechanism to do this (since someone can push to one of your repos, while you work on it), but I haven't found anything conclusive yet.
Update 2:
The repo on the share drive would be a bare repo, not containing a working copy.
Git requires minimal file locking, which I believe is the main cause of problems when using this kind of shared resource over a network file system. The reason it can get away with this is that most of the files in a Git repo--- all the ones that form the object database--- are named as a digest of their content, and immutable once created. So there the problem of two clients trying to use the same file for different content doesn't come up.
The other part of the object database is trickier-- the refs are stored in files under the "refs" directory (or in "packed-refs") and these do change: although the refs/* files are small and always rewritten rather than being edited. In this case, Git writes the new ref to a temporary ".lock" file and then renames it over the target file. If the filesystem respects O_EXCL semantics, that's safe. Even if not, the worst that could happen would be a race overwriting a ref file. Although this would be annoying to encounter, it should not cause corruption as such: it just might be the case that you push to the shared repo, and that push looks like it succeeded whereas in fact someone else's did. But this could be sorted out simply by pulling (merging in the other guy's commits) and pushing again.
In summary, I don't think that repo corruption is too much of a problem here--- it's true that things can go a bit wrong due to locking problems, but the design of the Git repo will minimise the damage.
(Disclaimer: this all sounds good in theory, but I've not done any concurrent hammering of a repo to test it out, and only share them over NFS not CIFS)
Why bother? Git is designed to be distributed. Just have a repository on each machine and use the publish and pull mechanism to propagate your changes between them.
For backup purposes, run a nightly task to copy your repository to the share.
Or, create one repository each on the share and do your work from them but use them as distributed repositories from which you can pull changesets from each other. If you use this method, then performance of doing builds and so on will be decreased since you will be constantly accessing over the network.
Or, have distributed repositories on your own computers, and run a periodic task to push your commits to the repositories on the share.
Sounds just as if you'd rather like to use a centralized versioning system, so the query for backup is satisifed.
Perhaps with xxx2git in between for you to work locally.

Resources