Git push throwing error: GH001: Large files detected - xcode

I am pushing my Xcode project which written in Swift to GitHub. In this project I used GoogleMaps SDK for IOS and GoogleMaps framework integrated with few other frameworks which made this project heavy as I expected. particularly there is one file called GoogleMaps is over 100MB which is violating GitHub's policy thus I am getting below error.
C:\Users\Shyam Bhimani\Desktop\FindMyBuddy>git push
Git LFS: (0 of 0 files, 1 skipped) 0 B / 0 B, 34 B skippedCounting
objects: 691, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (597/597), done.
Writing objects: 100%
(691/691), 60.52 MiB | 384.00 KiB/s, done.
Total 691 (delta 161),reused 0 (delta 0)
remote: error: GH001: Large files detected. You may want to try Git
Large File Storage - https://git-lfs.github.com.
remote: error: Trace: a2d7f29c8861bcb6bd13498cfcd44ac3
remote: error: See http://git.io/iEPt8g for more information.
remote: error: File
Pods/GoogleMaps/Frameworks/GoogleMaps.framework/Versions/A/GoogleMaps
is 123.08 MB; this exceeds GitHub's file size limit of 100.00 MB To
https://github.com/shyambhimani/FindMyBuddy.git ! [remote rejected]
master -> master (pre-receive hook declined)
error: failed to push some refs to
'https://github.com/shyambhimani/FindMyBuddy.git'
So far I have tried
Git lfs track 'Pods/GoogleMaps/Frameworks/GoogleMaps.framework/Versions/A/GoogleMaps'
git lfs track '*.*'
git lfs track '*.File'
However no luck It still gives me same error every time I push.
I do not know extention of that file so I tried *.* just in case if it works but it did not.
I would be grateful if anyone could help me to solve this issue. TIA

GitHub has a limit of 100MB unless you pay for Git LFS. Sadly there is no way to get around this unless you ignore files using .gitignore. But those files will no longer be tracked by git.
EDIT
Forgot to mention for your specific case especially with a common framework such as Google Maps it is very easy to re download that at a later date using CocoaPods. Personally I would ignore that whole folder in .gitignore because after you pull down your repo you can always re download it easily.

Related

files don't exist yet "this exceeds GitHub's file size limit of 100.00 MB" [duplicate]

This question already has answers here:
How to remove file from Git history?
(8 answers)
Closed 1 year ago.
I am trying to git add, commit, push an update made to some Python code, where I changed the naming convention of the files.
NB: I want my local branch to replace the remote version.
I have also deleted these files from data/ folder. However, git push and git push --force yield the same error:
remote: error: File workers/compositekey_worker/compositekey/data/20210617-031807_dataset_.csv is 203.87 MB; this exceeds GitHub's file size limit of 100.00 MB
remote: error: File workers/compositekey_worker/compositekey/data/20210617-032600_dataset_.csv is 180.20 MB; this exceeds GitHub's file size limit of 100.00 MB
But data/ only contains example datasets from online:
$ ls
MFG10YearTerminationData.csv OPIC-scraped-portfolio-public.csv
Is the problem to do with caching? I have limited understanding of this.
git status:
On branch simulate-data-tests
Your branch is ahead of 'origin/simulate-data-tests' by 6 commits.
(use "git push" to publish your local commits)
nothing to commit, working tree clean
git rm --cached 20210617-031807_dataset_.csv:
fatal: pathspec '20210617-031807_dataset_.csv' did not match any files
git log -- <filename> in data/:
$ git log -- 20210617-031807_dataset_.csv
commit 309e1c192387abc43d8e23f378fbb7ade45d9d3d
Author: ***
Date: Thu Jun 17 03:28:26 2021 +0100
Exception Handling of Faker methods that do not append to Dataframes. Less code, unqiueness enforced by 'faker.unique.<method>()'
commit 959aa02cdc5ea562e7d9af0c52db1ee81a5912a2
Author: ***
Date: Thu Jun 17 03:21:23 2021 +0100
Exception Handling of Faker methods that do not append to Dataframes. Less code, unqiueness enforced by 'faker.unique.<method>()'
A bit of a round about way but works for this situation effectively.
If you are sure that you want your local branch files to be in your remote branch; and have been experiencing these issues of once committed but since deleted files.
On GitHub Online, go to your folder, select you're branch.
Then "Add file" > "Upload files" file manually that you initially wanted pushed.
Then on your machine:
git checkout master
git branch -d local_branch_name
git fetch --all
I was successfully able to make a git push thereafter.

Can't push a single CSS file with git

I've encountered this horrible error message when trying to push sometimes.
> git push
Enumerating objects: 14, done.
Counting objects: 100% (14/14), done.
Delta compression using up to 12 threads
Compressing objects: 100% (7/7), done.
Writing objects: 100% (8/8), 215.75 KiB | 11.35 MiB/s, done.
Total 8 (delta 2), reused 5 (delta 0), pack-reused 0
send-pack: unexpected disconnect while reading sideband packet
Connection to github.com closed by remote host.
fatal: the remote end hung up unexpectedly
I had 5 or 10 files in my latest commit that I wanted to push, and I got this error. Through a lot of trial and error, I managed to push everything in separate commits, except for a single bootstrap css file. The file is only 216KB, so I doubt it's a file size issue. I also doubt it's a network issue because I have had no other internet connection problems in weeks, and have even been on zoom calls sharing my screen while this was happening.
Here's what my terminal looks like when I try to commit and push the file
❯ git status
On branch main
Your branch is up to date with 'origin/main'.
Untracked files:
(use "git add <file>..." to include in what will be committed)
java/servers/coins-website/coins-react/src/bootstrap-yeti.css
nothing added to commit but untracked files present (use "git add" to track)
❯ git add .
❯ git commit -m "add bootstrap-yeti.css"
[main b618bc6] add bootstrap-yeti.css
1 file changed, 11245 insertions(+)
create mode 100644 java/servers/coins-website/coins-react/src/bootstrap-yeti.css
❯ git push
Enumerating objects: 14, done.
Counting objects: 100% (14/14), done.
Delta compression using up to 12 threads
Compressing objects: 100% (7/7), done.
Writing objects: 100% (8/8), 215.76 KiB | 11.36 MiB/s, done.
Total 8 (delta 2), reused 5 (delta 0), pack-reused 0
send-pack: unexpected disconnect while reading sideband packet
client_loop: send disconnect: Broken pipe
fatal: the remote end hung up unexpectedly
The only thing that seems to work is cloning the repo, moving the file into the new local repo, committing, and pushing. But with that, I have to abandon my branches and stashes in the old repo. I've also had this same issue at work with a much much larger repository, so recloning and breaking up my commits every few days is making me lose my mind.
How can I fix this? Can I reset the .git folder without deleting my branches and stashes?

git clone hangs at "checking connectivity"

OS - Windows 7 professional 64 bit
GIT for windows - Git-1.9.0 - Using Git bash
I started having problems with "git fetch" suddenly out of nowhere.
Sometimes git.exe would error out and sometimes the "git fetch" would just hang.
So I decided to start everything from scratch.
I uninstalled git for windows and reinstalled it (accepting all defaults), restarted the machine. Created a brand new folder and did the following
$ git clone git#github.com:myid#example.com/myproject.git
Cloning into 'myproject'...
Enter passphrase for key '/c/Users/myid/.ssh/id_rsa':
remote: Counting objects: 287209, done.
remote: Compressing objects: 100% (86467/86467), done.
remote: Total 287209 (delta 188451), reused 287209 (delta 188451)
Receiving objects: 100% (287209/287209), 168.89 MiB | 328.00 KiB/s, done.
Resolving deltas: 100% (188451/188451), done.
Checking connectivity...
It consistently just hangs at "checking connectivity"
I have scanned the machine for viruses/trojans what have you and no threats were found.
This is happening both at work location and from home - So its probably not the internet.
I'm not sure how to proceed or what to try next.
I removed the known_hosts file from my ~/.ssh folder, which did the trick. Everything works now.
This message is not related to network connectivity. This is about checking whether every object is connected to an existing reference.
Detailed answer can be found on superuser
Try to run git with setting environment variable GIT_CURL_VERBOSE=1 to see what is going on.
This should improve with Git 2.34 (Q4 2021), where the code that handles large number of refs in the "git fetch"(man) code path has been optimazed.
See commit caff8b7, commit 1c7d1ab, commit 284b2ce, commit 62b5a35, commit 9fec7b2, commit 47c6100, commit fe7df03 (01 Sep 2021) by Patrick Steinhardt (pks-t).
(Merged by Junio C Hamano -- gitster -- in commit deec8aa, 20 Sep 2021)
fetch: avoid second connectivity check if we already have all objects
Signed-off-by: Patrick Steinhardt
When fetching refs, we are doing two connectivity checks:
The first one is done such that we can skip fetching refs in the case where we already have all objects referenced by the updated set of refs.
The second one verifies that we have all objects after we have fetched objects.
We always execute both connectivity checks, but this is wasteful in case the first connectivity check already notices that we have all objects locally available.
Skip the second connectivity check in case we already had all objects available.
This gives us a nice speedup when doing a mirror-fetch in a repository with about 2.3M refs where the fetching repo already has all objects:
Benchmark #1: HEAD~: git-fetch
Time (mean ± σ): 30.025 s ± 0.081 s [User: 27.070 s, System: 4.933 s]
Range (min … max): 29.900 s … 30.111 s 5 runs
Benchmark #2: HEAD: git-fetch
Time (mean ± σ): 25.574 s ± 0.177 s [User: 22.855 s, System: 4.683 s]
Range (min … max): 25.399 s … 25.765 s 5 runs
Summary
'HEAD: git-fetch' ran
1.17 ± 0.01 times faster than 'HEAD~: git-fetch'
You should execute "git prune".

How do I download a specific version of source code from android.googlesourcecode.com using Git?

I want to download the source code shown in the tags from android.googlesourcecode.
For example, I want to download calendar code
https://android.googlesource.com/platform/packages/apps/Calendar
I also want to download the code of android specific version tag. For example,android-4.0.3_r1.1 from the following link
https://android.googlesource.com/platform/packages/apps/Calendar/+/android-4.0.3_r1.1.
When I browse this code from git repositories it doesn't show the tag's version of code. How do I do this?
All the tags are listed in the Calendar page under All Tags. That page contains android-4.0.3_r1.1.
If you clone the repo and look for that tag, you will find it as well. For example:
C:\prog\git>git clone https://android.googlesource.com/platform/packages/apps/Calendar
Cloning into 'Calendar'...
remote: Counting objects: 80, done
remote: Finding sources: 100% (80/80)
remote: Total 30504 (delta 15730), reused 30504 (delta 15730)
Receiving objects: 100% (30504/30504), 11.16 MiB | 4.61 MiB/s, done.
Resolving deltas: 100% (15730/15730), done.
C:\prog\git>cd Calendar
C:\prog\git\Calendar>git tag|grep 4.0.3
android-4.0.3_r1
android-4.0.3_r1.1
...
From there, a simple git checkout android-4.0.3_r1.1 will allow you to browse the sources for that specific tag.

Git on Windows, "Out of memory - malloc failed"

Have run into a problem with repository and tried almost every possible config setting found out there eg. pack.WindowMemory etc etc
I believe someone has checked in a large file to remote repository and now each time I try and pull or push to it, GIT tries to pack it and runs out of memory:
Auto packing the repository for optimum performance. You may also
run "git gc" manually. See "git help gc" for more information.
Counting objects: 6279, done.
Compressing objects: 100% (6147/6147), done.
fatal: Out of memory, malloc failed (tried to allocate 1549040327 bytes)
error: failed to run repack
Have tried git gc & git repack with various options but keeps returning same error.
Almost given up and about to just create a new repo but thought I'd ask around first :)
I found a solution Here that worked for me.
In .git/config file (client and/or server) I added the following:
[core]
packedGitLimit = 128m
packedGitWindowSize = 128m
[pack]
deltaCacheSize = 128m
packSizeLimit = 128m
windowMemory = 128m
For reference (you might already seen it), the msysgit case dealing with that issue is the ticket 292.
It suggests several workarounds:
Disable delta compression globally. For this you have to set pack.window to 0. Of course this will make the repository much larger on disc.
Disable delta compression for some files. Check the delta flag on the manpage to gitattributes.
git config --global pack.threads 1
git config --global pack.windowMemory 256m (you already tried that one, but also illustrated in "Error when pulling warning: suboptimal pack - out of memory")
other settings are mentioned in "git push fatal: unable to create thread: Resource temporarily unavailable" and "Git pull fails with bad pack header error" in case this is pack-related.
sm4 adds in the comments:
To disable the delta compression for certain files, in .git/info/attributes, add:
*.zip binary -delta
From Gitattributes man page:
Delta compression will not be attempted for blobs for paths with the attribute delta set to false.
Maybe a simpler workaround would be to somehow reset the history before that large file commit, and redo the other commits from there.
EDIT:  Since git-v2.5.0 (Aug/2015), git-for-windows (formerly MSysGit)
      provides 64-bits versions as noticed by Pan.student.
      In this answer I was advising to install Cygwin 64-bits (providing 64-bits Git version).
I got a similar Out of memory, malloc failed issue using MSysGit when reaching the 4GB barrier:
> git --version
git version 1.8.3.msysgit.0
> file path/Git/cmd/git
path/Git/cmd/git: PE32 executable for MS Windows (console) Intel 80386 32-bit
> time git clone --bare -v ssh://linuxhost/path/repo.git
Cloning into bare repository 'repo.git'...
remote: Counting objects: 1664490, done.
remote: Compressing objects: 100% (384843/384843), done.
remote: Total 1664490 (delta 1029586), reused 1664490 (delta 1029586)
Receiving objects: 100% (1664490/1664490), 550.96 MiB | 1.55 MiB/s, done.
Resolving deltas: 100% (1029586/1029586), done.
fatal: Out of memory, malloc failed (tried to allocate 4691583 bytes)
fatal: remote did not send all necessary objects
real 13m8.901s
user 0m0.000s
sys 0m0.015s
Finally git 64 bits from Cygwin fix it:
> git --version
git version 1.7.9
> file /usr/bin/git
/usr/bin/git: PE32+ executable (console) x86-64 (stripped to external PDB), for MS Windows
> time git clone --bare -v ssh://linuxhost/path/repo.git
Cloning into bare repository 'repo.git'...
remote: Counting objects: 1664490, done.
remote: Compressing objects: 100% (384843/384843), done.
remote: Total 1664490 (delta 1029586), reused 1664490 (delta 1029586)
Receiving objects: 100% (1664490/1664490), 550.96 MiB | 9.19 MiB/s, done.
Resolving deltas: 100% (1029586/1029586), done.
real 13m9.451s
user 3m2.488s
sys 3m53.234s
FYI on linuxhost 64 bits:
repo.git> git config -l
user.email=name#company.com
core.repositoryformatversion=0
core.filemode=true
core.bare=true
repo.git> git --version
git version 1.8.3.4
repo.git> uname -a
Linux linuxhost 2.6.32-279.19.1.el6.x86_64 #1 SMP Sat Nov 24 14:35:28 EST 2012 x86_64 x86_64 x86_64 GNU/Linux
If my answer does not fix your issue, you may also check these pages:
git clone out of memory even with 5.6GB RAM free and 50 GB hard disk
Git clone fails with out of memory error - “fatal: out of memory, malloc failed (tried to allocate 905574791 bytes) / fatal: index-pack failed”
git-clone memory allocation error
MSysGit issues tracker
Some of the options suggested in the selected answer seem to be only partially relevant to the issue or not necessary at all.
From looking at https://git-scm.com/docs/git-config, it appears that just setting the following option is sufficient (set only for the project here):
git config pack.windowMemory 512m
From the manual:
pack.windowMemory
The maximum size of memory that is consumed by each thread in git-pack-objects[1] for pack window memory when no limit is given on the command line. The value can be suffixed with "k", "m", or "g". When left unconfigured (or set explicitly to 0), there will be no limit.
With this, I never went over the specified 512m per thread, actually used RAM was about half of that most of the time. Of course, the amount chosen here is user-specific depending on the available RAM and number of threads.
This worked for me, but I had to set the options via the command line using:
git --global core\pack [param] value

Resources