Sunday, 15 July 2012

github - Can't push to Git, due to binary files -



github - Can't push to Git, due to binary files -

so i've been having issues pushing master branch ive been working due binary video files. files big when tried force them first time. removed them out of directory of project working on. when seek force ever since first initial force , returns me error message.

compressing objects: 100% (38/38), done. writing objects: 100% (39/39), 326.34 mib | 639.00 kib/s, done. total 39 (delta 16), reused 0 (delta 0) remote: error: gh001: big files detected. remote: error: trace: b7371dc6457272213ca1f568d9484c49 remote: error: see http://git.io/iept8g more information. remote: error: file themes/somefile/uploads/me_582610_mountain-river.mov 315.08 mb; exceeds github's file size limit of 100 mb git@github.com:username/project.git

the file says big , seems still there somehow isn't there @ in directory or on computer. deleted completely. problem here? first time ever having issues this. went referenced git site support, https://help.github.com/articles/working-with-large-files/ , , ran git rm --cached me_582610_mountain-river.mov , returns me message fatal: pathspec 'me_582610_mountain-river.mov' did not match files

please help appreciated!

remember, default, everything commit git remains in repo - if "delete" in later commit.

one of git's weaknesses (along other dvcs') doesn't handle big binary files well. many teams/people wanting version lots of big binary files prefer centralized vcs' perforce, subversion, etc. 1 has greater command on part of repo 1 downloads , how many versions of prior commits 1 persists in repo.

to problem: have repo you've committed big binary file. though subsequently "removed" repo, the file remains. delete repo, you'll have surgery, physically destroying original commit in file added, , rewriting every subsequent commit in repo!

as per git documentation on removing objects (emphasis mine):

there lot of great things git, 1 feature can cause issues fact git clone downloads entire history of project, including every version of every file. fine if whole thing source code, because git highly optimized compress info efficiently. however, if @ point in history of project added single huge file, every clone time forced download big file, if removed project in next commit.

the solution problem not simple process, destructive (in rewrites every commit after commit in included offending file(s)) , pretty documented in above link encourage read several times , practice on local re-create of tree before updating official tree.

proceed care!

if after import, before has started base of operations work on commit, you’re fine — otherwise, have notify contributors must rebase work onto new commits.

frankly, when did year-or-so ago repo of own (i.e. not shared else), chose re-create current codebase new folder , create new git repo rather seek rewrite history, packfiles, etc. losing history wasn't major issue me @ time.

good luck!

git github binaryfiles command-line-interface

No comments:

Post a Comment