Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<blockquote> <p><strong>PS</strong>. Fair warning: </p> <p><code>git</code> is generally considered blazingly fast. You should try cloning a full repo from darcs, bazaar, hg (god forbid: TFS or subversion...). Also, if you routinely clone full repos from scratch, you'd be doing something wrong anyway. You can always just <code>git remote update</code> and get incremental changes.</p> <p>For various other ways to keep <em>full</em> repos in synch see, e.g.</p> <ul> <li><a href="https://stackoverflow.com/questions/5559321/fetch-all-in-a-git-bare-repository-doesnt-synchronize-local-branches-to-the/5559586#5559586">&quot;fetch --all&quot; in a git bare repository doesn&#39;t synchronize local branches to the remote ones</a></li> <li><a href="https://stackoverflow.com/questions/6150188/how-to-update-a-git-clone-mirror/6151901#6151901">How to update a git clone --mirror?</a></li> </ul> <p>(The contain links to other relevant SO posts)</p> </blockquote> <h3>Dumb copy</h3> <p>As mentioned you could just copy a repository with 'dumb' file transfer.</p> <p>This will certainly not waste time compressing, repacking, deltifying and/or filtering.</p> <p>Plus, you will get </p> <ul> <li>hooks</li> <li>config (remotes, push branches, settings (whitespace, merge, aliases, user details etc.) </li> <li>stashes <sub><sup>(see <a href="https://stackoverflow.com/questions/2248680/can-i-fetch-a-stash-from-a-remote-repo-into-a-local-branch/5257371#5257371">Can I fetch a stash from a remote repo into a local branch?</a> also)</sup></sub></li> <li>rerere cache</li> <li>reflogs </li> <li>backups (from filter-branch, e.g.) and various other things (intermediate state from rebase, bisect etc.)</li> </ul> <p>This may or may <em>not</em> be what you require, but it is nice to be aware of the fact</p> <hr> <h3>Bundle</h3> <p>Git clone by default optimizes for bandwidth. Since git clone, by default, does not <em>mirror</em> all branches (see <code>--mirror</code>) it would not make sense to just dump the pack-files as-is (because that will send possibly way more than required).</p> <p>When distributing to a <em>truly big</em> number of clients, <strong>consider using <em>bundles</em>.</strong></p> <p>If you want a fast clone without the server-side cost, the <em>git way</em> is <code>bundle create</code>. You can now distribute the bundle, without the server even being involved. If you mean that <code>bundle... --all</code> includes more than simple <code>git clone</code>, consider e.g. <code>bundle ... master</code> to reduce the volume.</p> <pre><code>git bundle create snapshot.bundle --all # (or mention specific ref names instead of --all) </code></pre> <p>and distribute the snapshot bundle instead. That's the best of both worlds, while of course you won't get the items from the bullet list above. On the receiving end, just </p> <pre><code>git clone snapshot.bundle myclonedir/ </code></pre> <h3>Compression configs</h3> <p>You can look at lowering server load by reducing/removing compression. Have a look at these config settings (I assume <code>pack.compression</code> may help you lower the server load)</p> <blockquote> <h3><sup>core.compression</sup></h3> <p>An integer -1..9, indicating a default compression level. -1 is the zlib default. 0 means no compression, and 1..9 are various speed/size tradeoffs, 9 being slowest. If set, this provides a default to other compression variables, such as core.loosecompression and pack.compression.</p> <h3><sup>core.loosecompression</sup></h3> <p>An integer -1..9, indicating the compression level for objects that are not in a pack file. -1 is the zlib default. 0 means no compression, and 1..9 are various speed/size tradeoffs, 9 being slowest. If not set, defaults to core.compression. If that is not set, defaults to 1 (best speed).</p> <h3><sup>pack.compression</sup></h3> <p>An integer -1..9, indicating the compression level for objects in a pack file. -1 is the zlib default. 0 means no compression, and 1..9 are various speed/size tradeoffs, 9 being slowest. If not set, defaults to core.compression. If that is not set, defaults to -1, the zlib default, which is "a default compromise between speed and compression (currently equivalent to level 6)."</p> <p>Note that changing the compression level will not automatically recompress all existing objects. You can force recompression by passing the -F option to git-repack(1).</p> </blockquote> <p>Given ample network bandwidth, this <em>will</em> in fact result in faster clones. <strong>Don't forget about <code>git-repack -F</code> when you decide to benchmark that!</strong></p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload