Choosing Version Control System for MOG development

Author:  Follow: TwitterFacebook
Job Title:Sarcastic Architect
Hobbies:Thinking Aloud, Arguing with Managers, Annoying HRs,
Calling a Spade a Spade, Keeping Tongue in Cheek
SVN and Perforce win over Git and Mercurial for gamedev

[rabbit_ddmog vol=”3″ chap=”Chapter 11(b) from “beta” Volume III”]

After we finished our (admittedly very limited) discussion on development flows – we can proceed to discussing specific version control systems. As of 2017, the following four version control systems are widely used for game development (listed in historical order of their respective first releases): Perforce, SVN, Git, and Mercurial. While, as noted above, most of gamedev industry is still leaning towards Perforce and SVN – there were successful games using Git and Mercurial too.


I have to admit that I never used Perforce myself; still, I’ll try to summarize arguments which gamedevs routinely provide for using Perforce:

  • Unlike most of the version control systems – Perforce is oriented not only towards coders, but also towards non-coders such as designers and artists. And I have to agree that providing designers/artists with a friendly environment is indeed extremely important.
  • Hare thumb up:Huge projects (those with lots of asset binary files, totaling terabytes) are handled without issues.Huge projects (those with lots of asset binary files, totaling terabytes) are handled without issues.
  • Locking files is possible.
    • As asset files (whether they’re binary or text – more on it below) are usually not really mergeable – having two artists to work on the same file is a Bad Idea™. This is where universally-frown-upon-in-programmers-world “lock file” feature comes handy.

On the minus side:

  • Perforce branching is reported to be rather ugly (up to the point of being outright unusable); even worse – data loss has been reported to happen during Perforce merges <double-ouch! />. On the other hand, to be fair – most of the Perforce users have reported working with Perforce for many years without problems, though it is unclear how much branching they were using.
    • [[TODO]]: Using Perforce streams instead of branches []
  • Perforce keeps a track of your working copy on the server; while not a problem for LAN – it is a problem when you have to work remotely (which is more and more often these days)
    • While working offline is possible with Perforce, it is subject to “reconciliation” process when you’re back online, which is well, ugly.
  • Perforce has been reported to require to resort to out-of-the-source-control file copying and/or sharing (which is an inherently Bad Thing™) on quite a few occasions.
  • Continuous Integration tools are relatively reluctant in supporting Perforce; on the other hand, with Jenkins, TeamCity and Bamboo supporting Perforce – it is not that bad either.
  • “Locking files” feature can be abused (in particular, you should have a policy of “not using exclusive checkout” for the code).
  • You cannot just delete a file in your working copy – you should do it ONLY via Perforce client; otherwise – you’re in quite a bit of trouble. It is not that big deal – but certainly it is an inconvenience.
  • At hundreds-of-dollars-per-user – pricing can get not-so-insignificant (especially if you have part-time users).



I have to admit that for a long while, I have been a fan of SVN – and I still admire it. From a technical standpoint, SVN is your typical centralized version control system (based on a single centralized server), and it is great because:

  • The model is simple
  • It is easily usable by non-developers.
    • Hare with smiley sign:For non-developers on Windows Tortoise SVN rulezzz!For non-developers on Windows (and large chunk of your designers will be on Windows) Tortoise SVN rulezzz!
  • Handles large multi-terabyte projects well.
  • File locking is available.
  • Offline work is possible and easy (though offline commits aren’t possible, and neither is offline access to history beyond one last version)
  • IMO, SVN sync merge is more intuitive than Git’s rebase (though I admit that this point is debatable and flame-war ridden).
  • SVN is built under a strong perception of history being immutable. While it is possible to mess with SVN history, it is difficult (or even impossible?) to do without messing with SVN files directly (i.e. without having admin-level access to svn server box).
  • [[TODO: partial checkouts]]
  • Path-based access control is possible, including restricting reads on a per-directory/per-file basis. [[TODO: refer to explanation why it is necessary]]

On the minus side:

  • While merges reportedly improved on the way towards SVN 1.8 or so – they’re still not as fluid as in Git. At the very least – as far as I know, you still SHOULD avoid renaming files in your branches (otherwise – chances are you’ll get an infamous “tree conflict” <sad-face />).1 It is not that big deal – but a significant inconvenience if doing feature branching.
  • Commits while you’re offline are not possible. This is not as bad as with Perforce (actually, if you’re offline just while you’re typing in while on a train back home – it is not noticeable at all), but if you’re going to be offline for a while as you are developing2 – it can become a problem.
  • As with Perforce, locking can be abused. To mitigate it, it is possible to:
    • Outright prohibits locking of source files (IIRC, pre-commit script should do it).
    • for non-mergeable files – make sure to write a script sending reminders (CC: PM) such as “you’re holding this file since yesterday – are you sure you really need it for this long?”)
  • For an open-source project – SVN’s model doesn’t lend itself well to “pull requests”
    • OTOH, I didn’t see much “pull requests” for intra-company development, even less for gamedev.

1 While SVN 1.10 is expected to address this problem by a significantly improved conflict resolver – SVN 1.10 is not out yet, so it is unclear whether this feature will make it to 1.10, and how exactly it will work in practice if it does make it.
2 Say, if your company sends you on a cruise while you’re developing <wink />



Definition of git:

a foolish or worthless person

After working with SVN for a while, I had to switch to Git – and found it being clearly better-suited for Feature-Branch development model; moreover – Git is also perfectly usable for Trunk-Based-Development – as long as there are only developers who work on the repository. On the other hand, for gamedev-with-assets-and-artists-involved – Git, while being usable, is clearly not the best option.

Pros of Git include:

  • Being branch-centered from the very beginning, branch handling in Git is good.
    • Still, reverting branch merge is ugly even in Git <sad-face />, more on it in [[TODO]] section below.
  • Offline work is very straightforward, you have full capabilities of commiting and having your history.
    • Of course, it comes at the cost of extra push operation, so if you don’t work offline often – it is not that big deal (especially in 2017, where you have Internet pretty much all the time) 

List of Git negatives, at least when it is used for game development, is longer:

  • Hare with hopeless face:Git is not really friendly to non-developers such as artists (that’s to put it very mildly)Git is not really friendly to non-developers such as artists (that’s to put it very mildly). The whole model is significantly more convoluted than that of centralized version control systems such as Perforce or SVN, and without ability to merge those-files-artists-are-working-on – it becomes convoluted-for-no-apparent-reason for their purposes <sad-face />.
  • Whatever-you’re-doing, you have to keep the whole repository on your local box (except for Git-LFS files, more on them below); if the whole-project-including-history is large (as in “1T large”) – it can take a looong while to download it.
  • As a result, some developers started to support multiple Git repos – one for “lean and mean” code, and another one for docs etc. TBH, I do not like the very idea of having several repos (it starts a slippery road towards “let’s keep all the code in Git, and all the assets in SVN” – which usually qualifies as a Really Bad Idea™ because of lack of sync between two repositories, as discussed above).
  • Git-LFS is a kind of crutch (and is not really following the “distributed” nature of the rest of Git).
  • Handling of huge-projects-with-lots-of-binary-files is rather ugly with Git. While it did improve with Git Large File Storage (Git-LFS) – it is still not clear how Git-LFS behaves for multi-terabyte real-world projects.
  • File locking for non-mergeable files is not available (see below on advisory locks – but they don’t really work well for non-developers).
  • Per-file access control is not supported (at least not out-of-the-box). While this is not that much of a problem for open source projects – it is quite an issue, especially for gamedev where we have to resort to security-by-obscurity <sad-face />.
  • I positively hate an ability to mess up with (“amend” in Git-speak) already-committed data. IMNSHO, having history immutable is a Really Good Thing™ for any version control system.

For a more detailed discussion on problems-of-Git-for-gamedev-purposes – see, for example, [chris@enemyhideout].

Git and unmergeable files

Hare pointing out:For game development (and unlike most of other software development projects), you're likely to have binary files which need to be edited (representing so-called “assets”)For game development (and unlike most of other software development projects), you’re likely to have binary files which need to be edited (representing so-called “assets”; more on assets and asset pipeline in Vol. V’s chapter on Graphics 101). More precisely, it is not only about binary files, but also includes any file which cannot be effectively merged by Git (even simple text-based Wavefront .obj file is not really mergeable in a sense that tracking differences in these files is pretty much useless).

A question “what to do with such files” is not really addressed by Git philosophy. The best way would be to learn how to merge these unmergeable files, but this is so much work that doing it for all the files your artists and game designers need, is hardly realistic <sad-face />; still – make sure that if you’re using Unity, you’re using their SmartMerge (that’s regardless of using Git or not).

The second best option would be to have a ‘lock’ so that only one person really works with the asset file at any given time. However, while locks are supported by Perforce and SVN (and there is a Lock Extension for Mercurial too) – Git’s position with regards of locks is (a) that there won’t be mandatory locks, ever, and (b) that advisory locks are just a way of communication so that should be done out-of-Git <sic! />. The latter statement leads to having chat channels or mailing lists just for the purposes of locking <ouch! />. I strongly disagree with such approaches, because IMNSHO:

all the stuff which is related to source-controlled code, SHOULD be within version control system, locks (advisory or not) included

To use advisory (non-enforced) locks in Git, I suggest to avoid stuff such as chat channels, and to use manually-editable lock files (located within Git, right near real files) instead. Such a lock file MUST contain the name (id) of the person who locked it, as a part of file contents (i.e. having lock file with just “locked” in it is not sufficient for several reasons). Such an approach does allow to have a strict way of dealing with the unmergeable files (that is, if people who’re working with it, are careful enough to update – and push(!) – lock the file before starting to work with the unmergeable file), and also it doesn’t require any 3rd-party tools (such as an IM or Skype) to work.

For artists/game designers, at the very least this logic must be wrapped into a “I want to work with this file – lock it for me” script (with the script doing all the legwork and saying “Done” or “Sorry, it’s already locked by such-and-such user”).3 And if you like your artists better than that, you can make a Windows shell extension which calls this script for them and displays a nice “Locked” icon.

The approach of lock-files described above is known to work (though having a drawback of creating commits just for locking purposes), but still remains quite a substantial headache. Actually, the headache can be so significant that it might be better to use Perforce, SVN, or Mercurial-with-Lock-Extension (all of them support mandatory locking) just for this reason.

Let’s also note that there is also an issue which is often mentioned in this context, the one about storing large files in Git, but IMO this is a much more minor problem, which can be mostly resolved by using Git LFS plugin.

3 and of course, another script “I’m done with file”, which will be doing remove-lock-file-commit-and-push


Issues with reverting Git branch merge commit

One of Git peculiarities is related to revert of Git commit of merging branches. While revert of committed branch merge is not a picnic in any version control system, in Git it is IMO particularly nasty and counter-intuitive.

For a proper discussion of it – take a look at [TorvaldsHamano]; here I’ll provide only a very brief overview. In short – after reverting Git committed branch merge, your system is left in not exactly the same state than it was before the merge(!) – so you need to remember about this reverted merge when you’re doing re-merge, otherwise you’ll get very unexpected results <sad-face />. In practice, it means that with Git you generally should avoid reverting branch merges at all costs. To make things worse – in Git-world this behavior is not considered a bug-which-has-to-be-eventually-fixed (but rather a feature-which-makes-those-who-know-about-it-gurus), so chances of it being fixed are estimated about as high as chances of a cold day in hell <sad-face />.


Last but not least on the list of our contenders-for-version-control, is Mercurial. While Mercurial is ideologically very similar to Git, it certainly has a very different look and feel. In particular, the following pros can be observed about Mercurial in the context of gamedev:

  • Mercurial is (almost-)usable by non-developers
    • TortoiseHG helps a lot for those poor Windows-based souls.
  • Branching is good (though reverting branch merge is still a mess <sad-face />)
  • Offline work is very straightforward too
  • Commits are not mutable4
  • Lock Extension is available (though not distributed with Mercurial by default)
  • Per-file access control is supported (though not for reading)

List of Mercurial cons is also impressive:

  • Wtf hare:Large files (beyond 100M or so) are handled very inefficientlyLarge files (beyond 100M or so) are handled very inefficiently; Mercurial needs about file-size-multiplied-by-5x-to-10x RAM to operate, so having a 1G asset file is likely to bring quite a few systems to their knees <sad-face />.
  • Same as with Git, pretty-much-whatever-you’re-doing (except for large files), you need to have the whole repository on your local box.
  • Same as with Git, unmergeable files are a Big Headache(tm).
  • Just as with Git-LFS, Mercurial Large Files Extension is a crutch, going against its overall distributed nature.
  • Access control restricting reading within repository is not possible <sad-face />.5

From what I’ve heard, one big reason why gamedevs are not using Mercurial, is because of that issues with large asset files (see, for example, [SirGru]).

4 except for “hg rollback” which doesn’t go beyond one last commit
5 And this is a fundamental restriction of all those distributed version control systems which have a copy of the whole repository on each box <sad-face />


On Open-Source Gamedev

If by any (admittedly rather slim) chance you’re planning to release an open-source game – another extremely-important factor is added into play: namely, “how many people you’ll be able to attract to work on your open-source project?” And in this regard, GitHub is a very clear leader by far, with BitBucket and GitLab fighting for distant second place.

Now, we should observe that all these three services are running Git (only BitBucket providing Mercurial option on the side). Moreover, all the competition-running-SVN such as OSDN and Assembla are lagging far far behind these three (as for Sourceforge – it is no longer recommended due to certain really ugly decisions they made a few years ago ).

This means that

for open-source games, Git does have a Very Significant Advantage™

Comparison of Four Major Version Control Systems for Gamedev Purposes

Now, we can summarize our discussion about different version control systems in the context of game development, in the following Table 11.2:

Perforce SVN Git Mercurial
Non-dev friendly Excellent Good Poor Kinda Acceptable
Trunk-based Development Excellent Excellent Overcomplicated Good
Feature Branches Poor Good Excellent Excellent
Non-mergeable files Excellent Excellent Afterthought, no locking Afterthought, issues with large files
Terabyte-size projects Excellent Good With Git-LFS only With LargeFiles extension only
Offline Work Acceptable Good Excellent Excellent
Access Control Good Good Restricting read-only access is not feasible Restricting read-only access is not feasible
CI Support Good (Jenkins, Team City, Bamboo) Good (Jenkins, Team City, Bamboo) Excellent(Jenkins, Team City, Bamboo, Travis) Good (Jenkins, Team City, Bamboo)
Open-Source Repositories None I know about OSDN, Assembla,CloudForge6 GitHub, Bitbucket, GitLab Bitbucket, OSDN, Assembla

As we can see – unlike for generic software development (where Git still arguably rulezz), for gamedev we have to say that non-distributed version control systems (such as Perforce and SVN) tend to be a more logical choice than DVCS such as Git and Mercurial. When choosing between Perforce and SVN – I’d prefer SVN, but I have to admit that if you’re heading for trunk-based development – Perforce becomes perfectly competitive too.

On the other hand, if your game is going to be open-source – Git (or at least Mercurial) gets an all-important-for-open-source advantage of additional exposure.

6 I don’t mean OSDN, Assembla, or CloudForge are bad technically; it is just that their popularity (and this is what matters for crowd-source development) is lacking at this point


Version Control: 3rd-party Hosting vs In-House

Femida hare:while sometimes outsourcing is indeed a good idea, some other times it doesn’t really workIn XXI century, overall trend is to have more and more services outsourced; however – while sometimes outsourcing is indeed a good idea, some other times it doesn’t really work. When it comes to outsourcing version control for a game, keep in mind the following pros and cons of such outsourcing (also known as “cloud-based version control”, “SaaS”, etc.):

  • Pro: less headaches, plain and simple. With an in-house version control, you need to spend time on configuring it, backing it up, and storing backups safely. It is not that much work – but somebody has to do it if you keep your system in-house.
  • Pro: upgrades happen automagically, so you don’t need to spend time on them
  • Pro: unless you have a serious admin which handles it – it is less difficult to mess up your version control system.
    • BTW, I’d say that for Git requirements for your admin are higher than for other systems (in other words, Git is substantially easier to mismanage – in particular, due to the mutable histories <ouch! />).
  • Con: for a game with lots of assets, and with 3rd-party hosting – you can be for a looooong wait for each checkout. Even more so if you’re using Git or Mercurial (and if you’re not careful enough to keep all your assets within Git-LFS or Mercurial LargeFiles – it can easily become catastrophic).
  • Con: upgrades happen automagically, so you can’t schedule them (so if a problem occurs affecting your system – it will happen at the worst possible time, like “on the day of the release”). Granted, it is rarely a problem for hosted version control – but is quite a problem for other 3rd-party systems such as Issue Tracking.
  • Con: While 3rd-party hosts such as GitLab or BitBucket don’t have a reason to steal your code – by their nature they are extremely juicy attack targets. And as we as gamedevs have to resort to “security by obscurity” much more often than we’d like to (more on it in Vol. VIII) – the damage from some-hacker-cracking-into-GitLab-or-BitBucket-and-publishing-all-the-source-code-found can be enormous.

Overall, when it comes to version control, it is not that much difference between 3rd-party hosting and in-house system (~=”you won’t do too wrong choosing any of these routes”). BTW, if you happen to like UI of GitLab or BitBucket but are not fond of 3rd-party hosting for any of the reasons mentioned above – keep in mind that you can have them installed in-house (a.k.a. “self hosted”) too.

[[TODO: cherry picking (in particular – for replay-based testing as discussed in Vol. II’s Chapter on (Re)Actors)]]

[[To Be Continued…

Tired hare:This concludes beta Chapter 11(b) from the upcoming book “Development and Deployment of Multiplayer Online Games (from social games to MMOFPS, with social games in between)”.

Stay tuned for further parts of Chapter 27, where we’ll continue our discussion on DB optimizations (going into the strange field of app-level caches and app-level replicas)]]


Don't like this post? Comment↯ below. You do?! Please share: ...on LinkedIn...on Reddit...on Twitter...on Facebook



Cartoons by Sergey GordeevIRL from Gordeev Animation Graphics, Prague.

Join our mailing list:


  1. Sebastian says

    What mean „acceptable” and „good” in Comparision Table – Offline Work row? IMO SVN have basic/poor offline support – it offers just diff, revert and blame/log (if log is cached), user can’t commit, merge or branch without access to server.

    • "No Bugs" Hare says

      Of course, all “acceptable” and “good” things are inherently subjective. Still, if speaking about usual offline scenarios (such as “in the train on the way home”) – IMNSHO SVN is good enough for these scenarios, i.e. practical differences, while existing, are pretty much negligible. After all, 95+% of the time we as developers are writing code, and working with source control only occasionally; and when we need source control – 90+% of those remaining 5% all we need is diff-with-last-version and revert, which is also covered by SVN. For remaining 0.5% or so – yes, they’re going to cause a productivity loss, but in practice the loss will be extremely limited.

      If we’re speaking about “work on a yacht without Internet access for many days” – situation would be different, but – TBH, such scenarios are extremely rare. Moreover, the whole importance of the offline situations is diminishing with time, just because there are fewer and fewer places which are not covered by the Internet. In other words – while in 2005 (when Git was conceived) offline work was a serious argument, now it is more of a “well, offline does occasionally happen” (in particular, even on a train I’m online – and while the connection quality is poor, but is perfectly enough to get SVN history or to commit; on planes it is not _always_ available yet – but it improves with time), and in 5 years from now it will be like “offline? really? where did you manage to find such a place?”

      • Sebastian says

        Good point.

        But sometimes I love GIT way to prepare all localy without breaking project for others. Even if I have internet access I work locally with small features without pushing it or creating branch (and merging again and again which consume time). I have full history of my current work. I can test and prepare patch without lossing any piece of my work and when feature is ready I can just merge with latest remote version and push it.

        But yeah, this same result can be achived with SVN (or any other VCS) and branches as long as server and internet connection is fast enough to not slow down my work.

        In last year I have whole few days without electric power in my office and another few without or poor internet access. Shit happens. In case like this I can just get copy of my local repo, finish work at home and merge as soon as remote becomes accessible. In this case I can also prepare fast backup just by adding another remote located on my external drive.

        But yeah. In many things I still prefer SVN. It’s easier in many ways, it has mature and fast software (like Cornerstone for Mac which I love). Working with branches is easier with GIT but SVN is better than many developer think it is… Selecting one of them is not easy and even in our company we have strong followers and opponents for both.

        BTW. I’m sorry for me poor english. 😉

        • "No Bugs" Hare says

          > But yeah, this same result can be achived with SVN (or any other VCS) and branches as long as server and internet connection is fast enough to not slow down my work.

          Actually, the result with branches will be even better 🙂 – as your data will persist on server, so you won’t risk to lose your work if your local HDD is lost…

          > SVN is better than many developer think it is…

          Yeah, especially SVN 1.8+. And in SVN 1.10 they’re promising to fix merges over renamed files…

          > BTW. I’m sorry for me poor english. 😉

          Not me mother tongue either 😉 .

  2. Damien says

    Mercurial branching model is just awful. You may lose your tags in merge, and you may lose your bookmarks anytime. For me it goes in the shitty bag with Perforce.

  3. Stuart says

    You should also check out Plastic SCM. It has all of the advantages you’ve listed under Subversion, while still being a DVCS which operates more like Git. Branching and merging are easy under Plastic, but large files are also not a problem. The merge engine is also designed to help show differences between image files, and allow custom diff tools for other files. Not associated with Plastic SCM, just really like their product.

    • "No Bugs" Hare says

      TBH, using marginal-at-the-time control systems is rarely a good idea – not because they’re bad – but because you don’t really know where their product will be in 5 years from now. Sure, it is unfair to the less-popular products – but TBH, I need my source control to “just work”, and with millions-of-projects-on-{Git|SVN|Perforce|Hg} I can be reasonably sure that I already know _all_ their quirks in advance (and as discussed above – there are quite a few for _each_ of SCMs).

      Also I have to say that I am _really_ frustrated by Plastic’s “comparison” with other source control systems (which raises a question: did they really do their homework on researching competing solutions?).

      Oh, and last but not least: SQL DB for files? Seriously? Keeping a multi-terabyte DB is as not a picnic as it goes, this is for sure…

      • Stuart says

        That’s fair. I’ve used Plastic for over 5 years, so I feel comfortable with their stability, but I recognize that they’re not mainstream.

        They have switched to JetDB as their primary backend, which has worked reliably and very high performance for me. But my repos have only measured in the GB, not in the TB, so I can’t testify to behavior at that level of size.

  4. Warren Postma says

    Subversion downsides you missed:

    1. data corruption on the server side. (unforgiveable in a CVCS).

    2. data corruption on working copies (inside .svn folder)

    3. tree conflicts. WTAF.

    4. centralized model is worse for distributed software teams (increasingly common)..


    • "No Bugs" Hare says

      1 and 2. Didn’t see in a 100+-people team working for several years using SVN (1 – 100% sure, 2 – well, if you mess with it locally – you can get it, but this stands for any source control). BTW, for Git (which does allow to mess with history remotely) it is MUCH easier to mess up the server-side…

      3. Is already much better in 1.8, promised to be better-than-Git in 1.10 (we’ll see about that). BTW, “tree conflicts” are already mentioned in one of the cons.

      4. At least when speaking about gamedev – it is _exactly_ the opposite. First, with the world becoming more-and-more connected, importance of being able to work offline diminishes every day, so a requirement to have _some_ Internet (like “100kBit/s”) becomes a moot issue (so even without gamedev in mind it is a non-issue in practice). Moreover, with gamedev and assets in mind, however counter-intuitively it may sound, DVCS such as Git are WORSE for distributed teams: finding 100Kbit/s-Internet (which is enough to work with SVN) is a non-issue even if you’re outsourcing to India or something (hey, do you really want to have your distributed team members without _any_ Internet??), while finding connection-sufficient-to-pull-terabyte-size-repo in a developer’s lifetime, is MUCH more challenging (pulling 1T repo over 100kBit/s channel will take like 3 years – not really good for practical purposes if you ask me).

    • Sebastian says

      1. I get few times with older SVN version. Looks like some kind of problem with checksums collision, solved after removing file from repo and pushing new version after that. Few times we get bug which solves our admin, I don’t known any details about root of this problems. 🙁

      2. Confirmed. Happily easy to fix with new SVN version where .svn directory is only in one per whole checkout. Anyone can fix repo just by new checkout and copy .svn from new to old repo.

      3. Confirmed but all my problems was related with two way merging – for example when I cherry pick some changes from trunk to feature branch and after that I try merge whole feature to trunk…

      4. This one is different, previous one are kinds of bugs and should never happend. This one is feature (by design) related with few positives.

  5. Arthur says

    Could you please explain why you need per-file/directory access control in your game repository? I have only been in the industry for a few years, but never seen the need for it, even in ~100-man team.

    The memory requirement for git/mercurial in regards to large binary files is (or at least was a few years ago) similar. We tried to create a Git repository of an Unreal Engine game, but it would fail due to running OOM on the machine. Git gc also frequently eats up all memory and crashes.

    The issues of handling unmergable files and reverting branch merges are equally bad in Git and Mercurial, but your article makes it sound like they are primarily Git issues. In my experience they are equally bad in both.

    I would advise against using Git-LFS, one of our teams tried it out, and they reported it being unreliable and had a few cases of data loss. Definitely not recommended. However it is still not required to have the whole repository locally, you can do shallow clones on Git containing only recent history.

    If you’re mentioning self-hosting of the services, you could also mention that you can have an in-house installation of GitHub as well, via GitHub Enterprise, although it doesn’t handle big game repositories too well…

    I disagree with your opinion on the Internet connection requirements. Working with SVN on a slow connection is extremely painful, simple operations can take minutes to complete. It is not about being able to work completely offline (it’s a big bonus if you’re on a plane for a couple of hours) but more about comfort of working with the repository. In Git and Mercurial, you only need the remote repository if you’re syncing, all other operations are local and near-instant. As for checking out the repository, sure, Git and Mercurial require you to fetch the whole thing (ignoring shallow clones, extensions like Microsoft’s GVFS, etc.), but checking out an SVN working copy (of the whole thing) is not that much better when talking about big repositories. We’ve had one company just mail us an encrypted hard drive with a working copy, because the week it took was still a lot faster than us fetching it over the Internet.
    While on the subject, I do feel you missed an important advantage of SVN – support for partial checkouts. I don’t know about Perforce, but SVN allows you to fetch only certain subdirectories. Git and Mercurial do not.


    • "No Bugs" Hare says

      > why you need per-file/directory access control in your game repository? I have only been in the industry for a few years, but never seen the need for it, even in ~100-man team.

      It happens to be far more important for MOGs than for single-player games (though unfortunately, most of the companies are unaware of it – and of bot risks in general – until they already got their first million of players). The point is that MOGs repositories are very juicy targets for cheaters (so writing bots becomes much easier), and with this in mind – access control becomes a very-nice-to-have feature (not because you don’t trust developers, but because of risks of spearphishing 🙁 ).

      > I would advise against using Git-LFS, one of our teams tried it out, and they reported it being unreliable and had a few cases of data loss.

      On the other hand, quite a few other people are reporting successfully using it for laaaarge projects, so jury is really out on this one. Also – the more mature it gets, the less chances for ugly things such as corruption to happen, so in the long run I’m pretty sure it will be viable (whether it is already viable right-now – is indeed an open question).

      > Working with SVN on a slow connection is extremely painful, simple operations can take minutes to complete.

      I’ve worked with SVN back in 2000 over dial-up, and it was still workable (yes, back then it indeed could take minutes, but it is not _that_ often we need to interface with source control system to start with, and most of the time we’re still writing code). And I didn’t see “minutes to complete” for SVN repos for years now (“2 seconds” is much more like it, and it is certainly not a problem for my teams). And let’s keep in mind that on the _very_same_connection_, trying to pull 1T Git repository will take years, so the choice is still very clear.

      > but checking out an SVN working copy (of the whole thing) is not that much better when talking about big repositories.

      Well, I don’t agree that 3 orders of magnitude difference qualifies as “not that much better”. Essentially, we’re speaking about the difference between checking-out-only-the-last-snapshot and checking-out-the-whole-thing-with-thousands-of-versions-for-each-unmergeable-file, so 3 orders of magnitude is not-unheard-of (and 2-orders-of-magnitude is actually pretty common). Sure, it is possible to play games such as –depth or Git-LFS, but while staying within “pure” DVCS Git- or Mercurial-style, the difference IS pretty bad.

      > We’ve had one company just mail us an encrypted hard drive with a working copy, because the week it took was still a lot faster than us fetching it over the Internet.

      Wait, what was the size of your working copy? With rather-trivially-achievable 10Mbit/s, we’re speaking about 100G/day, or around 700G/week. I have to say 700G for just-working-copy-without-history is quite a bit of data even for a 3D-heavy AAA game… Of course, if admins have no idea of what-they’re-doing, even for 10MBit/s channel they’ll be able to get only 100kBit/s download speed, but this is certainly not a problem of SVN (I remember when back in 2004 or so the need to transfer 100G-within-a-day across the Atlantic was indeed a significant problem which took admins like several weeks to configure properly, but well – channels are MUCH better now than 13 years ago, and BTW TCP has greatly improved in this regard too – in particular, due to RFC1323).

      > I do feel you missed an important advantage of SVN – support for partial checkouts.

      Yes, I did, will add it, THANKS!

    • "No Bugs" Hare says

      > The issues of handling unmergable files and reverting branch merges are equally bad in Git and Mercurial, but your article makes it sound like they are primarily Git issues. In my experience they are equally bad in both.

      Not sure re. reverting branch merges in Mercurial (thanks Knuth, never needed to do it) – but as for unmergeable files, you’re right – I added a note to this effect to Mercurial, THANKS!

  6. says

    Minor nitpick about Mercurial: history is just as mutable as in Git, and you can do arbitrarily intricate surgery on your history visually, directly in TortoiseHG. No need to remember which order to put the arguments to `git rebase` on the command line. Whether you like history surgery or not is a matter of taste, of course – I like it pre-sharing-with-other-devs but definitely not post-sharing.

    The lack of history rewriting seems to be a well-known “fact” about Hg. It’s not true.

    • "No Bugs" Hare says

      I am certainly not an expert on Mercurial, but it _seems_ that it is not _that_ bad. First – let’s clarify my primary concern: it is about messing with the remote commits (while changing commit message is ok, ability to drop/modify already-existing commit without trace is what gives me creeps).

      With this in mind, there seem to be two techniques present in Hg. If we’re speaking about “hg backout” – it seems to “revert” by actually creating a new commit (i.e. without actually changing already-committed stuff, which is good – at least we can always get that “reverted” version and re-commit it again, phew). And as for strip – I am not sure whether it is really strips existing commits, but at least it is an “extension”, so if we don’t have it enabled – I expect it to be ok.

      Where am I wrong?

  7. Hyunjik Bae says

    Nice article!

    There is Perforce Stream instead of Perforce branch feature. Internally Perforce Stream is based on the branch mechanism, but it is provided with some GUI and command line tools that force more strict branch-merge workflow, and intuitive GUI that shows branch status. I have used Perforce Stream for several years with tens of Streams (each stream has >50K files) and got no problem so far.

    I also use SVN branch as well. If I score them, this:

    SVN branch and merge: 30
    Perforce branch and merge: 30
    Perforce Stream: 80

    If there’s a comparison between SVN branch and Perforce Stream (not Perforce branch), it can be a supplement to this article.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.