I hate to see a developer getting burned by speaking to the public frankly about the issues they experienced during development; but the sheer idiotic rage of PC fanbois when John Carmack stands up and speaks his mind has to be seen to be believed.
As far as I can tell, there are two (actually 3) problems with Rage:
1. Six years ago when they began developing the engine, Mr Carmack made an (incorrect) call that the PC was not going to be significantly more powerful than consoles when RAGE was released - which he has spoken about at length.
2. id was anticipating AMD would have a graphics card driver released prior to RAGE being released which had the correct OpenGL extensions and/or performance necessary to handle the code RAGE is using (and AMD did not - in fact they made things significantly worse by releasing some Battlefield 3 beta drivers which bundle an 'ancient' version of the OpenGL drivers).
Sure, John isn't helping things with making public statements prone to misinterpretation, but he's staying consistently on message about this.
The third problem is the technology itself. The megatexture technology uses virtual texturing, in a similar way that the processes on your computer uses virtual memory, so that the actual texture can be much larger than what the graphics card can fit in its memory - with megatexturing it is in the 10s of GBs on disk. Reading some of the patch and graphics tweaks notes, it is also compressed, with both the GPU and CPU tasked with decompressing the texture on the fly with preference given to maintaining 60 FPS. What this means in practice is that you cannot compare your computer's performance running RAGE, to any other game that you've played (even ignoring the additional complication that id is pretty much the only company to use OpenGL instead of DirectX drivers).
The reason boils down to what parts of the computer are getting stressed enough to become a bottleneck. In a traditional game, levels are hand designed so that textures can be loaded (relatively) far in advance, so that the disk performance is almost never an issue - it is occasionally, because you get pop-in - typically at a scene start, but that may as much of an issue with the I/O bandwidth between main memory and the graphics card as between disk and main memory. With RAGE, every part of the computer becomes a potential performance problem - disk, disk cache latency, I/O between disk and main memory, CPU, GPU, I/O between main memory and CPU cache, CPU cache misses, I/O between GPU and main memory and/or CPU cache, GPU memory, other processes running on the system etc. It is highly likely that you have something slowly down one of the areas I've listed that you're not aware of, and a system which stresses all of these will only run as well as the slowest part. To take one example: if you've got a relatively full non-SSD drive before you installed RAGE, not only will you have disk fragmentation, but the files will be written to part of the disk which rotates slower under the drive head, which means that the data will be read from the drive slower.
(The consoles have a significant advantage here, because everything except the game can get out of the way - for instance, you are not running anti-virus software on your Xbox 360 or PS3; and the developers can write to 'bare metal' bypassing the abstraction layer required to support multiple graphics cards.)
So what possible reason could id have with going with such a high-risk design? Well, I'm guessing for much the same reason that every operating system now uses virtual memory - it incredibly simplifies the process of development. Under older memory management systems, such as OS X 9's segmented memory model, and (heaven forbid) DOS's flat memory model, you had to worry about a whole lot more variables: such as the maximum amount of memory you could be allocated at one time, whether that memory was 'near' or 'far', whether another program was attempting to access the same memory, and so on. With virtual memory, memory is always yours, flat and you can effectively have as much as you want (within the limits of the address space) provided you don't try to access all of it at once - the operating system handles the underlying details (relatively) transparently.
With megatexturing (and other virtual texturing methods) you don't have to worry about a texture budget for each discrete level: you just go ahead and design whatever you want, and the underlying engine will handle all the loading and unloading of the parts of the texture that are within (or close to) the actual field of view.
If you've been following John Carmack's public statements in the lead up to RAGE's launch, and have enough understanding of how computers work, none of the above should be a surprise to you.
Equally which, if you have enough of an understanding how people work, none of the subsequent overreaction should be much of a surprise to you either. But the subsequent un-PR-filtered statements by Mr Carmack are doing nothing but add to this overreaction. Which is unfortunate in the extreme, because from all appearances, the technology is the right approach to solving the issue*. Almost as unfortunate as Zenimax' decision not to license it, and likely prohibition on releasing it as open source at some point in the distant future.
*Except that apparently ray tracing is.
I think 6 years ago, mega texturing was the logical way to go. Unfortunately, technology moved on and now modern hardware is better suited to other techniques. As John said in his keynote earlier this year, RAGE took way longer to develop than they'd intended. If RAGE came out 2-3 years ago would it have been judged as harshly?
ReplyDeleteThe real shame is that iD are being blamed for issues with the OpenGL drivers, which is beyond their control.
I recommend watching Carmack's Quakecon talk for the technical details. http://www.youtube.com/watch?v=4zgYG-_ha28&feature=relmfu
ReplyDeleteMinor nitpick: "id Software" is not written "iD" and is not pronounced "Idea".
http://en.wikipedia.org/wiki/Id_Software
Thanks. Fixed. I'm not sure where you get the pronunciation from...
ReplyDelete(Must have been mixing up my Apples with oranges).
Unfortunately, there are other engines that do the job better - look at UE/UDK, that does a good job of virtual texturing and caching (to the point of the intial load being fuzzy, the same way, but it doesn't forget those textures outside of the field of view the way idTech5 does). Note that idTech5 (in Rage, at least) seems not to support dynamic lighting; everything is pre-baked in Rage.
ReplyDeleteAlso, MegaTexturing is not new; it was in SplashDamage's ET: QuakeWars (idTech4), and these issues that are in Rage now (with current tech) were not an issue in ET:QW at all. What's the big difference then?
I remember reading (but cannot find right now) about an interview in which Carmack said (paraphrased) that the texture pop would not be fixed. In other words, it's effectively a limitation of the engine. Now, I'm not seeing major issues with texture pop, but I've used a .cfg tweak for "higher level textures", and am using GPU transcoding (not available on ATi cards - why not? Because it's done using CUDA). There is a fraction of a second where there is the texture pop (most noticeable in doorways), but not so's most people would notice.
I don't think idTech5 is particularly advanced, not when compared to the other major engine (UE/UDK, which is not only used for FPS, but has also been used for pinball, racing, MMOs, and more). I think this is in part down to Carmack's arrogance(I think he has an element of "we are the best, and we will not be beaten" - not as much as some, however), but mostly because of fans placing him (and id) on a pedastal. From there, the only way is down, and with Rage, the mighty have fallen.
[Full disclosure: some of the comments below are taken from discussion I've had on other forums]
ReplyDeleteIMO MegaTexture's strength is also its weakness. Artists feel liberated, of course, since they can paint unique textures over every square inch without worrying about memory constraints. But the technology doesn't magically increase available VRAM, so when the volume of texture data grows large they must rely on the technology falling back on low-res textures as needed.
When artists are forced to deal with memory constraints up front then they can make artistic decisions about how to reduce memory usage. They could intelligently decide, "Yeah, we probably need to reuse or repeat some textures here," or perhaps, "This element of scenery looks nice but we don't have the memory budget for it, so let's cut it out." But when these decisions are given to the cold, unfeeling algorithms of the MegaTexture engine then the results aren't always up to modern standards of visual quality.
I totally agree with Andrew that virtual texturing is a reasonable approach. But just like operating systems have evolved over the years, I think this virtual texture tech could use a few more iterations.
To comment on id's recent history, it seems like they're just jumping between gimmicks without ever settling on a good general-purpose engine. I wonder if they ever considered that their engines might be over-optimized... they're extremely good at a few things, but inflexible and difficult to extend (compared to the competition, I mean).
The word id is Latin, standing for something like "under-ego" in Freud's psychology.
ReplyDeleteHence the pronunciation.
http://oxforddictionaries.com/definition/id?rskey=fe55Uk&result=3