Titanfall’s Texture-Gate Rumors: Absurdity or Truth?
Rumors, rumors, rumors. When you hear it from one source, take it with a grain of salt. Two separate sources with the same message â€“ there might be something to it. But when three, somewhat dependable sources, lay on the details in unison, then usually the best thing to do would be to actually listen.
With the recent rumors started, and mutually echoed by thuway, famousmortimer, and cboat, regarding the textures of the final version of Titanfall remaining unchanged from the alpha version, a lot of controversy has erupted about the accuracy of the information that was given to our three insiders, or even if the dissemination of said information was intentional, just to mar Respawnâ€™s upcoming release.
Released along with a fairly logical rumor about Titanfall rolling out with 16 multiplayer maps, the Titanfall texture-gate (I guess itâ€™s a thing now) controversy was immediately quashed by the developers, saying that it was most certainly not true:
@Gaming4Fun1 No, that’s not true.
â€” Respawn (@Respawn) January 30, 2014
HiredGunman10 from the Steam forums recently made the following statement:
â€œAs most of you know, Titanfall runs on the Source Engine.
However, what bothers me is that Source can’t handle large, detailed maps – which Titanfall obviously has.
So what has Valve done to the engine that allows it? Was texture streaming implanted into the engine? Cause I have yet to see any texture pop-in in any of the gameplay videos.â€
He also made another interesting statement:
â€œFor the past 10 years since Source’s debut, Valve has never bothered to allow Open maps to run well in the engine. Many of their recent games have been limited by this flaw. Therefore, I was surprised that its finally implanted into the game!â€
Yup, sounds like the ravings of a lunatic, right? I mean, the Source engine itself is about 10 years old, and there have been a ton of changes to the Source engine in subsequent games that could have fixed the open-world rendering problems that the engine has, or had. Respawnâ€™s Drew McCoy even said so himself in an interview with Digital Foundry:
â€œThe thing about the Source Engine when we got it is that we actually branched from Portal 2. It was DX9, very single-threaded and they used the way that engine worked to its best possible potential for Portal. It can’t render that much on-screen. The main thread just can’t push out enough jobs, so we’ve done a huge amount of work. We didn’t choose this engine because it was going to be 60, we chose this engine knowing that we’d be spending the next two years making it fast.
It’s actually a pretty slow engine for showing stuff on-screen. What we have in a level now would run in single digits on what it was before – if you could even get it to load at all. It’s been a huge engineering task, so what we did was put all the engineering [team] on the back-end so design [team] could be up and running at the task, otherwise engineering would have to be creating tools and design would be sitting around twiddling their thumbs. We only have a dozen or so engineers – it’s pretty small for the amount of work they’ve done. â€
So it does seem that the developers at Respawn Entertainment have worked hard to fix the caching and rendering problems so inherent to the engine; an tool that might not have been the best choice for their vision, but still better than creating a brand-new engine from the ground up.
But what about HiredGunman10â€™s statements? Was there any truth to his diatribe? We did some checking around and found that most of the issues that he was talking about, are actual valid issues tied to the Source engine. Check out the following statements about some exiting problems with the Source Engine from the Half-Life Wiki page, under â€œCommon Issuesâ€:
â€œThe Source Engine uses a caching system, whereby the loading of certain resources is handled and managed on the fly, rather than in a single operation behind a traditional loading screen. Texture and sound data are the primary areas in which this occurs. Textures are loaded to memory but only moved to the system’s video card when needed and audio files are loaded with an unusual “soundcache” system: only the first 0.125 seconds of each file are pre-cached, and the clip is used to cover the asynchronous buffering of the full sound file in the background when it is first requested.
Both systems keep data in the heap until there is no more room and older resources are flushed out, and when either is held up or otherwise slowed down the engine will either freeze or go into a temporary loop until the data arrives. ‘Stuttering’, or ‘hitching’ as it is sometimes known, is the result of these pauses.â€
From the Wiki page, it seems like all the issues that the Source engine has when rendering large environments is because of the caching issue which is pretty debilitating in its own right. Drew McCoyâ€™s own statements above, pointed to this when he said â€œIt can’t render that much on-screenâ€¦ It’s actually a pretty slow engine for showing stuff on-screen. What we have in a level now would run in single digits on what it was before – if you could even get it to load at all.â€
So how did they compensate for the caching issues? And rendering bigger environments on top of trying to address that problem? Well, they certainly fixed almost all of the issues (as we could tell by the Titanfall Alpha), but what about the texture resolution? Was that one of the needed developmental sacrifices to ensure that the engine runs as smoothly as possible, and renders large environments with ease, especially considering they had to standardize the engine’s graphical settings on the (relatively) restricted tech on the Xbox One?
Cboat, thuway, and others just might be right about the final product’s textures being the same as the Alpha version. When Titanfall hits all the yearning fans (including Yours Truly) on March 11th, weâ€™ll all know for sure.