Valve Hardware Day 2006 - Multithreaded Edition
by Jarred Walton on November 7, 2006 6:00 AM EST- Posted in
- Trade Shows
The Future of Gaming?
The first thing that Valve had to do was look at the threading models available: how can they distribute work to the cores? Having decided on hybrid threading, the next step was to decide what sort of software tools were needed, and to create those tools as necessary. Most of that work should now be done or nearing completion, so the big question is with the threading in place, how are they going to apply the power available? Up to this point we have primarily been concerned with what multithreading is and how Valve has attempted to implement the technology. All this talk about multi-core support sounds good, but without a compelling need for the technology it really isn't going to mean much to most people.
We've already seen quite a few games that are almost entirely GPU bound, so some might argue that the GPU needs to get faster before we even worry about the CPU. As Valve sees things, however, the era of pretty visuals is coming to an end. We have now reached the point where in terms of graphics most people are more than satisfied with what they see. Games like Oblivion look great, but it's still very easy to tell that you're not in a real world. This does not mean that better graphics are not important, but Valve is now interested in taking a look at the rest of the story and what can be done beyond graphics. Valve also feels that their Source engine has traditionally been more CPU limited anyway, so they were very interested in techniques that would allow them to improve CPU performance.
Before we get to the stuff beyond graphics, though, let's take a quick look at what's being done to improve graphics. The hybrid threading approach to the rendering process can be thought of as follows:
At present, everything up to drawing in the rendering pipeline is essentially CPU bound, and even the drawing process can be at least partially CPU bound with tasks such as generating dynamic vertex buffers. The second image above shows the revised pipeline and how you might approach it. For example, spending CPU time building the vertex buffer makes the GPU more efficient, so even in cases where you are mostly GPU limited you can still improve performance by using additional CPU power. Another item related to graphics is the animations and bone transformations that need to be done prior to rendering a scene. Bone transforms can be very time consuming, and as you add more creatures the CPU limitations become more and more prevalent. One of the solutions is to simply cheat, either by reducing the complexity of the animations, cloning one model repeatedly, or by other methods; but with more processor power it becomes possible to actually do full calculations on more units.
A specific Half-Life example that was given is that you might have a scene with 200 Combine soldiers standing in rank, and the animations for that many units requires a huge chunk of time. All of the bone transformations can be done in parallel, so more CPU power available can directly equate to having more entities on screen at the same time.
With more computational power available to do animations, the immersiveness of the game world can also be improved. Right now, the amount of interaction that most creatures have with the environment is relatively limited. If you think about the way people move through the real world, they are constantly bumping into other objects or touching other objects and people. While this would be largely a visual effect, the animations could be improved to show people actually interacting with the environment, so to an onlooker someone running around the side of a house might actually reach out their hand and grab the wall as they turn the corner. Similarly, a sniper lying prone on the top of a hill could actually show their body adjusting to the curvature of the ground rather than simply being a standard "flat" prone position. Two characters running past each other could even bump and react realistically, with arms and bodies being nudged to the side instead of mysteriously gliding past each other.
One final visual aspect that CPU power can influence is the rendering of particle systems. Valve has given us a benchmark that runs through several particle system environments, and we will provide results for the benchmark later. How real world the benchmark is remains to be seen, as it is more of a technology demonstration at present, but the performance increases are dramatic. Not being expert programmers, we also can't say for sure how much of the work that they are doing on the CPU could actually be done elsewhere more efficiently.
Besides the benchmark, though, particle systems can be used to more realistically simulate things like flames and water. Imagine a scene where a campfire gets extinguished by real dynamically generated rain rather than as a canned animation: you could actually see small puffs of smoke and water as individual drops hit the fire, and you might even be able to kick the smoldering embers and watch individual sparks scatter around on the ground. The goal is to create a more immersive world, and the more realistic things look and behave, the more believable the environment. Is all of this necessary? Perhaps not with conventional games that we're used to, but certainly it should open up new gameplay mechanics and that's rarely a bad thing.
The first thing that Valve had to do was look at the threading models available: how can they distribute work to the cores? Having decided on hybrid threading, the next step was to decide what sort of software tools were needed, and to create those tools as necessary. Most of that work should now be done or nearing completion, so the big question is with the threading in place, how are they going to apply the power available? Up to this point we have primarily been concerned with what multithreading is and how Valve has attempted to implement the technology. All this talk about multi-core support sounds good, but without a compelling need for the technology it really isn't going to mean much to most people.
We've already seen quite a few games that are almost entirely GPU bound, so some might argue that the GPU needs to get faster before we even worry about the CPU. As Valve sees things, however, the era of pretty visuals is coming to an end. We have now reached the point where in terms of graphics most people are more than satisfied with what they see. Games like Oblivion look great, but it's still very easy to tell that you're not in a real world. This does not mean that better graphics are not important, but Valve is now interested in taking a look at the rest of the story and what can be done beyond graphics. Valve also feels that their Source engine has traditionally been more CPU limited anyway, so they were very interested in techniques that would allow them to improve CPU performance.
Before we get to the stuff beyond graphics, though, let's take a quick look at what's being done to improve graphics. The hybrid threading approach to the rendering process can be thought of as follows:
At present, everything up to drawing in the rendering pipeline is essentially CPU bound, and even the drawing process can be at least partially CPU bound with tasks such as generating dynamic vertex buffers. The second image above shows the revised pipeline and how you might approach it. For example, spending CPU time building the vertex buffer makes the GPU more efficient, so even in cases where you are mostly GPU limited you can still improve performance by using additional CPU power. Another item related to graphics is the animations and bone transformations that need to be done prior to rendering a scene. Bone transforms can be very time consuming, and as you add more creatures the CPU limitations become more and more prevalent. One of the solutions is to simply cheat, either by reducing the complexity of the animations, cloning one model repeatedly, or by other methods; but with more processor power it becomes possible to actually do full calculations on more units.
A specific Half-Life example that was given is that you might have a scene with 200 Combine soldiers standing in rank, and the animations for that many units requires a huge chunk of time. All of the bone transformations can be done in parallel, so more CPU power available can directly equate to having more entities on screen at the same time.
With more computational power available to do animations, the immersiveness of the game world can also be improved. Right now, the amount of interaction that most creatures have with the environment is relatively limited. If you think about the way people move through the real world, they are constantly bumping into other objects or touching other objects and people. While this would be largely a visual effect, the animations could be improved to show people actually interacting with the environment, so to an onlooker someone running around the side of a house might actually reach out their hand and grab the wall as they turn the corner. Similarly, a sniper lying prone on the top of a hill could actually show their body adjusting to the curvature of the ground rather than simply being a standard "flat" prone position. Two characters running past each other could even bump and react realistically, with arms and bodies being nudged to the side instead of mysteriously gliding past each other.
Click to enlarge |
One final visual aspect that CPU power can influence is the rendering of particle systems. Valve has given us a benchmark that runs through several particle system environments, and we will provide results for the benchmark later. How real world the benchmark is remains to be seen, as it is more of a technology demonstration at present, but the performance increases are dramatic. Not being expert programmers, we also can't say for sure how much of the work that they are doing on the CPU could actually be done elsewhere more efficiently.
Besides the benchmark, though, particle systems can be used to more realistically simulate things like flames and water. Imagine a scene where a campfire gets extinguished by real dynamically generated rain rather than as a canned animation: you could actually see small puffs of smoke and water as individual drops hit the fire, and you might even be able to kick the smoldering embers and watch individual sparks scatter around on the ground. The goal is to create a more immersive world, and the more realistic things look and behave, the more believable the environment. Is all of this necessary? Perhaps not with conventional games that we're used to, but certainly it should open up new gameplay mechanics and that's rarely a bad thing.
55 Comments
View All Comments
yacoub - Wednesday, November 8, 2006 - link
sorry about that. got a little too excited.Ruark - Tuesday, November 7, 2006 - link
Page 6: ". . . everything crawls to a slow."duploxxx - Tuesday, November 7, 2006 - link
page 8 test setup, clear a cut/paste job... all of the cpu's are the same Athlon.put an allendale in the benchmark, lot's of people want to see how cache related this multithread sw is.
Valve talks about 64-bit... tests are 32bit? Since some competitors are talking about 64-bit code in there gaming also, should be interesting to see what the difference is vs 32bit.
JarredWalton - Tuesday, November 7, 2006 - link
Sorry 'bout that - I've fixed the CPU line. All systems were tested at stock and 20% OC. The problem with Allendale is that it has different stock clock speeds, so you're not just comparing different CPU speeds (unless you use a lower multiplier on Conroe). Anyway, this is a first look, and the next CPU article should have additional benches.The tests are all 32-bit. This is not full Episode 2 code, and most people are waiting for Vista to switch to running 64-bit anyway. All we know is that Episode 2 will support 64-bits natively, but we weren't given any information on how it will perform in such an environment yet.
brshoemak - Tuesday, November 7, 2006 - link
Nice to know where things are headed. Great article.Jarred, 2nd page, 2nd paragraph
should be 'heating the oven' - although quite funny as is, you may want to keep it ;)
JarredWalton - Tuesday, November 7, 2006 - link
Darn speech recognition. And "b" and "h" looked close enough that I missed it. Heh. No ovens were harmed in the making of this article.MrJim - Tuesday, November 7, 2006 - link
This was a very interesting article to read, the future looks bright for us with multi-core systems and Valve games. Excellent work Mr Walton!timmiser - Tuesday, November 7, 2006 - link
Shoot, I didn't make it past the dinner description. Got too hungry!George Powell - Tuesday, November 7, 2006 - link
I quite agree. Top notch article there. It is great to see how Valve are committed to giving us the best gaming experience.Regs - Tuesday, November 7, 2006 - link
The future looks bright for people willing to buy valve and multi-core CPUs!!