Fall IDF 2006 - Day 1: Laser FSBs, more Alan Wake, Flash in Vista & DDR3
by Anand Shimpi & Cara Hamm on September 27, 2006 3:16 AM EST- Posted in
- Trade Shows
Intel Demonstrates DDR3 on Desktops
In a peculiar move, Intel is demonstrating a couple of Core 2 systems using the upcoming Bearlake chipset, due out sometime in the middle of 2007. The biggest differentiating feature between the upcoming chipset and the current solutions is official 1333MHz FSB support and DDR3 memory support:
The DDR3 being used was standard 240-pin DDR3 memory running at 1066MHz (CL7). We'd expect to see DDR3-1333 by the time the Bearlake chipsets launch next year, offering a 1:1 ratio with the 1333MHz FSB.
The systems were running a combination of Quake 4 (or Doom 3 depending on the system) and a modified version of CPU-Z to verify DDR3 support. Although Intel wouldn't let us peak inside the cases, there was a live webcam attached that was pointed at the DDR3 modules to remove doubt of any demo system shenanigans.
Unfortunately, lower power consumption may be the only tangible benefit users get from DDR3, as we may not see much of a performance boost since current generation DDR2 platforms aren't exactly memory bandwidth limited themselves. Hopefully Intel has learned from its mistakes with the first DDR2 chipsets and will do what it takes to make the Bearlake family of chipsets a more attractive option upon their introduction.
Final Words
Day two of the Fall Intel Developer Forum is set to begin and there's much more to talk about, we'll report it as soon as we hear it...
16 Comments
View All Comments
drwho9437 - Wednesday, September 27, 2006 - link
"Photonic signaling is in many ways superior to electrical signaling as you can get much higher bandwidths out of an optical bus than you can out of an electrical bus, thanks to photons traveling much faster than electrical fields."Electric signals are photons too. All EM is photons. You are just talking Giga vs Tera freq. This lets you build more ideal wave guides. But that's all. They are both apples.
stepz - Wednesday, September 27, 2006 - link
Electrical bus doesn't signal with electrons, it signals with an electrical field. Individual electrons can take several minutes to get from one pin to another. Optical bus has a bit less latency thanks to photons traveling faster than electric field (around 0.5ns less for a distance of 30cm/1 foot). It has a lot more bandwidth thanks to lack of interference and immense frequency of the carrier signal.
I assume you mean signaling at the board level. It's not practical to use optical signaling at the chip level. Wavelength of the light used is too long for widespread use as chiplevel interconnects.
Caligynemaniac2 - Thursday, September 28, 2006 - link
Actually, optical buses will have more latency than electrical buses. E&M signals propagate in copper wires at ~.9c, while fibre optic cable propagates at ~.6cFurthermore, there will be inherent latency in converting from an optical signal back to an electrical one. What you get with optical interconnects is both increased bandwidth and increased latency.
This may be offset by the ability of serial interfaces to easily revert back to parallel interfaces when coupled with optical rather than electrical signaling (i.e. it is easy to duplex optical signals and essentially impossible for electrical ones- which meant routing was a nightmare for parallel electrical signals as bandwidth needs increased, and will make routing simple for optical signals).
drwho9437 - Wednesday, September 27, 2006 - link
Its all the same all that matters for "speed" of the particle traveling is the characteristic impedance of the waveguide. Signaling bandwidth has to do with how many wiggles you can make EM fields, that is just related to the freq. Now the reason we jump to optical freq after GHz is just because there is no easy way to make the intermediate freq.JarredWalton - Wednesday, September 27, 2006 - link
Electrical fields vs. electrons... what are we, electromagnetic physicists!? ;)The point of silicon photonics is to get light down to the chip level interconnects. It's obviously not there yet. Board level signaling should already be possible, though not generally practical, and of course system level signaling has been optical for a while (server to server fibre optic connections).
VooDooAddict - Wednesday, September 27, 2006 - link
I'm already looking forward to Core 2 Quad. The news that DDR3 support is also right around the corner makes me want to wait a little longer for DDR3 and the added bandwidth and 1:1 FSB to Memory Clock.Sunrise089 - Wednesday, September 27, 2006 - link
While actually having a highly multi-threaded game will be nice, until much more specific performance info is available I see little reason why "gamers [will] start thinking about the move to dual/quad core if they haven't already."Several problems stand in the way of quad core being ideal for gaming:
1) Anandtech itself reported how difficult it was for a game developer to make their games trully multithreaded, so it remains to be seen how many games actually have the degree of threading present it Alan Wake.
2) The publisher only quoted actual performance use of the physics thread (80% of a standard clocked C2Duo), so it's VERY possible if the other threads use substantially less processing power the four cores may not be trully needed.
3) Even in this game, the publisher itsself admits that simply overclocking the dual-core machine allows for equal performance compared to the stock quad-core, so raw power seems to be trully important, not a specific number of cores.
4) The most important performance increase will always be from one core to two, since the overhead of the OS, antivirus program, etc can be removed from the primary game thread. After that, each additional core will suffer from deminished returns
5) Most importantly, going from dual to quad core costs money, and if the past is any indication, that mnoey should be better spent on a GPU upgrade. In fact, with the near theoritical doubling of performance with dual core over single, the GPU upgrade is probably reletively a better idea than it's ever been.
Until these issues are resolved I think while the game may be impressive, the idea that gamers will want to jump to quad-core is mostly marketing derived.
Anemone - Wednesday, September 27, 2006 - link
I find it disturbing that there isn't much comparison of that same Alan Wake game on a Core 2 duo. I do realize that there is a push to highlight the benefits of a quad processor, but are they now making the window for a dual core being the sweet spot to say, just 2007?See, inadvertently, they are likely smashing the sales of core 2 duo's. A lot of folks are wondering if a core 2 duo is going to be "enough" and then for how long will it be enough? Intel has built a lot of "transition" chips over the years and they often have had relatively short useful lives. Is that the reason the Core 2 duo came out so reasonably priced? Is it going to be a small one year chip that is outdated, and potentially badly so, by the end of 2007? If so, why buy one?
Some assurance would have helped a lot if we'd seen a comparison of how Alan Wake, and the benchmark quad processing programs ran on Core 2 duo's. As it is, a lot of folks are getting nervous that they have bought, or might be buying into a dead technology of "just" a dual core.
I sure hope Intel addresses this market concern before they conclude. If not, the holidays could be a very rough season for sales of Core 2 duo's.
JarredWalton - Wednesday, September 27, 2006 - link
It's IDF and Intel is pimping new technology. I would say it's pretty reasonable to assume that quad cores are not at all required. Stating that HyperThreading can run the game with lower detail is also sort of funny, as HT only gives about 10% more performance. Let's see... Athlon 64 single core 4000+ is about 20-30% faster than the best HyperThreading Pentium 4 chips, but HT is enough while a fast single core is not? I don't really buy it, although "enough" seems to be a stretch at best. I will wager that Remedy will work hard to make sure the game at least runs on single core setups, as that is still a very large market segment.Another thought: audio often uses maybe 10-20% of the CPU time in a game. So physics + audio is one core. The streaming and terrain tessellation sounds like maybe half a core at best, and the rendering would probably use the rest of the available power and then some. Remember that Xenon only has 3 cores available, all without OoO execution (Out of Order), so it's reasonable to assume 3 OoO cores will be more than enough, and in fact 2 cores is probably going to be fine.
cmdrdredd - Friday, September 29, 2006 - link
"It's IDF and Intel is pimping new technology. I would say it's pretty reasonable to assume that quad cores are not at all required. Stating that HyperThreading can run the game with lower detail is also sort of funny, as HT only gives about 10% more performance. Let's see... Athlon 64 single core 4000+ is about 20-30% faster than the best HyperThreading Pentium 4 chips, but HT is enough while a fast single core is not? I don't really buy it, although "enough" seems to be a stretch at best. I will wager that Remedy will work hard to make sure the game at least runs on single core setups, as that is still a very large market segment."This is what I think too, they would dig their own grave if they released a game that performed pitifully on a 2.6Ghz single core system. Alot of people still use those HP, and Dell systems their parent's bought them to game on (sometimes with upgraded GPU).
There's no way they can convince me that all 4 cores are needed for any game. I can see 2 because that's becomming mainstream, but to commit suicide by relegating users to abysimal performance is bad news.