Came across this nifty little pdf file today, talking about the graphics engine, some of the problems they came across, and how they got around them.
Heh. This article made me glad I didn't attempt to get an sc2 beta key, because there's no way my computer would be able to provide the raw computational power needed to run these algorithms. The player probably wants to play this game at around 40 to 60 FPS, which means that doing everything stated in this article and way more has to take around 20 milliseconds. I'll just take the example they gave about rendering the depth of field effect used in the campaign passive scenes. To achieve these effects, sc2 must effectively: make a full screen pass (1024x756 or more pixels read, which is around 2.2 mb of data) Run an algorithm over each of these pixels to blur them and generate a medium blurred image (another 2.2mb) Run the same algorithm over these pixels with different parameters to create a maximum blurred image (0.5 mb) Blur the depth map Create a downscaled blur image So every 20 milliseconds, sc2 uses creates about 8 megabytes of image data, processes a gaussian blur on three fourths of it, then processes the alpha... just for the depth of field effect. Anyone playing this better have a good graphics card!
I also found this interesting little tidbit about the system requirements. http://www.starcraft2forum.org/index.php?categoryid=8&p2_articleid=4
Actually, it has and I'm sure it has been posted numerous other times as well but with a simple three second search I found this: http://www.starcraft2forum.org/forums/showthread.php?t=5835&highlight=siggraph Either way, it is old news no matter how you look at it and, in my opinion, that makes it irrelevant. That's fine if you or others disagree. I'm just stating my opinion.