Wednesday, June 30, 2010

Crytek aims for the cloud, sparse voxel octrees and sparse surfel octrees

Crytek's HPG 2010 presentation is available at http://www.crytek.com/fileadmin/user_upload/inside/presentations/2010_HPG_Keynotes/Notes.ppt According to the slides, Crytek is already using sparse voxel octree in the game asset pipeline. Because they are Crytek, they are also researching a variation on SVO, called the sparse surfel octree (I vaguely remember surfels from Michael Bunnell's point based ambient occlusion and indirect lighting technique in GPU Gems 2). I wonder what that is going to look like.

There are a few slides on server side rendering, which could "drastically change the direction" of the real-time rendering pipeline.

Other interesting tidbits:
- a configurable hardware scheduler which could make raytracing much faster
- Larrabee 2 still in the race for next-gen consoles?
- DX11 Compute Shaders suck
- Perception-driven graphics are the key to more efficient use of rendering resources (this is a very interesting future direction)

DysKinect

This one made me LOL:

Tuesday, June 29, 2010

Tom's Hardware review of OnLive

Tom's Hardware review: OnLive Cloud-Based Gaming: Is This the End of High-End PCs?
Interesting review, but the assumption in the article's title is already fundamentally flawed: high end PC gaming is already nearing it's end and it has nothing to do with cloud gaming, but everything with consoles. The article's conclusion is that OnLive doesn't come close to displace a high-end PC in gaming, but what the author seems to forget is that game publishers - not PC gamers - will ultimately decide if cloud gaming will succeed and they also have the power to kill off high-end PC gaming in no time (as they have been doing for the last 5 years).

Gone are the days when I was eagerly anticipating cutting-edge PC games like Doom III (before the name change to Doom 3), Half-Life 2 and Unreal 2. That was about 8 years ago. The last PC game worthy of being called cutting-edge was Crysis and that came out 3 years ago. Consoles have become the main focus of id Software, Epic games and now also Crytek. For a graphics enthusiast like myself, this is a very unfortunate trend, all dictated by economical laws.

So in essence, high-end PC gaming is already dead. Crysis still beats every other PC game (ported from consoles) in advanced effects. Cranking up resolution, AA, and AF settings in Modern Warfare 2 will not change the poor lighting and shadowing. I can't wait until the consoles, aka the high-end PC gaming killers, will be made obsolete themselves by cloud gaming. Considering the enthusiasm of game publishers and developers for the cloud (e.g. Crytek is a big proponent of server side rendering cfr. Yerli's presentation at HPG 2010), this might happen much sooner than thought, just like stereoscopic 3D is taking console gaming by storm.

UPDATE: CNET also has an excellent review of OnLive up, with an analysis of the benefits of cloud gaming for all actors in the playing field http://news.cnet.com/8301-17938_105-20009033-1.html

Monday, June 28, 2010

iray officially in the open with Bunkspeed Shot

Bunkspeed Shot (final version) was officially released last week. More importantly, it also means that the long awaited iray, which is powering SHOT, is finally available for everyone (it should have been here much earlier with 3ds max 2011, but was not integrated for some obscure reason).

Bunkspeed SHOT press release

You can download a 30-day trial demo at http://bunkspeed.com/shot/demo/index.html (Don't bother if you don't have a CUDA enabled card with at least 1GB of VRAM, else the software defaults to CPU only, totally retarded limitation if you ask me).

Funny enough, the press release makes it look like you'll need a Quadro or Tesla to run the software, but a Geforce should do fine and is probably faster. The amount of GPU memory will be the real decisive factor and Octane has shown that you can do a lot within a 1GB VRAM budget (it even has procedural textures now, which take up almost no memory).

Waiting for V-Ray GPU... hopefully another "Siggraph surprise" :-D

Saturday, June 26, 2010

Demo of Brigade real-time path tracer out!

Available at http://igad.nhtv.nl/~bikker/

Anyone (remotely) interested in real-time raytracing, owning a CUDA enabled GPU or a powerful CPU, must definitely try this excellent demo! It works with Geforce 8000 cards and upwards, but can also use the CPU only if you don't have a CUDA card. (UPDATE: some people on XP machines cannot run the program because of a msvcrt.dll error. Removing opengl32.dll from the Brigade folder seems to solve the problem). There are 4 different scenes to choose from, some are animated with "planes" flying around. You can simply edit "scene.txt" to change scene, resolution, samples per pixel and so on. This is a video of one of the scenes: http://www.youtube.com/watch?v=qC2zKIqttzk

It's an amazing piece of software with huge potential. Afaik, the CPU uses bidirectional path tracing while divergent rays are path traced on the GPU. Scenes and materials are very simple, but nevertheless I am still stunned that real-time path tracing with animated objects is possible today. (Bidirectional) path tracing solves most limitations encountered by other real-time GI methods which rely on image space techniques or can only be used in diffuse and semi-glossy scenes. Speed is the only limit. I can't wait to see Brigade run on a cluster of PC's each containing multiple Fermi's (as is suggested in the readme file). Looking very forward to the games that the Dutchies will produce with this technology ;-)

Thursday, June 24, 2010

Cloud gaming just works




The last couple of days, I have been reading a lot of impressions from gamers who have tried the OnLive service and I must say that I'm surprised at the amount of positive feedback. Many people don't seem to perceive any lag and the ones that do don't mind it too much and it never makes the games unplayable. I am stunned reading that OnLive works so well, I've been hoping that cloud gaming would work great, but this is even better than I had expected. I thought it would initially be plagued by major lag fluctuations, stuttering, connections shutting down when the service launched, but everything seems fine till now.

When cloud gaming really catches on and other services like OTOY and Gaikai will join the battlefield, these game clouds will have the capability to go beyond what consoles and even high-end PCs can offer in terms of graphics processing:

- insane geometrical detail with e.g. sparse voxel octree raycasting/-tracing for environments and characters
- advanced lighting and global illumination through GPU accelerated raytracing
- physics on every dynamic object
- procedural sound
- more human-like A.I. (just hook up the server to Blue Gene ;-))

This will be imo the ultimate argument to drop restricted console architecture in favor of cloud gaming.

Cloud gaming is also a great way for offering time-limited game demo's, which is Gaikai's main focus: play a demo of a soon to be released game right in your browser. How easy and customer friendly can it get? There are many other options beyond gaming, such as a Facebook-ish virtual reality world (LivePlace powered by OTOY), CAD programs, Photoshop, Matlab, anything compute intensive...

UPDATE: Two interesting short articles on Dave Perry's Gaikai:

http://www.nowgamer.com/news/3548/perry-big-3-will-embrace-cloud-gaming
http://www.nowgamer.com/news/3547/perry-gaikai-demo-surprised-publishers

Friday, June 18, 2010

Video review of OnLive

Cloud gaming, it's finally here: http://www.youtube.com/watch?v=Ir4B0rgta0Y

Two written reviews:

http://gizmodo.com/5567770/onlive-streaming-game-service-tested-at-home-finally

http://blog.wolfire.com/2010/06/Thoughts-on-OnLive

The reviews are surprisingly positive about the lag: some say it's noticeable, others say it isn't, but in either case it's not really an issue during gameplay, not even for "fast twitched" games like Unreal Tournament 3 and Batman Arkham Asylum. I(UPDATE: I have been watching 5 video reviews from casual gamers on YouTube, and every single one says that lag is unnoticeable, which is fuck awesome, I guess this settles anyones doubts about latency!). mage quality is a mixed bag: one reviewer says it looks almost identical to 720p local play, another says it's significantly worse. due to compression. I guess it depends on the bandwidth connection. Image quality is imo definitely not the biggest hurdle for cloud gaming, latency is much more important and apparently it's all very playable. Games are the most challenging software to make work through cloud computing and OnLive has apparently succeeded at this daunting task. OTOY and Gaikai will surely follow. When these services mature and gain popularity, consoles will face a difficult time. Ultimately, when every thinkable piece of software can be run on the cloud, who will still need Windows?

There's no doubt that cloud gaming is the future of games, the killer app for iPad and iPhone, probably sooner than most people think, and a serious problem for next-gen consoles. I think Larrabee/MIC/Knight's Corner might resurface in a cloud game server environment instead of being sold as a stand-alone PC card. I also think that at some point in the future, Nvidia and AMD will make hardware that is specifically aimed at game cloud servers and will be more efficient at memory use and power management in order to serve multiple users with the same hardware resources (like OTOY). A super beefed-up version of AMD Fusion for example.

Using the GPU for precomputing GI in game development

I just read on the Real-Time Rendering blog, that Ubi Montreal used GPUs to precompute ambient occlusion for Splinter Cell Conviction. The technique used was invented by Toshiya Hachisuka and described in GPU Gems 2 in the chapter "High-Quality Global Illumination Rendering Using Rasterization".

I also read that Bungie uses a GPU-accelerated photon mapping technique from the Siggraph 2009 paper "An Efficient GPU-based Approach for Interactive Global Illumination" by Rui Wang et al. to precompute GI in some of the Halo games (ODST?, Reach?).

It's nice to see that GPUs are actually used for precomputating lighting in games and movies (e.g. PantaRay in Avatar) and I believe this is a very interesting trend. On a PC stuffed with multiple Fermi's, some of these techniques might be close to real-time and achieve very high quality. With the latest breakthroughs in GPU-accelerated GI algorithms (path tracing, bidirectional path tracing (Brigade), soon realtime MLT?, (image space) GPU photon mapping, sppm) it should be possible to have movie-quality real-time GI on the next generation of consoles coming in 2012 (at the earliest). Or maybe not on consoles, but definitely on GPU clouds. :-).

Friday, June 11, 2010

Stochastic progressive photon mapping in Luxrender GPU

The guys behind Luxrender have released smallppmGPU and smallsppmgpu, two demos that incorporate ppm and sppm, which are very interesting unbiased algorithms that are much more efficient in rendering caustics than other unbiased methods (plain path tracing in particular) and which also offers nice DOF and motion blur. Link: http://www.luxrender.net/forum/viewtopic.php?f=34&t=4024 (registration is needed)

The next step for the GPU renderers which rely on brute force path tracing is to investigate more efficient and faster algorithms such as bidirectional path tracing and Metropolis light transport running entirely on the GPU. There is already research going on in this area e.g. "Path Regeneration for Interactive Path Tracing" by Novak, Havran and Dachsbacher describes an efficient bidirectional path tracer running on the GPU (http://www.vis.uni-stuttgart.de/~novakjn/paper/eg2010_pt.pdf).

Another logical evolution is getting biased algorithms (photon mapping, irradiance cache) to work efficiently on the GPU. This seems to be a much more difficult (but not impossible) task than having unbiased rendering on the GPU because these biased algo's are much more difficult to parallellize. Some recent papers in this area:

Morgan McGuire and David Luebke: Hardware-Accelerated Global Illumination by Image Space Photon Mapping

Bartosz Fabianowski and John Dingliana: Compact BVH Storage for Ray Tracing and Photon Mapping

Rui Wang et al.: An Efficient GPU-based Approach for Interactive Global Illumination

Maybe Chaos Group will stun us again at Siggraph 2010 with a biased GPU renderer, which renders 10 times faster than V-Ray GPU :-). Lots of interesting approaches to be explored and more exciting times ahead!