checkthegate opened this issue on Jun 24, 2008 ยท 13 posts
Dale B posted Tue, 24 June 2008 at 4:28 PM
Considering that they still haven't figured out actual multiple core multithreading across the board, I'm not holding my breath. The Cry engine takes advantage of precanned shader tricks on the latest GPU cards; it renders -game- environments very well indeed. But a game engine is optimized for taking advantage of GPU's and the human eyes ability to miss the forest for all the pretty trees. Take a single frame (not a capture, a for honest single animation frame), and you can see the tiling, the texture conditioning, areas where the 'light' is just ambient trickery because actual raytracing would choke even an ubergig GPU with oodles and kaboodles of DDR43 GodRAM on card because the number of raybounces exceeds even 128 bit oncard addressing. Get a DX-8 video card, one that doesn't know a shader from a shitcan, and assuming the demo even works, then you will get an idea of what the videeo card is actually manipulating. Once you have that, then you have an idea of exactly what kind of shader programming you will have to engineer yourself to use that game engine as a rendering application. And you will have to do some of that programming -every time- you change the setting, the figures, etc. Hopefully they will find the issues that are bedeviling Vue atm; I find that a lot more likely than some game engine being good enough for commercial production rendering....