RorrKonn opened this issue on Apr 18, 2007 ยท 266 posts
Zarat posted Sun, 22 April 2007 at 2:33 AM
I remember to have it seen the first time while programming GPU's.
GPU's are good for vector math and they are cheaper than true vector processors.
Back in about 1999 or so that was. lolz...
Later, about 2001, it was nothing fancy anymore to do complex scientific sims or graphics on the gpu. AFAIK There was never a software for homecomputers because they got only 1 or today 2 graphicsystems and the 32bit aren't a big help here as well.
The problem would have been to get the data fast enough to the graphic subsystem.
The next problem would have been the memory limit of 2GB or 4GB.Somewhat later there were renderfarms to solve the memory and number of gpu limit.
For rendering some fancy picture a single gpu and a normal pc are enough. Same as with 2D vector graphics.
The math makes it a bit more difficult to change shapes but there are scriptlanguages and there are even new ones in developement currently. Just nor for mass market as usual...
It's also a question of marketing. Right now it's just many whitepapers, formulas and few nice pictures. That won't sell...
There's a project somewhere with a little (technical only) description for some approach to use math based graphics for fast depiction. Have to look for the link later...