Fri, Nov 29, 10:20 PM CST

Renderosity Forums / Vue



Welcome to the Vue Forum

Forum Moderators: wheatpenny, TheBryster

Vue F.A.Q (Last Updated: 2024 Nov 29 1:34 pm)



Subject: New Nvidea Fermi, expected to help rendering?


andrewbell ( ) posted Wed, 21 October 2009 at 10:11 AM · edited Tue, 26 November 2024 at 2:03 PM

Hello there is a lot of speculation as to what this card is going to be used for I have read that it could speed up render times by 100 by using cores on the graphics card to do all that render work. All I have found on the internet is people moaning about how fast Crysis is going to run on it.... I don't care its a naff game imo anyway. If the card had that effect on rendering though this would be awesome and I am sure it would be at the top of everyones x mas list.

If anyone knows/has any idea of how it will/if it will benefit rendering please advise.

And if it came out next month is Vue going to actually be able to benefit from this change in the way things render or will we have till later versions?

Any information relating to this would be great ...


Jonj1611 ( ) posted Wed, 21 October 2009 at 10:48 AM

Man, you just have money to burn eh Andrew lol :) 

I don't know enough information on that chip to make an informed comment.

Jon

DA Portfolio - http://jonj1611.daportfolio.com/


andrewbell ( ) posted Wed, 21 October 2009 at 11:20 AM · edited Wed, 21 October 2009 at 11:21 AM

I thought you might spot this post! I am just trying to get next years xmas wish list in order lol


Jonj1611 ( ) posted Wed, 21 October 2009 at 11:50 AM

lol, well along with the SSD it seems to be coming on well hehe :) 

Jon

DA Portfolio - http://jonj1611.daportfolio.com/


Paula Sanders ( ) posted Wed, 21 October 2009 at 8:55 PM

Is it named for the physicist Enrico Fermi? Does anyone know?


Rutra ( ) posted Thu, 22 October 2009 at 12:24 AM

Vue doesn't use graphic cards to render so, good as it may be, that card will have no effect in render times. You will probably experience better OpenGL preview though, which will help you to build your scenes more accurately and quickly.


andrewbell ( ) posted Thu, 22 October 2009 at 3:36 AM

Quote - Is it named for the physicist Enrico Fermi? Does anyone know?

I believe this is correct.


andrewbell ( ) posted Thu, 22 October 2009 at 3:42 AM

Quote - Vue doesn't use graphic cards to render so, good as it may be, that card will have no effect in render times. You will probably experience better OpenGL preview though, which will help you to build your scenes more accurately and quickly.

Cool that is what I wanted to find out. I expect programs such as 3d Max will benefit from this and maybe future versions of Vue. I have a 4870 which seems to do a pretty good job at Open GL (not as good as old 8800gt in) 


Paula Sanders ( ) posted Thu, 22 October 2009 at 9:58 AM · edited Thu, 22 October 2009 at 10:02 AM

Attached Link: http://www.perpetualvisions.com/new-articles/scientists/physicists.html

file_441600.jpg

I just thought it would be of interest to post a photo that my mother, who was a physicist, took of Enrico Fermi at an American Physical Society meeting in 1938 or 1939. She knew Fermi as well as the others identified in her photos. I know this is a little off the topic, but I was excited when I saw mention of the Fermi graphic card since I grew up knowing about these famous scientists .


andrewbell ( ) posted Fri, 23 October 2009 at 4:50 AM

Wow that is so cool ! Great photo too


Paula Sanders ( ) posted Fri, 23 October 2009 at 8:18 AM

Thanks,  Andrew. I thought it was neat having the photo.


andrewbell ( ) posted Fri, 23 October 2009 at 8:40 AM

Great story to tell too! 


Arraxxon ( ) posted Sun, 01 November 2009 at 9:48 PM · edited Sun, 01 November 2009 at 10:02 PM

Like Rutra mentioned - Vue - and many more applications aren't using GPUs, instead making only use of the CPU(s) while rendering.

One of the first programs, starting to use the GPU for special calculations/functions is Photoshop CS4 for example. But mainly for tasks like realtime rotation and zoom and a few other things, but overall still CPU main usage.

For instance, if i read through the product description on the Cinema 4D R11.5 Maxon website, they clearly explain, that it gets it render speed by the support of multiple CPUs or CPU cores and hyperthreading. For raising the render speed even more, they mention network rendering support and special render technics like subpolygon displacement, bucket rendering, render instances - still all calculated with CPUs ... So nothing about GPU support to find here.
Don't know about other major 3D graphics packages - but i believe it will be a similar story - maybe in the near future.

GPU support, Nvidia CUDA and so on ... it still seems to be in the early stages of developement and maybe gets prepared now for a future major use.

But if you really think about it - the GPUs nowadays are really powerful - and all that power is sleeping, while the CPUs are going hard to work while rendering ... what a waste ...   


ShawnDriscoll ( ) posted Sun, 01 November 2009 at 10:25 PM

When operating systems begin using GPU's as CPU's will we being seeing applications running on them.  Until then, GPU's are still great at real-time triangle drawing for previewing with.

www.youtube.com/user/ShawnDriscollCG


andrewbell ( ) posted Mon, 02 November 2009 at 5:00 AM

Quote - When operating systems begin using GPU's as CPU's will we being seeing applications running on them.  Until then, GPU's are still great at real-time triangle drawing for previewing with.

I am glad because I went out at weekend and bought ATI 5850 ;-) , cannot recommend it enough, lightening fast zipping around my scene. Preview is about 3 x as fast on full quality rather than preview (really getting a good insight as to what my scene is looking like without having to keep clicking render. The viewport is a lot more detailed now showing almost every bump. Now to sell my 4870 ....


Paula Sanders ( ) posted Mon, 02 November 2009 at 6:41 AM

Can anyone compare the Nvidia GeForce  GTX 280  (which I have) and the new ATI 5870 (which I have read about)  in the context of use with Vue 7 or Vue 8?

Thank you.


Khai-J-Bach ( ) posted Mon, 02 November 2009 at 6:45 AM · edited Mon, 02 November 2009 at 6:46 AM

Quote - When operating systems begin using GPU's as CPU's will we being seeing applications running on them.  Until then, GPU's are still great at real-time triangle drawing for previewing with.

the DirectX 11 specs (in Windows 7, Vista download)  actually allow for calculations to be moved to the GPU from the CPU...

http://en.wikipedia.org/wiki/Direct3D#Direct3D_11
&
http://en.wikipedia.org/wiki/GPGPU

implementation is still to come tho.. but it's beginning....



andrewbell ( ) posted Mon, 02 November 2009 at 7:23 AM · edited Mon, 02 November 2009 at 7:24 AM

Quote - Can anyone compare the Nvidia GeForce  GTX 280  (which I have) and the new ATI 5870 (which I have read about)  in the context of use with Vue 7 or Vue 8?

Thank you.

Hi Paula what is your performance like with the 280 ? ... I understand that Nvidia is supposedly better for 3d applications than ati . My old 8800 GT was not that far behind the 4870 when it came to performance but this new card totally blows the 4870 out of the water.... I seem to have many more levels of detail in the main window that I never had before (especially on objects).


ShawnDriscoll ( ) posted Mon, 02 November 2009 at 1:06 PM · edited Mon, 02 November 2009 at 1:07 PM

Quote - the DirectX 11 specs (in Windows 7, Vista download)  actually allow for calculations to be moved to the GPU from the CPU...

This has what to do with final rendering?

www.youtube.com/user/ShawnDriscollCG


Paula Sanders ( ) posted Mon, 02 November 2009 at 2:11 PM

Hi Andrew -

I get good results with my Nvidia GeForce GTX 280. I just wondered about this card because I have heard that it is so good.


Khai-J-Bach ( ) posted Mon, 02 November 2009 at 4:04 PM

Quote - > Quote - the DirectX 11 specs (in Windows 7, Vista download)  actually allow for calculations to be moved to the GPU from the CPU...

This has what to do with final rendering?

sigh

moving the calcuations from the CPU to the faster GPU for final rendering. please READ the articles. its something thats been talked about for the last couple of years.
I know you have problems with anything I say, but I Do know what I am talking about.



Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.