Fri, Nov 22, 12:55 PM CST

Renderosity Forums / Vue



Welcome to the Vue Forum

Forum Moderators: wheatpenny, TheBryster

Vue F.A.Q (Last Updated: 2024 Nov 21 4:12 am)



Subject: Upgrade video card for Vue Complete 11


kenmo ( ) posted Thu, 07 March 2013 at 10:26 AM · edited Fri, 22 November 2024 at 12:46 PM

Presently have a Nvidia GTS250 with 1GB memory...
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814130892&Tpk=nvidia%20gts%20250

The computer is an I5-2500K with 16 gb of DDR3 memory, running Windows 7-64. System drive is an Intel 330 series 240 gb SSD...

I'm considering upgrading the video card to a Nvidia GTX 650 ti with 2 gb of DDR5 memory...

The GTS250 has 128 cuda cores and the GTX 650 ti has 786... However the GTS250 has a 256 bit memory interface and the GTX650 ti is 128 bit.

The GTX650 ti also meets my budget concerns...

I've also looked at ATI/AMD cards but it seems most 3D apps prefer Nvidia... and I'm not sure how cuda cores matches with stream processors ( cuda 1= stream 1)????

 

Software I use is : Vue Complete ver 11, Hexagon 3D, 3DCoat, Blender3D, Wings3D, Curvy3D...

Also Photoshop & ArtRage...

Any thoughts?

 


aeilkema ( ) posted Thu, 07 March 2013 at 11:18 AM

I don't think I would go from 256bit to a 128bit controller at all, I would wait for a little longer and save up some more to get the best. Vue is very picky about graphics card, so I wouldn't go cheap. 3DCoat likes power as well, the rest I don't know.

 

I have ATI, but I'm not really happy with it, it's OK, but if ever I buy a new computer or graphics card, I'm going to switch to nVidia for sure. I'm not sure if it really matters a lot, but for me ATI seems to stand still. I went from a 4000 series to 7000 series and it's not even a whole lot better at all. I can hardly see the improvement and it seems that at times the low range 4000 series is even better then the mid range 7000 series I have now.

Artwork and 3DToons items, create the perfect place for you toon and other figures!

http://www.renderosity.com/mod/bcs/index.php?vendor=23722

Due to the childish TOS changes, I'm not allowed to link to my other products outside of Rendo anymore :(

Food for thought.....
https://www.youtube.com/watch?v=pYZw0dfLmLk


cyberknight1133 ( ) posted Thu, 07 March 2013 at 11:26 AM

Really? I noticed quite a difference going from  a 4870 to a 5870. Were they comparable cards in terms of quality?


forester ( ) posted Thu, 07 March 2013 at 1:25 PM · edited Thu, 07 March 2013 at 1:32 PM

The reason that ATI cards are slower in Vue is that they use a software emulator for some functions that nVidia has as hardware. These functions have to do with geometry processing, so nVidia cards are almost always more efficient and often faster than ATI cards for applications involving 3D. Of course, a more powerful ATI card will offer greater performance than a cheaper and older nVidia card, as a general rule.

CUDA processes do not equate with "threads" or anything else, really. CUDA refers to a way in which certain physical phenomena, such as particles or object dynamics (fracturing stones, for example) are processed by the GPU on the video card (that is, off-loaded from the motherboard CPU). It is a proprietary process belonging and exclusive to nVidia cards. There is no equivalent in the ATI chipset or for any other video card maker. (Hope this attempt at "plain language", non-tech talk is understandable.) I do not know if the particle processes or the new ecosystem/particle processes in Vue 11 make use of CUDA processes on nVidia cards or not. Perhaps someone else on this Forum does know.

For this reason, CUDA being a set of GPU processes that Vue may or may not use in any meaninful way, I agree with aeilkema's comments that it might not be good to go from a 256 bit processor to a 128-bit processor.  However, given that the CPU described above is kind of a mid-range i5, any upgrade to an nVidia card with a good GPU will pull some of the load from the cpu and pass it to the cpu, and that is all to the good. Having more video memory, then is a good thing. So, there should be some real measurable and significant increase in processing performance from the proposed upgrade. Not as much as we would expect to see on an Intel i7 cpu, but nevertheless, some serious increase in performance.

Given that what is being basically proposed, in technical terms, ... transferring a significant amout of processing from the cpu to the gpu, kenmo also has to take into account the motherboard. He has not told us what that is, but it very much matters here. A run-of-the-mill motherboard could be the bottleneck that makes this upgrade less significant than it might be. Or, kenmo could have a very good motherboard, which will make the proposed upgrade everything that should happen with the change.

 post query... I did some tech checking. Vue does not make use of CUDA. So the whole thing being accomplished by the proposed upgrade is to offload some processes from the motherboard cpu to the gpu on the nVidia card. Still, this is an upgrade that makes sense to me.

Kenmo, you also need to make sure that you have sufficient power for the new nVidia 650 card. The stated power wattage requirement is 500 watts for the card alone, but one of my computers has a nVidia 650, and I found that I needed to allow the card to have at least 650 watts to itself. In my case, I had to upgrade the power supply on that box to a 1000 watt psu, a first for me!



kenmo ( ) posted Thu, 07 March 2013 at 3:39 PM

My psu is a Corsair 750 watt, the mobo is a Gigabyte GA-Z68X-UD3-B3....

http://ca.gigabyte.com/products/product-page.aspx?pid=3852#ov

 


zigoenga ( ) posted Wed, 17 April 2013 at 4:21 AM

Vue only use cpu for renders so you don't need to upgrade for the moment , my opinion 


ShawnDriscoll ( ) posted Wed, 17 April 2013 at 10:03 AM · edited Wed, 17 April 2013 at 10:03 AM

I got the ASUS nvidia GTX-670 last week for my computer I put together.  I picked that card because I wanted my new computer to play video games of all types, as well as be able to do decent OpenGL in Vue and in other 3D modeling software I use like modo.

www.youtube.com/user/ShawnDriscollCG


kenmo ( ) posted Wed, 17 April 2013 at 4:51 PM

Attached Link: Gigabyte GA-Z68X-UD3-B3....

> Quote - He has not told us what that is, but it very much matters here. A run-of-the-mill motherboard could be the bottleneck that makes this upgrade less significant than it might be. Or, kenmo could have a very good motherboard, which will make the proposed upgrade everything that should happen with the change.

 

My motherboard is a socket 1155 Gigabyte. Model GA-Z68X-UD3-B3....


ddaydreams ( ) posted Wed, 17 April 2013 at 9:01 PM

If your VUE scene setup seems laggy, sluggish. Then you might want to upgrade your card.

Otherrwise Vue does not seem to require much of a card. The rendering is CPU with the exception of Vue infinate wich can do GPU aa if it's needed and choosen.

Frank Hawkins/Owner/DigitalDaydreams

Frank_Hawkins_Design

Frank Lee Hawkins Eastern Sierra Gallery Store

 

My U.S.A eBay Graphics Software Store~~ My International eBay Graphics Software Store

 


drifterlee ( ) posted Thu, 18 April 2013 at 12:53 PM

I have the Geforce GTX 680 that has 4 gigs of RAM on it, but I also have 64 gigs of RAM. I love the graphics card. It runs Vue 11 great.


ddaydreams ( ) posted Thu, 18 April 2013 at 1:01 PM

I have the Geforce GTX 660 that has 2 gigs of RAM on it, also have 16 gigs of SYS RAM.  It runs Vue 10C great. this card was an upgrade from a GTX 260 with i think 768MB on it.

As my scenes got bigger the 260 was getting sluggish during scene setup. With the 660 setup is smooth. And my 660 uses less watts and only need one 6 pin power cable.

Frank Hawkins/Owner/DigitalDaydreams

Frank_Hawkins_Design

Frank Lee Hawkins Eastern Sierra Gallery Store

 

My U.S.A eBay Graphics Software Store~~ My International eBay Graphics Software Store

 


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.