Mon, Dec 23, 9:10 PM CST

Renderosity Forums / Vue



Welcome to the Vue Forum

Forum Moderators: wheatpenny, TheBryster

Vue F.A.Q (Last Updated: 2024 Dec 13 6:58 am)



Subject: Qestion to the new i5 series quad processor + Vue owners


Atron ( ) posted Mon, 18 January 2010 at 1:02 PM · edited Mon, 23 December 2024 at 8:54 PM

Hi guys!

I would like to hear from someone who renders on that new quad processor...

I read a review in which they said that these brand new processors can have 8 threads...does that mean that Vue sees it as 8 processor cores each with, say 2 Ghz, render speed?


ddaydreams ( ) posted Mon, 18 January 2010 at 3:50 PM · edited Mon, 18 January 2010 at 3:53 PM

Attached Link: http://www.brighthub.com/computing/hardware/articles/48391.aspx

I don't own an i5 But It's my understanding Core i5 will recognize the processors as having four core, not 8 as the i7s do, which would greatly diminish render performance compared to an i7 IMHO

See the article at this link for more info. Esp. the part about mutithreading.

 

Frank Hawkins/Owner/DigitalDaydreams

Frank_Hawkins_Design

Frank Lee Hawkins Eastern Sierra Gallery Store

 

My U.S.A eBay Graphics Software Store~~ My International eBay Graphics Software Store

 


Atron ( ) posted Mon, 18 January 2010 at 4:10 PM · edited Mon, 18 January 2010 at 4:11 PM

Quote - I don't own an i5
But It's my understanding Core i5 will recognize the processors as having four core, not 8 as the i7s do, which would greatly diminish render performance compared to an i7 IMHO

See the article at this link for more info. Esp. the part about mutithreading.

 

In the article says: "Hyper-threading is a technology used by Intel to simulate more cores than actually exist on the processor. While Core i7 products have all been quad-cores, they appear in Windows as having eight cores. This further improves performance when using programs that make good use of multi-threading."

I wish someone tested that processor with Vue; this is what I would like to know...does Vue see 8 cores a' 2 Ghz?  (8 x 2 Ghz ?)

I would need an answer from a person who owns such a new quad core system.
 


ddaydreams ( ) posted Mon, 18 January 2010 at 4:38 PM

You had asked about the i5

here is a quote from the same article the next line down from what you mentioned

"Core i5 products, however, will not have this feature, which means operating systems will recognize the processors as having four core and no more. This will have no affect on the performance of most applications, like web browsers and even games, but it will be a blow to those who use 3D rendering software and other such programs that excel with multi-threading."

 

Frank Hawkins/Owner/DigitalDaydreams

Frank_Hawkins_Design

Frank Lee Hawkins Eastern Sierra Gallery Store

 

My U.S.A eBay Graphics Software Store~~ My International eBay Graphics Software Store

 


Atron ( ) posted Mon, 18 January 2010 at 5:46 PM

Ddreams
Thanks for you input but I would like to hear real life reports how the hyperthread feature handles Vue and how many cores will render an image, 4 or 8?....I need to know it before I  buy one especially for Vue rendering...hope some Vue users have this processor and can tell more about this question.


FrankT ( ) posted Mon, 18 January 2010 at 6:10 PM

tried asking E-On ? they'd know for sure (or over at the E-On forums)

My Freebies
Buy stuff on RedBubble


Atron ( ) posted Tue, 19 January 2010 at 4:43 AM

Quote - tried asking E-On ? they'd know for sure (or over at the E-On forums)

No, I thought I could get a fast answer here from people who own both Vue and and a computer with that new quad processor... 


Rich_Potter ( ) posted Wed, 20 January 2010 at 1:14 AM

http://www.intel.com/en_UK/consumer/products/processors/corei5.htm?cid=emea:ggl|ci5dtop_uk_brand|u4C7A77|s

please click on see more details.

i5 processors only have 4 threads MAX, i.e. there will only be 4 cores

so if you want 8 threads you need an i7, i.e. for 8 cores.

Rich

http://blog.richard-potter.co.uk


Atron ( ) posted Wed, 20 January 2010 at 6:11 AM

Quote - http://www.intel.com/en_UK/consumer/products/processors/corei5.htm?cid=emea:ggl|ci5dtop_uk_brand|u4C7A77|s

please click on see more details.

i5 processors only have 4 threads MAX, i.e. there will only be 4 cores

so if you want 8 threads you need an i7, i.e. for 8 cores.

Thanks but I'm only interested in first hand reports since these links can't really answer my question...I just hoped that someone from the power-hungry Vue community bought an i7 computer and could tell about his/her experience....thanks anyway!


Rich_Potter ( ) posted Wed, 20 January 2010 at 6:31 AM

do you want an i7 or and i5? you need to make your mind up!

Rich

http://blog.richard-potter.co.uk


FalseBogus ( ) posted Wed, 20 January 2010 at 7:28 AM

i5 and i7 both have 4 physical cores.

i7 has hyper threading allowing it to allocate 2 execute threads per core thus windows sees it as 8 cores. It still has only 4 physical cores.

Hyper threading gives you extra performance depending on type of operations used with allowing the core use idle cycles with another (the second) execute thread.

However this does not give you twice the computing power compared to non-hyperthreading processor as it just tries to optimize the usage of those 4 cores with extra threads.
As normally you're wasting clockcycles to non-cpu operations and the extra threads are supposed to be utilizing these cycles.

How much extra it'll give is hard to say. depends so much on the operations run.
My wild guess is average of 20-25% more than similar architecture and clockspeed non-hyperthreading processor. You'll get some figures by seaching for i7 reviews.


FalseBogus ( ) posted Wed, 20 January 2010 at 1:43 PM

I have to clarify the thread title a bit.

"i5 series quad processor"

i5 processors have 2 cores, except for the 2 lowest end models they all support hyperthreading thus able to run 4 threads, but they are NOT quad core prosessors as i7 are.

Hyperthreading seen by operating system as 4 (i5)  or 8 (i7) (virtual)cores is not equal to having 4 or 8 real physical cores.


Jonj1611 ( ) posted Wed, 20 January 2010 at 2:24 PM

The i5 750 is a quad core processor.

Jon

DA Portfolio - http://jonj1611.daportfolio.com/


FalseBogus ( ) posted Wed, 20 January 2010 at 3:10 PM

True. I missed that. Thanks for correcting


Jonj1611 ( ) posted Wed, 20 January 2010 at 3:13 PM

No Probs :)

DA Portfolio - http://jonj1611.daportfolio.com/


Atron ( ) posted Wed, 20 January 2010 at 4:54 PM

OK, thank you guys!


FalseBogus ( ) posted Thu, 21 January 2010 at 4:05 AM

might as well ask myself if anyone has knowledge how well the new radeons work with vue8?

Was considering getting the HD5850 as it's faster, cheaper, less power hungry and has opengl3.2 and dx11 support compared to nvidia GTX285.


Atron ( ) posted Thu, 21 January 2010 at 6:07 AM

What I don't understand is that they can make graphic cards with up to hundreds of processing cores in a cpu while Intel/AMD can't give us more than 4 cores in a single processor unit...am I missing something?  :-) 


FrankT ( ) posted Thu, 21 January 2010 at 1:04 PM

because GPUs do a totally different job to CPUs

My Freebies
Buy stuff on RedBubble


Atron ( ) posted Thu, 21 January 2010 at 3:44 PM

Quote - because GPUs do a totally different job to CPUs

OK, I not a cpu expert; I thought the contrary...that they do a similar job (calculating ones and zeros?) lol
Our problem is that we can't use these GPUs for rendering Vue pics...hope that this problem will be solved in the close future.


Arraxxon ( ) posted Sun, 24 January 2010 at 9:27 AM · edited Sun, 24 January 2010 at 9:34 AM

Would be nice, if we wouldn't need to care about the hardware reality and just by programming it with software we could double it - make out of 4 cores simply 8 cores calculating power.
If the hardware isn't there, it won't work - like having a 100 horsepower engine in your car, and by running a special control program you will have suddenly 200 hp. Using a car engine as comparison - such programs can have some effect on the effeciency of the engine running somehow better, but mostly helping to reduce fuel consumption or other important aspects of a car engine.
That doesn't work - like jismi explained it perfectly - it can only help to optimize the calculation and data handling process - which can give you some extra speed, but it won't be noticable in all cases of your work.

In a CPU with 4 cores you can say, that each core is a real single CPU, which can and could work separatly from each other - not depending on each other and by multi-core programming their power can be combined - like giving each core a piece of the cake while rendering an image for instance - just like sending parts of the job to other computers in a network/renderfarm.
Meaning - if 3 cores would be damaged and only one core would be left, this one core still could run the OS and applications, just slower (okay that's just hypotheticlly spoken - the CPU probably would be totally damaged then ...).

On a graphics card cores it's a different story - they can't do any work independently from each other. It's a special architecture design, that has to work together, so one single core can't work alone and can do everything each other can, because they are grouped, each group given certain tasks to perfom and only those tasks.
To be able to do their work, they need to be combined in groups, newly called SM - streaming multiprocessors, to be able to do their work. In the recent Nvidia GT200 series there are 8 of those cores combined to one SM - so all together there are 30 of those SMs - each SM doing something special (simply spoken - i'm not really exact in that, just to make it easier to understand - like one SM is calculating only shadows, the other is for reflections and so on ...)

Looking at the soon to be released new Nvidia graphic cards like the G300 with the new Fermi architecture, it got 512 Cuda cores - it's more than doubling the predecessor GT200 series (like GT240,G260,G280,G295 ...) in core and transistor numbers.
Here you will have 32 cores forming a SM - meaning the new Fermi technology is suppose to have 8x the combined power of the GT200 series ...

That's what i'm looking forward too. They have DirectX 11, too, fitting perfect to the new Windows 7 OS ...

Surely it would be nice, we could add the GPU on a graphic card to the rendering process, since it's doing barely nothing at all, while the CPU is rendering at full power. But like discribed above, it won't be that easy, to stear and handle the data stream to and from those SMs and cores, because they are usually used for different, specialized stuff relating to realtime 3D.

Photoshop CS4 started to use GPU calculation in a few effects / features such as realtime zoom and rotating and a few other things ...

But since they are different and use a different technical architecture while processing data, like FrankT mentioned, it must be more complicated to add them to a rendering process, otherwise it would have been done already since a longer time ... doesn't matter if we are talking about Vue, or other well known expensive highend graphic packages like C4D, 3dsMax, Houdini and whatever they are called ...


Atron ( ) posted Sun, 24 January 2010 at 4:08 PM

Thanks Arraxxon for the explanation!


Khai-J-Bach ( ) posted Sun, 24 January 2010 at 4:39 PM · edited Sun, 24 January 2010 at 4:40 PM

*"Surely it would be nice, we could add the GPU on a graphic card to the rendering process, since it's doing barely nothing at all, while the CPU is rendering at full power. But like discribed above, it won't be that easy, to stear and handle the data stream to and from those SMs and cores, because they are usually used for different, specialized stuff relating to realtime 3D.

Photoshop CS4 started to use GPU calculation in a few effects / features such as realtime zoom and rotating and a few other things ...

But since they are different and use a different technical architecture while processing data, like FrankT mentioned, it must be more complicated to add them to a rendering process, otherwise it would have been done already since a longer time ... doesn't matter if we are talking about Vue, or other well known expensive highend graphic packages like C4D, 3dsMax, Houdini and whatever they are called ..."*

GPU Render Engines are coming. See Octane Render (Demo at http://www.refractivesoftware.com/ which works excellently already ) an unbiased render engine running on Nvidia 8000 series GPU's and above. there's a GPU based render engine coming from the Vray boys in the next few months as well with several others in the pipeline.

the best part is, you can run these along side a CPU render engine....

interesting times ahead.... oh yes.



Arraxxon ( ) posted Sun, 24 January 2010 at 6:13 PM · edited Sun, 24 January 2010 at 6:14 PM

Kaibach - thx for the interesting info !
I know, everyone is working on GPU based rendering, i just didn't have some exact info, who is working on it and when something like it will be released the first time.

But looking at the Octane Render website, their results and estimations with a 10x to 15x speed increase just using a mid-range GPU - not even highend - then we are talking about very impressive speed increases.

Like you said - interesting and for sure exciting times are coming ... !!


Khai-J-Bach ( ) posted Sun, 24 January 2010 at 6:29 PM

a huge thing they did'nt even think of that was found by accident in the chat room the other day..

told a guy about Octane and he installed it and set one of the demo scenes rendering.. then remembered he had poser 8 rendering at the same time. he was getting minimal slow down on poser and 6-8 frames a second on Octane's render. 2 separate projects at the same time on the same machine.

oh the possibilities...



Arraxxon ( ) posted Mon, 25 January 2010 at 5:18 PM · edited Mon, 25 January 2010 at 5:28 PM

Just read this on the Nvidia website - Cuda zone - a Maya GPU renderer called "Furry Ball" - suppose to be 100x to 300x  faster than CPU render on regular Geforce card ...

You are working in realtime in the render for instance ... here is the website:

furryball.aaa-studio.cz/


Dave-So ( ) posted Thu, 04 February 2010 at 5:55 PM

check here for much info on CPU, video, etcetc
updated daily
http://www.cpubenchmark.net/high_end_cpus.html

Humankind has not woven the web of life. We are but one thread within it.
Whatever we do to the web, we do to ourselves. All things are bound together.
All things connect......Chief Seattle, 1854



silverblade33 ( ) posted Thu, 04 February 2010 at 6:50 PM

the problem is guys, GPUs aren't standardized, ie ATI vs Nvidia architecture, and app makers have to consider if they may go bust oen day, or a newcomer appear

and Directx11 is NOT the same as Vue, Maya etc etc renderers. Game renderers don't have near the quality yet, good for animation but not for big stills.
what E-On, Autodesk etc could or can do though is make the app use "a GPU", that is any GPU,.but there in lies the rub: are ATI and Nvidia compatible, or do they need ot make another step of programmig to interface?

you now add ANOTHER level of complexity and issues to Vue, Poser or whatever app you use
Maya etc are backed by Autodesk with huge resources and integration with the card makers, cause here's ANOTHER rub
pros use the special "pro" cards, not the gaming cards most of us use for graphics.
Very bloody expensive things indeed.
so, more complexity and expense...

I do long for such to be used though, sigh, fekking 10 hour renders with stuff I'm making at moment due to being rendered IN the clouds high atmosphere :(
We need real time renders, or at least much faster ones, ASAP! And also UNBIASED renderers

"I'd rather be a Fool who believes in Dragons, Than a King who believes in Nothing!" www.silverblades-suitcase.com
Free tutorials, Vue & Bryce materials, Bryce Skies, models, D&D items, stories.
Tutorials on Poser imports to Vue/Bryce, Postwork, Vue rendering/lighting, etc etc!


Khai-J-Bach ( ) posted Thu, 04 February 2010 at 6:59 PM · edited Thu, 04 February 2010 at 7:04 PM

erm just to note, the Octane Render I noted is NOT a game render engine but a full fledged unbiased render engine using CUDA.

there is also a joint programming standard for using both ATI and Nvidia chips in this fashion. edit remembered the name - OpenCL http://en.wikipedia.org/wiki/OpenCL - being used to bridge the 'chipgap' so either can be used in a render engine.

read the link I posted about Octane - http://www.refractivesoftware.com/ and see whats coming. (also coming for Luxrender :) )



Khai-J-Bach ( ) posted Sat, 06 February 2010 at 11:05 PM

here's information on Luxrenders GPU rendering : http://www.luxrender.net/forum/viewtopic.php?f=13&t=3439



BernieS ( ) posted Sun, 21 February 2010 at 8:03 AM

Anybody using the intel Xeon W3520 cpu?

Just read an article on building a "graphics workstation" and it recommended the Xeon over the i7-920 because it supports ECC memory, which the i7 doesn't.

Be interested in any "real-life" experiences from Xeon users.

Thanks,

Bernie


EricNV ( ) posted Tue, 23 February 2010 at 12:54 PM

A question I have for anybody looking at GPU rendering is how do you plan to utilize OctaneRender's multi GPU-enhanced features once they're activated during beta or version 1? Seems to me you have to add a graphics card to your existing hardware setup just to be able to view and use other applications while OctaneRender is doing its job. (OctaneRender gobbles up GPU memory)

In other words, if I want to scale-up GPU render performance in a multi-GPU config with a 200 euro software product, I've got to spend $1500+ to get a PC with Core i5 / i7+ CPU(s), adequate memory and disk capacity, multiple full-length PCIex16 slots with an adequate power supply, plus the graphics cards themselves.

Am I off-base here?

Eric


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.