Thu, Nov 28, 11:19 PM CST

Renderosity Forums / Blender



Welcome to the Blender Forum

Forum Moderators: Lobo3433 Forum Coordinators: LuxXeon

Blender F.A.Q (Last Updated: 2024 Nov 24 8:44 pm)

Welcome to the Blender Forum!


   Your place to learn about Blender, ask questions,
   exchange ideas, and interact with the other Blender users!


   Gallery | Freestuff | Tutorials

 

Visit the Renderosity MarketPlace. Your source for digital art content!

 





Subject: AMD RX580 and Cycles


Retrowave ( ) posted Fri, 31 January 2020 at 5:38 AM · edited Thu, 28 November 2024 at 2:40 PM

Calling all AMD-Using Blender Heads!

I recently built a new PC, completely AMD based as in Chipset, CPU, and GPU. It continues to blow my mind, but I noticed something the other day. The GPU, for some reason, does not appear to be making use of the on-board processing cores when I put Blender into Cycles mode. But here's the thing, I know the card is installed and working correctly, because running in EEVEE mode it runs like greased lighting!

When I first run Blender on the new system, I switched to Cycles mode and initially, never took notice of whether it was using CPU or GPU. The performance was so good that I just assumed I were using the GPU. Turns out my AMD CPU is faster than my old nVidia GPU. That's a good thing, and something I never would have expected, but the problem is, the new CPU is also outdoing the new GPU by a massive margin, and that's not expected at all.

The important thing to note here is that Blender is obviously making proper use of the GPU when in EEVEE mode, but when it's in Cycles mode, it performs worse than the old nVidia GPU. The old nVidia GPU was an ancient GTX460 with only 1GB RAM and 336 processing cores, whereas the new AMD GPU is an RX580 with 8GB RAM and 2304 processing cores. The AMD is an absolute beast, and like I said, is certainly doing its thing when in EEVEE mode, but when put into Cycles mode, it works, but the performance is notably less than the CPU.

The CPU is an AMD RYZEN 7 2700, which is an energy efficient version with the 8-Core, 16-Thread architecture. It performs flawlessly, so flawless I actually thought I were using the GPU, so there is something amis somewhere with the setting for the GPU. I've checked everything, and everything is working fine apart from, for some reason, Cycles doesn't seem to want to know about the 2304 processing cores available to it in the RX580 GPU, and is being outperformed by the CPU. But remember, this is not the case with EEVEE, because with EEVEE, it's very obvious that it must be making use of those cores, everything is just instant, so it must literally be rendering the scene at 1/60th of a second.

I'm expecting workmen to turn up and upgrade the heating in this place any time during the next few weeks, so I've boxed it back up until they've been. But thought I'd ask about this in the Blender forum so that hopefully, I have something to be going on with once the workmen have left, and I can pull it back out of the box again. Any help on this is appreciated, I have no idea why it's not using the GPU's cores when in Cycles mode.


Lobo3433 ( ) posted Fri, 31 January 2020 at 8:03 AM
Forum Moderator

Cycles does not support RX580 you will need to select OpenCL instead of CUDA for rendering with AMD cards in preferences that would be the best set up for your current system

Screenshot_1.jpg

Lobo3433

Blender Maya & 3D Forum Moderator 

Renderosity Store

My Free Stuff

Facebook

Twitter

Renderosity Blender 3D Facebook Page



Retrowave ( ) posted Fri, 31 January 2020 at 12:21 PM

Thanks Lobo, but are you absolutely sure it's not a supported card?

If I go to the OpenCL tab, I get pretty much the same as you see on the CUDA tab, and RX580 is the listed GPU and selected, just as you have your 980Ti selected. I can't imagine why AMD, who actively support Blender, don't have their new cards supported when nVidia, who do not support Blender, do work. I know that CUDA cores are different to the cores on an AMD card, but the point is, they are cores, and even EEVEE uses them.

From what you say, you sound very confident and you are probably right, but I hope not, cause the reason I bought this card was mainly for Blender, and it was a top recommended card for Blender on some website out there. I know AMD have released their own Renderer for Blender (called ProRender) that is supposed to outperform Cycles on an AMD card, so maybe that's why it was recommended, but nevertheless I am shocked that AMD, who support Blender, have not yet gotten Cycles development team to support AMD's cores as well as nVidia's.

I'm surprised Ton even stands for Blender supporting a proprietary thing like CUDA, while something open like AMD is not supported yet, seems absolutely backward to me, and even more so considering AMD are kinda friendly with Blender, and nVidia are not.

I'll try it in AMD's "ProRender for Blender" mode when I get it back out, and if the recommendations were true then it should be fully supported and blisteringly fast. If not, I suppose I could change the card for an nVidia card, but I don't like nVidia and their arrogant proprietary way of doing things, and I intend to be using Linux, where again, AMD are open about drivers and nVidia are not.

That said, it does use the AMD cores when using EEVEE, and the CPU is so fast I thought I were using the GPU anyway, so I don't suppose it matters. I'll just have to check out "ProRender for Blender" mode on YouTube, give it a try later, and bloody-well hope it is as good as they say it is.


Retrowave ( ) posted Fri, 31 January 2020 at 4:12 PM

I see where my confusion lies now, it actually was related to "Prorender for Blender", which is AMD's non-proprietary answer to CUDA, something AMD have been working with the Blender team to develop.

Basically, AMD users need to install "ProRender for Blender", to enable GPU powered rendering that uses the processing cores on the AMD GPU. It also gives you "Full Spectrum Rendering", whatever that is, and from the comments on various YouTube videos, this is AMD's way of crushing CUDA, since it is open and works on both brands of cards.

I get why the Blender team worked with them on it now, lol, smart move Blender (and AMD).

Anyway, haven't tried it yet, but will as soon as the heating work in this place has been done. I'm off to YouTube now to find out what Full Spectrum Rendering is, but for those who are not aware of the GPU thing, here's what you need to install to enable GPU Blender rendering on your AMD card - PR for Blender


Lobo3433 ( ) posted Fri, 31 January 2020 at 4:43 PM
Forum Moderator

Blender for as long as I have been involved in using has always support NVidia cards and have always recommended them as the best cards to use with Blender now it is my understanding and on this point I might be off but Blender wants to get to a point where it can support CPU and GPU rendering simultaneously or at the very least offer comparable rendering speeds for either CPU or GPU EEVEE was a big step forward for Blender and made Blender original render engine pretty obsolete main reason Cycles was adapted. I know that there is also an Free Octane version of Blender that uses Octane as its default render engine but seems requires some hoop jumping to get it working right. I have not heard of the ProRender you mention will look into it so I am informed. And IMHO I do not see Blender ever dropping Cuda since that has been one of the foundations Blender was built on going all the way back to 2.49 days look forward to hear what your results are

Lobo3433

Blender Maya & 3D Forum Moderator 

Renderosity Store

My Free Stuff

Facebook

Twitter

Renderosity Blender 3D Facebook Page



Retrowave ( ) posted Fri, 31 January 2020 at 5:58 PM

Sorry Lobo, I could have worded my post a little better. Blender will always support nVidia, regardless if it being proprietary, but what I meant was that AMD have designed their cores to be better utilised through open software implementations like OpenCL and Vulcan, and they have done so to crush the monopoly that nVidia currently have with their proprietary GPU based CUDA. Bear in mind I'm talking about AMD's tactics, not Blender's. AMD will do everything they can to hurt nVidia, and by supporting open source, they are using the same successful formula at bringing the competition down, as Blender is doing.

The reason AMD made a smart move is because their API is open, and any program developer can incorporate AMD's ProRender GPU rendering into their product, it's not just for Blender. And here's the thing, the AMD tech will be more desirable to a developer than implementing CUDA, because unlike CUDA, it is not restricted to customers with nVidia cards. Basically, AMD have turned the table so that nVidia's proprietary model shoots themselves in the foot. And just to make it hurt a little harder, AMD hardware is designed from the ground up to perform better using these open standards, whereas nVidia hardware is not, nVidia is optimised for closed-source CUDA. That's one of the major differences in the design of AMD hardware and it's GPU cores compared to CUDA cores.

This is why, for some years now, AMD have been an active sponsor and supporter of Blender, and the Blender development team have worked closely with with AMD to get ProRender working. This is why when you said that the RX580 isn't supported, I thought what the hell, how on earth can that be when AMD are pretty much in bed together with Blender these days, and it was a top recommended card for Blender.

My apologies though, I completely forgot that it was due to ProRender that made it a suitable card. I just assumed it rendered on the GPU using Cycles, so I didn't make the connection when I read about it. I'm well-pleased I'll be keeping this card though, especially as I will be using Linux as my OS, cause that's another reason I basically went with AMD everything (safe and secure open drivers etc).

Anyway, I'll let you know how it goes but it might not be for a few weeks yet. I'm not sure how developed it is, and I'm reading different opinions on how it compares to Cycles at this stage, but it's open source, developed by AMD and has Blender behind it, so it's here to stay and that's the main thing, and it ought to develop pretty damn quick with those two behind it.


LuxXeon ( ) posted Fri, 31 January 2020 at 7:20 PM · edited Fri, 31 January 2020 at 7:22 PM

As of Blender 2.8, Nvidia has been working very closely with Blender Foundation for enhancing the raytracing GPU using Nvidia cards. This is why we now see accelerated support for Optix (RTX) as well as Cuda in the latest releases. The latest RTX drivers and cards can now speed up GPU renders as much as 4 times faster than just using the CUDA versions of the cards with Cycles. Nvidia began funding Blender development fund in a major way back in October, and it's likely that Cycles will be developed around the Nvidia GPU drivers and hardware going forward. Although AMD is still supported and being developed as well. I feel the perfect pairing cost effective solution is AMD cpu with Nvidia GPU at this time, since almost all well developed GPU render engines take heavy advantage of the Nvidia architecture these days. Cuda support is also found in many other types of GPU enhanced workflows such as Adobe products as well, which are also moving toward RTX for raytracing support, so with this in mind I do recommend the Nvidia GPU solution for the time being.

Although I tend to also go with Intel for CPU since Embree support is something I need in other packages, and the Intel Denoiser in blender is amazing. I think that's open source though, and should also work on AMD?

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


Retrowave ( ) posted Fri, 31 January 2020 at 7:37 PM

Aye, but nVidia are only sniffing around Blender due to AMD working with them. It'll end in proverbial tears for them unless nVidia made CUDA open hardware, and open sourced their drivers. Personally I hope they do, because if they did that, such heated competition would drive the cost of both brand's cards downwards a fair bit, but I doubt we could be that lucky.

To be clear, I have nothing against nVidia hardware as such, and I was always very impressed with my GTX460. But the longer I live in the world of software and hardware the more I learn that open source is the safest bet, so that means nVidia would be a non-starter for me as I intend my Linux installation to be running completely on open source drivers.


Lobo3433 ( ) posted Fri, 31 January 2020 at 7:46 PM
Forum Moderator

It always brings a smile to my face when I see threads like this in our Blender community we can share information and present opinions on why we choose this or that and it just a great sharing of information I usually always learn something new and I was hoping you would drop in John since we had some discussions about GPU's in the past and I knew you would bring additional wealth of info

Lobo3433

Blender Maya & 3D Forum Moderator 

Renderosity Store

My Free Stuff

Facebook

Twitter

Renderosity Blender 3D Facebook Page



Retrowave ( ) posted Fri, 31 January 2020 at 7:54 PM

Sorry LuxXeon, forgot to reply about the denoiser. I'm not sure to be honest, but that video I posted earlier mentioned something about a denoiser for AMD's ProRender thing, and that's definitely open source.


Retrowave ( ) posted Fri, 31 January 2020 at 8:00 PM

Yes, Lobo, very useful threads, even if you did scare the crap out of me earlier when I thought I'd bought a dud 😜


Lobo3433 ( ) posted Fri, 31 January 2020 at 8:13 PM
Forum Moderator

Sorry if I did I have several places I look for info to confirm my info before I jump in with both feet lol and I am a fan of AMD in general my last 3 builds were all with AMD but on my my current build I went with Intel chip and Nvidia GTX 980Ti (built this system 2016) and it has pretty much been flawless with Blender and planning on updating to a new Nvidia RTX card might not be able to swing a TI but from looking at most of the specs of the RTX Super I should be as happy as I was with 980TI

Lobo3433

Blender Maya & 3D Forum Moderator 

Renderosity Store

My Free Stuff

Facebook

Twitter

Renderosity Blender 3D Facebook Page



Retrowave ( ) posted Fri, 31 January 2020 at 8:43 PM

lol, don't be sorry, I just might have bought a dud for all I know, but hopefully not. I'll find out for sure when I get it out of the box again once the heating in here has been done. My previous system was an AMD CPU and nVidia GPU and it still works just fine, it served me very well for well over a decade. Was a noisy bugger though, both the CPU and GPU. In contrast, my new build is so silent anyone would think it is passively cooled, no kidding. Makes me wonder what they're doing to modern fans to make them so quiet.


Lobo3433 ( ) posted Fri, 31 January 2020 at 9:12 PM
Forum Moderator

When I built my system I did incorporate water cooling for my CPU because I run my system almost 24/7 I always it have it doing something if I am not on it it is either re-encoding video or rendering or something and at the time I build this one I came into some money so really went out on best spec possible at the time to make sure this one last me at least 10 years but I do need to invest in a new GPU. If you want when you get your system back up and want to do some bench marking of your system you might want to try this

Blender Benchmark 2020

I benchmarked my system and even being 4 years old I was impressed that it did really well you do not have to test all the scenes I did the bmw27, classroom and the fishycat files and you can then compare your results with others with similar set ups

Lobo3433

Blender Maya & 3D Forum Moderator 

Renderosity Store

My Free Stuff

Facebook

Twitter

Renderosity Blender 3D Facebook Page



Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.