Mon, Nov 25, 2:08 PM CST

Renderosity Forums / 3D Modeling



Welcome to the 3D Modeling Forum

Forum Moderators: Lobo3433

3D Modeling F.A.Q (Last Updated: 2024 Nov 24 8:50 pm)

Freeware 3D Modeling Software Links:
Blender | Trimble Sketchup | Wings 3D | Anim8or | Metasequoia | Clara IO (Browser-based 3d modeler)

Check out the
MarketPlace Wishing Well, as a content creator's resource for your next project.

"What 3D Program Should I buy?" Not one person here can really tell you what's best for you, as everyone has their own taste in workflow. Try the demo or learning edition of the program you're interested in, this is the only way to find out which programs you like.



Checkout the Renderosity MarketPlace - Your source for digital art content!



Subject: Can U recommend a good 200-250$ graphics card?


marciz ( ) posted Thu, 08 June 2017 at 6:56 PM · edited Fri, 22 November 2024 at 2:07 PM

I'm thinking of upgrading my graphics card--on the MOBO--to a desktop graphics card that should work well with my Win8 Dell PC, using mostly Poser9 or Vue11. I'm hoping to pick up some speed in rendering. Hopefully for around 200-250$. Any recommendations?


LuxXeon ( ) posted Thu, 08 June 2017 at 8:08 PM

I assume you are talking about GPU rendering specifically? Anything that isn't GPU rendering will rely obviously on the CPU for speed. I don't know what kind of GPU rendering architecture Poser uses, but I've heard the rendering software is based around the Blender Cycles engine. Cycles has two GPU rendering modes: CUDA, which is the preferred method for Nvidia graphics cards; and OpenCL, which supports rendering on AMD graphics cards. In my experience rendering with Cycles, Nvidia CUDA cards seem to provide the best render results, depending on how many cores you have on the card. The Nvidia Geforce GTX 1060 is a very good choice for around $250. The specs boast between 1024 and 1280 Cuda cores and between 2-6gb of vram. There may be others in or around this price range, but make sure you check out the core count for the card if you are looking for render speed improvements. Also on board vram will help support more geometry and textures in the scene.

https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1060/

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


LuxXeon ( ) posted Thu, 08 June 2017 at 8:14 PM

You can get a 1060 with 3gb and 1152 Cuda cores on NewEgg for $229.

I don't know if linking directly to a store like NewEgg is against TOS or not, so I'll simply post the text. You can copy and past it to your browser to check it out. You might find other cards for around the same money, but again be sure to compare cuda cores and vram to get the best bang for the buck.

https://www.newegg.com/Product/Product.aspx?Item=N82E16814487284&cm_re=graphics_card--14-487-284--Product

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


SinnerSaint ( ) posted Thu, 08 June 2017 at 9:22 PM

If you wanna save some money and not do a deal with the devil (Nvidia) then go with Amd. You can get a AMD RX 480 for less $$ than a Gtx 1060, but you'll get a full 4 gigabytes of onboard ram and faster performance when rendering with cyckles. Not to contradick Luxx again (sorry mate) but Nvidia are not the only game in town for gpu rendering. You will find if a renderer gives you the option to use OpenCL instead of cuda, then you are wasting money if you don't go with Amd. Shop around, but you can get a really great Amd card for a lot less than a Nvidia card and get more speed out of your purchase. 😁


marciz ( ) posted Fri, 09 June 2017 at 8:52 PM · edited Fri, 09 June 2017 at 8:53 PM

All very helpful, folks! I'm not tech savvy when it comes to video cards; the specs seem daunting to understand. The machine has an NVidia GeForce GT620 as a video card--and I'm not actually sure it IS on the motherboard. (But I think it likely.) So, I'm pretty much a newbie when it comes to video cards. But I'd done a little research and that said that a desktop card rather than a gaming card would be better for 3D rendering--thought it did say the desktop cards are pricey.
Looks like what you've given me so far is a very good start!


LuxXeon ( ) posted Fri, 09 June 2017 at 11:44 PM · edited Fri, 09 June 2017 at 11:52 PM

marciz posted at 11:33PM Fri, 09 June 2017 - #4307174

All very helpful, folks! I'm not tech savvy when it comes to video cards; the specs seem daunting to understand. The machine has an NVidia GeForce GT620 as a video card--and I'm not actually sure it IS on the motherboard. (But I think it likely.) So, I'm pretty much a newbie when it comes to video cards. But I'd done a little research and that said that a desktop card rather than a gaming card would be better for 3D rendering--thought it did say the desktop cards are pricey.
Looks like what you've given me so far is a very good start!

I certainly understand the tech talk and numbers can be confusing and misleading, no worries. However, if by desktop card you mean a workstation level graphics card like the Nvidia Quadro, then that information you read was probably referring to on-screen display rendering such as would be necessary for a professional CAD artist or engineer. In the case of high-end workstations that require screen redraws of very complex geometric shapes and scenes, the Quadro cards are usually recommended. However, in the case of GPU-enhanced renderings like that found in Blender Cycles or Octane Render engines, a gaming level card is just as effective or in some cases even faster than a workstation Quadro card which costs thousands of dollars more. One would wonder how that is possible. The answer once again is in the number of cores the card has. Without getting too involved in comparisons, suffice to say that most gaming cards, like the GTX line, contain more cores for polygon throughput than the much more expensive workstation cards do. Most GPU render engines like the ones we are using to render polygonal 3d scenes are using the graphics card's CUDA or OpenCL cores to push the speed of the render. Workstation cards contain features like more onboard Vram (memory) which allow them to perform different tasks and hold much more information at one time than a gaming card, but they do not require a large number of cores on the card. So rendering in Cycles using a $4000 Quadro card would be slower than rendering the same scene using a $1000 Titan X gaming card. This is because the Titan X card has more active rendering cores with higher clock speeds than the higher priced, powerful Quadro card.

Also, you wouldn't really find a workstation level Quadro which had a high core count in the price range you specified. You can, however, find a gaming level GTX card with a large number of rendering cores for that price. I hope that clears things up a little.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


LuxXeon ( ) posted Sat, 10 June 2017 at 12:04 AM

By the way, before anyone suggests otherwise, I don't mean to make it sound like a Quadro workstation card isnt as good as a much cheaper gaming card, so please do not misinterpret that. I'm just talking about render engines which use CUDA cores for rendering speed, such as Cycles and Octane. Of course, a $4000 Quadro workstation card would have greater overall benefits over a GTX card if you were running serious computationally intensive tasks like GPU enhanced video processing or fluid dynamics, etc. For high-end GPU computation, a professional workstation level card is always best if you can afford it. However, it won't give you the best bang for your buck in render engines like we are talking about here when you compare the cost vs core count specifically. The only exception would be if you plan on rendering very large scenes with a great number of very large textures, and needed more VRAM on your graphics card to handle that. Then again, for only $250 range, you won't see many cards with more than 4GB of on-board VRAM either.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


marciz ( ) posted Mon, 12 June 2017 at 5:31 PM · edited Mon, 12 June 2017 at 5:40 PM

Yes, LuxXeon, your info is beginning to clear things up for me. After all, I'm likely not going to encounter situations where I'm rendering very high poly counts, etc.. My average still picture file in Poser is like, 3MB at 300dpi. Plus, I'm out of the loop completely on animating.
Vue, is a different story, one can get high counts even with their lower end programs, if you populate a scene with forests, and complex cloud forms, etc.. So basically, I'm trying to just get a bit more speed for whatever I'm trying to render.

So do you think I'd gain any speed by upgrading the video card--speed of renders--even to a card in the price range I'm into? (Despite whether or not I understand the specs.)
And secondly, how do I find out which I already have; CUDA or Open CL? I think I already queried Dell--the manufacturer--a number of years ago, and got a vague answer on that. That is because I bought a copy of Pret-a-3D's Reality renderer some years ago, and they had a choice of files between Open CL, and probably CUDA, can't recall. I just wanted to know the correct download file version of their plug-in to use. (Though it didn't work well in either version and I couldn't easily find the info of theirs--Reality--that specified the version of Poser it uses; which I didn't have.)

Thanks once more!


LuxXeon ( ) posted Mon, 12 June 2017 at 6:08 PM · edited Mon, 12 June 2017 at 6:09 PM

Ok, this is where things can get very dubious for you if you aren't somewhat educated to the hardware specs that are required by your software, and also your current hardware situation. Unfortunately, I am limited as to my recommendations based on the information you can provide. Understandably, this can be a confusing topic even for some intermediate graphics enthusiasts so let's tread lightly to start with. First, let's find out exactly what hardware you're running on your machine. I know you're looking to upgrade your graphics card, but in order to know for sure if that will be the biggest benefit to you right now, you need to make sure two things. First, make sure the software you intend to render with can utilize the GPU acceleration fully, and which GPU architecture the software uses specifically for GPU accelerated rendering. This should be the very first step. You may be thinking you will benefit most from a graphics card, but if the software you are running is merely GPU enhanced rendering, and not fully utilizing the GPU over the CPU, then your CPU will still be the biggest bottleneck in terms of render speed.

For example, there are some render engines that use the Computer Processing Unit (Intel i5, i7, Xeon, etc) to do most of the work during a render, but will allow you to include the GPU cores, if you have any, to assist the CPU in the rendering process. These render engines will still use the CPU, but incorporate any GPU cores in the rendering as well, like little helpers. I do not use Poser, so I do not know for sure if the render engine there allows you to use ONLY the GPU for rendering (which puts all the load on your graphics card entirely), or if it's using GPU assisted rendering which means the CPU is still doing most of the work and still involved in the rendering process. Also, I do not use Vue either, but I can do a quick search on that one for you if you like. Pret-a-3D's Reality engine, I believe, uses GPU assistance rendering if you choose it. This means it's still rendering with CPU, but involving the GPU cores if it can. Daz Studio's IRay engine, on the other hand, can do CPU, GPU, or both. This particular engine is very versatile, and it would benefit you greatly to have at least a GTX card with 1000 or more CUDA cores, like the 1060 I mentioned earlier.

First find out for sure if the software you use can support CUDA or OpenCL rendering, or if they are simply GPU assisted, and still using the CPU for most of the work. Then find out exactly what CPU processor your system currently has, if you can. For example, is your system a i5, 17, or AMD processor? It's important to know how modern your CPU is because most CPU's produced in the last few years can take advantage of a collection of high-performance ray tracing kernels, developed at Intel, called Embree. Almost all of today's CPU powered render engines use that to speed up renders. Once you have this information, we can recommend what the best choice is for you, or at least what we would do in your situation. I don't want to tell you to go buy a $250 GTX 1060 if your software can not use CUDA.

If you know exactly the make and model of your computer, I can just look it up for you and see what it's running. But I'd need to know the exact make and model number of the computer, if possible.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


kenmo ( ) posted Tue, 13 June 2017 at 11:59 AM

I would prefer a 256bit card over 128bit.


LuxXeon ( ) posted Tue, 13 June 2017 at 2:29 PM

kenmo posted at 2:26PM Tue, 13 June 2017 - #4307396

I would prefer a 256bit card over 128bit.

For video editing and other memory intensive tasks, I would agree. For increasing render speed in GPU render engines like we may be referring to here, memory interface bit depth really doesn't matter. BTW, the GTX 1060 is 192-bit.

To be honest, I'm not sure the OP really needs a graphics card to help with his render speed. Unless his software is using 100% GPU rendering, his CPU may be the bottleneck anyway. I'm hoping to establish exactly what his system specs are, and if the software he runs can use GPU or simply GPU enhanced rendering.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


davidstoolie ( ) posted Tue, 13 June 2017 at 2:59 PM

Lux, Vue is using a gpu/cpu interactive path tracer. http://www.e-onsoftware.com/products/vue/vue_2016_complete/?page=14

If y ou look at the recommended system specs to run Vue, they say they are using Opengl, and require an Opengl accelerated video board for optimal compatibility. They also give a warning at the bottom of the page that using Nvidia's Optimus or ATI's BinaryGFX technologies will present severe Opengl problems. So does that mean they're using Opengl instead of Opencl or cuda?

It looks like a lot of the advanced rendering features in Vue are crippled in the path tracer too. It seems very primitive and only good for quick preview or viewport rendering. I don't think clouds are even supported in Vue's path tracer yet.


marciz ( ) posted Thu, 15 June 2017 at 12:40 PM · edited Thu, 15 June 2017 at 12:44 PM

Sorry I took awhile to get back here, been busy.

I'm taking notes here, for sure. Well, here's some info; again, my card is the NVidia GeForce GT620. I dug up the specs via Dell, and I'm pretty sure it's an XPS 8500, since the manufacture date on this PDF is consistent with the one atop my PC's cover--nowhere on its box or on the unit is the model number, which I'm not sure of since I got it in late 2012, new.

PDF here; http://downloads.dell.com/Manuals/all-products/esuprt_desktop/esuprt_xps_desktop/xps-8500_reference%20guide_en-us.pdf

It's the i5 processor, btw; as that is stamped on the front. It has 4 cores.


And also the graphics card; Total available graphics memory; 4096 MB Dedicated graphics memory; 1024 MB Dedicated system memory; 0 MB Shared system memory; 3072 MB

Direct X version, 11 or better

There's a note at the bottom of the page; The gaming graphics score is based on the primary graphics adapter. If this system has linked or multiple graphics adapters, some software applications may see additional performance benefits.


Some further info via control panel; Intel(R) Core(TM) i5-3350P CPU @ 3.10 ghz. RAM; 8 gigs 64 bit OS, Windows 8 (my note; not updated to 8.1) And under "performance"; Processor--calculations per second 7.6 Memory (RAM) memory operations per sec. 7.8 Desktop graphics performance, 5.2 (determined by lowest subscore)--base score Gaming graphics--3D business and gaming graphics performance, 6.4 Primary hard disc--Disc data transfer rate, 5.9


As to whether Poser 9 or Vue Esprit 11 use OpenCL or CUDA, I'll have to contact them (an article on Vue said that they went with GPU rendering in '16, but I think I got my version before then; just need more info for ya). I tried to be redundant and give you as much info as I can for now, and it's very much appreciated.

Marc


LuxXeon ( ) posted Thu, 15 June 2017 at 2:47 PM · edited Thu, 15 June 2017 at 2:49 PM

Thank you for providing the information there, Marc. The first thing that jumps out at me is that you are running a 3rd gen i5 with only 8gigs of system ram. The Intel i5 processor, with the exception of the i5-4570T, are all quad-core processors. When you are rendering you are using only 4 cores to create the image. Granted you have a good clock speed @ 3.10 ghz, but you would still be rendering much slower than someone who has a Core i7 @ 2.8ghtz because they are using double the amount of cores to render. Also you have a very early generation processor which is not very efficient compared to more recent generation of Intel processors, even at slower clock speeds. Your 8gb of DDR3 system ram may not impact rendering directly, but certainly does create some bottleneck in data transfer on the system compared to newer, faster DDR4.

The NVidia GeForce GT620 has only 96 CUDA cores running at 700mhz. This card, of course is very slow compared to even a low end GTX card which could have over 1000 more cores than what you currently have. In fact, it's not doing very much to help your i5 render a scene, if it's being used at all. When you compare the number of CUDA cores in your GT620 to a GTX 1060, you can see there is an insane performance difference there. Here's a benchmark comparison:

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-Nvidia-GeForce-GT-620/3639vsm8899

Your current card does not even register on most of the tests, so an upgrade would absolutely help you with what you want. The only thing is you have got to make sure you have the available slots to put this new card in, because it is much larger than your current card, and requires a lot more power. Your current machine is not quite up to the standards of the average machine I see using these new generation GTX cards, and I'm not sure your board or your power supply will hold up.

In all honesty, I would recommend waiting and saving some money to purchase a new machine down the road. AMD are coming out with their new line of processors now (Threadripper and Ryzen), and Intel are introducing the i9 soon, so the cost of later generation i7 processors will come WAY down very quickly when that happens. A processor upgrade will help your render times immensely. I also recommend no less than 16gb of ram for doing almost any kind of graphics intensive work, but that's another story.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


marciz ( ) posted Thu, 15 June 2017 at 7:04 PM · edited Thu, 15 June 2017 at 7:09 PM

So, you're saying 16gB of RAM in the current computer, or in a newer model?

I was thinking today that yeah, even a low end card with more cores might speed things up and get me what I need without going nuts with more new hardware. Hadn't thought about the power requirement--though I've seen similar info online.

Pretty sure this machine has PCIe, though. Would an upgrade of the power supply, and another 8 gigs of RAM, help in this same machine? Those two upgrades are not usually terribly expensive, because I'm usually satisfied with the low price end of brands I know and trust.

Think I'll also contact the software vendors this week and ask about that OpenCL and/or CUDA for renders.

Thanks a lot so far! M


LuxXeon ( ) posted Thu, 15 June 2017 at 10:04 PM

Regarding the ram, I don't know offhand if your current mobo can support the newer generation of ram modules, but if you have slots available for the addition of more DDR3 modules, an upgrade there would be very inexpensive considering the reduced price of DDR3 these days. So at the very least, I'd say you would definitely benefit from the addition of more RAM just to be able to handle larger datasets in your system overall and speed up data transfers through your CPU. However, keep in mind that will not make a noticeable impact on your render speeds. Increasing ram will only help keep your system more comfortable and running more smoothly, but you will not see a speed improvement in your renders, At the very least, 16gb (or more) will make for a much more stable working environment for memory intensive applications like Vue. So, all in all, upgrading the ram is simply a recommendation to help you in the long run, but not really for the immediate speed boost you are seeking.

If you're sure your mobo can accept a larger GTX desktop card, and your power supply puts out around 480w, then I'd recommend the GTX 1060. The 1060 Ti is an overclocked version that you might find in your price range as well. First you will need to find out about what kind of GPU rendering the software is using as far as if it takes advantage of cuda cores, opencl, etc. It's not clear from their literature that the software is using GPU for rendering outside of interactive preview rendering with path tracing.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


marciz ( ) posted Sat, 17 June 2017 at 9:00 AM

Well, thanks for all the help, LuxXeon.

I don't want to be a pest, because you've helped immensely so far; but, it occurred to me today--there's really no big emphasis on buying new, recent equipment here. So, what if i were to find an older device that would fit the specs/architecture of my current PC--keeping in mind I still should find out about OpenCL/CUDA? This machine is, after all, only 5 years old and I'm sure there's plenty of used equipment floating around on eBay. (yeah, I know, ya pays your money & ya takes your chances.) Anyway, you've given me a load of knowledge I didn't have 2 weeks ago.


LuxXeon ( ) posted Sat, 17 June 2017 at 2:56 PM · edited Sat, 17 June 2017 at 3:01 PM

Are you talking about purchasing some used equipment that is newer than that which you currently have? Not sure what you mean there specifically, but I always hesitate to purchase any used or refurbished hardware because you just never know what the true condition of those parts is or how much life they have left. Also, there's no warranty on used parts of course.

Anyway, when you say your system is "only" 5 years old, I should point out that 5 years is an eternity in the technology market. New and improved computer tech is coming out almost daily, so if your machine is over 3 years old, it's pretty much past its warranty and considered obsolete in terms of modern technology specifications. For example, since around the time you purchased your system, SSDs have been on a meteoric rise as much faster and more reliable solid state technology has taken over the industry at a blistering pace. Having a modern SSD to run your OS can make a world of difference in the data transfer speed of your computer, improving every aspect of your experience, from booting the OS to launching and running software applications. It won't necessarily help your render times, but it will make a difference in just about everything else you do on your system.

So again, if all you really want is just a noticeable increase in render time when using your current applications, then a new discreet GPU with more cores may help (provided your software supports offline GPU rendering with Cuda or OpenCL to begin with). Your current system does have some other bottlenecks that will eventually need attention though. The CPU, the hard drive, the RAM. All of these things play a crucial role in optimizing your experience when doing graphics processing of any kind. If you're considering replacing any of these devices with used parts, I would highly recommend against it, just based on personal experience with used computer parts. If it were me I would hold off until I could comfortably afford the acquisition of an entirely new system. Of course, that decision is ultimately up to you and how you feel about buying used parts.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


SinnerSaint ( ) posted Sat, 17 June 2017 at 7:25 PM

I disagree with lux about used parts. You can get some killer deals for new and used stuff on Newegg.com! I saw a GTX 970 for sale there for only $130.00.


LuxXeon ( ) posted Sat, 17 June 2017 at 9:39 PM

SinnerSaint posted at 9:28PM Sat, 17 June 2017 - #4307777

I disagree with lux about used parts. You can get some killer deals for new and used stuff on Newegg.com! I saw a GTX 970 for sale there for only $130.00.

Well, the GTX 970 is a really great card with excellent rendering power. 256bits, 4g vram, with 1664 Cuda Cores. However, if it's refurbished, you have to wonder if it's performing at optimum potential.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


marciz ( ) posted Sat, 17 June 2017 at 10:46 PM · edited Sat, 17 June 2017 at 10:53 PM

Well, the computer I write this message to you on is probably around ten years old. I also have a 12 year old running XP that is also using a 17 inch scanner that is SCSI connected! I just installed a used IDE DVD burner in it as well, runs fine. I have an even older system with XP on it, I use as an mp3 jukebox into my stereo system. And, my first computer; still works, running Win98! (Though I've no use for it, ha ha.)

But of course, the graphics cards needed for the more sophisticated tasks we've been discussing are likely best served by new equipment to install it in. It's just that if something I own works and still does what I need it to do, I'm gonna hang onto that sucker, and keep maintaining it. Anyway, thanks to all who replied. I'm going to use my day off to do some research, lots of info out there online. I need to get myself edjamacated about all this newfangled gear...and yes I am aware of SSD. There are drawbacks there as well, however. I just haven't been keeping up, because I've been satisfied with what I've got....:) Marc


LuxXeon ( ) posted Sun, 18 June 2017 at 10:40 AM

Marc, I not only respect that outlook on old technology but I typically follow the same principals in my own life. I still have computers that I used 15 years ago and try to keep everything in good working order. However, when it comes to production, I also maintain a modern workstation for my rendering and model creation based on the latest tech, because the software I typically use demands the most horsepower. I still use a 10-year-old HP Pavilion Media Center TV m7750n Desktop PC as a Network Attached Storage device (although I did upgrade the storage drive to a 4TB Seagate SATA).

However, when it comes to rendering, my workstation uses a Intel i7-7700K CPU @ 4.20 GHz and a GTX 1070 8gb and a Samsung 850 EVO 250GB SSD. My livelihood depends on modeling and rendering, so I need as much speed as I can afford. I usually try to upgrade every 3 years, or just before the warranty expires so I can resell the current parts to help finance newer ones. If you're working as an enthusiast or hobbyist, then all that matters is that you're satisfied with your current productivity and the experience you get in the software you choose to work with.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


SinnerSaint ( ) posted Mon, 19 June 2017 at 11:38 AM

That's funny, because the Nvidia Gtx 1070 I have in my gaming system is showing specs of GDDR5 @ 8.0 GB (256-bit). So looks like even the Gtx cards are 256 bits now.


LuxXeon ( ) posted Mon, 19 June 2017 at 12:41 PM

SinnerSaint posted at 12:34PM Mon, 19 June 2017 - #4307962

That's funny, because the Nvidia Gtx 1070 I have in my gaming system is showing specs of GDDR5 @ 8.0 GB (256-bit). So looks like even the Gtx cards are 256 bits now.

Yep, that's correct. There are certain lines of Geforce GTX 10 cards which also have 256-bit memory bandwidth. The GTX 1070, and 1080 are both 256-bit cards. The 1060, however, is 192-bit.

Currently, the most powerful card that Nvidia has is the Titan XP. It has a whopping 3,840 Cuda Cores, 12 GB GDDR5X, and a massive 384-bit memory interface. The Titan XP is quite a powerful beast and is probably the fastest card on the market for rendering polygons. Sticker price on the Titan XP is currently around $1200, so still cheaper than most Quadro cards, and much more powerful in terms of rendering capability.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


ShawnDriscoll ( ) posted Mon, 19 June 2017 at 10:54 PM

Try to get X70 or better, whatever the series of GTX. They have the 256-bit. I have a GTX 670 that (I think) may need to be replaced. Two days ago, my SKYRIM stopped playing very well after 4 years of use. 6000 hours almost.

www.youtube.com/user/ShawnDriscollCG


LuxXeon ( ) posted Tue, 20 June 2017 at 11:26 AM · edited Tue, 20 June 2017 at 11:30 AM

Shawn, keep in mind not all of the GTX 7x line cards are 256 bit. For example, the GTX 760 is 256-bit, but the GTX 750 is only128. You'd have to check the specs for each card in that generation. However, 256-bit memory interface isn't going to necessarily benefit render speed, because the clock speed of each core is so much less efficient than 1060, which is in also his price range. For example, the GTX 1060 has between 1152 to 1280 Cuda cores depending on which version you go get. Each core has a base clock speed of 1506 MHz with boost clock of 1708. The GTX 760 Ti has more cores with 1344, but the clock speed of each core is only 915 MHz with a boost of 980 MHz. Even though the interface of the 760 is 256-bit, the memory transfer rate is slower than the 1060 too. Another disadvantage is that all GTX 7x generation cards utilize the old Maxwell architecture, which is far less power efficient than the new Pascal. So, I'm not sure the GTX 7x generation of cards would be the best investment for the purposes of render speed, but pretty much any GTX card will show dramatic improvement over his current configuration. So it all depends on how much he's willing to spend, and if the software truly supports GPU rendering.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


ShawnDriscoll ( ) posted Tue, 20 June 2017 at 3:14 PM · edited Tue, 20 June 2017 at 3:14 PM
LuxXeon ( ) posted Tue, 20 June 2017 at 5:40 PM · edited Tue, 20 June 2017 at 5:41 PM

ShawnDriscoll posted at 5:31PM Tue, 20 June 2017 - #4308081

X70. Not 7X.

Ok I think I see what you mean now. Any GTX generation with 70 or higher in the last two digits should be at least 256-bit?

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


SinnerSaint ( ) posted Tue, 20 June 2017 at 5:58 PM

Lux is right about one thing. It doesn't matter if a card has a 256 bit bus if the ram speed is slower than a card with 192 bit bus. That's all the bits stand for in these cards is the bus width. You should really be looking at memory bandwith, not just bits.


LuxXeon ( ) posted Tue, 20 June 2017 at 7:58 PM · edited Tue, 20 June 2017 at 7:59 PM

SinnerSaint posted at 7:54PM Tue, 20 June 2017 - #4308101

Lux is right about one thing. It doesn't matter if a card has a 256 bit bus if the ram speed is slower than a card with 192 bit bus. That's all the bits stand for in these cards is the bus width. You should really be looking at memory bandwith, not just bits.

That's kind of the point I was trying to make much earlier on in the thread. Besides, I think we may be straying way off topic of the OP in this case anyway since all he really cares about is increasing render speed with GPU. The most important thing when it comes to rendering with GPU is the number and frequency speed of the cores. VRAM is also of secondary importance for being able to handle larger scenes and more textures, but almost all GPU render engines rely on Cuda cores for speed.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


marciz ( ) posted Thu, 22 June 2017 at 9:27 AM

LuxXeon posted at 9:21AM Thu, 22 June 2017 - #4308108

SinnerSaint posted at 7:54PM Tue, 20 June 2017 - #4308101

Lux is right about one thing. It doesn't matter if a card has a 256 bit bus if the ram speed is slower than a card with 192 bit bus. That's all the bits stand for in these cards is the bus width. You should really be looking at memory bandwith, not just bits.

That's kind of the point I was trying to make much earlier on in the thread. Besides, I think we may be straying way off topic of the OP in this case anyway since all he really cares about is increasing render speed with GPU. The most important thing when it comes to rendering with GPU is the number and frequency speed of the cores. VRAM is also of secondary importance for being able to handle larger scenes and more textures, but almost all GPU render engines rely on Cuda cores for speed.

I'm back after some delays.

Lux, I see your point about keeping things up to date; that's your livelyhood!

At the moment, I'm in a hurry, so I'll have to zip back here later to try to decipher what you're saying. That way maybe I can look at specs of various cards that have been suggested--Shawn, SinnerSaint--and perhaps make a buying choice (?).

right, gotta run!


maxxxmodelz ( ) posted Sat, 22 July 2017 at 7:28 PM

At this point and time, I can tell you it's a terrible time to buy a new GPU card. Terrible time. The prices are very inflated right now due to the data mining fad saturating the market. They are buying all the new cards by the tens of thousands and inflating costs. If I were you, I'd go with a used or refurbrished card right now, and wait for prices to drop. Just be careful if you buy a used card. Find out what it was used for. Because if it was used for data mining, then chances are it's been run at 100 percent it's entire life, and will probably not live much longer.


Tools :  3dsmax 2015, Daz Studio 4.6, PoserPro 2012, Blender v2.74

System: Pentium QuadCore i7, under Win 8, GeForce GTX 780 / 2GB GPU.


marciz ( ) posted Sun, 23 July 2017 at 8:00 AM

Thanxxx, MaxxxModelz.

I'm always careful about anything I bid for on ebay, but would anyone admit to doing data mining while selling a card?

M


LuxXeon ( ) posted Sun, 23 July 2017 at 2:41 PM

MaxxxModelz raised a good point, but there's no way to know if a card was used for crypto currency mining. It would be extremely rare to find a seller willing to admit that if he's trying to get rid of the card. A good rule of thumb might be to just assume any new generation card you see for sale on eBay is suspect and probably used for crypto currency mining. Especially now that AMD and NVidia may come out with new cards specifically for mining. You will likely see a ton of recent gen cards flooding the market, and they will all be in very poor condition internally.

https://www.engadget.com/2017/06/27/nvidia-and-amd-currency-mining-cards/

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


marciz ( ) posted Mon, 24 July 2017 at 5:00 PM

Say, LuxXeon, to change the subject just a little; when I use Poser 11's 'Superfly' renderer, it uses up 90-95% of my cpu. And that's for a LOW resolution image! As you may recall, I posted my computer's specs here. Pretty sure it's using CPU, not GPU. Not sure if it's using all 4 cores or maybe there's an adjustment to tell it to use more, if it's only using, say, one. I suppose I should get around to studying the manual...

As to another card, sounds like I'd be better off with new gear...


LuxXeon ( ) posted Mon, 24 July 2017 at 7:32 PM · edited Mon, 24 July 2017 at 7:33 PM

marciz posted at 7:31PM Mon, 24 July 2017 - #4310644

Say, LuxXeon, to change the subject just a little; when I use Poser 11's 'Superfly' renderer, it uses up 90-95% of my cpu. And that's for a LOW resolution image! As you may recall, I posted my computer's specs here. Pretty sure it's using CPU, not GPU. Not sure if it's using all 4 cores or maybe there's an adjustment to tell it to use more, if it's only using, say, one. I suppose I should get around to studying the manual...

As to another card, sounds like I'd be better off with new gear...

Well, that's why I'd suggest buying new or, if you do buy used, go with an older generation card which would be less likely to have been used for mining. Anything under a Nvidia GTX 9x generation should be ok. The miners are using mostly newer gen cards. I don't know anything about the Superfly engine in Poser specifically, but I have heard that it is based on the Cycles engine in Blender. This being the case, I'm sure that it is using all the available cores for rendering unless it's specified otherwise. Cycles contain settings in which you can specify how many threads you want the render engine to use, but it will always try to use all cores.

You can possibly get an overview if the renderer is using all cores by keeping an eye on the CPU resource monitor in Windows 10 (assuming you have Win 10). It will tell you all the processes running on your system, and also the number of threads and cores each process is using in real time. This might give you a summary idea. Open the Task Manager, go to Performance, then click "Open Resource Monitor" at the bottom of the dialogue box. Then go to the CPU tab. You'll see a column for Threads and CPU.

______________________________________

My Store
My Free Models
My Video Tutorials
My CG Animations
Instagram: @luxxeon3d
Facebook: https://www.facebook.com/luxxeon


maxxxmodelz ( ) posted Thu, 27 July 2017 at 12:33 PM · edited Thu, 27 July 2017 at 12:42 PM

The resource monitor is available in earlier versions of Windows as well. I have win 7 on one of my machines and it's there. They added a couple features to 10, but you can still monitor which apps are using which cpu in earlier versions.

I prefer using CPUID HWMonitor. CPUID monitors everything about the hardware on your system, right down to fans and temperatures. Get the portable version though, not the setup version. You can run it right from a USB stick.


Tools :  3dsmax 2015, Daz Studio 4.6, PoserPro 2012, Blender v2.74

System: Pentium QuadCore i7, under Win 8, GeForce GTX 780 / 2GB GPU.


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.