Sat, Jan 11, 1:51 PM CST

Renderosity Forums / Poser - OFFICIAL



Welcome to the Poser - OFFICIAL Forum

Forum Coordinators: RedPhantom

Poser - OFFICIAL F.A.Q (Last Updated: 2025 Jan 11 12:18 am)



Subject: Video Card Advice Needed


DunjeonProductions ( ) posted Thu, 24 March 2005 at 11:04 AM · edited Sat, 11 January 2025 at 1:46 PM

Alright, so I have a GeForce FX 5200 with a VGA splitter on the DVI port. This connects to my two monitors, a NEC Multisync 1100+ (21"), and a NEC Multisync XV29+ (29"). Just looking for advice on what video card might work better with these two monitors. The 5200 does a good job, but does get slowed down when I am doing more than one thing on each monitor. Thanks in advance for any advice given.


thixen ( ) posted Thu, 24 March 2005 at 11:28 AM

Donna know on specific cards, but I do know there is over head with spliting out of a DVI port, you may want to look into something that accually has 2 VGA ports to cut this down. Matrox made a good one a few years back. Also I think both of the major companies do (ATI Nvidia) Of course there is also the old trick of useing an AGP/PCI Express for youe main monitor and a seperate pci card for your secondary one. since there are 2 GPUs then there is no speed hit at all.


DunjeonProductions ( ) posted Thu, 24 March 2005 at 11:31 AM

Yep, tried the seperate PCI card tactic... but it seemed to slow down my 29" for some reason... I think it was only a 16 MB card, so maybe that affected it somehow. So the splitter does add overhead, eh? Well that is something for me to look into now...


kuroyume0161 ( ) posted Thu, 24 March 2005 at 11:37 AM

Doesn't the GeForce FX 5200 have both VGA and DVI ports out? NVidia has been doing this type of card since before the TI4600.

C makes it easy to shoot yourself in the foot. C++ makes it harder, but when you do, you blow your whole leg off.

 -- Bjarne Stroustrup

Contact Me | Kuroyume's DevelopmentZone


thixen ( ) posted Thu, 24 March 2005 at 11:39 AM

" So the splitter does add overhead," I've always been told it does, and that from people who make the cards.


DunjeonProductions ( ) posted Thu, 24 March 2005 at 11:42 AM

kuroyume0161: Not the one I have... it does seem a little stange though that it has only DVI output on it... I would have thought that it would have a least a VGA output on it, but alas... only DVI


thixen ( ) posted Thu, 24 March 2005 at 11:58 AM

My new ATI card is like that also. I probably should of explained the overhead with the splitters better though. It really comes down to the fact that you are using one GPU to run 2 monitors. The high end cards with 2 ports usually have dual GPUs or GPUs spefically designed for running 2 monitors. Of course if you go too high then your looking at very specific equpiment that may need upgrades to your current system (ie bigger power supplies, water cooling systems, etc)


svdl ( ) posted Thu, 24 March 2005 at 12:58 PM

Hmm. The 5200 is a pretty decent card. What CPU/RAM/disk does your system have? It may not be the graphics card that's the bottleneck.

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


kuroyume0161 ( ) posted Thu, 24 March 2005 at 2:11 PM · edited Thu, 24 March 2005 at 2:12 PM

That's why I'm surprised to see a 5200 without dual ports, but then there are different manufacturers of Nvidia cards (support and specs differ between them) and there are different 'versions' of the same major models ("Ultra", "Pro", "Gamer", whatever).

That said, I've never used a splitter on a single port for dual-monitors. At first two cards (AGP and PCI) and now strictly dual-port AGP cards. The bottleneck could indeed be caused by a slow system or lack of resources. Or it could just be the reality of using dual-monitors off of one GPU (whether this is hardware or software/driver based). A single port was never really designed to act as a dual-monitor interface.

Message edited on: 03/24/2005 14:12

C makes it easy to shoot yourself in the foot. C++ makes it harder, but when you do, you blow your whole leg off.

 -- Bjarne Stroustrup

Contact Me | Kuroyume's DevelopmentZone


DunjeonProductions ( ) posted Thu, 24 March 2005 at 2:25 PM

Well here are the specs for the machine: Dual AMD Processor Motherboard with 800 MHz FSB 2 X AMD 1800 Chips 1 X 80 GB 7200 RPM Drive (20 GB Free) 1 X 200 GB 7200 RPM Drive (100 GB Free) 1 X 1 GB DDR RAM GeForce FX 5200 128 MB Video Card (DVI Out) Monitors: NEC Multisync 1100+ (21") NEC Multisync XV29+ (29") Well thats her there.. let me know if you see anything that might be slowing it down other than the video card.


Khai ( ) posted Thu, 24 March 2005 at 2:25 PM

thats interesting.. my Fx5200 has dual ports.. I run dual 17" inch CRT's.. I do get slowdown but thats not the GFX card, it's a bottleneck elsewhere (the rest of the system! lol) the CRT's hook up 1 to the VGA port and the other to the DVI Port (via an adaptor)..


svdl ( ) posted Thu, 24 March 2005 at 2:46 PM

DunjeonProductions: Nice system! By the way, do you have just one single 1 GB RAM module, or do you have two 512 Mb modules? You can't use dual channel without using a pair of identical RAM modules. Looks like the FX5200 is the bottleneck indeed. A good card, affordable and fast, is the Geforce 6800LE. It's an AGP card. Its successor, the 6600, is a PCI Express card. The fun part about the 6800LE is that it's actually a 6800GT with half of its 16 pixel pipelines disabled. The same trick (and the same reason - varying chip quality) that Intel used with its 486DX/486SX processors. But you can enable those pipelines with a software tool. Maybe not all of them, but often you can increase the amount of pipelines from 8 to 12. And if you're lucky, you have a card performing like an 6800GT for the price of an 6800LE.

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


thixen ( ) posted Thu, 24 March 2005 at 2:54 PM

well if you have a high budget you could go for the Nvidia BFG 6800 Ultra OC. I heard it's like the mother of all graphic cards.


DunjeonProductions ( ) posted Thu, 24 March 2005 at 2:54 PM

svdl: Yep, it is just one 1 GB stick of RAM. Hmmmm, thats not a bad little piece of advice about the 6800LE, eh? Might just have to look into that... Thanks for your help!


DunjeonProductions ( ) posted Thu, 24 March 2005 at 3:07 PM

thixen: Yeah that card looks suh-weet, but pricing would be a bit of a factor... it would be like 800 bucks Canadian... ouch!


kuroyume0161 ( ) posted Thu, 24 March 2005 at 3:15 PM

svdl has a good point. With dual processors, it is always best (and recommended) to populate the slots two at a time with identical modules per 'grouping' (usually A and B or similar). Each processor is then allocated its own memory, increasing performance.

C makes it easy to shoot yourself in the foot. C++ makes it harder, but when you do, you blow your whole leg off.

 -- Bjarne Stroustrup

Contact Me | Kuroyume's DevelopmentZone


DunjeonProductions ( ) posted Thu, 24 March 2005 at 3:18 PM

Hmmm.. I will have to try that out... I can get an extra gig of RAM fairly cheap, will have to test it out...


kuroyume0161 ( ) posted Thu, 24 March 2005 at 4:01 PM

Make sure that it is the same make and model from the same manufacturer. Mixing and matching may destabilize your system (or prevent it from booting).

C makes it easy to shoot yourself in the foot. C++ makes it harder, but when you do, you blow your whole leg off.

 -- Bjarne Stroustrup

Contact Me | Kuroyume's DevelopmentZone


DunjeonProductions ( ) posted Thu, 24 March 2005 at 6:30 PM

Awesome, thanks for the advice!


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.