Forum Coordinators: RedPhantom
Poser - OFFICIAL F.A.Q (Last Updated: 2025 Jan 11 12:18 am)
Donna know on specific cards, but I do know there is over head with spliting out of a DVI port, you may want to look into something that accually has 2 VGA ports to cut this down. Matrox made a good one a few years back. Also I think both of the major companies do (ATI Nvidia) Of course there is also the old trick of useing an AGP/PCI Express for youe main monitor and a seperate pci card for your secondary one. since there are 2 GPUs then there is no speed hit at all.
Doesn't the GeForce FX 5200 have both VGA and DVI ports out? NVidia has been doing this type of card since before the TI4600.
C makes it easy to shoot yourself in the
foot. C++ makes it harder, but when you do, you blow your whole leg
off.
-- Bjarne
Stroustrup
Contact Me | Kuroyume's DevelopmentZone
My new ATI card is like that also. I probably should of explained the overhead with the splitters better though. It really comes down to the fact that you are using one GPU to run 2 monitors. The high end cards with 2 ports usually have dual GPUs or GPUs spefically designed for running 2 monitors. Of course if you go too high then your looking at very specific equpiment that may need upgrades to your current system (ie bigger power supplies, water cooling systems, etc)
Hmm. The 5200 is a pretty decent card. What CPU/RAM/disk does your system have? It may not be the graphics card that's the bottleneck.
The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter
That's why I'm surprised to see a 5200 without dual ports, but then there are different manufacturers of Nvidia cards (support and specs differ between them) and there are different 'versions' of the same major models ("Ultra", "Pro", "Gamer", whatever).
That said, I've never used a splitter on a single port for dual-monitors. At first two cards (AGP and PCI) and now strictly dual-port AGP cards. The bottleneck could indeed be caused by a slow system or lack of resources. Or it could just be the reality of using dual-monitors off of one GPU (whether this is hardware or software/driver based). A single port was never really designed to act as a dual-monitor interface.
Message edited on: 03/24/2005 14:12
C makes it easy to shoot yourself in the
foot. C++ makes it harder, but when you do, you blow your whole leg
off.
-- Bjarne
Stroustrup
Contact Me | Kuroyume's DevelopmentZone
Well here are the specs for the machine: Dual AMD Processor Motherboard with 800 MHz FSB 2 X AMD 1800 Chips 1 X 80 GB 7200 RPM Drive (20 GB Free) 1 X 200 GB 7200 RPM Drive (100 GB Free) 1 X 1 GB DDR RAM GeForce FX 5200 128 MB Video Card (DVI Out) Monitors: NEC Multisync 1100+ (21") NEC Multisync XV29+ (29") Well thats her there.. let me know if you see anything that might be slowing it down other than the video card.
DunjeonProductions: Nice system! By the way, do you have just one single 1 GB RAM module, or do you have two 512 Mb modules? You can't use dual channel without using a pair of identical RAM modules. Looks like the FX5200 is the bottleneck indeed. A good card, affordable and fast, is the Geforce 6800LE. It's an AGP card. Its successor, the 6600, is a PCI Express card. The fun part about the 6800LE is that it's actually a 6800GT with half of its 16 pixel pipelines disabled. The same trick (and the same reason - varying chip quality) that Intel used with its 486DX/486SX processors. But you can enable those pipelines with a software tool. Maybe not all of them, but often you can increase the amount of pipelines from 8 to 12. And if you're lucky, you have a card performing like an 6800GT for the price of an 6800LE.
The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter
svdl has a good point. With dual processors, it is always best (and recommended) to populate the slots two at a time with identical modules per 'grouping' (usually A and B or similar). Each processor is then allocated its own memory, increasing performance.
C makes it easy to shoot yourself in the
foot. C++ makes it harder, but when you do, you blow your whole leg
off.
-- Bjarne
Stroustrup
Contact Me | Kuroyume's DevelopmentZone
Make sure that it is the same make and model from the same manufacturer. Mixing and matching may destabilize your system (or prevent it from booting).
C makes it easy to shoot yourself in the
foot. C++ makes it harder, but when you do, you blow your whole leg
off.
-- Bjarne
Stroustrup
Contact Me | Kuroyume's DevelopmentZone
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Alright, so I have a GeForce FX 5200 with a VGA splitter on the DVI port. This connects to my two monitors, a NEC Multisync 1100+ (21"), and a NEC Multisync XV29+ (29"). Just looking for advice on what video card might work better with these two monitors. The 5200 does a good job, but does get slowed down when I am doing more than one thing on each monitor. Thanks in advance for any advice given.