· Content
· News
· Articles
· Mailinglists
· Knowledgebase
· Trouble Tickets
· Files
· Glossary
· Links
· Compatibility Lists
· Forums
Welcome to our website
To take full advantage of all features you need to login or register. Registration is completely free and takes only a few seconds.
New Alienware ALX workstation to have dual video cards
Posted by: get quad on: 05/14/2004 06:25 PM [ Print | 15 comment(s) ]
In keeping with the 2 is better than 1 theme, according to the fan site famman.net Alienware is soon to release its new ALX workstations featuring dual processors in addition to dual video cards working in parallel. Remember the old Voodoo2 SLI setup anyone? Apparently, the setup will use a new Alienware-designed mobo featuring dual PCI-Express dedicated graphics slots supported by a next-gen chipset from Intel:
Video Array is an accelerated graphics processing subsystem that will allow users to add multiple, off-the-shelf video cards to their Alienware computer systems and have both cards process graphic commands in parallel. Understanding the wide-ranging wants and needs of its customers, Alienware designed its solution so that it is not tied to any one specific video card. This design will allow users to take full advantage of the fastest video card on the market for a significant performance increase. Alienwares exclusive Video Array combined with X2, an Alienware designed motherboard which is currently based on Intel Corporations next-generation chipset and will include dual PCI-Express high performance graphics card slots, will deliver significant performance gains over current graphic solutions. The new Video Array Technology and X2 motherboard will enable users to run graphics intensive applications flawlessly at maximized settings, render 3D visuals in record time, and much more.To get your drool on and see all the pictures from E3, be sure to read the entire article.
« Intel Updates Specs for Unannounced Chipsets · New Alienware ALX workstation to have dual video cards
· Intel shows off Montecito Itanium »
Comment
LRSeriesIII Aspiring Rocket Scientist Posts: 1120 Joined: 2002-08-29 |
![]() Alright, I am a little confused. It looks like they are saying that it will be able to use multiple graphics cards to provide a performance improvement, even to single monitor set ups. This seems a bit odd, and I am really curious about the details of the entire set up. ->Computers ->Folding for team 3074 |
Comment
NerdZero Registered User Posts: 197 Joined: 2003-03-11 |
![]() Seems like one video card will draw half the screen while the other video card will draw the other half. Craziness but cool if it works. I wonder what will be more cost/performance effective? One smokin' video card or two mid-range? |
Comment
opus13 misanthrope. Posts: 1574 Joined: 2002-04-05 |
![]() hmm, i seriously doubt this is going to be an 'alienware only' deal, as they are a system builder, and have zero history of actually creating anything on their own. they are not inventors, just simple resellers, much like gateway. |
Comment
TheFeds "It's the Feds Posts: 128 Joined: 2002-07-29 |
![]() And it's a dual Xeon...so that means we'll be seeing one around here, sooner rather than later. Right? [Quote]Alienware |
Comment
Forge "It's the Feds Posts: 720 Joined: 2001-05-12 |
![]() I really don't see how Alienware is going to get two cards to draw one screen. It WILL require special hardware/software to do that, and that doesn't jibe with their 'off-the-shelf' line. I wouldn't be surprised to find out that the 'big performance improvements' are actually being claimed for PCIe over AGP, and that the two cards will do their own thing, 2+ monitors involved. Registered Linux user 82133 (li.org has a short memory) |
Comment
matroxgaming "I'm The King!!!!!!!!!!" Posts: 63 Joined: 2004-04-17 |
![]() Two high end Xeons, Two 6800U and a rack of Raptors... 700 PS?? |
Comment
gribo Registered User Posts: 241 Joined: 2002-03-13 |
![]() judging from the cards in those picture, they are not top of the line cards (no power connector, no BGA memory chips). it is quite possible that they have modified drivers to get the frame buffer output through the PCIe to a third RAMDAC only card. a very interesting solution. i wonder what the benchmark results are.. Dont pat a burning dog |
Comment
Rellik SMP project 2501 Posts: 224 Joined: 2001-11-21 |
![]() The most interesting thing about this rig is the fact that two DIFFERENT cards can be used. That is rather strange. I am thinking ultimate reviews between NVidia and ATI cards. Odd thing is, since performance is different, how will it keep the cards in sync? Maybe the tertiary card works as a buffer. Another thing about the power cords. Is it not feasible that the pci-e standard provides more juice to the cards? And maybe they don SMP+SCSI=Multitasking Heaven. Current rig |
Comment
Joey Jo Jo Jr. Registered User Posts: 80 Joined: 2002-01-16 |
![]() I'm glad to see this in a way. I had a dual Voodo2 SLI setup when they were brand new and stupid expensive. It allowed me to run Unreal at 1024x768x16 at 40fps! (p2 400@450). That was pretty fast for those days and that game as some of you may remember. I always wondered why graphic cards didn't continue along the lines of dual. Back in the days just before the first GeForce card was released I cared enough to call Alienware because their site said that they were working on a dual Voodoo3 setup. They said the project was cancelled because the upcoming card named GeForce had specs so good that even a dual V3 couldn't compete. The rest is history of course. I still say that Graphics cards manufacturers should offer their own dual card solution. It would extend the life of vid cards for users as well as allow for a longer chipset development cycle. Unless someone is going to tell me that dual cards (that didn't require a stupid proprietary mobo) was possible with the awesome technology of 1997, but not today? |
Comment
Forge Registered User Posts: 720 Joined: 2001-05-12 |
![]()
Good try, but no cigar. Nvidia has bridges for all their NV3* chips. The PCIe FX 5200 (GeForce PCX 5200) is popular at the trade shows for demostrating that your PCIe is working. If you look at the pictures, though, you can see where the two PCX 5200s in the dual Xeon system are both outputting into the small card at the bottom. THAT card is hooked to the monitor. So, yeah, it's just ATI's AFR tech, being implemented with two cards instead of two chips. I doubt it'll be the godsend everyone's thinking, though. I'm fairly certain both cards will need to run the shaders independantly, so you're simply doubling fill rate, best case. By mixing an ATI and an Nvidia card, as everyone seems so interested in doing, you'll only end up with the WORST of both worlds! The faster shading card will need to wait on the slower one, or lose sync. Also, while doubling fillrate sounds good, the last generation or two haven't been thirsting for fillrate much, they come up wanting more shader power. I'll take my graphics single. We'll see who is still laughing in 6 months. Registered Linux user 82133 (li.org has a short memory) |
Comment
Big B Psychic or Psycho? Posts: 3631 Joined: 2001-07-03 |
![]() Perhaps that's the reason we haven't seen any real effort for dual AGP slots since the Micron Samurai... MSI Z97S SLI Plus.Pentium G3258 @ 4.3GHz. 8GB GSkill DDR3-2133. Seagate 320GB. WD 1TB+160GB+160GB. LG DVDRW. XFX Radeon 7850. XFX 650W PSU. CoolerMaster 212 EVO. Win 7 |
Comment
AMDScooter Registered User Posts: 172 Joined: 2002-10-30 |
![]() I thought it was interesting also to see the unmarked Koolance water blocks on the CPU's and GPU's. If this dual GPU/PCI-E setup works it would be nice to see it marketed for the masses, which I'm sure someone will do eventualy if it actually works well. On the downside this setup takes 3 PCI slots on it's own. Good thing more MOBO manufacturers are tossing all you need but good audio onto the newer motherboards. I'll be keeping an eye on this one... --==My Heatware amdscooter==-- |
Comment
Erik Olofsson Registered User Posts: 86 Joined: 2001-10-21 |
![]()
Not quite true. Pixel shader performance = fillrate. With this setup you get double the pixel shader performance, the same vertex shared performance, and end up loosing CPU performance. So if you are fillrate limited this should help. It all depends on the application though. Server Cube black case + 2xAthlon MP 1800+ + ASUS A7M266-D rev 103 + AX7 coolers + 3 GB Reg ECC Kingston |
Comment
proffesso Winner of the Internet Posts: 1485 Joined: 2001-06-25 |
![]() time to sell the house...dual quadrofx 4000...dual mons. sweet deal. quadro's lose a fair chunk of speed over two screens, so a card for each..thats some serious firepower. chasing tokyo girls |
Comment
dogbait Registered User Posts: 195 Joined: 2003-08-23 |
![]() Those who remember the 3dfx and their pass through cables might recall the deterioration in 2d image quality, especially at higher resolutions that came with it. With resolutions even higher today I'd say this solution is only for the die-hard gamer, unless something can be done with those DVI outputs on each card. New lead free motherboard* *Supply your own solder. |