PDA

View Full Version : SLI? Gimmick or performance?



VirtualAce
03-22-2005, 05:08 PM
I'd like to hear your opinions on whether you think SLI is a gimmick or a real performance gain.

I think it's a gimmick. Here's why.

Ok so one video card does half the pixels and the other does the other half. Big deal. How does the other video card know which pixels to render? We are talking about 3D here. You CANNOT know where the pixels will be until you transform the vertices into screen space. Thus you must still send your vertices to one card for processing. That card will transform them and then send the one's that are not in it's range of pixels to the next card. Big hair deal. We already know that one video card can blit more pixels and render more tri's in one frame than there are available in any one screen mode. So fill rate is no longer the problem.

The real bottleneck is getting the vertices to the card, not with the card drawing the pixels to the screen.

So how does card A know what to send card B w/o first transforming all vertices into screen space?? Now if they found a way to have the cards know which vertices will fall where in screen space, then of course there will be a gain if both card's are transforming vertices.

But how would you do this? You cannot know which vertices will resolve to which pixels until you do some type of transformation on them.

I say gimmick.

Sang-drax
03-22-2005, 05:45 PM
What you say sounds reasonable...

ober
03-23-2005, 07:53 AM
I hear what you're saying Bubba, and it makes sense, but I still don't think it is a gimmick. I don't personally know how it all works or claim to understand why it is better, but I really don't think all these companies would put research and time into building hardware to support it if it wasn't worth the effort.

However, maybe what you're saying is true, and the reason it failed to reach production status several years ago is because they thought it wasn't that much of an advantage. Or maybe they didn't think the current hardware at that time could support it.

Either way, I'm stuck thinking there has to be something to the talk, otherwise manufacturer's wouldn't be "walking the walk".

Damn I'm cheesy.

ober
03-23-2005, 08:00 AM
http://www.anandtech.com/video/showdoc.aspx?i=2284

Maybe this will shed some light?

Darkness
03-23-2005, 10:39 AM
The real bottleneck is getting the vertices to the card, not with the card drawing the pixels to the screen.

Well, I thought both could be problematic if either are substantially slow, but yeah having to transfer stuff through busses, and accessing memory tends to be teh suk




So how does card A know what to send card B w/o first transforming all vertices into screen space?? Now if they found a way to have the cards know which vertices will fall where in screen space, then of course there will be a gain if both card's are transforming vertices.


I don't think it has to. Granted, that could only even theoretically be a problem in only one of the modes of SLI because in the other mode they just take turns rendering the entire frame by itself (sort of like normal double buffering mode for a single card, except with two cards), but basically you've got a polygon soup (all your vertexes and related data), the SLI driver keeps track of what gpu did the most work in the past few frames and predicts how to best split up the work between the two gpus...where the pixel ends up in the frame buffer doesn't ultimately matter that I can see.

EDIT:
really good article by the way Ober. Very informative.

VirtualAce
03-23-2005, 12:20 PM
Very good article. Now that makes sense. One card renders one frame and then the other one renders the next. While one renders, one processes. Sort of like a circular sound buffer. While one section is playing, the previous/next is loading. Now that would be a performance gain.

As for the actual SLI mode, again I'm very suspicious of all this branch prediction stuff. So the only way a game can take advantage of SLI is simply to throw all the vertices at the card during loading and then they can choose from that pool. What that article fails to answer is the fact that we are still sending vertices via the bus be it PCI-X or AGP. So if you need to send vertices to the card during the game, it's possible that SLI will actually be slower than non SLI. Since video card memory is no where near enough to stick all your vertices into, plus textures, etc,. I think the other mode where they take turns rendering frames would be the only mode where a performance gain could be attained.

Well, at least I wasn't too far off. I knew that both cards had to have access to all the vertices because prior to transformation there is no way to know where in screen space that vertex will be.

ober
03-23-2005, 12:28 PM
Just to be picky, there are no AGP SLI systems. They are all PCI-X.

RoD
03-23-2005, 01:52 PM
I, like ober, dont claim to know alot about it. What i do know is the two pc's i setup using it have seen performance gains, both in FPS and the smoothness and quality of the gameplay.

InvariantLoop
03-23-2005, 05:20 PM
SLI as it is now it is being used/developed only for gaming. There is an increase in performance as shown on the different benchmarks. http://www.futuremark.com/community/halloffame/

VirtualAce
03-24-2005, 08:14 AM
Just to be picky, there are no AGP SLI systems. They are all PCI-X.

I never stated there was AGP SLI. SLI is expressly PCI-X and older PCI. I said that it doesn't matter because the bottleneck whether we are using PCI-X or AGP or PCI or ISA or any bus is ...the bus. That's the bottleneck. Not fill rate, not texture rate, not primitive rendering, not primitive counts, etc, etc. It is the bus. So no matter what bus you are using you gotta get the vertices to the card before anything can happen. So even in an SLI system, the vertices must cross the bus at one time or another. So you would want to pre-load all vertices into the card during loading and never send vertices on the fly. This would totally screw up the prediction, cache scheme because you would be constantly adding new data to the mix which would falsify the results of previous cache estimations of hits/misses on data. In fact I could see that if the programmer failed to use SLI correctly, it might even be slower than non-SLI.

ober
03-24-2005, 08:23 AM
Right... and I got your point, but I was trying to clarify for anyone else that might have gotten confused by the statement (I even had to read it a second time to make sure you weren't implying that).

SMurf
03-24-2005, 08:33 AM
SLI's great if you're one of those "uber-l33t" people who overclock their top of the line CPU to 4GHz, using a cooling system reminiscent of a refrigerator (Trust me, I've seen one of these ;)). They just seem to have money to throw at hardware manufacturers. Hell, half of them probably even call ATI's or nVidia's offices screaming "D00D, LIKE WHERE'S YOUR NEW CARD MAN? WHY AIN'T IT SITTIN IN MY RIG DAWG? ROFL!!!111!!11!!" (Yes, they say the 1's and excalamation marks).

To summarise, if you have more money than sense/patience, you will buy two very expensive cards to run in SLI mode. These will do you fine for about 3 months, until the next model comes out that is able to match the performance in one card. But lo, it also has an SLI mode! Repeat ad nauseum. :rolleyes:

Game manufacturers will always be aiming for the processing power of a consumer PC around at the time of their game's release. How likely is it do you think that Dell will make SLI a standard in their systems?

Darkness
03-24-2005, 08:34 AM
The articles admits SLI isn't even always faster than a single card. My single ati does justice.



Game manufacturers will always be aiming for the processing power of a consumer PC around at the time of their game's release. How likely is it do you think that Dell will make SLI a standard in their systems?

nada mucho senor

InvariantLoop
03-24-2005, 09:04 AM
To summarise, if you have more money than sense/patience, you will buy two very expensive cards to run in SLI mode. These will do you fine for about 3 months, until the next model comes out that is able to match the performance in one card. But lo, it also has an SLI mode! Repeat ad nauseum. :rolleyes:


Where did you come up with all this info? Have you even searched or read anything about SLI? Not all PCI-Express cards are more expensive than AGP cards. You can buy 2 6600 PCI-E cards for the price of 1 6800GT AGP card.

Besides, gamers are the ones who push this tenchnology, you think if there was no interest in this field, the graphics card would be where they are now? I dont think so.

Darkness
03-24-2005, 10:02 AM
I kind of doubt that SLI will become that popular. Seems like too much of a hassle to have two cards. Also, having two of anything doubles the probability of something going wrong, cuz if one hits the high road the whole system doesn't work.

Darkness
03-24-2005, 10:04 AM
I kind of doubt that SLI will become that popular. Seems like too much of a hassle to have two cards. Also, having two of anything doubles the probability of something going wrong, cuz if one hits the high road the whole system doesn't work.

I'm honestly satisifed with the high end nvidia and ati cards anyway. I honestly don't think I will be needing an upgrade for a while. I'd rather companies just produced good games instead of worrying about this proverbial pi$$ing contest about which game has the prettiest graphics. Doom3 and HL2 weren't *that* fun, I've played a lot of amateur games that cost nothing to make with relatively bad graphics that were more satisfying.

ober
03-24-2005, 10:06 AM
Wrong. If one hits the high road, you remove the bad one and switch out of SLI mode. It's not that hard.

And there are already people I work with that are talking about upgrading to it. And we've already had one member in the Tech Board that bought the hardware and will most likely get the second card to use it.

It is only a matter of time before more people start snatching them up. I'd also like to reiterate a point someone else made: You can buy two really nice low-end PCIX cards that will trump a high-end, more expensive single AGP card, so cost is not really the issue here.

Darkness
03-24-2005, 10:09 AM
Wrong. If one hits the high road, you remove the bad one and switch out of SLI mode. It's not that hard.

Yeah, true. It would still be a pain, and you'd still be left with a PCI card, but you are right you'd survive.

Well, hmm, maybe it'll catch on somewhat more than I initially thought but I still don't think it will become that popular. We shall see.

ober
03-24-2005, 10:18 AM
It might be a pain... but what are the chances of it happening? You have as much chance as your single AGP card going down.

And still... it seems like you think AGP is faster than a PCIX card. I have a low-end 128mb 6600 PCIX card that benchmarks evenly with my co-worker's high-end (6800GT?... I don't know the model number) 256mb ATI AGP card. I paid about $120 for mine... he paid close to $400 for his. Both are running at stock speeds.

No offense, but your statements are uneducated at best.

Darkness
03-24-2005, 10:35 AM
No offense, but your statements are uneducated at best.

How? I happen to do a lot with graphics and I actually do know quite a lot about video cards, I've even been able to fiddle with the code from some old drivers. I didn't say the chances of a failure were HIGH, I just said it's twice as likely :)

oh well whatev

edit: and i still don't think sli will become mainstream :)

InvariantLoop
03-24-2005, 12:28 PM
SLI is not really a new concept, its actually quite old. It was first used if i remember correctly by another gpu company, 3dfx if im not mistaken. But it did not get very popular due to the technology back then, the card were very large combining 2 monster cards made it extremly difficult to use, no mainstream user would buy something that would not fit in their case. Later on when Nvidia bought 3dfx they got the rights for SLI too, and they improved on it, they made smaller cards with more power so this time SLI is here to stay, and the motherboard companies know this and support SLI with their PCI-E motherboards.

edit: i was unsure if it was 3dfx or some other company so i did a little search and came up with this. Its a good read, it explains how the 3dfx SLI(Scan Line Interleave) and how Nvidia SLI(Scalable Link Interface) work. http://www.neoseeker.com/Articles/Hardware/Features/nvsli/2.html

RoD
03-24-2005, 02:23 PM
>> using a cooling system reminiscent of a refrigerator

say it like its a bad thing :P

SMurf
03-24-2005, 03:37 PM
Where did you come up with all this info? Have you even searched or read anything about SLI? Not all PCI-Express cards are more expensive than AGP cards. You can buy 2 6600 PCI-E cards for the price of 1 6800GT AGP card.
I never said anything that equated to "SLI = expensive". I said that given that you were this type of person:-


if (p4_3ghz_at_4ghz && fridge_cooling && muchos_wonga)
printf("D00d, my dad's your dad's boss! LOLOLOLORZ!!1!");

You would buy expensive.

As you later said, the cheapest option for "SLI" are actually a couple of 3dfx Voodoo2s. I have a magazine somewhere showing a guy running Doom3 on them. :cool:

Fridge cooling might be a decent option (Particularly if you have space in your case for a six-pack), but the problem with heat pumps like that is that while it's very cool inside, it's a fair bit warmer outside, although I suppose if it's only coming out on your feet that's fine.

InvariantLoop
03-24-2005, 04:36 PM
I never said anything that equated to "SLI = expensive". I said that given that you were this type of person:-


if (p4_3ghz_at_4ghz && fridge_cooling && muchos_wonga)
printf("D00d, my dad's your dad's boss! LOLOLOLORZ!!1!");

You would buy expensive.

As you later said, the cheapest option for "SLI" are actually a couple of 3dfx Voodoo2s. I have a magazine somewhere showing a guy running Doom3 on them. :cool:

Fridge cooling might be a decent option (Particularly if you have space in your case for a six-pack), but the problem with heat pumps like that is that while it's very cool inside, it's a fair bit warmer outside, although I suppose if it's only coming out on your feet that's fine.

No offence but it seem that you dont know much about cooling in general, whether its aircooling, or water cooling. I will not mention LN2 because that is done in another level for bragging rights, experimentation and not practical reasons.

I did not say anything about 3dfx sli being cheaper or more expensive. I said that even though the concept back then had potentials as it has already been proven now by the nvidia SLI, back then it lacked the technology to give the performance.

To conclude, i suggest you go read about watercooling, how it works, and what it can give you. A quieter system thats for sure.

edit: btw you are confused, people that overclock generally dont have $$$. Thats the reason why they buy smart and overclock even smarter. Why would i get a P4 3Ghz when i can OC my 2.6ghz to 3GHZ. or why would i buy 6-8 fans when i can buy a watercool kit, make my OC potentials even higher, have a quieter system and in the end pay less and have the same performance as an Alienware. Anyway read up on stuff, maybe go build your own system, OC a little and see why people do it.

SMurf
03-24-2005, 06:38 PM
My brother has a water-cooled Cube. Sure, when it's not doing anything it's silent, but do even the slightest thing to cause a CPU usage jump (Say, open a few Explorer windows very quickly) and you'll hear something akin to a vacuum cleaner suddenly come on. I prefer my system to be a bit more consistent with noise levels. :rolleyes:

And btw, when I say "fridge cooling", I mean by chemicals, not water. As in... a fridge.

Also, my point about this "hypothetical person" was that they do things like overclocking because they can, not because they make sense.

But you're right, I don't much about SLI, or cooling in general. Plus I don't think I could find much info on overclocking a P3 these days. ;)

VirtualAce
03-25-2005, 12:08 AM
I did not say anything about 3dfx sli being cheaper or more expensive. I said that even though the concept back then had potentials as it has already been proven now by the nvidia SLI, back then it lacked the technology to give the performance.


Wrong.

I used to run two Voodoo 3 3dfx cards in SLI mode. Their size was not an issue and the performance was great. It was better than one card at the time, but I'm not so sure that SLI is such a benefit today given the power we have on one card.

Clyde
03-25-2005, 06:40 AM
You sure you had voodoo 3 SLIed? I had dual voodoo 2 SLI, and i was under the impression that the voodoo3 lacked SLI capability.

VirtualAce
03-25-2005, 07:18 AM
Yes you are right Clyde. It was two Voodoo2's I had SLIed. Man its been a long time since then. My 3 fried inside of my ASUS rig - a known problem with ASUS and 3dfx Voodoo 3 at the time.

Thanks for the correction.

RoD
03-25-2005, 08:08 AM
Guys why are you questiong voodoo's sli? Perhaps you have forgotton that the people making these cards are from 3DFX, and 3DFX started SLI and its concept....

RoD
03-25-2005, 08:08 AM
also if his watercooling was done right you wouldnt hear ANYTHING, time for him to redesign.

Clyde
03-25-2005, 09:54 AM
Man its been a long time since then


Heh, i had my dual voodoo2s for a loooong time, i finally upgraded to a new comp. with a 6800 GT. Slight change in performance :).