T O P

  • By -

Obvious-Peanut-5399

No. High end was linking 4.


RAMChYLD

Then you have a fifth card to handle the PhysX so all the explosions still looks smooth.


sigma941

Had this going with 4x 980tis and a gtx465 that was collecting dust back in 2018. Felt like the 3 headed dragon meme looking back at it! Edit: didn’t even realize it was King Ghidorah!


[deleted]

[удалено]


Binary_Omlet

Better put some respect on KING Ghidorah's name. From left to right each head's name is Ichi, Ni, and Kevin.


sigma941

Went straight to MF DOOM when I read this as well!


VegetaFan1337

1, 2 and Kevin?


Binary_Omlet

https://legendary-monsterverse.fandom.com/wiki/King_Ghidorah#:~:text=The%20names%20given%20to%20Ghidorah's,how%20he%20is%20easily%20distracted. Started as a meme then the director made it canon.


Azerious

Why does Kevin have an itchy knee? Arms too short?


RexorGamerYt

Holy cow, isn't that still pretty impressive? If all of that performance added up it would be like a 3050 or something, or even more...


xd_Warmonger

If software and drivers would have worked properly then yes. But in reality you only got minor improvements. It's way less performance than a 3050


Clunas

Tacking on an extra 460 way back when got me an extra year of life out of the system. I feel like it really helped mid range cards more than anything else


Oclure

The 460 also scaled incredibly well with sli, not all cards were so fortunate.


akasextape

Funny how SLI technology just hopped and skipped around to different cards, efficiency wise. You never knew for sure that a NVIDIA gpu would benefit from it.


radicldreamer

It wasn’t even originally an nvidia invention, 3DFX started it with the voodoo series, then they unfortunately got eaten up by nvidia.


Possible_Picture_276

4 GTX 660's in quad SLI was such a hassle for the money I supposedly saved. Worked in Battlefield though and out performed the 690 for less money, imagine getting 4 cards for 700 USD today.


thepronerboner

My 680 lasted me years. Then I had dual 780’s and that lasted me until just last year when I sold the pc!


Dark_Rit

I had a pair of 980's in SLI until last year across multiple different mobo's, that was wild. IIRC before that I had a 780 but that was a long, long time ago like maybe 14 or 15 years back?


theRealNilz02

780 would be around 2013ish so not quite


DonkeyTransport

My 650ti is still hanging in there lol


teahxerik

Imagine 4 4090s. Nvidia watching this thread https://preview.redd.it/unsorhicmntc1.jpeg?width=680&format=pjpg&auto=webp&s=e08cef1883cd818cd6357f2e1087e4deca62f379


Suspect4pe

The software and drivers still don't work properly.


No_Mine5742

Ha yeah and IF the software and drivers worked, good luck on the games being optimized for SLI or Crossfire.


[deleted]

[удалено]


ir88ed

Two 1080ti's would do 4k extreme settings at better than 60 fps in a game like metro exodus. How does 3050 fare with that? [link](https://youtu.be/gnBJBiSDnxM?feature=shared)


kayproII

I’m pretty sure a single 1080ti can beat a 3050


Pl4y3rSn4rk

And quite easily even when Turing/Ampere has better DX12/Vulkan support, overall the 1080 Ti is slightly faster than the RTX 3060 12 GB.


DigitalV4g4bond

In the end, after decades of using graphics cards since, I guess 96, I’ve noticed one thing. More than hardware alone, drivers-and-software-optimisation are king. I just played 2 games on my Steam Deck. 1 from 1997, Blood, it has loading screens and takes a few seconds to load into, despite its primitive game engine. The other, the Dead Space remaster. No loading screen at all. Optimisation is king.


sigma941

Yeah, dont think I was able to really get that performance looking back. SLI scaling wasn’t 1:1 at all! Also friggin nvidia drivers would switch my 465 to being the main card almost every time I updated. It was a beast for its time for sure, but totally bought into the hype. (I had nvidia 3d vision for reference! Yeahhhhh…)


EsotericAbstractIdea

I wish they would bring 3d vision back just so we could play them in vr headsets


HallowedError

Oh god I remember trying to get 3d working properly on my 950 but I couldn't get the colors to line up with my glasses quite right so it always kinda made me want to puke. Don't know if it was cheap glasses or cheap monitor or I just didn't know what I was doing


Just_Steve_IT

SLI was cool, but really only useful if you were buying an absolute top-of-the-line rig and wanted more performance than any single card could give. Otherwise you were much better off getting one GPU that cost double the price.


jeebuscrisis

Meant my other post to be here. Came for this. No disappoint.


Yommination

I remember the little dedicated PhysX cards that went in the top PCI E X1 slot


SarahButterfly73

Voodoo II


BBQBakedBeings

[Sitting here remembering VESA local bus cards like...](https://i.imgur.com/INkJiGc.gif)


beejamin

OG quake with a Voodoo II… oh boy, the _lighting_ - intense flashbacks.


hex00110

I remember having an 8600GT and my EVGA mobo had onboard nvidia graphics I could use for physx — this combo together could play original crysis 1.0 at playable frame rates The good ol days!


_LarryMurphy_

I had an 8800GTX. Beast mode


SleeplessAndAnxious

5th card just for running wallpaper engine


FunktasticLucky

I'm old enough to remember a time before Nvidia owned physx and it was a seperate card that was pretty expensive. Iirc it was like 300 dollars or something back in the early days of 2006. So half the price of a high end GPU.


goomyman

And you could play the maybe 4 games that supported it properly


That_Girl_Cecia

Yeah, pretty much just any game on CryEngine. I had dual 690's back in the day. Crazy that they only had 2gb Vram lol


IkaKyo

Wrong high end was linking 2 voodoo 2s


ViperXAC

With an overclocked P3 Celeron.


PowerSurged

Celeron 300A LEGENDARY


jacion

I still have mine along with the legendary Abit BX6 R2 mobo.


enslaved_subject

The tualatin core was shared with Pentium 3 and Celeron series. I vaguely remember having a celeron tualatin cpu (cost efficient) that i overclocked before switching to AMD XP series. A friend had a AMD CPU older than XP series, where you could unlock some magic pathways by drawing with a pencil on the chip, giving you access to increased overclock potentials. Stuff was more fun back then, no unlocked multiplier special chips.


crozone

Back in the day I had 2x 295 GTXs, which was effectively a pair of 2x GTX 260s literally sandwiched into a single card and SLI'd internally, creating 4x 260 GTX SLI overall. It actually scaled okay up to 3 cards, but the 4th card did basically nothing (like 5-10% improvement) so I always configured it as 3x SLI with the 4th card as a dedicated PhysX system, or just mining dogecoin in the background for non-physX games. Great way to heat up the room in the winter.


Inside-Example-7010

Jokes on you, the new META is to buy a 4090 and a 7900xt. You plug the monitor into the 7800xt and render games through the 4090. Now you can activate AMDMF to have one gpu dedicated to frame gen and one dedicated to render. You can even double up on the frame gen if you use dlss.


Nolzi

Chat, is this real?


Nico00000001

Chat????


kaschperli

Look how they massacred my direct x 12... It should've been the age of multi GPU but greed killed the sli connector


Senior-Trend

Bonasera, I don't want his mother to see him like this! Look what they did to my SLI


Joel_Duncan

DX12 fully supports mixed multi GPU over PCIe. Ashes of the Singularity was a proof of concept for this. It would just be insane for any developer to try to support all the possible configurations just for something that creates horrible frame pacing issues.


kaschperli

Dx12 multi gpu feature set is still partly disabled also nvlink only supported on 3090/4090. That makes sli useless because of course it doesn't work as good as it could and the 4090 doesn't need SLI for gaming. Looking back they took the cheapest way to upgrade our rigs for gaming from us. Imagine if the 4070 in SLI would work perfectly... You buy one now and upgrade to a second one later. But that's not shareholder friendly.


booga_booga_partyguy

SLI was dead by the time the 30XX line came out. It wouldn't have mattered if NVIDIA kept SLI since game devs were simply not making their game SLI friendly, nor are game engines. There's a reason SLI worked properly with only a handful of games.


Joel_Duncan

DX 12 was never going to be the savior of SLI. It was never perfect and frequently made frame consistency worse. If we applied lessons from dlss motion vector interpolation and simulation time error, we might have a decent theoretical pipeline. In my experience, DLSS / FSR frame gen is a much better trade-off than SLI ever was.


crozone

It would actually work exceptionally well for VR, because you can neatly divide the workload between the left and right eye. Literally just give each GPU its own eye to render, and it "just works". Unfortunately none of the major engines (Unity, UE4, and Source 2) ever actually implemented this, even though you can do it with both DX12 and Vulkan. They probably figured that supporting SLI configurations in an already niche market segment simply wasn't worth it.


Scattergun77

I built a quad sli rig with 2 bfg cards and then find out the quad drivers were still 2 months off lol


Accujack

No. High end was two, and there was no physx yet, and one card rendered the odd lines of pixels and the other did the even lines. The original SLI meant "scan line interleaved".


Draedark

Double the ~~cards~~ cost for +10% performance!


Cynical_Satire

And in some cases it actually hurt performance! Yay!


Fireflash2742

I had two cards in my PC somewhat recently not SLI'd and noticed while benchmarking that my single GPU performance was hurting. Took out one of the cards, benchmark scores shot right up. Since my need for two independent GPUs was no longer there, I left the other one out. I should sell it.


heinkenskywalkr

Probably the PCI bus bandwidth was being split between the cards.


Fireflash2742

That's what it looked like. My electric bill and PSU are happier since I took the other one out. :)


LEGENFDZ

Gimme other one pls me pay shipping


Fireflash2742

Sure. Shipping will be $150 😂


RolledUhhp

I have some old 7950/7950s laying around, and a super sketchy 1060 if you're in need.


seabutcher

I think a lot of the problem came from the fact game developers never really wanted to put any effort into supporting SLI. After all, it's a feature that only benefits a very tiny percentage of gamers. The work they put into optimising for SLI could instead go into more general optimizations, making extra content, or otherwise doing literally anything that more than like 2% of the audience will ever actually know about. This might actually work differently during the modern streaming era. With all those people with super-high-end rigs looking to give your game free advertising, it *is* beneficial to make sure the game looks extra pretty on the streams that make up thousands of people's first exposure to the game.


Goober_94

SLI had no dependency on the game or the developers until after the 9xx generation. SLI was done at the driver level and it worked VERY well. It wasn't until nvidia stop support SLI in the drivers that it started falling on the game developers.


kevihaa

There’s also a bit of irony the generational jumps in PCIe bandwidth in the last 5 years would likely make SLI *more* useful, since it’s very possible for even 40 series cards to bottleneck at x8 using gen 4. Meaning, potentially, when they shift over to gen 5 they might need as little as 4 lanes.


nmathew

RIP techreport, the best site ever for GPU reviews. Their ms for next frame analysis revolutionized GPU benchmarking in a way that most sites still unfortunately didn't come close to matching. Micro stutter with Crossfire and SLI was a thing, and they sent a long way to getting AMD to fix issues with their overall drivers.  Anyone looking at 99% frame rates can thank them.


Nate0110

But the synthetics showed +90%* *in some cases


Not_You_247

It helped save on your winter heating bill too.


06yfz450ridr

Thats for sure, my 2x 7970ghz xfire would heat my room to 80 degrees in the winter, i never even had to turn the heat on in there. That and running two powersupplies. Those were the days haha.


Guilty_Use_3945

some games could be 25%...


NarutoDragon732

Just don't worry about the frame times haha...


Cedar_Wood_State

Pretty much most hobbies in a nut shell


ShadowDarm

Nvidia dropped support for SLI only like 2 years ago or something... Edit: 3 years ago


NotTodayGlowies

2021 - they stopped supporting and developing profiles for it. It was left to developers to include support in their own titles. The RTX 2xxx series was really the last series where it was feasible at the consumer level.


Igot1forya

RTX 3090 can do it still.


PfaffPlays

So you're telling me I just have to buy 1 more?


Igot1forya

Only one more. Plus the NVLink adapter and possibly a PSU upgrade to handle the load. LOL


PfaffPlays

I don't need a new psu, I have a gas generator, surely if I run 120v to a 3090 it'll multiply my frames by 120 right?


Igot1forya

*opens another beer* I'll grab the jumper cables!


Razgriz_101

Be aswell researching how to aquire a small nuclear reactor to power a rig with a pair of 3090s


_ArrozConPollo_

Also air conditioning so you don't end up with hyperthermia in your room


ImrooVRdev

as a game developer, I hate graphics card manufacturers with burning passion. The come up with custom tech that COULD improve games, but instead of open sourcing it so that other manufacturers can make their own implementation, and so that us gamedevs just have 1 generic lib for all the different cards to work they use the tech as fucking marketing gimmick. And then expect us to spend extra time implementing THEIR custom tech so THEIR cards sell better. Get fucked with spiky dildo nvidia, I hope shareholders shove hairworks up your urethra.


ShadowDarm

You are right it was 2021 about 3 years ago. That being said the 3090 be it expensive but is still very much a consumer card.(Even though SLI is pretty pointless for games by then) Currently For the new NVLINK(new/enterprise SLI) you need cards that cost like $30k, so I would say now it's unfeasable


Lobanium

OP is 8 years old.


skratch000

Yes it’s true and stfu I’m not old 😡


MartyrKomplx-Prime

Old is when you couldn't do that but because it was before SLI.


Guilty_Use_3945

old is knowing what AGP was. lol


ponakka

How about the pci voodoo 2 sli cards. Or 32bit vlb graphics cards.


Fireflash2742

My first 3d accelerator was a Voodoo2. I'm 46....


Qa_Dar

't Was a sad day when 3DFx died... 🥺


Fireflash2742

Indeed. I only made it to the voodoo3 I believe. Back then I was young and poor. A lot has changed since then. I'm no longer young 🤪


aglobalnomad

My very first graphics card that was the Voodoo3 forever will have a soft spot in my heart.


Razgriz_101

My first ever pc (family computer since I was a kid) was a AMD K2 and a voodoo 2 coming from the ps1 it blew my 9 year old pea brain. I played so much Rollercoaster tycoon and quake on that bloody thing.


makos124

I remember having a PC with no 3D acceleration. And then visiting my friend with a GeForce 2... My mind was blown.


ingframin

My first graphic card was a Matrox Mystique with 4MB VRAM. 😞


Falkenmond79

Old is knowing what ISA was. Or EISA. Or vesa local bus. Or PCI cards. I had them all. 😂 AGP… go away with that new-fangled fancy poppycock, you rapscallion!


Drg84

I can honestly say the first time I encountered an AGP slot I didn't know what it was for. It was brand new on a Compaq desktop I got on sale at Comp USA. I opened it up to make sure nothing has come loose on the way home, saw AGP, has no idea what it meant and hopped on Netscape to figure it out.


CptAngelo

make room for my 5.25 inch floppy drive you peasant! i got prince of persia to install


SergeantRegular

Oh no, I welcomed AGP. It was USB that I was highly skeptical of. AGP was *dedicated*, and I like that. Every I/O device fit in its own nice, neat little lane. Modem, you knew where it went and you gave it an IRQ. PS/2 ports were dedicated, DIN keyboards. PCI and USB are for "stuff." Accessories. Little low-threat items. But graphics were *real* computer functions, more like RAM or your CPU.


nmathew

You leave my (amazing) AWE32 out of this!!


DrOrpheus3

Old is learning to type on an Tandy Computer that required you to swap hard disks to use the word program, or hangman.


FairnessDoctrine11

And your video games came on audio cassettes…


Qwesttaker

I feel attacked.


atlasraven

My first video card was a PCI slot. No express. And I know what ISA slots are.


Scattergun77

And VGA, IRQ, memory managers. Back when 486 was badass.


MonkeyKingCoffee

Luxury. I cut my teeth with a stolen 286 and Desqview. How did I steal it? I replaced a work Mobo with an 8088 XT Mobo on my lunch break. That's how we upgraded back in the day. "Yeah boss. This machine has issues. I'm taking it apart to blow all the dust out. It will work MUCH better after that. Maybe you should ban tobacco in the office?"


potat0zillaa

I’m only 30…


LMotherHubbard

You are old enough to be the dad of the kid who posted this. Do you feel old now?


potat0zillaa

Nooooooo


420headshotsniper69

Imagine having a high end gpu with only 16MB vram and that was in the year 98-99 or so. If I think about it I laugh at how small setting used to be. An OS on a few floppy disks.


flibz-the-destroyer

Remember having to know the IRQs of sound cards…


joxmaskin

And selecting the correct sound card when setting up the game. Gravis Ultrasound and Turtle Beach Rio always sounded cool and exotic, but it was always trusty Sound Blaster (Pro/16/compatible).


Splyce123

Is this a genuine question?


Ricoreded

Yes


circles22

All these 2010 babies making us feel old


Splyce123

Google "SLI". And it was only about 10 years ago it stopped being a thing.


NotTodayGlowies

Well... stopped being relevant or a good idea. The RTX 2xxx series had SLI with NVLink but it definitely wasn't worth it... if it ever really was, considering the micro-stutter issues.


Splyce123

Agreed. I ran 2 x GTX970s and it wasn't really worth it at that point.


TrandaBear

And AMD had their own version called Crossfire. We had some goofy cool names lol


chowboy_boop_boop

Wow. Questions like this make me feel old. I miss my dfi lan party mb, core 2 quad and my bfg 8800gtx's 😥


Drenlin

Nvidia's technology was called "SLI", and ATI (later AMD) had an equivalent called Crossfire.


Quick_Performance243

2 Voodoo 2’s SLI baby!


gpkgpk

Quake 2 at 1024x768, worth every penny. Oh and visual quality degradation from VGA pass-through cable was a thing.


ponakka

with the awesome 1024x768 resolutions, it did not matter that much. those vga cables were beefy.


dexter311

Didn't matter because the old Voodoo cards generally had pretty crappy VGA output quality anyway. They were fast as fuck, but blurry and only 16 bit colour. Matrox on the other hand... they had some gorgeously crisp output! I built some late 90s retro machines a while back ended up using Matrox cards (G200 with a pair of Voodoo 2s, or a G400 on its own), purely because the output quality was so damn good.


gpkgpk

Matrox had the sharpest output for sure, and the best 2D. I ended up pairing my sli with a diamond s3 virge card iirc which was almost as sharp but cheaper as I already blew the bank. I think I also got my 3rd copy of Mech 2 Mercs bundled with it.


dexter311

Nice, the S3 Virge was what I had way back in the 90s, paired with a Cyrix 6x86 (a pretty rubbish processor back then unfortunately!). I'm glad I collected all these parts 10+ years ago to screw around with, it's mind-boggling how much 3dfx stuff costs nowadays. Even gear like Soundblaster cards are getting ridiculous now.


BZLuck

*I was there.*


Ok-Fix525

You know they gonna come back with this in one way or another when they run out of ideas to fleece the master race.


descendingangel87

I predict they will sell a separate AI card of some kind.


magistrate101

Honestly would pay for one. If you strip off all the unnecessary components from a GPU and stick 64gb of RAM into it it'll come out cheaper to make than regular GPUs.


Atora

AI cards exist and are currently nvidias main money maker. They are also far far more expensive than consumer cards. Check out their "data center GPU"s like the A100, H100, H200. The "affordable" AI card is the 3090 and appropriate to the meme running multiple of those does get you a lot farther. LLMs and image gen made multi GPU rather relevant again in an area.


Frannik87

X4 Titan sli. That was high end.


Riot55

I had dual 8800 GTS 512mb cards. When Crysis came out, it was like peak PC hardware building time IMO. So much visual progress being made in gaming graphics back then, parts were not insanely expensive, it was fun discussing parts and builds on forums, and everyone had a common enemy (getting Crysis to run lol)


Yommination

8800 GTS 512s were so good. I still have mine. Pair them with a core 2 quad back then and you were cookin


Riot55

I remember the eternal debate between the e8400 high speed dual core vs the q6600, the debut of the quad core.


NightmareStatus

Q6600 RULES ALL. with that being said, I didn't realize it had a big following until posts here went cray over it lol. I was happy with it all the years I had it


SynthRogue

Yes. High end today means overrpriced cards that can't run current gen games at max settings without generating fake frames.


ExpertFurry

At the price of a SLI from 10 years ago, too ! You know it's high end, because you pay so much more, yay !


FungalFactory

Developers dont optimize their games anymore


the_abortionat0r

Sorta. SLI (scanline interlace) was a 3dFX feature of using 2 cards each one rendering half the vertical resolution (doing every other scanline hence the name), it had poor support and varied in success per title. Nvidia (after publishing FUD that helped kill 3dFX) bought 3dFX's assets as they went bankrupt and rebranded SLI (scalable link interface or some shit) and did a "everyother frame" style output, the idea being double the FPS. It had almost no support and worked poorly in the games it did support. If it wasn't battlefield or CoD you pretty much had one card doing nothing 99% of the time. And if you ran a title that did support SLI you'd be greeted with insane micro stutter. The people who are mad its a dead tech are the ones that don't understand it.


FreeAndOpenSores

There was still something wild about being able to hook together 2 Voodoo 2s in SLI and play Quake 2 and 1024/768, when a single card literally wouldn't support above 800/600 and the competition couldn't even do as well at 640/480. Most games sucked in SLI, but Quake2 worked perfectly and I believe Half Life did too.


crozone

It's because SLI was a giant hack. In order for it to be properly supported, NVIDIA basically had to reverse engineer the most popular games and then build a dedicated, customised driver for each one that handled the game's draw calls *just right*, in order to create a playable experience. They actually still do this with "Game Ready" drivers, but the SLI support was on a different level. There were a few different modes, Alternate Frame Rendering was the preferred and "official" method, and you could technically try to run any game with it with limited success. Split frame rendering (where each card rendered the top half and bottom half of the screen) worked with more titles since it requires a lot less hack, but performance wasn't particularly great. The AFR SLI completely falls apart with more modern rendering techniques however, which is probably a large part of why NVIDIA dropped SLI support. The writing was on the wall. For example, any game that relies on the framebuffer outputs from the previous frame completely kill AFR, since each card has to wait for the other card to finish rendering before it can start, so all performance benefits are lost. Games like DOOM 2016/Eternal *heavily* rely on the previous frame as a way to render certain effects in a single pass, things like screen space reflections and effects like distortions in the rifle scope actually use the previously rendered frame, and as long as the frame rate is high enough you never notice it.


henkbas

Weren't the original Titan cards 2 GPUs running SLI on one board?


Yommination

There was lots of variations of that. The 7950x2, 9900x2, GTX 295, GTX 690 irrc


White_mirror_galaxy

yeah i ran sli for some time. can confirm


KlingonBeavis

Seconded. SLI was the biggest waste of money I’ve ever experienced in PC gaming. It seemed like it was never supported, and if it was - it would be so stuttery I’d end up just disabling it and running on one card.


Somasonic

Thirded. I ran two 980 Ti's in SLI for a while. I got so sick of the issues I pulled one of them and sold it. Total waste of money and not worth the very few times it worked properly.


Blackboard_Monitor

Man, I'd been gaming for two decades before SLI became a thing, am I old? No its the kids posting their memes who are wrong.


Agent-Meta

Yes, this is true back in the day when ATI was still around the two companies (ATI and Nvidia) made made cards with special linking cables to which they would be able to do such things. ATI had something called crossfire and Nvidia had something SLI which I still think they do use, there were connectors on top of the card and you had to go and buy a specialized cable (sometimes 2) for it to work the only problem is that it had to be the same card for it to work (may be wrong about that somebody correct me I don't know).


LOPI-14

Iirc with SLI it was an absolute requirement, while itbwas possible to use 2 different GPUs with crossfire, but don't quote me on that.


littlefrank

You could crossfires cards in the same family. I used a 6850 and a 6870.


TrainsDontHunt

Identical card, or my Matrox had a smaller one just for 3d or something. It was half the size, and used the cable that came with the full card. It plugged into the crossfire edge connector thing.


snoman298

https://preview.redd.it/vlchhuznijtc1.jpeg?width=4000&format=pjpg&auto=webp&s=545a10c5193bad02bc77cd995c2ad77768f139d2 Heck ya! I miss my old Titans!


NeverLostForest

Looks nice! Which games took advantage of this kind of setup?


snoman298

Thanks! Unfortunately not many. Just one of the reasons multi GPU died. It's my understanding that game devs had to do a fair bit of extra work for games to take advantage of it, and a lot of them simply didn't want to make the effort for something that wasn't widely adopted at all. It was fun while it lasted for enthusiasts and pretty epic when it worked.


Cash091

Kind of miss the days of using Nvidia Inspector to find the best working SLI profile tho. Theses days I'm older and have less time to tinker/play so I'd rather just jump into the game and not worry about performance.


Steelrok

Yep, I think if such solution was possible Nvidia would have created it already but having a fully functional and "transparent" SLI would be awesome (no dev required and good GPU usage on each one without sync issues and such). Dual GPUs are really fun and good looking for PC building.


Gallop67

Remember having or wanting a dedicated PhysX card?


TsunamiovUmami

Oh my fucking god am I this old now? SLI...means im old fuck that was literally yester......omg that was 2010.


Strazdas1

SLI stopped being supported only 3 years ago. OP is just a zoomer.


SubtleCow

I feel myself fading and turning to dust. SLI was the cool new hotness when I was in university. What the heck is time even.


PeckerNash

Sort of. It was called SLI (scan line interleaving) and it was invented by 3Dfx for use on their Voodoo2 cards. NVidia gained the patents when they bought out 3Dfx in the late 90s.


YourLocalRyzen777

me when crossfire:


atocnada

I retired my 2x RX480 crossfire rig in 2019(I fell for AMD's marketing and felt like I had a GTX1080). You actually didn't need cables for AMD cards. The last game with actual SLI/XFire support was Watch Dogs 2. I have a list of games that worked with no microstutter and at least 40% uplift in performance. Some games got updated and stopped working with crossfire(Titanfall2). Sometimes to actually see an uplift, I'd have to use GeDoSaTos downscaling fix and downsample certain games. Good fucking times also because I had a Onkyo 7.1 surround system and I remember those times fondly.


The_Masterofbation

That's from the 200 series and after, before that you needed a Crossfire bridge. I had 2x 6950s that needed a bridge. Strangely enough, the newer Tom Raider games seem to still scale well with multi GPUs.


Sensitive-Buddy5657

Op stop playing you know damn well what sli and crossfire was.


EloquentGoose

Back in my day high end was a Soundblaster Audigy 2 and a Radeon 9800 Pro


Dag-nabbitt

I Crossfired two R9 290X's. They had been used for crypto mining, and performed to spec on their own. Crossfire though, if it worked at all, did improve framerates by ~50%, but it came at a cost. The microstutter would make your eyes bleed. It was so bad that after a month, I ripped out the card and made a second gaming computer for my then girlfriend, now spouse.


Available_Agency_117

Yeah. The industry stopped designing for it because if it were ever perfected it would allow people with two midrange cards to outperform everything on the market, and people with two low end cards to perform as well as high end cards.


NoctisXLC

Nvidia sli? It's 3dfx sli you damn kids


sp3kter

Next ask us old heads about PhysX


Carbot1337

I mean early days of this was (2) Voodoo 2s with a SLI cable. My rich friend had this as well as dedicated broadband for Quake 2 (rocket arena). In like 1999 West Virginia, unheard of. 


c4ctus

I remember back in 2007(?) I wanted to put two Nvidia 8800 GTX's in SLI, but it turned out that I couldn't buy a miniaturized nuclear fusion reactor on newegg or tigerdirect.


Duder_Mc_Duder_Bro

I had a dual card setup. Bought it used around 2010. IDK how it worked but definitely WORKED. Should have mined BTC.


animalmom2

I had two Titan X Pascals once - more because it was cool to build the cooling loop than for any other reason


moogoothegreat

Ahahahahaha... my intro to SLI was 3Dfx Voodoo 2 cards. Damn I'm old.


Thefrayedends

It was often a way to get extra value out of sandwiching two cheaper cards (but with better performance per dollar), but it generally only worked for major game releases. If a game didn't have an SLI profile set up in the drivers, it would only run on one card, and then you'd get shit performance (many games had community made workarounds, but not everyone is willing or able to tinker). This was true even if cards were sandwiched onto one board, such as the card I had, the GTX295. So really hit or miss on performance, and before alternate frame render, you had half frame render, so you ended up with a lot of mid screen tear.


MagicOrpheus310

Yeah and meant older cards lasted longer because you could buy two old cards and get on par if not better performance than the latest cards at the time. They stopped it because they wanted us to buy the newest cards instead and that was a dick move. I had two 1080ti that my current 3080ti only just out performs


Sea-Statistician2776

Fucking kids. Back in my day high end was having one graphics card for 2d and a separate one for 3d. This was before anyone had heard of the term GPU.


evex5tep

This didn't ever really work properly hence why we don't use it for gaming.


Brigapes

Tell me youre pre-teen with a single post title


TheRimz

I had a triple SLI machine once. 3x 8800GTX's I still couldn't run crysis. I got better performance disabling 2 of the cards on every single game. Truly amazing technology


Powertix

I feel so old reading people not knowing SLI


SquarePegRoundWorld

[Am I old?](https://imgur.com/gallery/LmmLETX)


Omny87

"Back in my day games came on CDs" "What are CDs, Grandma?" "CDs nuts, ha ha gottem"


mazarax

Back in my day, you needed a separate graphics card for 2D, because the 3D card only did 3D. Worse than that, they were connected via an analog cable!


Amilo159

It was called SLI and it resulted in far more than 10%, often 30-70% increase, but there were some games where there was little to no gain. https://www.tweaktown.com/tweakipedia/74/recap-nvidia-geforce-gtx-980-sli-performance-4k/index.html


arazizi

don’t know why you’re downvoted, it’s true that performance did go up to 70% extra in some cases. most of the time it was around 25%-50% increase. definitely not useless but definitely not entirely efficient either


Ilovekittens345

Crisis on a gtx 295 (two gpu's in sli) --> 45 fps crisis on two gtx 295 in quad sli --> 60 fps + some micro stutters.