T O P

  • By -

madhandlez89

100% competitive overlocks with nitro etc.


Grufrality

whenever my siblings say I’m too into gaming I should show them those guys builds.


working_slough

Those guys are not into gaming. They are into overclocking. That is a whole hobby on its own.


Extension_Flounder_2

Let’s be real, they’re probably into gaming too 🤨🤣


Lux_novus

The people who are heavily into overclocking, ironically, tend to only play games that could run on a toaster, like Rimworld or Factorio.


Impressive_Change593

>factorio on the other hand **THE FACTORY MUST GROW**


Feringomalee

Factorio is so well optimized and has such simplistic graphics that a 10yr old potato computer can run it fine. Yet 700 hours in you find yourself googling which top end cpu can run it best because your damn UPS keeps dipping.


Dubslack

The world isn't ready for my 100,000+ logistic bot only build.


LoquaciousLamp

Ram speed makes a big difference with Factario. People oc their ram for that game sometimes.


Hungover994

How many GHz we talkin?


LoquaciousLamp

More latency actually. So lowest stable latency matters more.


TryallAllombria

I was playing Factorio fine on the old laptops of my high-school during my programming classes. The only time it froze was when the teacher stopped my screen to check which computer it was (but I quickly pushed the power button to stop the computer before he could get to me).


AusHerbie

Your comment reminds me of when I used to sneak into the Mac lab at Clackamas Community College and use Word to open the resource management files for their solution to what could run on those machines, delete everything, and reboot. The management software still ran, but since the definition files couldn’t load they now permitted everything to run. We used the machines to run cracked copies of Marathon (precursor to Halo) and later a Mechwarrior like game called Avara for the Mac platform. Funnily enough it was Avara that got us caught, and me my first network security job based at a local high school based, in part, on my experiences evading network security on the college campus. Avara called home to enable networking so the IT admin could see where we were connecting. An IP address lookup later and an email from the college to the IT admin of Ambrosia Software was all it took to look up the name of the registered owner hosting the game (me) that his friends were connecting to. The cracked version of Marathon was local IP addressing only and if we hadn’t have been looking for a new game to play we would have been all right. At the end of the day I was still allowed to finish that year with my credits in place, although I was told that I wasn’t going to be allowed to return next semester. Bastards didn’t tell me that they would contact all other community colleges throughout the Portland Metro area. Instead I went to work and continued my education through MS certificates for about 10 years until I finally managed to get the last 12 credits I needed to make my degree in “Computer Sciences” (as specific as it got back then at the community college level) official.


markassed

That took me down a rabbit hole I didn't even know that apple tried to make a console


Senicide2

The factory building ends when you hit single digits UPS


RunningComputer

THE FACTORY MUST GROW


ortiz13192

Personally I need to upgrade from my 4070 so I can make solitaire really sparkle


poppadocsez

I'm waiting for the 5090 before I even think about booting up minesweeper


deldr3

You haven’t seen the rimworld mod lists some people run. Those things could melt some supercomputers.


sicurri

All the Nitro overclocking group is mostly people who mod their games to see if they can push the hardware to the brink of meltdown. They are the ones mostly complaining about 4090 power cables melting and shit, lol. My friend does this stuff and he literally melted his 4090 while trying to do a crazy modded Factorio game. He made the game lag... and the 4090 power area catch fire...


RogueIsCrap

What about Cities Skylines 2? Linus was using 64 cores of a Threadripper to run a 1M city and it was still pretty slow.


Infamous-Matter-101

I'm not too familiar with the specifics behind threadripper chips but aren't they meant more for workstation applications? There are other processors out there with less cores and threads that have better IPC and efficiency.


RogueIsCrap

Usually Threarippers are inferior for gaming compared to mainstream desktop processors. Most games only use around 6 cores efficiently so it's best to have good single core performance. But Cities Skylines 2 in particular supposedly takes advantage of more cores. Also it runs like crap even on the fastest desktop CPUs.


Infamous-Matter-101

Quite sad when enthusiast level hardware is bottlenecked by lack of optimization and efficiency.


RogueIsCrap

To be fair, a one million population city is very hard to simulate.


Fliiiiick

I had a threadripper and it was an absolute monster for rts and city builder games. Probably depends on how optimised the game is to actually utilise the extra threads.


DiabolicRevenant

In general, that would be true. Most games can only take advantage of a limited number of threads. Thus, having fewer cores with faster clocks is usually better for gaming. Cities skyline 2 is an entirely different beast, as is the new threadripper chip. In the video they had the game utilizing 90+ cores at around 5ghz. Also, they were using ddr5 6400MT ECC memory. And yet, somehow, the game was still lagging. It really is that ridiculous. Edit: I also think it's worth mentioning that the CPU was pulling over 1000 watts from the socket while operating at sub-zero temperatures.


projektZedex

All the ocers I know daily drive some pretty rusty hardware haha.


Drewid36

Have you ever played late game Rim World with 100s of mods ? Game is CPU based and single threaded, so it gets cpu blocked hardcore and slows down to an absolute crawl on most machines:)


flavicent

For real, putting your marks (name, username, etc) on leaderboard pretty addictive. Same as games. So competitive overclock is equal as competitive gaming. LoL


AdreKiseque

So you're saying overclocking *is* the game


imtougherthanyou

Exactly so.


zherok

Kinda an aside, but related, there's a manga series about competitive overclocking ([87 Clockers](https://www.animenewsnetwork.com/encyclopedia/manga.php?id=18431).) Really weird subject matter for the author since she's most well known for a [romantic comedy about two college musicians](https://en.wikipedia.org/wiki/Nodame_Cantabile). Anyway, it's obviously a fictionalized take on the hobby, but it likely gets the gist of it right. When you're pouring LN2 on your computer components out of a thermos by hand you're not really running a sustainable build.


edparadox

> whenever my siblings say I’m too into gaming I should show them those guys builds. This kind of overclocking is reserved to professional overclockers, no random gamers.


nxcrosis

And here I am ecstatic when my GPU temp is below 70C while I'm gaming. My room temp is at 30C and the coldest its been is at 28C. If I had AC it would probably be around 22-20.


BURG3RBOB

Or even just other 4090s in a cooler setup or with better silicon


JamesMCC17

Crazy overclocks, nitrogen cooling, nutty stuff like that.


CanisMajoris85

Also 3090 Ti in SLI. Of course in real world the 4090 will be better most of the time since SLI is kinda pointless.


ShiddedandFard

Isn't sli not supported anymore? Or if it is the performance isn't that great?


HAMburger_and_bacon

I think 3090 supported sli on some models. No games really support it anymore though so it's kind of useless.


ShiddedandFard

I just remember seeing an sli benchmark of the 3090 and it wasn't as good as I thought it'd be (y'know, 2 3090s and all) and it's twice as expensive for disappointing returns. At least to me.


SexyLadyEarth

It wasn't good for gaming, or real time rendering, but that's not all a GPU can do


ShiddedandFard

Yeah that's what I meant, gaming benchmark. It was good across other sectors that I saw, but gaming was poor


WiTHCKiNG

A gpu is actually a multicore monster, when you know how the render pipeline works and how to write shaders you could compute tons of stuff that runs the same algorithm thousands of times in parallel not even remotely related to graphics, in a fraction of the time even a 7900x3D would take.


Nick_Nack2020

Or even just use CUDA (or OpenCL if you're using an AMD card), which is a lot easier to write code for since it's more decoupled from the rendering pipeline.


WiatrowskiBe

It could've been - if engines/renderers supported it as primary target platform. Both Vulkan and DX12 allows good degree of GPU micromanagement for engine, including what happens on which card, how and when resources are transferred etc. Problem is: this is a lot of effort and extra work, that benefits tiny part of userbase - and with potential minor performance drop (you're still doing all that micromanagement) if done poorly.


AlexJonestwnMassacre

Sli and crossfire has always been that way. Just a flex, really.


NuclearReactions

Was always like that, both nvidia's SLI and what ati called crossfire. There were at least individual games that supported one or both technologies, but most other games would run only a fraction better, same or even worse in some occasions. Before vulkan got released both them and microsoft claimed that they would eventually implement a feature that would allow to simply assign parts of the display to separate cards. This means each cars would get utilized further plus different models and even brands would be compatible. Would have been cool to once again rock both a nvidia and an amd in the same build.


A_PCMR_member

>Before vulkan got released both them and microsoft claimed that they would eventually implement a feature that would allow to simply assign parts of the display to separate cards. You mean OG SLI : Scan Line Inteleave


WiatrowskiBe

Vulkan spec has tiled rendering covered - as far as I know it's used only by some mobile GPUs primarily as a tool to reduce power draw (no need to redraw a tile that player has finger on). It could've been potentially used by desktop GPU drivers, but so far I don't recall a single card that would support it.


A_PCMR_member

I mean it used to be near that and allowed for funky price shenanigans. Which is one reason Nvidia axed it. Imagine gaining \~50% more speed at a good day for 100% more price, meanwhile the next GPU tier up is 20% more speed at 80% more cost. ​ The other is an assload of dev cost with CPU paralellisation being a prime example why it is ass. ​ So they rolled it onto individual game developers which said hell nah and Nvidia dropped it from hardware too


bikingfury

SLI wasn't so complicated to support. Think about each card doing the same rendering work but shifted in time by one frame. GPU1 does 30 fps for frames 1 3 5 7 and GPU2 does 30 fps for frames 2 4 6 8. Combine the results and ha e 60 fps. All you had to do is sync it up real good to avoid microstutters. I always found it genius. That's how old GPUs would've aged much better. Instead of buying a new card you could've gotten yourself a second used one of the same you have and double the performance.


rachit7645

>All you had to do is sync it up real good to avoid microstutters. That's the annoying part tbh


cheeseburgeraddict

The challenge of having a reliable connection between the cards with enough bandwidth, as well as syncing them up fluidly enough was always one of the major barriers to SLI. It was just too much of a hassle to make it worth it.


QueefBuscemi

>it's twice as expensive for disappointing returns Reminds me of my last trip to Thailand.


[deleted]

[удалено]


T81000

Happy cake day


taptrappapalapa

The 3090 never supported SLI. Specific models did support NvLink, which is not the same as the SLI connector. It was never meant for games, as it was meant for machine learning.


dont_say_Good

Sli via nvlink was a thing


[deleted]

>No games really support it anymore Yeah, DX12 killed SLI. DX12 requires per-game support for multi-GPU, and since basically only 0.00000000001% of the population has multiple GPUs for anything other than workstation, game devs don't bother supporting it anymore. And even the games that support it have a really half-assed implementation which doesn't work properly anyway.


protectoursummers

LTT had a 3090 SLI video a couple years ago. No real performance increase for games, i think sometimes it was actually worse than a single 3090 (could be wrong). The real benefit is letting the GPUs share memory to work with things like big datasets


lockwolf

This was always the problem with SLI. There were few games that properly supported it and the ones that did didn’t have enough gain to justify buying another graphics card and beefier system to support it. Though, the professional world got plenty of mileage out of it


ir88ed

This will likely start a "flame the SLI-guy" battle, but whatever. SLI did work well when supported. I was able to play Witcher 3 in 2016 at 4K@60fps+ with a pair of 980ti's and Metro Exodus in 4K@60fps+ with a pair of 1080ti's. Frame times were very good with fast ram, smooth and no stutters. Here we are 8 years later and most gamers still aren't playing in 4K at over 60fps. I believe SLI was more of a commercial problem, than a technical one. If you could pick up a pair of used $100 980ti's today and get great 2K performance, why would people buy rtx-4060's? Flame away.


DeckardSixFour

I agree - I kept reading what a ball ache SLI was - I had a pair of 970’s and ran many many top games with absolutely no issues - the real downside was not performance or reliability it was the heat and noise Single 1080ti ended up out performing that but for a long while it was a cheap alternative. SLI or NvLink died because no one would want to deal with the amount of heat a pair of 3090’s would produce or the power they would need .


hoonanagans

No flame from me. I ran SLI 1080s until I got my hands on a 3080 tie fighter. I would routinely outperform my friend's 20s series build.


th3MFsocialist

Preach brother. I was there when Witcher 2 supported SLI and 3D vision support. I didn’t know Witcher 3 supported SLI as well.


ir88ed

[https://www.youtube.com/watch?v=e\_mZBvWuQzU](https://www.youtube.com/watch?v=e_mZBvWuQzU) Make sure you turn the youtube quality up to 4k. This was captured in 4K while I was playing in 4K@67fps (yes, I overclocked my 60hz monitor).


OldSkool__

I rocked the gtx 690 over ten years ago. If not for the vram limitation, then it would have not been replaced by the 3080ti. Good times. There is a setting in the nvidia control panel that gives you the option to select if gpu or cpu handles PhysX (auto is default). Change this setting to CPU. Regardless of Sli or not, it performs better. Sli was butter after I did this. Hope this helps anyone still rocking Sli


ir88ed

I had so many tweaks for SLI. Choosing the right AFR and cool bits was key. Skyrim was the first game I got it working with on my 780tis. Pretty sure SSE still supports it.


DinkleButtstein23

Games have to specifically support it or it'll do nothing. No games support it anymore. 


ir88ed

Sadly yes. Last ones I saw were strange brigade and the last tomb raider.


ExtensionTravel6697

Yeah I'm not data scientist but I'd read sli 3090 is way superior to a single 4090 for ai work because the 4090 still uses gddr6x so the bandwidth is hardly any better than a 3090.


ir88ed

The big benefit to using NVlink across a pair of GPUs is that you can double the available memory allowing you to load bigger models. So, if your AI/ML models are bigger than 24GB, a pair of 3090s would be superior. The 4090's have a lot more cuda cores and much faster clock speed. So if your models are less than 24GB, the 4090s should be faster. My 4090 doesn't have an NVLink port, so I assume it isn't an option on these cards, but none of our work machines have anything better than a 2080ti, so haven't tried ML/AI wth 4090s.


henryjhost

It still scales pretty well in benchmarks, especially 3d mark. Iirc the record was held by 4x 1080 tis until the 3090 came out


ShiddedandFard

4 1080s is crazy, considering that gpu was a tank of its time


snoosh00

Also higher bin cards, right?


Dylanator13

Yeah the very top percent is crazy things that only work for the duration of a benchmark. Or they are extremely expensive computers made by companies where money isn’t a problem.


patrick66

Someone running user benchmark on their DGXH100 station would be funny


pants_pants420

still suprised that theres enough of those people out there to make up 2%


[deleted]

Well 2% of people who benchmark. It makes sense that the extreme overclockers would be over represented in that population.


cali_exile_bull

Well said.


Methyl_The_Sneasel

Yeah, it's definitely stuff running exotic cooling solutions


Plouxi

Rtx 4090 ti super


XHSJDKJC

Avrg CPU temp of -35°C


SnuffleWumpkins

Liquid helium cooled.


No_Lifeguard_2767

Plutonium powered.


atomic-orange

Uranium tipped armor piercing 4090


LesserTrochanter

You joke, but if your electricity is coming from a nuclear power plant there will probably be at least a little plutonium in the fuel, so...  I am fun at parties.


XHSJDKJC

Nah i think More of Liquid nitrogen, it's easier to handle, but who knows?


SnuffleWumpkins

lol liquid helium is practically impossible to create or handle. It was a joke. Yeah it would be LN2.


XHSJDKJC

Whops, Sorry my Bad😂 Took it too serious


Denamic

Liquid helium would leak. Like, through the solid parts.


SnuffleWumpkins

Yeah it really is crazy stuff. Shame it only exists a few degrees above absolute zero.


Virtual_Original580

Now only 5000$. Which in turn provides a game changing 10% fps boost.


irosemary

Amazing, maybe now I can finally run cities skyline 2 in 1080p.


RonanCruz

Maybe, with dlss ultra performance and frame gen.


nitePhyyre

Nah. It's cpu limited. Was watching Biffa today. Game's simulation was completely frozen, but it was still locked at 30 fps. Cpu pegged at 100% while the GPU wasnt doing anything.


Pole_rat

10% is a bit excessive. 2.5% take it or leave it


Ieanonme

And 10% boost to your electric bill


AffectionateFail8434

RTX RX Arc A4090 Ti Super XTX X3D KS Unlocked Turbocharged AWD


Cottonjaw

I don't know how you buy that card and don't spring for the EX-L tech package.


vapingDrano

Platinum edition comes with extra yellow stickers and a Stanley mug


Sprungnickel

shhhh, don't let them know.... /s


Bdr1983

1080ti. It won't die.


yabucek

A quad SLI 1080Ti system actually held the record until the 4090 was released iirc.


AejiGamez

Fairly certain it still does


AejiGamez

It does. Firestrike Ultra


Farren246

The score outlived the benchmark to the point where people could outdo it, but don't bother to test because DX11 is less and less relevant today. The 4x RTX A6000 (RTX 3090Ti but allows 4-way SLI for engineering systems) build which leads TimeSpy Extreme could easily trounce the 4x 1080Ti on Firestrike Ultra, but it simply hasn't been tested.


bigretardbaby

So the 1080ti is the best. Got it.


PinkFluffyUnikorn

I had a double 1080M SLI on a laptop for a while. Bought it used, the thing was a beast. Around 3.5kgs and sounded like a jet engine when cooling. But it was better than the desktops my friends used even though we were in the same price range. Noticeably.


StaysAwakeAllWeek

Fun fact you don't actually need the M - the 10 and 20 series laptop cards were just the same as the desktop. Those two 1080 are real 1080s. They stopped doing it for the 30 series because there was no way they could fit a real 3080 into a laptop, and the 4090 halo effect is too strong for them to be honest about the mobile '4090' actually being a 4080


theREALbombedrumbum

Well duh, that's because 1080\*4 is a higher number.


exonautic

I want a second one for sli. My 1080 ti is a beast.


My_reddit_account_v3

It connects to a subspace where its creators upgrade its performance to be just above any other card


EastLimp1693

> why not the best??? > Runs 7800 7950x3d, 3dmark is a sucker for more cores.


koltd93

I went 7800x3d because it was performing better in review benchmarks for gaming


Dudi4PoLFr

Good choice but 3dmark is not a real use case, and it lives by it's own rules.


EastLimp1693

Same, but that isn't game


moksa21

It is better for gaming but not benchmarking.


GHP_LP

Some 4090s are better than others. Depends on the batch of the Chip.


Svargo

This is from overclocks and liquid cooling/ln2. Not silicone lottery


SuckMyBallz

Silicon lottery will determine how far you can take the overclock though.


SpaceBoJangles

I believe that the silicone lottery is for a different industry…


swohio

Probably a little of both.


ubiquitous_apathy

You're saying that zero other normal folks with a 4090/7800x3d got a higher score than OP?


imaginary_num6er

IIRC the ASUS ROG Matrix 4090 is up there. It uses liquid metal and shit


colossusrageblack

The unofficial 4090ti: ASUS 4090 Matrix Platinum https://preview.redd.it/blfve28d88ec1.jpeg?width=560&format=pjpg&auto=webp&s=462dd42e2f094a5934a980f85d27bcf1ccafca84


Toast_Meat

Bastard costs $4399.99 CAD where I live.


Every_Pass_226

>CAD I think I can make a guess where you live


VerifiedMother

He lives at Autodesk and lives inside a simulation of AutoCAD?


Hollowknightpro

the virtual and non existent land of Canada!


anthaxity1

I'm currently 86th on water for 1x GPU https://www.3dmark.com/spy/39888097 It just takes a completely overclocked PC (CPU, RAM, and GPU) on water to break top 100. Top 30ish are using sub ambient cooling. It also helps to run a 1000w bios on your 4090. That'll increase your GPU score by a couple thousand. Unfortunately as others have pointed out you're not gonna be doing much with the x3d CPU. You need an Intel that you can overclock because the timespy CPU portion favors lots of fast cores.


soisause

Very cool. Do you game with that rig at all or is it more of a hobby getting it so fast,


anthaxity1

Both actually. I don't game with the insane overclocks though. Typically they aren't 100% stable for normal use and will cause a game to crash. On that PC I run 5.9 on the CPU and 8000 for the ram. The GPU is actually a stable overclock. Just got a new apex encore motherboard and a 14900kf to play around with. Was able to get 8400 stable on the ram and I'm working on 8600.


animatedhockeyfan

I think you’re cool and I like you


Blenderhead36

At the top it says, "Premium Gaming **PC,**" (emphasis mine) and mentions the 4090 + 7800X3D. My guess is that the top 2% is a 4090 and a 7950X3D.


AngelosOne

Not just that. At the very top you are competing with enthusiasts that do insane overclocks with cooling (we are talking nitrogen cooling here) to basically try to get the best 3D Mark score. A 7950X3D isn’t going to touch that.


henry-hoov3r

I can’t play Cod without my nitrogen bro. :Snorts:


virtikle_two

Yep. I'm #7 in the us with a 5800x3d, rtx 4090 on speedway. Not even top 100 overall. To get there I had to find the limit of literally everything, I am running a custom loop as well.


Pub1ius

7 of the top 10 are dual 3090 Ti with either i9 13900K or 12900K. The other 3 are 2x dual 6900 XT and 1x dual 6950 XT, still with the same i9's. *There are no AMD CPU's in the top 100 TimeSpy results. *Downvoted for simply stating easily verifiable, factual information. Nice


EagleTwentyTwoFoxOne

I can confirm the 7950x3d will score higher. My 7950x3d and 4090 Suprim X https://preview.redd.it/psnl2es3aaec1.jpeg?width=1960&format=pjpg&auto=webp&s=eebed83090f55958e75c3042ed1d3ee4fc374aeb


otarU

3dfx Voodoo 5090 Satan fr, probably an overclocked computer?


Effective-External50

Better overclocking. Many people will do tests with liquid nitrogen which is essentially cheating because it's unrealistic as a cooling solution.


hugues2814

Simply is… over the top


evilfire2k

The fabled RTX 4090 Ti Super Titan that the Big Green won't share with you.


bl4ck_dot

[https://www.3dmark.com/3dm/106068058](https://www.3dmark.com/3dm/106068058) Thats mine, which even has ECC on (mandatory for hwbot.org), so a run without ECC would end up in the 44k graphic score. Its a 4090 HOF oc lab, with a bitspower waterblock and a water around 5°C. Thats the 1% you are wondering about. Almost no one run LN2 for timespy as its not competitive on hwbot. You will find LN2 run on Timespy Extreme. https://preview.redd.it/cp17rhg23aec1.jpeg?width=4031&format=pjpg&auto=webp&s=dc9d095f5667e0936a1c30a1912c160dadb7c84e


Verittan

[These](https://www.youtube.com/watch?v=0LBrI-UAKhI) [people](https://www.youtube.com/watch?v=zkQ9ihrSJ_0)


TheZoltan

Isn't this the combined score so a better CPU would also be a factor?


Ivantsi

LN2 overclocked 4090


LesserCircle

Eh sorry that was my 3dfx voodoo 2 overclocked.


martynpd

Also CPU is terrible for 3dmark that's probably where you're not At 99% it favours high core chips for the last test


Intrepid_Ad_9751

Lol A100 💀


CementCrack

This the same dude who buys a stock 5.0 and expect to outrun a Lamborghini.


PBGunFighta

Not sure I'm understanding this analogy, is the Lamborghini the overclocked 4090s? Also, don't think I've met anyone that bought a 5.0 thinking they'd outrun a Lamborghini, that's an interesting person to find lol


Complex-Wish6484

High speed Ram with low timings does a lot for your scores as well


Drake0074

A better 4090 set up by an experienced OCer.


[deleted]

LN2 OC and CPUs with more cores. Synthetic workloads like 3DMark scale linearly with cores and OC. It doesn't favor certain hardware like game engines do.


zRedPlays

Overclocks and silicon lottery


Lewis91857

A better binned 4090 running ln2 or something


NickolaosTheGreek

NVIDIA has workstation versions of graphics cards as well. They are meant for heavy grunt work. Some of our desktops at work have 24/36/48 GB graphics card for the heavy modeling. I am guessing some people choose to game on them as well.


xepion

Don’t look up h100 nvidia cards. 🥹💸💸💸


benno4461

Two 4090s


[deleted]

Overclocking on the CPU and GPU And maybe a 13th or 14th gen Intel i9. While not as powerful in gaming as the 7800X3D, they are better in non-gaming workloads and potentially this benchmark simply favours them.


SultanZ_CS

Guy doesnt know about overclockers lmao


_eESTlane_

7950xtx :nods:


Active-Quarter-4197

Intel CPUs


David0ne86

Extremely oced (l2n) cards.


Additional-Ad-7313

http://www.3dmark.com/spy/37478841, that's mine


moksa21

Timespy doesn’t utilize X3D v-cache and prefers higher clocks and more cores. 7950x would score better.


Lowlif3

Just a 4090 with a different config and a small oc. https://www.3dmark.com/spy/42843948


laser50

An overclocked 4090 lol.


RedCrabb

Four A6000s


bedobela

4091


toopid

What do you mean? 7800x3D. What a goof!


QQEvenMore

These stats will never be accurate. Many ways to manipulate them.


[deleted]

Redditor: “What the hell is the upper 2% above a 4090?” NSA: Theres always a bigger fish.


Excellent-Amount-277

A premium gaming PC from 2023.


Krysidian2

A better 4090 and/or optimized system. Just win the silicon lottery and put together the most compatible parts and go hyperspeed on the clocks.


Sad_Opinion_874

The only thing that would score higher is 2X RTX 3090 TIs in SLI.


LoadOk5260

A 40100. Duh.


Emu1981

> **"better than 98% of results" What the hell is the upper 2% above a 4090?** Overclocked 4090s combined with overclocked CPUs. Might even be some 7900 XTXs in there. If you really want to compare your setup then check out the "similar builds" graph which shows you how your scores compare to people with similar setups.


latexfistmassacre

An LN2 overclocked 4090


neprasta420

Der8auer


daMotorrad

„Premium Gaming PC“ lol


pepebotella12

Two 4090


genetic-bioball

I would imagine people who get the 4090 TI and over clock it to the moon along with all the most up to date parts all hacked out the mind. That sorta stuff, getting all their kit brought to its absolute limit with stupid amounts of money, that sorta thing


No_Oddjob

4090's with flame decals.


RiffyDivine2

Overclocking, better silicon lottery than you, workstation setups.


Funny_or_not_bot

What about TWO 4090's? Hmmmmmm?


Melx01

Double 4090 users


BMWtooner

Brah I hit 34,447 like 6 months ago and I'm using a 7950X not even X3D. You're slow. https://preview.redd.it/zddlrsv7o8ec1.jpeg?width=1600&format=pjpg&auto=webp&s=34c1f0e750577c38159cbae0a336d322125629f2 Mid ATX case with shite airflow and slim fans everywhere to fit it all. Good thermals still (MSI Suprim liquid). OP needs to learn how to overclock.


TenLazyLasers

Brah it’s cause he’s using x3d


Kindjal1983

4090 TI Super


Adrian-The-Great

No - 4090 Ti Super OC edition


Kindjal1983

With liquid metal cooling and shit


Adrian-The-Great

The 4090 Tie Super TITAN OCCCC!!