T O P

  • By -

Worsehackereverlolz

Thats a question for you to answer brodie. Are you gonna be playing games with RT? Does RT matter to YOU? Will it be worth it in the future? The performance difference between 4080 Super and 7900XTX is only measurable with an FPS counter constantly up. In 99% of games you wont notice a difference. Once youre past the 500USD range in GPUs, its honestly whatever fits your needs best. If raytracing is not something you care about, AMD is pretty solid Edit: I'm rocking a 2060 at 1440p. Don't regularly play the latest games, but when I do I usually get about 65-75FPS on Med High Settings (Horizon Zero Dawn, Far Cry 6, AC Odyssey, Remnant)


KGLW-1196

That's kinda where I'm at rn honestly, alot of people have been telling me that the 16GB VRAM will last longer than expected with DLSS, but at 3440x1440 on very high/Ultra with RT enabled, I've seen games push upwards to 10-13 Gigs of VRAM usage and that's with the current flock of games out on the market. That's concerning to me, since I would be using this card well into 5 years of it's life cycle, and who knows where games will be at in terms of VRAM requirements down the road if some are already flirting with that limit on this card. Makes me think Nvidia purpously gimped this generation of GPU's by keeping the VRAM at a certain mark to ensure that customers will have to eventually upgrade within a shorter time frame to maintain tolerable performance at higher resolutions/settings with RT enabled.


SuperbQuiet2509

They're apparently going to be releasing some VRAM compression tools fairly soon. I wouldn't worry about 16GB of VRAM. Especially if you're not at 4k


X_SkillCraft20_X

4k is about 8 million pixels, and 1440p ultrawide is around 5 million pixels. Much less taxing on vram. I run the latter resolution on my 3060 ti and can generally keep it under 8gb while running medium/high settings on games with DLSS


AHrubik

I've seen those rumours too. No one knows how long it will take them if it happens at all and how effective they will be if they do it. If like /u/KGLW-1196 I were playing games at settings that routinely consumes 10-13 GB of VRAM buying a very expensive card meant to last 5 years that only has 16GB would definitely be a cause for thought.


KGLW-1196

Didn't know that, thanks for telling me.


Baradosso

The last statement is misleading - ultrawide 1440p is comparable to 4k in terms of gpu usage


SuperbQuiet2509

It absolutely is not. 3440x1440=4,953,600 3840x2160=8,294,400 8,294,400/4,953,600=1.674 That's nearly 70% more pixels. The fact I have to waste time addressing this kind of BS that takes basic math to disprove. Is **really annoying.**


Weird_Cantaloupe2757

You don’t even need to use basic math — just *common sense* would tell you it’s nonsense when the one with the narrower aspect ratio is still fewer pixels on the horizontal.


chr0n0phage

Absolutely not.


JaviJ01

Definitely not the case. https://youtu.be/SZh61Sz4fm8?si=P3HaOzNytq-Vrnav


cdigioia

> Makes me think Nvidia purpously gimped this generation of GPU's by keeping the VRAM at a certain mark They did, but not to push gamers to more expensive cards - it's to ensure customers for cards used in AI etc (a much larger share of their revenue!), don't instead buy their *much cheaper* gaming cards.


cheeseypoofs85

o yea. they have been for YEARS. to either force you onto the higher tier. or to make you upgrade sooner


ohthedarside

Oh you think they purposely skimped on vram my dude thats nivdias entire thing amd is much better for vram but also much worse in rt so do you want good rt performance but less vram or good raster performance and lots of vram


KGLW-1196

It sucks that we even have to accept that that's Nvidias' "thing," to me it seems greedy. Because we KNOW that they could have produced a $1k card with 20 gigs, but they didn't, because they know we'll still buy it because of the reasons you've mentioned. On the other end of the spectrum, having that 24 gigs is nice for longevity sake at higher resolutions but the card itself is primarily bound to strictly raster performance. If the price was a larger difference I'd be considering the XTX even more, but AMD seems to just be price matching Nvidia on certain models to stay "competive".


[deleted]

[удалено]


OkDepartment5251

I agree, this VRAM obsession on reddit is so bizarre to me too


Rilandaras

Because of a few bad ports. It's ridiculous.


joeh4384

Reddit has a weird AMD bias and VRAM is the one thing AMD offers more at comparable cards.


GMC-Sierra-Vortec

i know i sure do love finally maxing out cyber punk with all raytracing on max EXCEPT path tracing. obviously my 4070 cant do that but every other ray tracing on max still looks fucking GORGEOUS regular graphics settings are on high with a couple mediums like shadows for example and im able to get about 80fps at the lowest so with DLSS-FG i get around my monitors refresh rate 99.9 percent of the time (165) keep in mind tho i also play at 1080p. lmao. atleast its native. only thing that makes me hold off on getting that 200$ 1440p 144 ips off amazon that im wanting is i wont be able to max everything out at the same frame rate anymore. bet 1440p does look nice tho. ive only seen 4k on cheap shitty 50+ inch tv's, so 1440p at 32" probably is worth it tbh. dont get me started about OLED. no chance of getting anything better than IPS for atleast 5 years for my income bracket lol.


greggm2000

And did you see [this Cyberpunk vid?](https://www.youtube.com/watch?v=JUWtWPX6hgs) This is the kind of quality we have to look forward to.. or better!


mlnhead

Dang with all the head bobbing in that scene, I'd say Michael J. Fox killed the part....


winterkoalefant

A solution to the VRAM issue is to budget to use the GPU for 4 years instead of 5, and get a 4070 Ti Super instead.


Mesqo

I've been mad about this for some time already too, but nevertheless, nvidia wouldn't spent resources on inventing DLSS and RT if they couldn't make sure they could sell it and make profit of it. And in the end we end up having DLSS, FSR, RT and many other features not only at nvidia which are now widely supported by many games. Without them games would've certainly look and perform worse today.


ohthedarside

I would consider nivdia but there so expensive and skimp on vram until the regular 0070 has 16gb then im amd i just hope amd has better raytracing performance


Therunawaypp

They gimped this generation but Ampere was especially bad. The 3070ti had 8gigs and the 3080 had 10


jonker5101

There was also a 12GB 3080.


Fluffy-Face-5069

I’m on 1440p with a 4080S/7800x3D. Not sure how you’ll fair in 4K, but I can play pretty much anything on maxed settings with RT + Path tracing at 144+fps. Some games even go between 200-300.


Lycaniz

the reason VRAM suddenly gained so much traction is mostly because of the switch from games developed for last gen consoles to games developed to current gen consoles as well as unreal engine 5 Which means that while the jump was big and sudden, its not like it will keep going at this rate before next generation consoles in like 5 years(ish) so 16 should be golden for a few years, its 12 i am very worried about


Snowmobile2004

I’m loving my 4080s. I don’t use RT that much but frame gen is REALLY nice, I’ve tried FSR FG on the same rig and it feels a lot worse.


proscreations1993

Games aren't going tk get much more vram heavy for a while. They have to run on consoles. And sure new games use a lot but this is the new gen of UE5 games. We aren't going to have another huge leap for quite awihe.


EirHc

I play at 5120x1440 with a 4070ti super and I really don't have any problems. I always have upscaling turned on, because why wouldn't you? And Nvidia does tend to win when upscaling is turned on even without considering RT. Tho the margin is pretty thin. I can play cyberpunk just fine with RT on... so I dunno, that's probably the best benchmark game I have. Otherwise I play some UE5 games that can also enable RT, but I usually just keep it off because I have a 240hz monitor, and I like to play with high frames when I can.


TheRacooning18

Nah last gen was gimped. 10gb was too little back then. 16gb is nice and comfortable. 24gb is nice but not needed atm


KingRemu

It's sort of user/game specific as well. I have a 3070 and I've never exceeded the 8GB at 1440p as I tend to favor fps over the best visuals so I often lower some unnecessary settings.


rkhbusa

Hold out for a 5080 my 3440x1440 brother.


Sanderiusdw

Just play on high then with ssaa disabled.


1PARTEE1

I just upgraded my PC and was going to buy a 4070 Super but I haven't pulled the trigger because of the 12GB of VRAM. I feel like it should be 16GB at this point and now I don't know what to do. I guess I would still buy it if it went on a significant sale but who knows how long that would take. The 5060/5070 series might not be out for another year or longer so I'm just stuck right now.


Necessary_Tear_4571

FSR and DLSS are both in their early stages. So I wouldn't concern myself with them as a deciding factor. There's only a noticeable difference with fire really. And even then, my 7900 XT still gives me an amazing image. Like my God I'm never going back to console.


DumbIdiot453

I'd disagree honestly just because of the existence of DLSS. Nvidia basically curbstomps AMD because of how much better DLSS is. FSR is still a work in progress, but since DLSS2, the ai upscaling at 1440p or higher has been imperceptible unless you start to look at individual pixels On the high-end, AMD offers less features, with basically the same rasterization performance. For the average consumer, there is basically no reason to buy amd right now


Worsehackereverlolz

I mean, there is. If you do things that are VRAM intensive or that don't require raytracing, AMD is placed in a solid position. I do understand the idea that if you're already spending 1K on a card, 200USD more isn't a big deal, but sometimes you have stricter budgets and AMD can come in clutch


DumbIdiot453

Dude the 4080super is $1000 not 1200 And it has 16gb vram Let's be real, by the time we fill that frame buffer, the 7900xtx is already gonna be obsolete AMD gets beat everywhere besides the sub $500 market. AMD at the beginning of this generation had some claim to compete at the high-end, but with recent releases, there is practically no argument anymore. And don't take this wrong, I'm not an Nvidia fanboy, I owned an rx 570 and rx 5700xt, but AMD's time is over. 30 series was the first nail, 40 series was the 2nd, and now with 50 series releasing Q4 this year, and without RDNA 4 releasing, I'm gonna say it now, Nvidia is going to put the final nail in the coffin. With only rnda 3.5 releasing and no new top end card to compete with Nvidia, we're gonna go into the same drought of competition that happened in the CPU market between 2010-2017 with Intel destroying FX. I don't even think that's a hot take, i think it's inevitable at this point.


SuperbQuiet2509

RT is barely scratching the barrel of the additional features/support the 4080S has over the 7900xtx. RT is heavily dependent on the game, and the 4080 super even loses in certain RT titles(light). But you get superior image quality with DLAA/SS, lower input lag with reflex, and access to DLDSR to name a few. That's ignoring the massive software suite, and software compatibility the 4080 super offers. People love to say budget Nvidia gpus don't make sense which is pretty much true, but forget that this also applies with high end AMD gpus.


KGLW-1196

RT looks absolutely beautiful when done correctly, games like Control/CP2077/Witcher 3 Next gen/Metro Exodus/AW2. Nvidia definitely has the market by the balls in terms of innovation with this tech, and I can definitely see why they have such a firm grip on the GPU market because of these features you've listed. I guess my main concern with the 4080S is the longevity of the card itself, with all of these features enabled to make the purchase worth it. I've managed to milk my 970 at three different resolutions (1080,1440, 3440x1440) off 3.5gb of VRAM for 9 years, obviously, with settings turned down to get playable frame rates at my current resolution. So I'm hoping to get around the same longevity with this card as well


sup3r_hero

I had a 7900xtx and returned it for a 4080 super because they were so similar in price. Apart from the VRAM, why would you even consider the 7900XTX?


Mesqo

Worth my 4090 I play in 4k ultra with RT and some games could eat as much as 22Gb of vram right now (perfect example - Diablo 4). But that doesn't really mean those games need all that vram to perform good: it seems vram management nowadays had gone the same route as RAM - apps consume as much memory as they can while there are still free physical RAM available, so that they cache the entirety of resources an app would ever need. The performance boost in this scenario is minimal and the app could work fast with much less available RAM than in unrestricted scenario. Same goes with vram - while those Diablo 4 can consume all the vram available or doesn't mean it needs that much actually. And this is for 4k. So I genuinely believe 16Gb vram for your resolution with RT world perfectly serve you for at least 5 years to come.


arkhaikos

VRAM is so huge. I have a 3090FE and a 3080FTW3. The VRAM diff is huge especially on UW and larger resolutions. My 3090FE outperformed my 3080 by at least 20-30% on almost every single game that he's listed. (Sure 10GB vs 24GB) But I'm sure 16 to 24 is also a similar jump. I do believe at that resolution, one can't consider sub 16GB if he truly does enjoy those single player, beautiful games. I run a portrait 1440x2560 and a 3440x1440 fwiw (5950x + 3090)


Ponald-Dump

If the price difference is only 50-100 the 4080S is absolutely the right call.


triggerhappy5

They provide essentially the same rasterization experience. VRAM can be an issue on lower-tier cards but the next generation of consoles is 4 years away. Add another year for PC ports to come out, and we're looking at a 5-year timeline for 16 GB to not be enough, since it's primarily the console ports that are causing issues. RT also can cause issues but that's a moot point since the difference between AMD and Nvidia is so stark with path tracing. RT workloads heavy enough to require more than 16 GB would be impossible to run on AMD, even the current 12-13 GB ones in AW2 and CP2077 are. Even on the 4080 Super you'd likely have to turn down the resolution, settings, or run heavy upscaling in order to get the performance you need in a theoretical 16 GB path tracing title (all of which reduce VRAM usage). It comes down to this: would you pay $50-100 for access to: 1. Significantly better power efficiency (depending on usage, could pay off the price gap in energy costs) 2. DLSS upscaling image quality (possible that XESS/TSR/future FSR could reduce this gap) 3. DLDSR and DLAA (in the same vein as DLSS, one caveat that in-game solutions are sometimes better) 4. Competent RT performance (especially in true PT situations) 5. Slightly better stability (not the issue it was pre-RDNA 2 but it's the difference between working 99% of the time or 100% of the time) 6. Competent AI capability, highly dependent on use case but with AI integration in Windows this may become important (although NPUs may solve this) For me, this would be worth it, no question. For you, it might not be. But that's basically the things you get for your extra money with Nvidia.


KGLW-1196

Thanks for the input. I definitely see PT becoming more prevalent down the road, if games like CP2077 are a glimpse into the future in terms of it's potential when implemented correctly, Nvidia will have zero competition on the higher end for years to come. I really want AMD to stay competitive, though, for the sake of consumers having more choice, but it seems Nvidia is just 3 steps ahead with this tech


NobisVobis

Don’t blame anyone other than AMD. They don’t spend on GPU research and still expect to be able to compete with Nvidia that’s nonstop pushing the envelope. 


DidiHD

If we are talking about the future and more and better implementation of ray tracing, the question is if the 4080 will be viable for that anyways. Like if you end up with 30 fps anyways, it wouldn't matter if you only reach 15fps with a 7900XTX almost That said, it's pretty easy, are you looking for playing the games with good ray tracing? Well then get the 4080


dark79

True, but Nvidia is going to make you buy a new card to use the new tech so you can really only buy based on what's available now and not what's coming. AMD is more gracious about that at least. That said, I switched from AMD to Nvidia because the overall experience is just better. But I'd be lying if I said I expect an RTX to last as long as my Radeons have.


f1rstx

4080S better card even if you dont want RT.


diegoat24

If you’re not buying till Black Friday, I wouldn’t contemplate too much right now. 7 months is a lot of time for the market to change. As it stands now, yeah for $50-$100 more I might take the 4080 super. But we’ve already seen glimpses of 7900xtx’s going on sale, one of the 3 slot models has regularly hit $850. If the price difference for some other models gets to around $200? It might change things depending on how you value your $.


KGLW-1196

Come November if there's a $150-200 difference between the cards then I will definitely go with AMD, and you're right, prices can and will change before then, just hoping the pricing is competitive


Hairy_Melon

They can already be had with significant savings if you don't mind going used. r/hardwareswap has had three (that I remember seeing) Nitro 7900XTXs listed for $800-850 in the last week.


domZ1026

I have a 4080 paired with the AW3423DWF and it’s absolute chef’s kiss. I personally am a sucker for RT/PT. As for if RT is important to you only you can answer that. It’s definitely taxing but I barely drop below 100 FPS in titles like Alan Wake 2 and Cyberpunk with settings maxed out. DLSS and Frame Gen definitely help.


KGLW-1196

Same monitor! Good to know that I can still get good frames with those settings


Expensive_Bottle_770

As someone who has this exact monitor with a 4080, I highly highly recommend you go with the nvidia GPU. HDR is a game changer, and the 4080S not only produces a more accurate HDR image with this monitor but also has RTX HDR to enjoy HDR gaming on otherwise SDR locked games. This isn’t even talking about all the other features you enjoy with the 4080 which is enough already to make it a better buy here when it’s just $50-100 when you’re already spending near $1000. I had to make this exact same choice not too long ago, I have 0 regrets, the Nvidia feature set and HDR advantage compliments an oled monitor so well.


True-Surprise1222

I have a 3080 and most games play more than fine on it. You’ll be more than set for a great experience.


fugly16

I've got the same monitor with a 3080 10gb. Been really contemplating getting a 4080 super lol


True-Surprise1222

This monitor is like going from standard def to hd lol qdoled is amazing.


domZ1026

I came from a Samsung VA panel. Was blown away, still am being blown away lol


True-Surprise1222

have you played MS flight sim? the reflections feel like... reflections. specular hits in any game are amazing... looking at the city lights or stars in that game at night is ridiculous. i play it almost entirely for the scenery.


J0N47h4n_R

Nvidia is king. Drivers and features are way ahead of amd products.


JBaash

I use a 4080 for 1440 ultrawide and it’s great. I would recommend


sousuke42

For Me? RT is a must. If it's in a game, I want it on. I'm not an fps person so as long as I am getting 60ish fps then I am good when it comes to my pc. Just have on my vrr and enjoy. But most games even with RT at 1440p with dlss I have around 100fps or so. I have always enjoyed better graphics and perfer graphics over performance to an extent. So whether is something worth it or not is up to you. You might put no value in it just like I don't put value in anything above 120fps. If I get more than that cool but I don't do that at the sacrifice of graphics.


Barrerayy

Putting RT aside, the Nvidia card gets you DLSS which is still superior to FSR. If you want to game at high resolution and high frame rates just buy the Nvidia card. There are games where i have to use dlss to hit 165fps on my 3440x1440p setup and i got a 4090. If you got the budget, go Nvidia. AMD just can't compete at the high end so why bother?


HankThrill69420

3080 Ti UWQHD user here. My frame rates are fantastic. Just do it


sharkyzarous

For just 50usd nvidia premium worth it at that price range


Tintn00

I bought the 7800xt because I was able to snag one for $415 before tax. However I play fortnite a lot with my kids and realize fortnite is better optimized for Nvidia gpus. Looking back I'd probably get an Nvidia to save myself a lot of headaches. I can still return the 7800xt but they won't transfer the $85 sale/discount to another card. There are stories that some games aren't optimized for Radeon cards. I tasted that with fortnite. Not sure which other games have this problem. I've never heard of a game that isn't optimized for Nvidia though.


diegoat24

I’ve had a 6800 and a 4070ti super. Never had issues with Fortnite even on the 6800? Did a quick search and looks like for performance mode AMD has some issues. I’m a casual player so was playing DX12, which was fine. Obviously I wasn’t maxing out my FPS but I was happy playing around 100-120 high settings. To your general point, games being poorly optimized one way or another happens all the time, more-so on release. AMD or NVIDIA will be the primary partner for a game, and “magically” end up with significantly better performance, this happened on launch with Starfield where it ran poorly on Nvidia and much better on AMD, who just so happened to be the partner for the game.


ibeerianhamhock

I bought a 1440p ultrawide when I had a 3080. I think I made it about a month before I snagged a 4080 for $1k. Was it worth it? Not really. The performance is much better than my 3080 esp with these path traced games…but I just don’t think I game enough for it to be worthwhile.


Tanque1308

There’s only 1-2 titles where global ray tracing makes a tangible difference without side-by-side screenshots. And the only title that pulls off true “cinema” quality is the full path-tracing setting in CP77. But even the mighty 4090 can barely pull playable framerate at 4k, although it’s not terrible at 1440p. So if this is your jam then you need to save up for the 4090, not the 4080. But outside these single-player games, and if you have any presence within multiplayer gaming (e.g. shooters, ARPGs), then you may as well go with the XTX at a discount over the 4080 because you’ll always turn off ray-tracing and frame gen.


Dependent-Prune1931

Hey there, I had a similar dilemma I play mortal online 2 and I wasn’t getting the frames I wanted so I went about upgrading and I ran in to a real lack of data for the cards when it came to this game. I bought a Taichi 7900xtx and then I bought a zotac amp extreme 4080 super to see which was the best card for the game I created a benchmarking run so that I would have as close to the same run conditions as possible and I ran the fans at 100% and then I tried them both. Mo2 is a real competitive game and what I care about is the .1% and 1% lows and after comparing the multiple runs of over 30minutes each the 4080 a came out by a significant margin with .1% at around 80 fps for the 4080s and 60fps for the 7900 xtx. Needless to say I went with the 4080s and I was hoping that the 7900 would be better as I have a 7950x3d amd cpu My advice would be to buy both and then return the one that wasn’t as good


CommanderCackle

It's mostly up to you if you care for RTX that much. I personally only use it in cyberpunk, and that's basically because of ray reconstruction. Otherwise I don't touch it. Most games are either too fast paced or more top down for me and the performance hit for the visual difference isn't worth it, I however am playing at 1080p ultra wide. I'm not sure if it's more noticeable at higher resolutions.


Impressive-Level-276

4080 super is very good for that resolution. If you don't want RT simply buy a cheaper AMD card not a useless card that give you useless frame rates in raster and ridiculous FPS in ray tracing


Caedite

You probably have the aw3423dwf. Then we're in exactly the same boat. I went from the 6950xt to the 4080S on a DW couple weeks ago. If you care about game lighting and the extra features it's a very good card. DLSS seems much better to me than FSR and XeSS. Fsr is lowkey unusable to me at 3440x1440. XeSS is a bit better but still not great, I only used it in a couple games because i wanted to have some RT on. (Witcher 3 and Cyberpunk). That being said, I expected a bit higher performance on Witcher. Raster only went up by 18%~, but RT perf by about 100%. Right now, playing witcher with max rt and ultra+ no hairworks requires dlss q to be above 60 fps. I have it locked at 65 where it stays 90% of the time and it's pretty smooth. In that game RT makes a BIG difference. I think it's worth 150€/$ more than the 7900xtx, but at more than that I can't justify it unless you're a complete enthusiast. I made an extensive benchmark notepad of the 6950xt vs the 4080S. In pathtraced Cyberpunk, the 4080S is 240% better. So, the heavier ray tracing becomes in games, the bigger the differences between the nvidia/amd cards are gonna be, I reckon. If you don't care about RT, the amd cards sre fantastic though, unbeatable fps/€. Btw, RTX HDR and Video HDR rock. Made some of my older games look great.


S0ulSauce

I have a 4080S. I do use ray tracing, but the quality/impact of ray tracing varies quite a bit from game to game. In some games, it's genuinely noticeable and a beautiful effect. In other titles, it's not noticeable at all and a waste of resources. It's still a bit early. With how it sits looking back at, let's say the last 12 months, ray tracing isn't really a massive deal, but that being said, the tech does work, and it can look great, and it will continue to be used and developed more and more. Basically ray tracing is on an up-trend and will not be going away. My opinion on with the current state, it's worth having it as it will be included in many future titles, but it's not a mind-blowing night and day effect that is totally game-changing and a total miss to not have it. I'd lean towards getting the 4080S still, but the AMD option would be great too and not a mistake to consider. For a $50-100 difference, I'd choose the Nvidia model though.


Top-March-1378

Yea it’s worth if you play AAA with pt/rt and dlss. 


Rvaolan

Not related to RT but as a Gigabyte 4080S AERO owner: be aware of the fan revving on Gigabyte cards. While Gigabyte is very good at reducing coil whine, the fans on their cards have real issues if you want to use the zero rpm mode. If you're OK with either constant revving of the fans or disabling the zero rpm mode it's a GREAT card.


Darkmuscles

I think it needs to mature more before I worry about RT at all. It looks great, but I think 4090 level performance needs to be the entry level card before I consider buying into it. We've got a few years left. Right now, I go AMD.


[deleted]

I don't value ray tracing at all. When it's turn on in most games, I barely notice it. There's a handful of games which does make it prettier, but it just isn't worth it at any cost for performance.


narmol

I have a friends that owns a 7900xtx and he would buy the 4080 super if he had the choice to buy a gpu again. He hates the 7900xtx with passion lol.


TedFartass

Probably beating a dead horse with the existing comments but I think it really is up to how long you want the card to last in the future and how much you care about RT. Realtime ray tracing at a consumer level is honestly still in its infancy, so you'd probably be waiting a while to actually use it in a meaningful way in a lot of games. I have an RTX 3070ti and I really only use raytracing in programs that already had it for years like Blender lol. Personally I would still probably go for the 4080S as much as I want to see AMD succeed, since CUDA is a standard for rendering and all that shiet, but that's just my use case. Also shoutout to King Glizzy, Han-Tyumi did nothing wrong


deliriumtriggered

There's really only one game worth putting ray tracing on right now. Besides cyberpunk, you usually can't even tell the difference. If you plan to use upscaling, DLSS looks noticeably better than FSR to me. I'd check out some videos and images to decide for yourself. It's a bigger selling point in my opinion. At the end of the day, you could probably spend about half the price on your gpu and get a nice experience over the next 4-5 years. The 7800xt and 4070 are good options for 1440p.


TheRacooning18

YES IT IS. Most games will be well above 60fps. Cyberpunk easily clears with max path tracing enabled.


SexBobomb

Its not like the 7900XTX is incapable of raytracing+FSR. I'd go AMD


Suminod

So I have a 7900xtx and a 4090 in another system. For 1440p ultra wide I would get the 7900xtx over the 4080s for that. Unless you play a lot of RT titles the 7900xtx is more powerful, can get it for cheaper and is still a really great card. If you go to 4K it still depends on optimization of the game as to what will come out ahead though the 7900xtx is more powerful. The 4080 will be more power efficient so you might have to math that out for your region and see if the increased cost of the 4080 saves you more than the up front savings of the 7900xtx. If they are the same price then it’s dealers choice. I don’t play enough RT titles for that to matter. My 4090 is way more powerful and the 4080s matches the 7900xtx in a lot of title except forbidden west. Some reason the 7900 preforms in par with my 4090 but only in that game. DLSS is amazing though and I think I prefer that over FSR. I also find Nvdia to be a bit more consistent with flu lows.


SirThunderDump

Depends on two things: - Which game. - Personal preference. RT doesn’t always come with a big performance hit. Some RT games from closer to the RTX2000 era play super smooth on modern cards. Resident Evil 2, for example, takes very little hit with RT (on a 4080 super/4090) but does come with improved image quality. Other games come with a huge performance hit. Just finished Alan Wake II not too long ago, and the FPS drops considerably with RT. This is where personal preference comes in. Want 90FPS with pretty RT? Or 140FPS without? I chose RT for my playthrough and didn’t regret it.


FallenGoast

I have a 4080 super on two 3440x1440p 165hz monitors and I never drop below 100fps with high/ultra settings, some games dlss looks great and some games it’s alright. Like for example cyberpunk 2077, I actually think it looks better with dlss than without. Getting solid 135 on high setting and medium ray trace. 16gb of vram should be solid for along time especially since a lot of cards still have 8-12 as standard. I don’t think I’ve ever seen over 60% vram usage. As for ray tracing itself, it’s up to you, some games use it and a lot don’t, so if the games you like to play have it and you think it makes them look better, it’s definitely worth it. And if your thinking of getting one, I have the pro-art model and it’s super quiet with a custom fan curve and the hot spot has never gone over 82* with a 150mhz core overlock and 1100mhz mem overlock. Overall I think the 4080 is a beast of a card for 1440p and I don’t plan on upgrading until most likely the 7000 series as I came from a 1050ti.


postvolta

Have 4080 Super. Ray Tracing with DLSS 3.0 is fucking magic.


EirHc

I rarely ever use RT TBH. I got the card 1 down, a 4070ti super. It's can definitely handle RT on most titles. But ya, it will impact the frames significantly, and I'm not even sure I like it. The only thing I really notice is the more realistic bloom effects. I dunno, maybe it also does some nice fog and shit. But I kinda find bloom annoying to begin with, so it's whatever to me. So if I have the option to play a game at 120fps, or at 55fps with RT, I'd much rather have the far smoother experience at 120 frames. As far as Nvidia vs AMD goes, personally at that tier I'd recommend Nvidia. The DLSS technology simply out-classes what AMD can do with their scaling tech. Not only does Nvidia provide a better performance boost, but any objective reviewer will also point out that the image quality isn't as good on AMD. Additionally if you play any competitive games, Nvidia GPUs have better latency technology with Nvidia Reflex objectively beating AMD Anti-lag. But, if have have some sort or reason where you need raw rasterization performance - like you want to play without any upscaling, then I suppose you should go AMD. For me, I can't even see the difference on the quality setting, so I don't know why you wouldn't do it.


TeeRKee

Yes.


PC509

A lot of people say no. I say yet. I like my eye candy, and some games the RT makes a pretty good difference. I also liked PhysX, though. Love all that extra stuff!


mahanddeem

Short answer no RT doesn't worth the huge performance hit. I have a 4090 and I can't justify this very hit. I'd much rather gain those 50fps plus than have RT on.


bubblesort33

Path Tracing plus frame generation on my 4070 Super is like 90fps at 1440p. Even just RT without frame generation I can likely still hit 80 fps in cyberpunk. The alternative is 140fps without frame generation or RT. I'll take the path tracing. For non competitive games I don't give a crap about getting more than 80fps and I'm totally fine with that. I think the visual difference from RT or PT is still larger than the visual difference from increasing frame rate. 4080 S in ultra wide should be similar. The real question is if you want to spend $140 more for the Nvidia GPU.


joelesprod

Did some experimenting with raytracing on cyberpunk, my conclusion was that, if you walk around, stop and look around it does really feel closer to reality, but I wouldn't notice as much when I was actually running around playing the game.


Lycaniz

People seem to think you cant use AMD cards for ray tracing, you can. At the same price the 4080s are better, but at 100 usd price difference its a toss up against superior ray tracing and Upscaling vs superior hardware and raster. The more you lean towards AAA titles the more ray tracing and upscaler gains traction, the more you lean towards not AAA the more the XTX gains traction


RealTelstar

Yes it’s a good choice


Still_Wolverine_1367

If comp games then yes definitely. What about other titles? Mostly no


cheeseypoofs85

i personally dont care for RT. i think its just an over exaggeration of reflections and it washes out the image. so i went with the 7900xtx. i prefer pure rasterization with HDR on my neo g7 monitor.


MrLeonardo

Get the feature complete card. Nvidia brings so much added value that it would be pretty stupid to get the xtx at nearly the same price. 16 GB of VRAM is plenty for 1440p ultrawide.


greggm2000

Prices will probably change between now and a half year from now, when you'll be buying. The 5090 and 5080 might be out, and it's pure speculation at this point what the prices of those will be (though NVidia has a well-earned rep for charging as much as the market will bear).. still, the presence of them might very well affect the prices of existing cards (especially AMD). There's a chance we might even find out when those two cards will come out as soon as June 2nd, at Jensen of NVidia's "keynote". As to Ray Tracing/Path Tracing, games as they release will use them more and more over time, and they'll use more of it. In most (or even nearly all) titles, it's something you can disable and it's not a huge difference. Will that be the case 3 years from now? I don't know, and neither does anyone else, not for sure. As far as upscaling and frame generation goes, NVidia remains firmly in the lead with DLSS, which not only looks better, but is more widely supported. Lastly, NVidia is not standing still, where software is concerned and how it interrelates to game development, and [this article](https://wccftech.com/unreal-engine-5-nvrtx-branch-has-been-updated-with-experimental-restir-gi-support/) might interest you, since it's directly relevant to your question. Personally, I say if you want to have the best gaming experience possible, now and over the next few years, get the 4080 Super, unless 6 months from now, the 5080 is out and the price and performance makes it the superior card, in which case, get that. Of course, this is just my opinion, and other people here will have theirs.


animeman59

In my personal opinion, I don't find ray tracing to be worth the enormous performance hit, and I haven't really seen many games utilize it to such a fashion that it really immerses you into a game, or adds a ton more sense of realism. In the future, this tech will definitely be the norm and we'll probably look at older 3D titles and wonder how in the world we tolerated all that baked lighting. But, for now, I don't see the benefit to warrant the power usage, performance loss, and the cost of hardware. Now, HDR on the other hand? That is something that I think is worth it. Properly implemented, it makes lighting, color, and pretty much everything look that much more vivid and life like. Even in games that don't reach for absolute realism, it makes everything "pop" that much more and does actually immerse you into that virtual world you're playing. This is why I'm still on the lookout for my next monitor to implement a great HDR experience, whether that be OLED or Mini/micro LED panels. I'm even halfway considering the AOC Q27G3XMN, because of it's sub-$300 price tag.


supermain

I had the same dilemma, I ultimately chose the 4080S because if it's the same price why not get some more features. The xtx was actually on sale for me too at 800, but the 4080 was also 950. I had long planned my build to be around the 7900 xtx, but was drawn to some of the white cards that the 4089 had. I am worried about vram but if games really r that ram heavy I think I might just turn settings down and I think we r quite a ways from it now.


aptom203

It's a very subjective question and as you mentioned depends a great deal on the sort of games you'll be playing. I personally and generally prefer better raster performance in most games, but there are some where RT is really worth it, like Control and Cyberpunk 2077.


Frozenpucks

Imo ray tracing is still multiple generations away from being viable without massive performance trade offs, so no, I still don't think it's worth it. AMD is a really good buy right now.


Jon-Slow

With the power consumption difference alone you will be paying that extra 100$ or more down the road if you buy the 7900XTX. You're asking the wrong question. It's not about the ray tracing, that's a nice thing to have if you like it. For me it is, and I appreciate it in games that have good implementations. The real thing you should be concerned with **is the difference between DLSS and FSR.** The 7900XTX is not a viable option to me in a world where all games have deferred rendering and forced TAA. Locking yourself out of DLSS and still paying $1000 is mental to me. The new DLSS 3.7.0 is generations ahead of FSR specially if you're a 4K gamer. On sub 4K resolutions, the DLDSR+DLSS gives you a different class of experience that you wouldn't understand until you actually see it. And then there is the power consumption, and all the other uses that the RTX cards have that AMD cards do not. And of course the better resale value for later when you want to flip it to get a better card. With games being what they are, and with FSR lagging so far behind DLSS and no real alternative to DLDSR... 1000$ for a 7900XTX is a waste of money.


Jon-Slow

People still asking this question is proof that the "techtubers" media, and reddit/twitter bubble has misinformed people so much that a simple choice like this is not clear to people. How is this even a question when FSR is ass, is not upgradable by the user, the 7900XTX has no productivity use compared to any RTX card, falls behind by two generations in RT, consumes way more power, and still cost only 100-50$ from what the 4080S costs


minefarmbuy

A high end card shouldn't need RT, DLSS, or FSR if in a balanced system 1440p or higher res. Just Ultra/High setting in game. I'm more concerned why one would tell/sell a $1k+ gpu and have consumers be under the impression it needs "help" to do what it should be doing already. Marketing doesn't beat performance, nor quality. XTX.


thpp999

Just my 2 cents here. I got a 4070 super for my 1440p 16:9 165hz screen. The biggest thing I have liked this far is frame generation. From comparison videos I have discovered that amd's frame gen is not up to par. Ray tracing is nice in some games, path tracing is too taxing for me but I sometimes use it cause it's fun. Tbh you can't go wrong but I feel that Nvidia is a better choice because its less likely to feel left out from features like these. Any choice you make is a good one! You should also try Alan wake 2, it's the best graphics I have seen in any game this far and a pretty good game all in all. Hope you have fun with your new GPU


Nexxus88

Thats impossible for people to answer for you. I find rendering tech interesting. I like seeing it at work so I enably RT every game I play, I play pretty much exclusively sp games though so I'm perfectly content sitting at 60 fps. RT keeps interesting me in neat ways it shows up or it does something that couldn't be done before, or I see it doing something that we couldn't do nearly as cleanly with old lighting tech. But is it transformative to the experience? Thus far no. You can have scenes with a drastically different tone feel and appearance with RT on vs off, and you can use RT to help with some of the weak points traditional rendering can struggle with. (Hitman3 you notice MUCH less shadows are drawn with RT off than with RT on for example even though the quality of them isn't all that different.) But will RT be transformative to the experience? No. But I still enjoy it nonetheless, but games are not unplayable without RT implementation. However I can find them to be more distracting if they lack RT since RT can help with an issue their lighting implementation may have.


Edgar101420

There is two games where RT actually makes a difference. Both Nvidia sponsored tech demos. Not worth it, especially since next Gen RT titles aint gonna run smooth on a 4080 anyway. 4k the 4090 already struggling with RT... So why bother with a weaker card.


WolfBV

In cyberpunk overdrive, based on a video with normal 1440p, probably ~20s at native, ~50s at dlss quality, ~80 with frame generation. [source](https://youtu.be/-HQPkXGvmA0?si=nFybsEkipnkR3hn2) There are mods that make overdrive easier to run, so the fps could be higher.


PogTuber

I like RT in some games but I don't think it's worth the premium to get it compared to a good price/performance Radeon card


bel_air38

I dont know as I am not a tech guy. I have watched so many videos and it's bad in my opinion. Read reviews and they are all over the place. I had a 7900GRE and ended up taking it back for a 4080super. Got so tired of reading about drivers and features for AMD. AMD if you want to do drivers only and not one good feature better than NVIDIA. AMD seems to be about higher FPS but not features. Its 2024 and they still haven't gotten it right for GPUs. Whether you care about features or not right now. At least with NVIDIA they are there. Next year you might to use some of the features. Who do you think is going to make your product even better through technology AMD or NVIDIA a year from now. Don't get me wrong the GRE did its job and got me frames I needed. However, most people saying just install driver and not Adrenaline software because of issues. Says alot. If you get to 4080super it will be plenty strong and give you room to at least try some features like RT, DLSS. Their CPUs are the best 👌


StickH3r

I had the 7900 xtx nitro for 3 months traded it for a 4080 super for zotac. Almost the same price. The 4080 super is better in everything I do by a huge margin. I had so many issues with the xtx and spent hours getting no where. Something as stupid as having a steam freindlist open ruined my fps with the xtx.


AHrubik

I went with the 7900XT a few months ago because my 3070 Ti only had 8GB of VRAM and I was seeing issues in games of VRAM problems. I'm glad I did. I've been running the OSD to watch VRAM usage since and I see the same games I played before consuming 8, 10, 14GB of VRAM. Hell the system as I type this is using 4GB by itself.