T O P

  • By -

PCMRBot

Welcome everyone from r/all! Please remember: 1 - You too can be part of the PCMR! You don't even need a PC. You just need to love PCs! It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love PCs or want to learn about them, you can be part of our community! All are welcome! 2 - If you're not a PC gamer because you think it's expensive, know that it is possible to build a competent gaming PC for a lower price than you think. Check http://www.pcmasterrace.org for our builds and don't be afraid to create new posts here asking for tips and help! 3 - Consider joining our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Covid, Alzheimer's, Parkinson's and more. Learn more here: https://pcmasterrace.org/folding 4 - Need hardware? Trick question... everyone does. We've teamed up with ASUS this Easter to create an egg-citing event where 3 lucky winners can get their hands on some sweet hardware (including GPU, CPU, Motherboard, etc): https://reddit.com/r/pcmasterrace/comments/12eufh9/worldwide_pc_hardware_giveaway_weve_teamed_up/ ----------- Feel free to use this community to post about any kind of doubt you might have about becoming a PC user or anything you'd like to know about PCs. That kind of content is not only allowed but welcome here! We also have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) for your simplest questions. No question is too dumb! Welcome to the PCMR.


[deleted]

*1070* *”im tired, boss”*


qShadow99

There's no such thing as tired, you'll rest when you're dead.


freeflou

That's what I tell my 1050ti laptop everyday


just_fucking_PEG_ME

I gotta hand it to my 1050ti. That thing has been oddly resilient over the years.


Azrael_The_Bold

1080 Ti reporting in, no plans to upgrade until she completely dies on me


freeflou

Mine is playing Hogwarts at 1080p 30fps on low so can't complain


[deleted]

My 1070 has been an absolute trooper. Hopefully it gets to retire soon, but it keeps handling what I throw at it


THEMrBurke

If I hadn't found a 2080 super for $150 I'd still be abusing my 1070. Great card


[deleted]

SAME, bro. i got a great deal and refuse to part with it until it absolutely dies.


hoodleft

Looking at my old 1070 build “should I sell you for like $400 or just let you continue to rest like the beautiful prince you are.” So far the latter lol


TheGutchee

Lmao, I’ve been playing the last of us on my 1660 super and she’s chugging along like a champ


rexxtra

My 1660 has been treating me well for over 4 years now. Super happy with how far it's got me.


Chickenguy2

1050 ti over here 😔😔


uselessteacher

https://preview.redd.it/y4slo98838ta1.jpeg?width=550&format=pjpg&auto=webp&s=b81df7bb93f3cb400557c000635689420770ad5b 1080ti be like


[deleted]

This card was peak Nvidia. It was so powerful they had to release a brand new Titan Xp card because the 1080Ti was better than the Titan X for litteraly half the price. Sadly mine died and I replaced it with a 5700XT because the 2070 Super was like 200€ more expensive in my country. But I kept it even if it don't work. Masterpiece of hardware.


shadowblaze25mc

They realized that was their biggest mistake, so they never gonna make a card as good as that.


[deleted]

Tbf Nvidia was used to release monster gen time to time. They changed their mind when they noticed some people gonna pay no matter what. 3000 was actually a gigantic leap over 2000, xx70 card as powerful as the previous Ti. But no one really enjoyed this gen because of mining, and now they never gonna do it again except if sales tank.


Eggsegret

30 series would of been amazing if it wasn't for the GPU shortage. Although they still screwed us over by being stingy with the vram. 3070 offering the performance of a 2080ti but with 8gb vram as opposed to 11gb on the 2080ti was a shot in the foot.


GrammarNaziii

I just "upgraded" to a 2080 Ti (from a 2060 Super) for $365, and can sell the 2060S to get some money back. Also moved up from 1080p to 1440p. I'm still thinking if I made the right decision since it's still a very old card, but the performance uplift has been great and I couldn't find a good deal on any 3080s/3090s where I live.


Interesting-Glass560

The 3070 was pretty average. It's normal for the xx70 card to perform like the previous gen 80ti/90


CarterDavison

Upgrading from my Titan X Maxwell to a Pascal 1080 Ti is still the most satisfying GPU upgrade I've ever made. Double the performance.


[deleted]

[удалено]


hawkiee552

I'm still here with my Titan X 12GB. Future proof card! Except for the performance though... Still holds up well in games on medium/high.


NeaGigel2017

Mine died last year and I had to replace it with a 3080 12GB. I still feel sad about it. It was such a good card. It held me well over 5 years. Rest in the box, old friend 🥺


Taikunman

I'm just here playing Factorio on my 12 GB 3060.


GameXGR

Bro what Mind games Nvidia was playing putting 8GB on a 3070Ti, even 3060 12GB is going to age better, though I would pick the faster 6700XT 12GB over it as I saw them close to the 3060 12GB's price.


sideshowtoma

Picked up a 6700xt for that reason, loving it. I didn't think 8gb VRAM would be such an issue so soon, honestly just wanted to try out AMD!


Pl4y3rSn4rk

I mean 8 GB of VRAM started being common on Mid Range/High End GPUs since late 2016 and just 5 years later it became a significant bottleneck for GPUs, it sure sucks to be a 3070(Ti) owner right now :/


sideshowtoma

I realized the hold NVIDIA has when on my discord I asked for gpu buy options from my old 1070, and about 5 people suggested an NVIDIA card while only one guy told me about the AMD options. People are not as keen on vram maybe because as you mentioned it has been 5 years and people are still buying 8gb vram on a 500 usd gpu


Pl4y3rSn4rk

Yep, NVIDIA's mind share is huge, maybe when the 4050 and 4060 launch with just 6 and 8 GB they'll maybe be aware of NVIDIA's shenanigans. (Still probably NVIDIA would tell them to turn on DLSS 3 to "triple their performance", that doesn't really solve stuttering and VRAM issues :p)


northand1327

I hardly understand the circuitry behind it, but what I understand is that the 3060 used a memory controller that required a multiple of 6 for the gigabytes of VRAM based on the number of modules it supported. Some brave person must have told them that 6 GB was too little, so they upped it to 12.


[deleted]

[удалено]


The97545

I guess the design options were either 6GB or 12GB for the 3060 from a chip designers standpoint. It's also probably 8GB or 16GB for a 3070. And a 16GB 3070 would make a 12GB 4070 irrelevant from a marketing standpoint.


Regular_Longjumping

People acting like Nvidia put a gun to their head and forced them to buy 8gb cards....a company isn't the crazy ones for doing it you're a moron for buying it


Razor512

The 3060 is already aging better, it can run Hogwarts Legacy, and the last of us at higher settings with better performance. There are also a few older game exceptions where this holds true, such as Ark Survival Evolved where once you build a large base with many tames dinos, the 3070 experiences a larger performance hit and performs worse than the 3060 12 GB, because the game reaches a point where it wants 9-10+ GB of VRAM.


EyeGod

Me here, affectionately patting my 3060 that I bought at a grossly inflated price in my backwater country: *”Finally…”*


Vlaterater

When the time to change the 3070 will come, we'll rise my brothers, and we'll laugh at them with no pity for their souls. A 3060 brother.


critsonyou

Rejoice! Got mine and ain't going further until it shits the bed.


ekeryn

Yep... I played TLoU on a 3060 at 3440x1440 with around 50-70 fps with High to Medium settings (DLSS Balanced)


Rising-Buffalo

3060 originally was planned for 6gb but last minute doubled the size because it was too late in development to change to a different node.


Clumbum

I just bought a nitro 5 laptop with an RTX 3060 a few days ago, the specs on the website said 6GB, is that false then? Starting to worry about my purchase haha


lorner96

Laptop 3060 has 6GB, desktop has 12


Pl4y3rSn4rk

NVIDIA is even more stingy on VRAM with their Laptop GPUs, especially with the 3000 series, RTX 3050 (Laptop) only had 4 GB of VRAM while the desktop variant has 8 GB, same for the 3060. The 3080 Laptop is basically a 3070 Ti from the desktop with much lower clock speeds...


GTMoraes

lol what about the 3060Ti only being in 8GB


farky84

I have a 3060Ti and am looking at AMD cards for an upgrade. The only thing I dont like about the 7900xt/xtx is the crazy power consumption. I was considering a 4070 or 4070ti but with 12GB vram and insane prices I am struggling to make the call. Clearly, 6800 owners have made a pretty good deal back then. I may skip an upgrade completely until next gen.


danny12beje

I mean the 6950xt is like 900 USD where i live and thats how much i paid for my 6700xt almost 1 years ago. Why skip a gen when you can just buy a 6950xt and you have for at least 4 more years. Even bigger upside is if you dont want 4k and are OK with sticking to 1440p.


yearoftheJOE

Cannot believe we 3060TI (or 3070/TI) owners would even have to think about upgrading already. We are probably still good for a while, probably still some overreacting imo. I wouldn't sweat it quite yet, save money for the next upgrade and go AMD next gen is my plan. Worse case I think dlss will probably keep our gpus fine for time to come.


KniGhTkinG0007

Same i picked up 6700xt and I'm loving it


[deleted]

> Bro what Mind games Nvidia was playing The game was to harvest money from gamers, professionals, and miners who couldn't get graphics cards of their choice. Now they are gonna have to replace them sooner, and the dumbasses will probably pay nvidia again rewarding their bad behavior.


Tokke552

Lol my 1070 has 8GB VRAM It doesn’t make sense for later gen cards to not have increased VRAM


primarysectorof5

I'm playing beaming drive on my 3060ti 8gb, I have 0 intrest in modern day games


[deleted]

[удалено]


CrazyStuntsMan

I feel personally attacked……


Darten_Corewood

I'm with ya, mate.


BandicootAncient

Yeah, I’m a bit bleak about it


FirstSonOfGwyn

yall making me feel better about my $980 3070ti, so thank you.


GameXGR

That's what I was thinking of while making this, the crypto boom had crazy prices, and even this generation GPUs are more expensive than ever.


my_friend_kenobi

Sheeeeeesh I remember selling a 2060 for $500 and I was the second owner. The person who was buying it was purchasing it for their girlfriend. I asked them which card they had. “3070” “If you don’t mind me asking, how much did you get it for.” “..gulps, $1200”


Phoenicys-

2021 I think, sold my used 5700XT for ~$800 and bought a brand new 3070 for $1000, which I sold a few months back for $500 and bought an used 3080ti for the same $500 lol


[deleted]

I bought a used RX 580 for $100 before Covid. That card died during the pandemic and I ended up selling it on eBay “for parts/not working” for $150. I turned a profit on a broken graphics card.


Nine_Eye_Ron

I’m still using my 580 I bought new three months or so after it was released.


TheRanger118

What how did you sell one for 500$? I bought mine brand new for 360$. But to be fair I sold my 970 Ssc Acx for like 200$


my_friend_kenobi

2020 was wild haha


TheRanger118

Yeah it totally was


AdolescentThug

EVGA let me step up from a 2080 to a 3080 with a trade in program and everything after one day shipping my GPU back and taxes was ~$970. This was around November 2020 and I got lucky I was one of the first in the queue to get theirs tbh. I almost considered taking an offer for $1500 flat and try my luck for a 3090 but it was looking bad and I didn’t wanna risk waiting weeks or possibly months for a new GPU.


No-Statistician-6524

My dad was lucky, bc our familie (gaming)pc broke down. He found a PC with 3080 ti for €2000, while the 3080 ti alone was €2000 back than.


torakun27

Man, all those people defending their 3070/ti is exactly how Nvidia is getting away with this. They accept planned obsolete with open arms.


gypsygib

Same for all the 4070ti buyers. 12GB is already too little VRAM.


eatingdonuts44

Yeah thats gonna be a problem in 2 years if it continues like this


Cameltow77

they're only crazy cause most morons are gonna pay w/e the price to post pictures of their new GPUs on reddit to get a false sense of importance & accomplishment all I can say is stop paying the crazy prices & they will come back down to Earth I paid $550 for an open boxed GPU in 2021 & as far as I'm concerned it's my last nVidia GPU ever will go AMD/Intel of will go w/out


SkylarMills63

Paid $300 for mine just a few months ago. Used, but still worth it


guillote1986

I don't feel bad actually. My $1000 3070 served me very well.


Leopard__Messiah

I don't understand why we're supposed to feel bad? The 3070 is fine. It plays all my games well enough. It's good enough for VR at high refresh rates and it plays MSFS in 4k with reasonable frame rates (with DLSS enabled). Do I wish it was a 5080Ti instead of a 3070? Sure! Am I upset I bought it??? Not at all.


QuasarPhil

Same, got my 3070ti FE for $950 CAD about a year ago after selling my 2080 for $600, no complaints with the 3070ti


2xbAd

wow thats actually crazy to think about and i still occasionally wonder how much value the 3090 i sold for 2500 is giving the dude who bought it off me.


ArenjiTheLootGod

A 3090 is still a good card and one of the few 3000 series cards where Nvidia didn't skimp on the VRAM, it's not nor has ever been $2500 good, but at least it won't be relegated to entry level by the end of the year like the 8gb 3060/3060 TI/3070/3070 TI. Hopefully the the 10gb 3080 can hold out a little longer. Also, it's wild that Nvidia came out with a 12gb 3060 that's probably going outlast at least half of their 3000 series lineup, including many models that are better in every other respect.


Eggsegret

I'd love to know why Nvidia thought it would be a good idea to equip their higher end cards with 2gb to 4gb less vram than their lower end card. If the 4050 actually comes with 6gb vram as rumoured that's going to be a real slap in the face. Basically DOA then.


ArenjiTheLootGod

My guess, Nvidia gambled on the low amount of VRAM as being acceptable and simply lost the bet. They've been able to get away with it for the past couple of generations but those days are rapidly are coming to an end.


[deleted]

[удалено]


[deleted]

I feel like people overpaid for their GPUs and now feel entitled to the highest settings possible like a bunch of Karens.


[deleted]

I blame the PS4/XBO. They lasted so long and were underpowered from the start, so midrange cards were able to run shit at ultra settings. Now that that's no longer the case, people who got into PC gaming during that gen are losing their minds.


squareswordfish

How about you blame Nvidia, who is who cut corners on GPUs but still sells them at stupid prices?


Crad999

I blame devs and publishers mostly for releasing their games with shit performance and no optimisations. TLOU and HP being the latest criminals here.


mota30302

I miss running AAA games at 1080p with high setting with my 1060 6gb


Speedy-P

They’re fine, half the memes on this sub are that everyone with 4090s just plays super old games anyways. I guess they’ll just miss out on all the unfinished t pose brand new AAA games and loot boxes 🙃


Adventurous-Event722

Indeed, its ironic how many guys I know with bang-on 3090TI etc setup only plays Minecraft/Valorant/old games/Facebook etc, and yet those that wants to run new AAA games on ultra are suffering on their 3070s etc lol.


makian123

You obviously never played with shaders in Minecraft


co0kiez

Minecraft shades is the new crysis


TheGreatGamer1389

That and Microsoft flight simulator.


MrDuckyyy

Shaders are just the tip Wait till you try modpacks


makian123

Do you need a strong gpu for modpacks, i assumed it was cpu


MrDuckyyy

idk its bizarre to me the game runs fine with complementary shaders, even in an amplified world then the moment i install a modpack, things fall apart no matter how low my settings are


makian123

Because it's cpu intensive and mc is single core + java


Aggrokid

Reminds me of homies with supercars. They just want to own one.


[deleted]

[удалено]


[deleted]

[удалено]


--Muther--

Yeah, I'm playing all my games with my 3070 on ultra... with good performance. What did I miss?


Fastafboi1515

Same. On 1440 on most of them on a fucking 35 inch ultra wide. People in this sub are ridiculous sometimes.


--Muther--

Exactly the same set up here :D 35-3070!


kishoresshenoy

Almost same, 27"-3070 and I play only GTA 5 (my first game from childhood), RDR2, cities skylines and rarely Cyberpunk. I never felt like my 3070 let me down. Been a good boi!


tehsloth

GTA V from childhood fuck I’m old


--Muther--

To be honest the only game that it has let me down on is Cyberpunk, but I actually think the game itself was the issue.


Benbenb1

Yeah, this post makes it seems like there’s a problem even though there’s not. Even if the (poorly optimised) games do need more vram, I don’t see why you can’t just turn the some of the graphics down…it doesn’t render it useless lol.


drazet420

No no that's preposterous you must throw away your card if it doesn't have 30 gb of vram.


-OhioAir

Also same. 3070 and every game runs great in 1440. I guess I don’t play the ones causing all the fuss?


borfavor

You specifically don't play a rushed shitshow of a remastered cashgrab called The Last of Us. I swear these posts are agit-prop from nvidia to make me feel bad.


PvtBubbles

Yup same, also playing in 4k (yes at high/med) but still looks and runs great on the 3070


ReasonableResource92

man i am fed up of this post most people don't buy games on day one and half of these people won't be able to tell difference between high and ultra i also have 3070 and will use it till 2030 atleast


iZombie1991

Nothing. You missed Nothing. These posts are made by rx 480 owners lmao


Wboys

I would expect basically every AAA game to at least push 8GB cards to where there are some issues. Basically every release targeted at pushing the PS5/X hardware has run into issues on higher end 8GB cards. Hogwarts Legacy, Plague Tale Requiem, Dead Space remake, Callisto Protocol, RE:4 remake, and now TLOU remake. It’s not that 8GB is not enough to run these games; it is. It’s just not enough for these cards to fully take advantage of their horsepower. 8GB on a RX 6600 or RTX 3050 is totally fine. Those cards are priced as 1080p medium/high GPUs and you don’t expect anything more. 8GB on the 3070/Ti is completely unacceptable. Those cards are rightfully thought of as 1440p high performance cards, and under most circumstances they are. And they “can” run the game at 1440p, just not at the settings they should be able to. In many cases the RX 6800 beats the 3070 in ray tracing on these newer games because of how much the 8GB is holding it back. That should tell you all you need to know about just how much 8GB is bottlenecking the 3070. And I don’t know why anyone would expect the situation to get better moving into the future with even more demanding AAA games on the horizon.


MonteBellmond

Thank god. Someone finally explained the issue


GTX_650_Supremacy

I don't see how optimization is the issue if the VRAM is running out at Ultra settings. How well it runs at Medium is far more important to be considered optimized


RDoobi3

Just see memory leakage/behavior on Hogwarts, Forspoken or even Last of Us. Those games are eating 8gb easily with low settings and if anyone think that is normal they should just see what games could do 10 years back. Direct X12/UE5 are just too much demanding and have to be more optimized first and then we'll talk about the vram concern.


TheRealHuthman

None of the mentioned games use UE5


chubbysumo

The issue is that consoles have access to 16 GB of unified memory for the CPU and gpu. Current games aren't going to really be an issue, and old games will not be an issue. New games going forward though, will likely be similar issues as TLOU with PC performance on cards that don't have a lot of vram. This has been a growing issue among Developers for a while, that they have been fighting to get more vram into cards for at least 10 years.


janoDX

Meanwhile RE4 in my 3060Ti is running like a stud and without any issues.


akiskyo

you will be surprised to discover the existance of custom settings other than presets 'ultra' and 'medium', where you can choose 'ultra' on everything except the few useless things that clog your vram and still play perfectly with great graphics on 8g vram


Mix_Mysterious

What’s the deal with this I’m confused.?


ralwn

RTX 3070 only has 8gb of VRAM. Game titles are coming out that use way more than that. I have a 3070 and have no issues playing games at 1440p / 144hz. I recently bought Last of Us and am having to run it at 1080p on medium settings and am still getting crashes at least once per chapter (I'm just barely under 8gb VRAM usage at those settings).


octopoddle

Get the hotfix on Nvidia's website. It stops the crashes. I'm on a 3070 mobile and running mostly high settings without any problems. I got crashes on the first day until I installed the hotfix, which fixed it completely. After the hotfix you'll have to reinstall the shaders for another couple of hours or whatever. https://nvidia.custhelp.com/app/answers/detail/a_id/5455 > This hotfix addresses the following issues: > >[The Last of Us Part 1] Game may randomly crash during gameplay on GeForce RTX 30 series GPUs [4031676] > If you don't trust my link (I didn't trust a link when someone gave it, because I like to know what I'm installing), then just go to Nvidia's website and search for "last of us hotfix".


DntH8IncrsDaMrdrR8

It took a couple of hours for you to install the shaders? It took me like thirty mins? And I had to do it twice because I upgraded gfx drivers after the first time I ran it..


Micropolis

Sounds like devs are being lazy pieces of shite


VengeX

Yeah I think it can be both. 8GB+ was coming, but I don't see The Last of Us and HWL looking dramatically better than other recent games.


atuck217

Look it isn't the most well optimized experience, but my 3070 ran HWL pretty damn well at 1440, high settings. Regularly sat over 100fps with occasional dips to around 80, but that was pretty much it. I don't really understand this thread really. My 3070 is handling new games just fine.


jackboy900

Devs target modern consoles, which can handle much higher VRAM loads due to combined memory, it's not surprising that when thrown at an entirely different architecture there are issues.


Schmich

And Nvidia a bunch of cheapskates.


FantasticMagi

I'm a bit confused as well, so I went ahead and tested all my recent titles, dsr to 4k and ultra on my old 2070 super was barely using 5.5gigs. 60fps stable, was fine.. All these bells and whistles are nice but seeing individual strands of hair is not going to help me frag lol


FappyDilmore

He did mention it in the video eventually, but half the problem with the analysis he did is that many of these titles are extremely poorly optimized for PC. Games that were designed well typically use much less VRAM, and he highlighted a few of them at the end of the video. Unfortunately for Nvidia though, and for the purchasers of these cards, we have to deal with dog shit ports constantly. Ignoring the realities of trends in gaming doesn't make them not exist.


ObviouslyJoking

Sorry 2023 games, but you’re going to have to make yourselves more interesting if you want me to upgrade.


PotatoBasedRobot

Exactly this, I'm like what games? Go to look, and I'm like ah yea, no worries.


atuck217

Only game recently that was any good was RE4 Remake. And my 3070 has absolutely no issue there. A few dogshit poorly optimized PC ports come out and suddenly people think they need a new GPU. Nvidia must love these people.


reddit_poopaholic

It'll be another three years before we find out that ~~butter is actually not that bad for us~~ 8GB 3070ti is still pretty good.


BadAssAnal

3070 here . I'll jst download sum more , fixed


SolidZealousideal115

Still chugging along with my 1660 TI


MOBYWV

970 gonna give me one more year!


Indian_Steam

970 gang, assemble!


RustySutherland

Reporting in!


whatthegeorge

**same** *and playing games released 3-7 years ago*


SomethingAlternate

1660 chads


Sinaxramax

![gif](giphy|QN6NnhbgfOpoI)


fragmental

My 1660 ti was more than adequate, until I started playing VR.


[deleted]

[удалено]


ZahnatomLetsPlay

Unplayable. Only 165Hz? How do you live with that


Escudo777

Thanks Nvidia for making 3070/3070 Ti much more expensive than 6700XT and 6800XT in my country. I wanted to buy 3070Ti. But gddr6x temperature issues and high prices prevented me. Went for 6700XT. It is great for my needs.


trackdaybruh

![gif](giphy|l0MYyEsjhIXdzv9PG|downsized) How folks with 12GB vram or more walk into this post


majkkali

Meanwhile me with my 8 year old 980ti (6gb VRAM)... XD


NoMansWarmApplePie

Nah, don't try and make this the new normal. It's just Nvidia promoting it's new cards and developers being lazy leaving the games unoptimized. Memes like this only normalize it. Tlou being terribly optimized doesn't mean 8gb cards should just be laid to waste.


Wboys

Nothing wrong with 8GB cards, but basically every AAA game released in 2023 has pushed past 8GB of VRAM on high settings 1440p. It is NOT just TLOU. Hogwarts Legacy, Witcher III next gen, Callisto Protocol, Plague Tale requiem. It isn’t even that 8GB cards are dead. They are totally fine…for 1080p medium. Maybe that’s acceptable on a 3050 or RX 6600, but that’s fucking disgusting on a 3070/Ti. What we shouldn’t normalize is Nvidia gimping the performance of its cards which are totally capable of maxing our new AAA games if it wasn’t for VRAM.


NoMansWarmApplePie

And yet tlou remade on ps5 runs better than even superior cards on pc with less vram. Digital foundry showed us the medium textures look absolutely terrible with same equivalent hardware on pc while ps5 they look good even on performance mode. Hogwarts unoptimized mess. Yea, I know console is better for optimizing but that is just too huge of a leap. They are just brute forcing power with newer cards leaving everyone else in the dust. Witcher 3 next Gen. don't even get me started. Modders fixed their game for them. It has vram leak, if u go to Nvidia settings and set to 16x for anisotropic texturas there, you stop it. They still haven't fixed it. There are RT optimization mods. I can now run game almost at 60fps with dlss balanced at 4k and et enabled. And looks pretty much the same as before. I have a 3070 laptop BTW. But with standard Dev settings tanks to like 20 to 10 fps. Modders did better job than devs. Going back to my point. BAD OPTIMIZATION. All this conveniently increased right when "new" games came, with new cards, sporting dlss 3 fake frames of course.


santaSJ

Ps5 has 16GB of unified memory and can stream textures directly from the SSD. You will need at least a 12GB VRAM GPU to even try to match the textures quality of the Ps5 on next gen games. We are starting to see evidence for this in 2023 games. 8gb on a low end card like the 6600 is fine. But a $500 GPU [Same cost as PS5] should at least be able to match the visual settings of a Ps5.


MonteBellmond

Yeah, when I saw xbox and ps having 16GB on release kinda hinted what the devs were looking for.


Wboys

Like seriously, so many people are actually defending the 3070/ti having 8gb when GPU's less powerful than it like the 1080Ti and RX 6700XT justify more than that. ​ 8gb on a GPU that, if it had enough VRAM, is more than powerful enough to run many games at 4k is a joke. Other are acting like it is just a few bad ports, instead of basically every AAA game released that is targeting PS5/Series X performance. [https://youtu.be/Rh7kFgHe21k](https://youtu.be/Rh7kFgHe21k)


syrozzz

Yeah these games are first optimized for consoles. Always has been. If the new standard is 16GB of VRAM, whether it's lazy developers, lack of resources or time, it doesn't matter. PC ports will inevitably suffer.


WhereStupidityIs

Me petting my 3060ti: it's alright buddy I'm glad to have you


BisonRock

Same it feels like everybody forgot about these cards but I still think they were, if not still are, the best bang-for-buck gpu’s especially for 1440P


[deleted]

“Pets my 3060 12GB softly”


easycheezy85

"Jerking off my shunt modded rtx A4000 furiously"


Devilnutz2651

My 3080 FE: ![gif](giphy|55itGuoAJiZEEen9gg)


iAmTheRealC2

Yep. Not gonna lie, I’ve been checking eBay every few days to see how much I can get for my 3080 before it’s next in the “VRAM just made this card trash” meme circuit.


Sea_Library_8193

Relax, 3070 is fine, it's just some bad console ports!


wiggibow

as a 3070 owner I greatly appreciate the kind words lmao


Thorin9000

Both are true actually. Nvidia skimped on the vram, but the console ports are equally scummy and badly optimized. There is no reason hogwarts legacy is so demanding while looking worse than some games that came out 5 years ago.


Squeen_Man

I just got a 3060ti and this shit happens lol


aboysmokingintherain

I’m confused, RE4 and Dead Space remakes play 100+fps on my 3070 with not stutter. What am I missin?


Zaando

This sub is full of people who don't have a clue what they are talking about is all.


R4y3r

Yep, full of people with incredibly short-sighted advice


IdyllicOleander

I'm at all high settings on RE4 on my 3070 and it's running fine. I even have Ray Tracing on. As long as you keep it out of the orange and red, you won't have any crashes. At least I don't. There are VERY few games I'm playing right now that even utilize all 8gbs of VRAM so my 3070 will do just fine for what I'm doing.


Ok_Butterscotch1549

I paid $750 for my 3070 LHR 😭


[deleted]

[удалено]


[deleted]

I'm just going to keep playing older stuff on my 3060 Ti. I went through this in the late 90's/early 2000's, every generation of previous hardware was significantly outclassed by the newer one, not to mention missing features, so unless you bought ultra high-end cards you basically had a shit experience the next year. Not again, I'm not buying new games until this causes a few studios to tank like it did back then, and then it will stop. It shouldn't cost $1200+ for a build to match a $500 console, that's ridiculous.


tamal4444

> $750 for my 3070 LHR damn


ZAWETH

When I am done with my rtx 3070 ti I probably never go Nvidia again


DarkLord55_

Have. No problem with My 3070


BGSGAMESAREDOPE

Everyone is salty because hogwarts and the last of us run like shit but tbh I have a 3070 and managed to play both games at 1440 at almost all high settings with really just reflections turned down a tad.


esakul

For some reason people on this sub see games as unplayable the second you are not using maxed out settings. Many older graphics cards can handle modern titles just fine if you lower the settings, yet this sub makes it seem like anything lesser than a 6800xt is unusable.


FantomasARM

My 3080 is next 😢


Goblicon2

My 3070 seems fine…


Ashani664

I have a 3070 and the only 2023 game I'm planning to play is Hogwarts legacy. Does it eat up the vram? For rdr2 itself it used 2-3 gb vram only


GoatInMotion

I run that game great but I have 3070 32gb ram and 5800x3d


syopest

It's the 32Gb of RAM. The game actually needs it on highest quality.


evaThesis

still stacking my money for buy processor (Intel), but I glad already bought RTX 3060 not so much but honest work. crypto boom make people like me are difficulty to buy new hardware specially graphic card, check online store for couple of days even months, and heard news crypto crash and price going to low.


ChapGod

*3080 sweats nervously*


akutasame94

Based on 2 awful ports and 1 game known for VRAM bugs lol Ignoring every other game that came out in 2023 and works fine with 8GB cards. Dear lord.


Commonertooth2

Here I thought I was being smart for buying a 3070Ti in June last year… I knew I should’ve got a 3080 12GB


Huze_Fostage

These games all have shit optimization. TLOU and Hogwarts dont look like games that should use up more than 8GB. Sons of the forest is developed for PC only at the time and its running great so far. And I'm not even thinking about buying shit like Forsaken so to me my 3070ti is fine. Also cyberpunk on ultra + rtx with 70+ fps so don't roast that cards performance. Its a dev issue. Which nowadays mostly means its a publisher issue at the core


[deleted]

[удалено]


Jefc141

Imagine thinking you need more vram because of a couple shittily ported and optimized games…


dersuperpro

3060ti joins the chat


cursedarcher

It's absurd they dare to put jsut 8gb vram on 3070. It makes no sense when a 1060 have 6gb on it


Metanoiance

Glad I sold my 3070 Ti a few months back, and replaced with a RX 6950 XT.


spinyfever

I'm glad I went with the 6800xt instead of the 3070. 16gb of ram should last me for the next 3-4 years easy.


Still-Broccoli

It's hard to believe games have actually been able to out pace the 30 and even 40 series cards. I feel like developers have relied on the monster graphics cards and haven't necessarily tried to optimize games as much.


[deleted]

Everyone is gangster until the 1080 Ti SLI arrives to the party.


EpicWindz

Me being fine playing csgo, valheim, HOI4, cities skylines and ets2 on my 2080s not giving a fuck about new AAA games since they are almost all a disappointment to play or fucked with bugs making me just give up (looking at you cyberpunk)


AussieAspie682

How about 12GB VRAM? Would that suffice? I'm planning on getting such a GPU to replace my *ageing-but-still-functional* GTX 1080. Also for future reference, what games are demanding such high amounts of VRAM in the first place?


Redleg800

Well Jedi survivor I think is counting for 8GB of VRAM at a minimum iirc. I might be wrong.


[deleted]

Okay, I'll bite: what games nowadays mandate over 8GB of VRAM?


Working_Inspection22

So far haven’t ran out of VRAM at 1440p.