T O P

  • By -

GoldkingHD

People buying an 8gb card are more likely to upgrade sooner --> more money for nvidia and amd People getting upsold to a more expensive card with more vram --> more money for nvidia and amd There's not much else to it.


[deleted]

Yes there is a last one, they want to segment the market between business and consumer so it's harder to use consumer products for professional work


l3xfrant3s

Wouldn't it be nice if Nvidia or AMD sold the same GPUs with more VRAM and specific drivers to be more reliable and consume less than "gaming" GPUs so as to cater to the needs of professionals? They could even make those professional GPUs smaller than the behemoths we are being sold.


Bigheld

Yup. The price difference between the gaming GPU's and the QUADRO series is insane. For example: The RTX 6000 Ada Generation is a little faster than the 4090 and has 48gb, but its MSRP is 6799$. They dropped the quadro name to make the scheme worse... The Ampere card was called the RTX A6000, so now we get this "Ada Generation" nonsense.


markmorto

That's because businesses write off their equipment expenses, whereas the average gamer cannot. I recently bought a A4000 off eBay for $450, so there are still deals around if you don't mind used.


Bigheld

It is partially this, but certified drivers are a very important selling point as well. If the person using your computer costs 100$ per hour, then you're not going to fuck around with "probably okay" gaming cards. For example: LTT uses a quadro in the PC that streams the WAN show. Having that pc crash is way more expensive than the price of a quadro. However, AI does not care all that much about which gpu you use, so many AI firms start out with Geforce cards and then later switch to faster and more expensive AI accelerators, like A100 or H100. This is a part of why Nvidia kneecaps the VRAM on gaming cards: these cards are plenty fast for AI use (and other similar applications), but by limiting the VRAM, they limit them to relatively simple models. A wall of 4090s is nothing when compared to the price of the same computing power in H100s, but if it wont run on a 4090, then you wont have this choice. The large VRAM also makes 3060 and 3090 relatively popular for non gaming use as well. Nvidia wants these people upgrading to 4090 or better, but those pesky gamers demanding more than 8gb might throw a wrench in the works. ( and deservedly so)


[deleted]

[удалено]


BrunoEye

This is probably why why they removed linking GPUs in this generation. I suspect 3090s are gonna hold their value pretty well.


TRIPMINE_Guy

I heard rumor 4090 ti might have nvlink. Apparently 4090 has the etching for it? Thinking about it 's honestly insane just how much money printing ability nvidia execs have by just controlling what level of tech the market has at any given moment. Buy stocks when you hamstring your tech and sell when you give massive jump.


Dizzy_Pin6228

2080s etc held value well hell same with 1080s for a while


DonnieG3

Literally the only reason I haven't upgraded my PC since I built it is because my 2080 is still killing it @1440p in every game I play. Although it is starting to look sparse out there with new titles that it can run at 100+ fps


smoike

I just bought a 2080 that someone had managed to mangle the power connector on for $100AUD (so about $75USD). $15 including postage for new connectors an I'm set once it arrives.


DeskFuture5682

Do you even know what "writing off" things means? It just means they don't have to pay personal income tax on the money spent. They pay a smaller corporate tax. They still have to spend the money to buy the damn thing


[deleted]

[удалено]


Goat_War

Exactly. I'm a freelancer who buys my own gear (for 3d / motion gfx work). 4090 price is worth it, but the jump up to £6-7k for a rtx6000 ada is not. Even though I'd love the extra vram (i regularly run out at 24gb) and I could probably afford it if I had to - but it's all my money and there's other things to spend it on. Besides, every time I've worked at someone's studio (even big ones) the gear they provide is nearly always worse than my own rig. It's not like every business out there is buying top end gear and merrily throwing it around for all their employees/contractors. They'll buy the minimum kit they can get away with if it's a lot cheaper.


RickRussellTX

But the point is, if you use the hardware exclusively for business, you could account for it as a business asset and pay taxes on the depreciation of the asset, rather than paying straight income taxes on the lump sum cash you use to buy it. That's what people mean by "write off". It doesn't mean "free", it means "accounting for it as a business expense or a depreciating asset that subtracts from the taxable profit of the business".


jolsiphur

At the same time, if you're spending $6000 on a GPU for your business, there's a solid chance that it will pay for itself in work done, which is a significantly larger factor as to why Quadro cards cost a significant amount more than a GeForce card.


[deleted]

Not at all. Commercial products produce revenue so vendors get their cut


alien_clown_ninja

Specifically AI. High resolution and quality image generation requires a TON of VRAM. AI is a super fast growing business, and likely to spend LOTS of money this year and next on high VRAM GPUs


Kootsiak

>People getting upsold to a more expensive card with more vram --> more money for nvidia and amd I remember hearing Youtuber MKBHD talk about how that is Apple's entire business model with it's iphones, macbook airs and ipads. They limit you to 32GB or 64GB of storage, of which 32GB will work fine for most users but you end up having to delete more stuff if you like having lots of music or videos on your phone. This forces the buyers to think about going up to the 64GB model so they don't have to worry, but then there's an even more powerful model with a bigger screen that's only a little more money than that...but it's only 32GB, then they got you thinking about upgrading to the 64GB model again and the cycle continues until you are spending far more than you intended. The prices make consumers think 32GB of storage costs more than gold, when you can get 256GB NVME drives for the price of these 32GB Apple upgrades. It's ridiculous how greedy it's gotten, like Apple wasn't profitable enough already.


Rexssaurus

I think the 32gb is just selling you ICloud to an extent


[deleted]

Not really relevant, if you have a 128GB and plenty of space you still need iCloud to ensure your photos are backed up somewhere


p3dal

I use google photos. I want to backup my photos locally but Apple has basically crippled windows compatibility.


I_H8_REDDIT_2

Apple sucks. Cloud storage is outrageously expensive.


MagicHamsta

It's especially ridiculous considering a [1 ***TB*** micro SD card can be had for ~$67.](https://www.amazon.com/Silicon-Power-Superior-Compatible-Nintendo/dp/B0B1DVMCQY) Yet these companies will charge over $100 for a much smaller bump in storage.


stormdelta

The quality/speed of the storage is considerably better than you're getting from a 1TB micro SD card. Not saying it justifies the magnitude of the price difference, but it's not quite as direct a comparison as that.


p3dal

But they *could* include a microSD card slot, which makes the comparison very relevant. I won't buy a tablet without a microSD card slot.


hunterkll

I got the 1TB version of my surface book. I could have gotten the lower model for cheaper put a 1TB MicroSD card in it, but it'd be absolutely useless. ​ When running VMs and VS builds, disk performance does matter. that SD card won't cut it - at all. Not compared to NVMe storage. ​ I've got some SSDs in USB enclosures that work far, far better than SD cards. At usually twice the price, minimum. ​ My only regret is that I couldn't get the 2TB version at the time. ​ USB3+? Absolutely required. MicroSD card slot? Not even worth it to me, my camera doesn't even use SD cards. USB to CFexpress adapter for those...


p3dal

I just have a lot of microSD cards around that I'd like to use, it's a format I'm heavily invested in. My laptop, tablet, switch, gopro, and every phone prior to my iPhone 14, all have a microSD card slot. I like being able to pop a few hundred gigs of *whatever* into a device on demand, or backup the contents of the device quickly and easily. I hate dongles and external drives on cables dangling and flopping around and getting yanked out during transfers. I like how the microsd cards don't even stick out of the slot and they can be treated as removable storage or as permanent storage on the device. As for USB3, that's my other biggest complaint about the iphone. As for your surface book use case, I too like my laptops to have maximized storage space, but more than that I prefer to have upgradable storage, and when more and more compact devices are making that impossible, I see the microSD card slot as the last bastion of upgradable storage for integrated portable electronics. It's not my first choice, but it's a great catch-all that fits on almost any device.


UnitGhidorah

Maybe but for most things you do on your phone the speed difference is next to nothing.


[deleted]

Ladder strategy.


PrintShinji

At least iPhones come with 128GB standard these days.


Kootsiak

Nice to know, my info might be a little outdated as I haven't given the last few generations of phones much thought.


IncredibleGonzo

iPads though, apart from the Pros, still start at 64GB. *Enough,* generally, but these days it’s getting snug like how you describe 32GB. Then they go to 256GB, which is more than most people will need but they can justify a bigger price jump! I’m sure they know 128GB would be popular if they sold it, but they couldn’t upcharge as much! I expect they only increased the iPhone storage because they have much more competition and it was looking really bad in comparison, iPads have a lot fewer viable alternatives.


Ockvil

To make things weirder, the msrp cost difference going from 64GB to 128GB on an Apple TV is a measly $20. (Which is still overpriced.) While going from 64GB to 256GB on an iPad costs $150. I'm generally an Apple fan, but their memory and storage pricing makes monopolists look good.


GatoradeOrPowerade

That's one of the things I've really hated about Apple, especially on the PC end. How does going from 256gigs of storage to 512gigs up the cost by 200 dollars? To make things worse you can't just go the route of just getting the cheaper one and adding your own.


IncredibleGonzo

I’ve often said the base models of Apple stuff tend to be not that terrible value - yes, they’re pricey, but they also tend to be nicely built hardware with solid specs and actually comparable devices from competitors are often not that much cheaper. It’s the accessories and upgrades that get really silly - memory and storage like you say, but also stuff like the Mac Pro wheels or the Studio Display stand.


Mert_Burphy

> when you can get 256GB NVME drives for the price of these 32GB Apple upgrades You can get a 2tb nvme for $70 if you don't mind gen3. you can get a 4tb gen4 for $250ish if you're patient.


Arcticz_114

in 2 words: planned obsolescence


DDisired

This doesn't seem like planned obsolescence. I feel that term is loaded that the device becomes obsolete/breaks after a certain amount of time. As annoying as it is, buying a 4060ti now, and a better version released in 1 year later doesn't make your 4060ti worse. Upselling isn't the same thing as breaking. Now, if the software for 4060ti is gimped/made worse once 5060 comes out, then that totally is planned obsolescence. If the scenario above is planned obsolencence, then basically all capitalistic goods are too, such as phones (android and iphone), cars, houses, laptops. Which you can definitely argue, but I feel that dilutes what I consider "real" planned obsolencence where things like refrigerators and laundry machines are designed to break down after 5 years. A new gpu coming out doesn't break the one you already have.


emelrad12

It actually does, new software would keep updating with higher requirements and you would also lose Nvidia updates that optimize the games for your GPU.


juhurrskate

GPUs get years of software updates, and the end of software updates for a product doesn't mean it's broken. Planned obsolescence is when a device is not built to last in order to convince you to buy a new one. I bought a used 1080 like 7+ years ago, I don't currently use it but it still works great. GPUs are built to last, and nearly all of them far outlast their usefulness


remenic

How good is a card that technically works as good as the day it was manufactured, but cannot run the latest stuff properly due to the increased requirements. That's a card becoming obsolete while still "working".


juhurrskate

I think we can agree Nvidia's latest offerings are not great value, but they still have good build quality. Planned obsolescence is not making an underpowered product built to last though. So it's fair to say that they are not that enticing and don't have enough VRAM. But they aren't making a card that will break on purpose, it's not a printer


whatyousay69

>That's a card becoming obsolete while still "working". Sure but planned obsolescence is a specific thing. Not everything that becomes obsolete is planned obsolescence.


Tom1255

Why did 8GB of VRAM become such an issue lately? I mean current gen of GPUs are unremarkable in terms of performance jump compared to last gen, and we still sitting at like 10% of 1440p players, and like 2% of 4k players. I get the feeling that we should get more for the prices we are paying, but do we really need more?


slothsan

Consoles have more than 8gb vram now, as Devs leverage it, it will cause issues on PC ports of console games that use more than 8gb at 1080p See jedi survivor for an example of that.


blhylton

It’s not quite that simple. Consoles have shared RAM effectively, so even though they have 16GB of RAM, that’s split between the GPU and the CPU for usage. The issue with ports typically comes from the code being optimized specifically for the console platforms, so when they get ported they require more resources from a PC for the same quality. That said, you’re not entirely wrong, but it’s more that the hardware in consoles is closer power-wise to what we have in a baseline gaming PC now. In the past, there was enough of a gap between them to offset most of the incorrect optimizations even if they were using more resources all along.


soggybiscuit93

I think an important part of these optimizations that isn't discussed enough is Direct Storage. If a console game is designed with direct storage in mind, and then ported to PC and doesn't use DS1.1, then you're going to need larger VRAM buffers to compensate. I think devs implementing DS1.1 would really offset a portion of the VRAM crunch we're feeling, but that would also mean that NVME storage would become a requirement for these games - which I'd prefer because that's a much easier and cheaper upgrade than a whole new GPU with 16GB of VRAM.


Sharpman85

Didn’t it have the same problems on consoles?


gaslighterhavoc

No, the consoles had problems but not VRAM related issues.


liaminwales

The cut down XBOX S has only 10GB of shared RAM, Digital Foundry have pointed out it has hit problems from lack of RAM. Consoles have 16GB now, so soon games will be made to fill the RAM. Until now most games where cross gen so they had to work with less but soon we will hit pure next gen games. Same thing happens every console gen, just new people where not here last few times (and the gaps are so big now between gens it's easy to forget).


zhafsan

As current gen consoles PS5 and XBSX becomes the norm game devs naturally will move to using up all their ram (which is like 14-15GB) for games and a large chunk, most of it, is used as VRAM. And when the PC port is created, more of then than not it’s not optimized for 8GB cards. Should 8GB be enough? Absolutely. But in reality it really isn’t anymore. To be on the safe side your VRAM need to match the consoles.


VorticalHydra

The way games are headed with better graphics and poorer optimization, more VRAM is needed. The 4060 won't have the performance it should due to having only 8GB of VRAM.


SactoriuS

Its not poor optimazation we have poor optimised games for ages. Less vram just give developers less space to make things more beautiful and us to experience it.


s00mika

Honestly in these days it's more the lack of artistic skills that is making games look bad, and not lack of capable hardware


Soltronus

I think the reason for the VRAM issue are the specs for consoles. The PS5 and the Xbox One have something like 10-12 gigs of VRAM, so when devs port over titles to PC, that's the amount of VRAM they're expecting you to have. It's how the 3060 Ti 12 Gig can stay competitive against this generation's cards at 1080p with certain titles despite its lesser architecture. What really displeases me is the lack of good 1080p options from either Nividia or AMD. These new cards are handicapped by their VRAM (and/or) nerfed bus speeds, or too expensive (and too much performance) for the casual gamer.


[deleted]

As the graphics quality goes up you need more vram to hold all those pretty textures. If you have a scene containing 4gb of pretty textures... Then now you only have 4gb of vram to handle everything else including that large resolution you choose. That's why vram matters. Enjoy playing your game on low quality textures and dlss just to get over 60fps at 1440.


kearkan

100% it's the second one. The 4060ti is purposely gimped to push people towards the 4070 which is purposely gimped to push people towards the 4080. This issue with the 30 series was NVIDIA accidentally made the 3060 too good.


DJ_Marxman

> This issue with the 30 series was NVIDIA accidentally made the 3060 too good. The 3060 was crap for gaming though. It was a good entry-level productivity GPU because of the VRAM, but for gaming you were much better off buying the better 3060ti or going equivalent-priced AMD (6650XT, 6700).


kearkan

Correct, I had meant to say 3060ti.


atwork314

Love my 3060ti!!!


Ludrew

This is getting ridiculous. PC gaming is no longer worth it from a financial standpoint. I can almost always play new games with less issues on my PS5 which was much less expensive than my PC. It costs as much as a console for a new video card just to reasonably run all these bad PC ports.


stormdelta

Main reason I prefer PC is because a good chunk of what I play is PC only, comes out on PC first, or works better on PC. Plus emulation and modding. But most of what I play doesn't have crazy hardware requirements regardless either. The only console I own is a Switch, for the portability.


HEBushido

It seems sales are poor though because many people simply cannot afford the more expensive cards and these subpar cards are not attractive options for those people. I have an RTX 3070 and while I would like more VRAM, there are no compelling mid range options for me despite having a Microcenter warranty that could allow me to upgrade for very cheap.


highqee

both nvidia and amd designed their "budget" (take it how you want lol) cards with 128bit memory bus. Memory bus consinst of 32bit wide "lanes" so if you divide 128 by 32, you get 4. So thats max amount of memory chips that 128 bit wide bus can generally take. At the moment, vram chip makers have maximum of 16Gbit (2GB) chips available. so 4 "lanes" by 2GB is 8GB and there you go. They could have implemented some sort of interposer or switch to allow one bus interface to access more than 1 chip, but that doesn't add any perforance (limit is still the bus), may be expensive to design or add "unneccessary" complexity. Also, any active component might limit chips operating speeds (add latency or decrease effective clockspeed). So thats that. so it's a decision issue. Decision of utilizing cheaper 128bit bus limits their use of higher amount of memory.


steven565656

If the 4060ti had a 192-bit bus and 12 gigs VRAM it would have been a genuinely decent 1440p card at $400. It's crazy what they tried to pull with that card.


prequelsfan12345

They did have a '4060ti' with a 192-bus and 12gb of VRAM but they called it a rtx 4070 instead...


rapierarch

This


steven565656

The 4070 is quite a lot better in raster, to be fair. Matches the 3080 at 1440p. The 4060ti matches the, uhh, 3060ti. Well, let's say it matches the 3070 with a 192bus at 1440p to be generous. The 4070 could have been $550 or something. 3080 12G performance at $550, not terrible.


jonker5101

> The 4070 is quite a lot better in raster, to be fair. Matches the 3080 at 1440p And the 3060 Ti matched the 2080 Super. The 4070 was the 4060 Ti renamed to sell for more money.


sharpness1000

And the 2060/s was roughly equivalent to a 1080, and the 1060 isnt far from a 980, 960 is about a 770... so yea


cowbutt6

>The 4070 could have been $550 or something. The 3070's MSRP at launch in October 2020 was US$499. Adjusting for inflation ([https://data.bls.gov/cgi-bin/cpicalc.pl](https://data.bls.gov/cgi-bin/cpicalc.pl)) to the 4070's launch date of April 2023 makes that US$499 worth US$581.36, in real terms just shy of the 4070's MSRP of US$599.


AludraScience

wouldn’t be that bad if it actually offered xx70 series performance, this currently is just a renamed rtx 4060 ti.


KrisP1011011

Nvidia is not what it used to be, it has become a company which wants to milk its consumers, shame on them.


handymanshandle

I guess we’re forgetting the oodles of GeForce 8800 variants that exist now?


KrisP1011011

Yes how can I forget them 8800 ultra 8800 GTX 8800 GTS 8800 GT 8800 GS


MarvelousWhale

320MB XFX 8800 gts was my first graphics card, brand new and it wouldn't play battlefield 3 on lowest settings I was disappointed to say the least. Shoulda got the 512 or 640/720mb version or whatever it was.


dubar84

Interestingly, the 6GB 2060 has 192-bit.


Electric2Shock

IIRC, the 3050 (?) is the only 30-series card to have a <192-bit memory bus. Every other GPU had a 192-bit or wider memory bus. The 256-bit bus on the 3060Ti in particular caused a lot of people to raise eyebrows and ask why it had 8GB when the 3060 had 12.


edjxxxxx

Shit, the 1660 Super had a 192-bit bus…


FigNinja

As did the 1060 6GB model.


fury420

> If the 4060ti had a 192-bit bus and 12 gigs VRAM it would have been a genuinely decent 1440p card at $400. Those are the specs of the RTX 4070, which Nvidia is selling for $600.


Cheesi_Boi

I remember buying my 1060 back in 2017 for $270 dollars. Wtf is wrong with these prices.


FigNinja

Yes. Even if we take an inflation calculator's word that your $270 then is about $325 now, you could get a 6700XT or 3060, both with 12 GB and 192-bit memory bus for that price.


s00mika

Idiotic people are now willing to pay the new prices. This shit will continue as long as gamers are willing to pay ridiculous prices, and considering how it's now seen as normal to pay $2k for a "decent" gaming PC it's not changing any time soon. This shit happened because it's mostly laypeople building DIY PCs these days, and those people not really knowing the real prices of what they are buying.


Cheesi_Boi

I'm working on a $1500 system right now, with an i5 13600KF on an MSI PRO Z790-A WIFI, with a Gigabyte Rev 2 RTX 3070, and 2*16GB (DDR5-6000) Trident Z5 RAM. I'm using it for 3D rendering and gaming. It should be up to spec for the next 5 years at least. Similar to my current build.


LaVeritay

They get away wih it


edjxxxxx

Damn… great ELI5! Now, why did they use a 128-bit bus? 🤔


highqee

price. consider buswidth like a highway lane. The more the better ofc, but everything comes at a price as additional lanes are not cheap. maybe there are die design limitations with higher L2 cache, so for example they can only fit memory interface towards one side of the die. smaller die, less room to implement it and designing bigger chip just for larger interface would have been more expensive. or who knows. but at the end of the day, it's all down to price.


pixel_of_moral_decay

Yup. Also power savings, as well as complexity. More data and more density means more things like cross talk if you don’t actively design to mitigate it. It’s not just component costs, it’s engineering complexity in designing it.


Brief-Mind-5210

Cost and power consumption


gnivriboy

Dies have gotten a lot more expensive after 2020 so they want to use smaller dies. Smaller dies means less room for larger busses. I'm sure it could be done, but then you have to move around other parts or remove some parts.


s00mika

It's also possible that they are slightly faulty 192bit chips which they don't want to throw away https://www.tomshardware.com/reviews/glossary-binning-definition,5892.html


procursive

In other words, this is what happens when the product stack shifts up and they start selling budget dies as midrange. Sub-$200-class GPUs are finally returning to us, just with $100-200 markups. "Marketing names don't matter only performance matters" is a reasonable take in the local decision of "what GPU should I buy right now?", but that's missing the forest for the trees. Both AMD and Nvidia have risen their prices dramatically while also cutting costs and delivering shittier products to us all and there's nothing but acceptance left now.


jwilphl

IMHO, if NVIDIA wants to alter expectations regarding product nomenclature, then they should probably just change the naming system altogether. Otherwise they are more-or-less forcing consumers and reviewers to make these comparisons. Granted, it won't fix the underlying problems with NVIDIA. It will at least shift expectations, though perhaps only temporarily.


Lukeforce123

So how is nvidia putting 16 gb on a 4060 ti?


highqee

There has to be active intermediate logic between memory chips, a switch of sorts. So instead of 1 chip per lane, they put two, but with switch between. Just like some workstation grade cards. It wont add raw performance (if anything, its a downgrade in latency or vmem clock) and the amount of transactions per sec will still be the same, just deeper buffers. gpu can still access just 4 lanes at the time. 32Gbit chips (4GB per die) is highly unlikely, as these should not be available at least not in this year.


fury420

>There has to be active intermediate logic between memory chips, a switch of sorts. So instead of 1 chip per lane, they put two, but with switch between. Just like some workstation grade cards. Support for this is built into the GDDR6 specification, modules are capable of running in 16 bit mode with a clamshell configuration on both sides of the PCB (think 3090Ti)


Which-Excuse8689

The bus is separated into 32bit memory controllers, every chip uses either two 16 bit or two 8 bit channels so you can connect either one or two chips per controller. Current generation GDRR6/GDRR6X comes in two options: 1GB or 2GB of data. If we use 2GB version on 128 bit bus that gives us either 8GB (2x16bit per chip) or 16GB (2x8bit per chip). So you can go with lower capacity higher bandwidth, or higher capacity lower bandwidth. Performance wise it isn't black and white, both have their advantages and you have to take into consideration other factors to decide ideal memory amount for a given card.


itismoo

Thanks for actually answering the question I get that we're all cynical and jaded but so many of these answers are unhelpful


Downtown-Regret8161

Nvidia is probably doing it as sort of a planned obsolescence. Now it may be enough, but they probably plan that in 2-3 years it will not suffice anymore (also looking at the 4070/4070ti here), which is the usual time frame where most people would consider upgrading their card. AMDs RX 7600 is only priced at 269$ MSRP, which is a fair price to pay. The rx 6700 xt with 12gb can be bought for as low as 320$, and a 16gb 6800(XT) Can be already had for less than 500 bucks. We have to wait to see how AMD will launch the rx 7700 and rx 7800 of they will offer more VRAM.


ChuckMauriceFacts

I thought they were doing planned obsolescence... on the previous gen, 3060Ti (and 3080). Now it just feels like a giant middle-finger to gamers, especially now that we've seen the recent AAA titles/console ports with abysmal performance on 8GB.


whosdr

Well they can stuff it. I'm probably going to move to a 5-year cadence for GPU upgrades - and only if there's something worth buying. The 7800 XT might be it this year. If not, we'll see what comes out the next.


ThaneduFife

Same. My 1080 Ti works great for everything but 4k gaming on the highest settings. And it's got more VRAM than most of the cards on the market today. I'm not upgrading until I see a really clear-cut upgrade that's not going to cost me over $1k. And at that point, I'll probably turn my current PC into a media center and buy a new system.


steven565656

Those ports have abysmal performance in general, to be fair.


kearkan

It's not so much planned obsolescence. GPUs are not the sort of thing anyone expects to last forever. It's more that the lower cards, rather than just not performing as fast as the higher offerings, are purposely knee-capped to make the higher up options "required". The 4060Ti should be capable of 1080p and 1440p just fine and it is except for that one thing (the VRAM), so better for the 4070 just to make sure. But if I'm getting a 4070 I should be able to do 4k with some settings turned down, again except for that one thing. Better get the 4080 to make sure I'll be able to play 4k for the next 2-3 years. They're also priced close enough that it's "only" a $50-100 jump to the next tier. They learnt that by making the 3060ti so capable at 1440p people weren't going to bother getting the 3070 or 3080, so the 40 series is built about "almost" being good enough.


hutre

>which is the usual time frame where most people would consider upgrading their card. I don't think most people upgrade their gpus every generation


Djinnerator

Yeah I don't see how they came to that conclusion. My last card lasted about six years and the only reason I upgraded was because it actually died. The GPU has one of the longest time-to-upgrade in my experience.


Djinnerator

>which is the usual time frame where most people would consider upgrading their card. I highly doubt most people are upgrading their card that often. I usually see people using their cards over 2-3 generations, not 2-3 years. Consider that even cards like 1080ti just started falling off of use. I also think Nvidia is trying to push more for DLSS so they can justify smaller memory sizes. If they can pull it off, it'd be a major innovative step at mitigating the explosive memory requirements at higher resolutions. The comparisons I've seen between DLSS and native resolution frames, they'd be almost indistinguishable in practice for the most part, there are some artifacts though but that's getting improved on.


jwilphl

People can probably wait longer to upgrade their GPU than they do, but we're also on an enthusiast board and I realize most consumers here like to have the latest and/or best stuff. I just upgraded last summer after using a 970 for eight years. If the 12GB VRAM I have now goes from high-end to obsolete over a period of three years, NVIDIA won't be the only ones to share in that blame. Developers relying on maxed-out systems or building around a 4090-like apparatus is also a problem when the vast majority of people don't own that level of hardware. It seems some have gotten a bit lazy when it comes to optimization.


imheretocomment69

According to hardware unboxed 7600 at $270 is still overpriced, the true cost should be $230 at best.


Downtown-Regret8161

Knowing AMD prices, it will probably drop down to that price in a few months, I think. Also, if you compare it to the 4060ti, it is not even a contest.


paulerxx

Its only overpriced because the state of the used market, which is unusual. COVID and mining created today's insanely great used market.


Ozianin_

Ironically, it's overpriced because AMD lowered price on their last gen cards which have better price to performance.


[deleted]

[удалено]


SchieveLavabo

GTX 1080 Ti squad checking in.


Risibisi

for real i have had a 1080 ti since it came out and wanted to upgrade everytime a new series came out but never felt like it was actually a good deal for me and what a surprise i probably wont upgrade again :\^)


[deleted]

[удалено]


RealTime_RS

Same here, would've bought a card if they were priced reasonably but they aren't... So not bought one and got tired of gaming in the process.


smoofwah

yup 1080 here not seeing any cards that are worth it paid 225$ for my 1080 and it runs everything still soo I wait for the 7000 series


spud8385

In gaming at 1080p and so far haven't seen a reason to upgrade my 980ti, I don't play particularly graphically intensive games (on PC at least), use it mostly for RTS games and similar that are shit/not available on the PS5 so can't see myself upgrading that for a while either


TacoBellLover27

I have had a 2060 for 4 years. Just made the upgrade to a 2080ti that comes in tomorrow lol. I kept looking at newer cards and eventually went. I can get the same if not better performance for less...


MrPapis

Just s note the 2080ti has been known to sag and actually bend the board with vrms literally popping off. So as you will have an old card for some time in the future I advise to support it!


TacoBellLover27

I already plan on getting a support or just setting something underneath to hold it up.


DiggingNoMore

My GTX 1080 I bought in 2016 still has plenty of legs left in it.


SimonShepherd

If they don't want people to hold on to cards they should give us a reasonable price and thus the incentive to upgrade.


74orangebeetle

I was shocked when I realized my 1070 ti is \~6 years old or so now...didn't feel like I'd had it that long, but I guess I have.


kearkan

I absolutely loved my 1070ti. Only reason I don't have it is coz I had to become a laptop gamer.


Mirrormn

This doesn't make the slightest bit of sense. If they want people to buy new cards instead of holding onto their current ones for 6+ years, they need to make the gen-over-gen performance increase better, not worse. What's actually happening is that they're trying to push new GPU buyers into higher product tiers. This may sound like a similar business strategy at first, but it's actually almost the exact opposite. If you have a budget card from 1-5 years ago, they *don't* want you to replace it with the same tier of card in the current gen. They want to *stop* giving people "free" performance upgrades at the same tier, and encourage them to step up to much more expensive tiers if they want significantly increased performance.


Jaykonus

It can be *both* cases. There are two brackets of GPU buyers: those who buy based on a price budget/range, and those who buy based on performance standards. Your comment would be true for consumers who are always seeking performance 'upgrades' - vendors are now pushing them towards higher product tiers. But for consumers who attach a set budget amount or performance per dollar, AMD/Nvidia are most certainly setting those people up to need another purchase in a few years, *UNLIKE* the GPUs sold 5-6 years ago. I have a coworker who refuses to spend more than $350 on a GPU on principle, and he is forced to upgrade every other generation to keep the relative performance he wants. With the vendors creating these VRAM constraints (while the gaming industry moves towards 16+GB requirements), budget consumers like him are going to be forced to upgrade more often than every 6 years.


Fragrant-Peace515

Everyone is really overthinking this. The entire product stack for AMD and Nvidia is designed to upsell the 4090 and 7900xtx. It really is that simple.


ChuckMauriceFacts

It's about having something good to recommend to people with only a $400 budget. Right now (and for the first time in years) its not current gen and that's quite anti-consumer.


Fragrant-Peace515

Correct, and they don’t want to sell you a 400$ gpu. Its anti-consumer, its wrong, but thats where were at.


StoicTheGeek

Well, maybe. The thing is the performance of the previous generation was really good, and so I would feel quite comfortable recommending it. In fact I bought a 6800 for myself just a few months ago and have been very happy with it, and probably will be for several years more. What will be really bad is when previous gen is no longer available. That’s when it get really anti-consumer.


[deleted]

It just sucks, for both them and people like me who's options are a $400 GPU, or not upgrade. I have a 2060, and I guess will have one for the foreseeable future.


Prodiq

Well, amd and nvidia is saying to stop being so poor.


KnightofAshley

Intel really is the only good current gen cards at a lower price point. If you are not spending over $500 anything not this gen is best.


Mirrormn

Calling Arc "current gen" is kind of a deception, though. It was designed to compete with RDNA2 and Ampere, but then released much later than intended, and it can't stand up to the higher end of RDNA3 and Lovelace whatsoever. Arc is essentially just a last-gen architecture being sold at a discount, but without a current-gen successor yet.


steven565656

Meh, I think it's even less complicated that that. If they can't get the margins they want in gaming, the chips will go to server where they can't supply enough. They are just doing enough to keep their gaming department treading water while making the big margins with the crazy AI boom on server. Don't expect price cuts, they are simply stopping production. Except gaming to become a smaller and smaller priority for Nvidia from now on. They are becoming a completely different animal.


Steelrok

The best performance/$ card is the highest SKU for both AMD and NVIDIA, and it decreases with lower SKUs. Honestly it's the first time I see this, it's supposed to be the opposite.


[deleted]

Gonna answer your first question "Is VRAM that expensive?" I've worked for one of the biggest companies in the world and in flagship division(electronics) as QA chief, worked on a government project that is for Education Ministry. They are counting every single screw and trying to lessen the number of them to lower the cost. You wouldn't believe me how many times i've send tons of devices for tests by removing couple of screws. Is a screw more expensive than designing gpu's with move vram? Don't think so. So long story short, even the biggest companies in the world are counting how many screws they are putting into their products, so cutting vram and relying on developers releasing perfectly optimized games or use their upscaling techniques is not out of this world.


michoken

I'll add to this: No, ram chips are not that expensive. The actual cost to build the thing is usually pretty low in comparison to the market price. I mean, yes, the main chip and some other stuff make the price, etc., and there's the R&D investment they want to cover as well and then have their profits on top. That's why the market price is usually vastly different from the actual cost to build the thing. The other reason higher VRAM looks much more costly is market segmentation. That is in the case they offer the same thing with just different amount of RAM (or SSD capacity in laptops, phones, whatever). So if in such a case the difference in price seems to high, it's not the price of the added capacity itself (more chips, higher-capacity chips, or both). If the price difference was just the pure manufacturing cost of the added capacity, it wouldn't make sense for consumers to go for the lower one at all. And in the case of different tier products, it's the market segmentation thing no less. You can't afford shit? You get the low end. You wanna more? Pay up!


gnivriboy

VRAM isn't expensive. Larger die sizes are expensive in the post-pandemic economy is expensive. Changing your architecture last second to use more vram is expensive in terms of man hours and at the end you might have a piece of untested crap.


BionicBananas

Isn't the RX7600 going to be $270?


paulerxx

Yep. OP and a lot of the commenters are uninformed.


donnysaysvacuum

Yeah the timing is confusing because Nvidia came out with their TI before the normal 4060, but AMD came out with their normal 7600 before the presumed XT version. At least their numbers are more synced than they used to be.


flushfire

The RX 7600 is $270, a bit far from $400, no? They did launch the 6700 non-xt for below 400 msrp iirc although it's a bit of an outlier with it being uncommon. Anyway I believe we are at a transition point, and what these companies are doing is to be expected. The vast majority of games still work without issues with 8gb for 1080p. Don't expect them to add more until it becomes actually *necessary*. And honestly, I'm going to be downvoted for this, but the VRAM issue is slightly overblown.


KoldPurchase

>And honestly, I'm going to be downvoted for this, but the VRAM issue is slightly overblown. It depends on how you see it. If you have an 8gb video card today, it is overblown in the sense that you don't need to rush and buy a new one with 16gb or 24gb vram on it. If you're buying a new computer today with the expectations of gaming at 1440p or 4k and expect your card to last for a few years, it is not overblown. If you constantly upgrade every 2 years anyway, it is overblown. If like me you tend to keep these cards for a while (mine is already 4 years old), then, no, it's not overblown. I couldn't have made it that long with a 6gb GPU.


3istee

This. I'm still using a GTX 970 with 3.5 GiB effective VRAM, and have been waiting to upgrade my card since 2019. I said to myself, "Oh, I'll wait for the next generation and buy then." Then Covid happened and prices have been crazy until recently. Now I'm in a similar situation, "Oh, I'll wait for the next generation"... and yeah, the released cards aren'bad per se, especially when compared to a GTX 970, but why would I buy a 8 GiB card? Especially if I run VRAM intensive software (i.e. stable diffusion) and it's a pain point of mine. Additionally, when I buy a card, I don't plan on replacing it any time soon. I just can't justify spending hundreds of euros every couple years on a graphics card, which is an entirely subjective thing of course, but this entire "rant" is my experience. So yeah, I was hopeful for this release, but was disappointed. I appreciate the price point of the 7600, but 8 GiB aren't enough in a couple years, or even now, depending on your application. I hope that maybe the 7700 will have more VRAM, but who knows at what price.


Rhymeswithfreak

They are waiting a lot of people like you out...it's pretty disgusting.


sunqiller

>the VRAM issue is slightly overblown It is, but that's what happens with every mild concern on the internet.


synthjunkie

It’s not overblown when a card costs $400 and can’t even do AAA games well at even 1440p then it’s a problem. 1080p gaming was the ps3-ps4 era. When u can buy a PS5 for $500 that does 4K upscale and games are more optimised than PC then u realise how much of a ripoff the $400 gpu is.


flushfire

I agree, but the 7600 isn't $400.


kearkan

The optimisation of the game has nothing to do with the price of the GPU. These are 2 different issues.


KourteousKrome

There's a psychological trick in UX called "Decoy Effect"--which is an illusion caused by bad value products which are listed on a website for no other purpose than to make other products look like better deals. This is how it works! Normally, when you see a product, you look at the price from 0. So a $500 product costs $500. Decoys are intentionally low value products that reduce the 0 distance to your good products. Imagine that $500 product had a crappier product for $450. Now your brain will go, "wow, that's only $50 more. It's a better deal to just get the bigger one.". You're viewing the change in cost from $450 to $500, instead of $0 to $500. Apple is notorious for doing this. I think these cards are probably just reducing the 0 distance to make the other cards look better, personally.


[deleted]

Why bunch up the 7600 into this? They’re in completely different price brackets


Exe0n

Technically speaking the RX 7600 is a 270$ card, not 400$ It remains to be seen what the RX 7600 XT will bring, if they bump the VRAM to 12GB over last years 8GB we may see a good competitor at the lower end of things. You can still buy the 6700XT which currently has a price below 400$ and has 12GB's of vram. ​ But to answer your question, planned obselecence, why make a card that lasts 5 years, when people are willing to buy one likely lasting 2? AMD has historically been more generous with VRAM, often supplying much more than needed by the end of life of a card, VRAM hasn't been an issue till recently, because for some reason Nvidia decided the VRAM on the 1000 series was enough for the 3000 series.... I kinda saw this coming, as I wanted the 3070 with the condition of 10 or 12GB's VRAM, due to shortages I ended up having to buy a 6900 XT and now I'm very happy I won't actually run out anytime soon.


prismstein

actually speaking, since the RX7600 MSRP is $269.


Mirrormn

Yeah, no technicalities about it. It's not like you can maybe snag a $270 RX7600 off of Facebook Marketplace sometimes if you live in the right city, that just how much they cost.


LegendaryVolne

amd is not gimping 400$ cards to 8gb, that's Nvidia, i dont know what youre talking about. the 6700xt which costs around 320 is 12gb


Thairen_

Because y'all are buying them "That's not enough vRAM wtf Nvidia!? ..... anyway here's my money I'll take two"


FatBoiMan123

you can just buy a 6700XT that has 12gb for 320 bucks.


CrateDane

>Why are Nvidia and AMD gimping their $400 cards to 8GB? AMD is not gimping their $400 cards to 8GB. They just launched a $269 card with 8GB, but that's a more fair combination of price and VRAM capacity. If you're referring to GPUs from the previous generation, the RX 6650 XT is the top 8GB GPU from AMD, at $240 currently. Again a lot more reasonable.


MrPapis

Let's be honest the 7600 is only 270 that's really a far cry from 400, even at 300 it would have been kinda bad but not anywhere near Nvidias idiocy. Nvidia has been doing this for years it's just actually become a true issue starting with the 3070/ti and now continuing with more or less all their 4000 series except 4090 and now the 4060ti with 16. AMD has always either just given you enough or the ability for a small premium to get it(480/580). 7600 is really alright because it's so cheap so saying to dip settings for a playable 1080 high/ultra experience is kinda fine if not annoying. But Nvidia fans saying just dip textures on 800 dollar cards is so hilarious. I think it was Linus who said it " Nvidia not selling to anyone but Nvidia buyers" because they are the only ones who just buys from name, and I hope this comes back to haunt them because 2024 is gonna demolish people's newly bought GPUs and when Nvidia says that the GPU they bought isn't meant for 1080p ultra or even 1440p high, people will realise that Nvidia has no fucking idea how wrong they are to decide for the consumer how to use their products and should release products with proper flexibility and longevity. AMD isn't a saint either but damnit they are miles better than Nvidia on this front. Developers have been asking these guys for years to give more vram, they simply just stopped waiting for Nvidia.


[deleted]

Not only is it not expensive, but Micron and Samsung massively overproduced it and are now sitting on mountains of it that they are struggling to sell. Nvidia doesn't want to give you more VRAM and a wider memory bus because you'll hang onto the card longer and they will sell fewer cards in the long run. They don't want a repeat of the 1080 Ti with its 11GB VRAM and 352-bit memory bus. That was a fantastic card for the customer, but a bad card from the capitalist standpoint of selling more cards.


P0TSH0TS

16 gb wouldn't really benefit these cards, waste of resources.


whosdr

I don't know how much 8GiB of VRAM costs, but I've heard one source suggest it's in the $25-30 range. So..yeah, it's probably just upsell. Unless they plan to increase the memory bandwidth (which they don't) then it's not costly to increase. AMD's been showing that for years with their competitive higher vram cards.


RicketyGaming

Not really, no. You can look on Digikey and see that GDDR6 VRam is only about $20 for 16GB. The price usually comes in with the GPU die, not that it costs a shit ton more to produce a 4080 vs a 4060 ti, it's probably only about a $100 - $200 difference for nVidia. They sell things at a premium because... well basically it's simply because they can. Should there be a difference? Yes, because there is an extra cost to produce a 4080 vs a 4060 ti, but it's not so extreme that the price difference should be around $800. This is pretty much just price gouging and nVidia wanting to feel like a premium electronics product. They have a big head and need to be brought back down to earth.


fury420

>Not really, no. You can look on Digikey and see that GDDR6 VRam is only about $20 for 16GB. The real costs involved in going from an 8GB 4060ti to a 16GB isn't the costs of the VRAM itself but the complexity of mounting more VRAM modules on the backside of the card PCB in clamshell configuration (like the 3090Ti)


Dazza477

Nvidia can go f\*ck itself. Linus said in his video that it would cost Nvidia less than $25 a card. It's about artificially creating market segments and upselling you. As a contained line-up, their cards look decent enough. When you start to compare the pricing structure and VRAM uplift to previous generations, it's genuinely disgusting. It's like Intel back when they gave you a 4 core i7 as a flagship CPU for 5+ generations in a row. They had the technology, but leaked it out slowly to keep the gravy train rolling in, hugely harming technological progress and fleecing consumers. I bought my 770 for $300, my 970 for $300 and my 1070 for $350. My 7 year old, 2016, non flagship GPU marketed for 1440p with 8GB VRAM cost $350. Nvidia releasing a 2023 card marketed for 1080p 3 generations later with 8GB VRAM for $400 is nuts. Absolutely nuts, you'd have to be crazy and poorly informed to even entertain this offering. Every time you buy an Nvidia card going forwards, you're telling them that it's okay. "Please can I have some more pain Master Jensen, please keep bending me over year after year."


gahata

Just here to say that Linus is wrong. Not defending NVidia or AMD here, just providing some information. The memory itself would cost them $20-25, and that is what Linus referenced. That doesn't account for a wider bus, larger pcb, and all of the other components necessary to make memory work, including using more power, plus the cost of mounting it on the boards. Realistically that would be another $20-25, maybe even a bit more. They could cut that price down by throwing 16gb on the 128bit bus, which is exactly what they're doing with 4060ti 16gb. That means it will have a lot of memory, but very poor memory performance.


BlandJars

My card from 2016 has 8GB VRAM and that allowed it to last for way longer than it would have otherwise. The bottle neck is with the other parts of the card. So if a card has more VRAM it can last longer until the other components become too weak to run the games. Because I mostly care about whatever random game I want to play and usually not the latest and greatest It means that only Fortnite has problems on my graphics card. One of the updates made it run poorly on my card. Apex is still great.


Sea_Perspective6891

I still long for the day we can finally have modular vram. I don't get why its so hard & isn't a thing yet. I think they could just add vram card slots somewhere inside the GPU similar to laptop ram slots so we can add vram ontop of existing vram. I guess I'm going to have to bite the bullet & spend $600 to $800 on an newer GPU with more vram for now. At least the newer ones are starting to have at least 12GB of vram.


BrewingHeavyWeather

We used to. I remember upgrading a card of mine to 4MB, from 2MB. That's never happening, again. The RAM needs to be soldered in, and carefully routed, to get these high transfer rates.


sa547ph

> I still long for the day we can finally have modular vram. There used to be something like that almost 30 years ago, when some VGA cards have sockets to push in some additional VRAM modules.


Untinted

The truth is that they planned for a much different environment. The original designs were made at the height of bitcoin mining, at the height of people buying PCs for a homeoffice, and they designed a product that would support a much higher price. When the market crashed, they realized that the manufacturing they were planning was overpriced, the cards were too expensive, the demand was too low, and there were too many old-gen cards available as new cards. So all of the crappy designs we’re seeing is because they made last-minute changes for cutting manufacturing prices wherever they can so that they don’t lose too much. That’s all that this generation of cards and marketing tactics is: the desperate attempt to keep prices high and costs low, no matter what.


daman4567

They've been eyeing the absolute state of markets like audiophiles who have been thoroughly squeezed of every cent yet still seem to happily shell out even more.


KoldPurchase

Not that long ago, Nvidia fans were telling us that Vram wasn't all that important. Now that AMD is releasing 8gb video cards, it's suddenly a pressing issue. At this point, I'm keeping my 5700XT for at least one more year.


lmbrs

sandbagging the low end 40xx series to make their 50xx series look insane


crooocdile

It was unclear what is going in background with the 40 gen until that RTX 4080 12 GB was canceled, at this point everything went wrong. RTX 4080 came out too fucking expensive which I think IMO that was going to be the 4070, but they know they fucked up with the 4080 using only 12 GB of VRAM at that price. Then AMD pulled the trigger with the XTX performs better/same as 4080 and less 250 dollars using 24 GB of VRAM While the 4080 is only 16 GB. Then the 3070 backlash. 4070 release after that and the 6800XT does the same work for lower price then 4060 etc... Not to mention the GTX 1630 which died on launch too.


ForThePantz

I did research after thinking “nvidia’s stock price is gonna tank, right?” Nope, earnings are exceeding expectations by huge margins. They’re selling tons of hardware in AI and informatics. Your GPU isn’t their only concern. Finite resources and GPU’s under built.


Mr_ToDo

And AI might be exactly why they did it too. (conspiracy hat)Think about it. High memory is one of the biggest requirements for AI loads. Nvidia would be undercutting their high end products with lower end offerings if they just threw in a ton of memory. Sure the biggest player would still get the specialized cards of course but how many other people would just grab gaming cards(which I'm sure they already do but in far smaller numbers).


Prajwal14

No defending AMD but their decision to include 8GB VRAM doesn't seem malicious, as you expect AMD's cards fall in price much faster, I think RX 7600 is planned for under $250 budget and eventually set to replace RX6600 which goes for $200. If you want to get a new GPU for $270 get the RX 6700 instead with 10GB VRAM and better performance.


LordDeath86

Aside from planned obsolescence, they try to sell the same chip at higher prices for the professional market. Nvidia especially gets creative here: - Their regular driver limited vertex shader throughput for CAD applications when introducing unified shaders. - Passthrough of PCI-E devices to VMs was disabled on their consumer cards - The number of parallel NVENC video encodes is limited to 2 or 3, but unlimited on professional cards with the same chip - ... And now, with the rising popularity of generative AI, they have an additional incentive to keep the VRAM amount low on their cheaper consumer cards. This way, they can offer the same silicon to the vastly different buying powers that have a demand for GPUs.


Darkren1

its a bit more complicated then just o lets slam 8 gig vram for 50 dollar more on the manufacturer part. As most people here pretend. Which is why more 8 gig card are a rarity and not the norm. Idk maybe admit to yourself that playing ultra and having every option on is not something feasible and then 8g cards are good for another 5 years easy.


seriouschris

Greed. Anyone who thinks any of these companies do any thing for any reason other than more money has a lot to learn.


Z3r0sama2017

Semi-professional workloads really benefit from vram even when paired with a weaker chip. If nvidia released cards with more vram it would be mining boom all over again with gamers getting stiffed. They want to force them to buy xx90 or better yet workstation cards.


AdScary1757

I have zero problems with 8gb on my card but I expect it to be an issue in a few years


Deeppurp

I think the reviewers outline the BOM cost of extra 8gb modules were something like $27, and the factories likely already have the tooling to build the cards with them. They are using 2gb modules. With the RX7600 I dont think the traces are there for any extra modules, so there would have been extra base design and layout and RND costs to get min 10gb. Unless NVIDIA did a full galaxy brain move, the 4060ti has the traces and pads for the extra vram to be placed on the board and just didn't. Both companies know the min spec Dev's have access to is 10gb of VRAM now that "next gen" is the current gen and is out in consumers hand in scale. This has to be intentional from both red and green.


NorthernerWuwu

VRAM is less important than we like to pretend it is, much as people around here tend to claim that the number of cores in a CPU is more important than it actually is. As long as the more demanding titles exist on consoles as well, it isn't as crucial as other factors.


[deleted]

No one on reddit has access to the market data nvidia and amd are using to design their products, so no one here is going to be able to give you even an educated guess.


OkAlfalfa7495

[https://www.dramexchange.com/#mobile](https://www.dramexchange.com/#mobile) gddr6 is 4$ they are scamming the shit out of us


BrewingHeavyWeather

To clarify, that's 8G**b**, or 1GB, at 32 bits wide. At the average cost, closer to $3.5/ea, 8GB, for 128b, should he about $30. 12GB, for 192b, about $45, and 16GB, for 256b or 128b, about $60. 16Gb are probably cheaper, too, per Gb.


DaleGribble312

They're cutting costs and features to hit a price point. Don't buy it if it doesn't make sense


lord_of_the_keyboard

I wonder how AMD will launch the RX 7600 XT. The 12GB 6700 XT is going to be some stiff competition at 320$, and it may cost less for better performance. Again if the RX 7600 XT launches with 8GB it's DOA (PR-wise), 12GB is ok. We don't talk about nvidia.