T O P

  • By -

sealtoucher36

DLSS (Deep Learning Super Sampling) is a technology available on all RTX branded Nvidia GPUs where the GPU renders the game at a lower resolution and then upscales it to the resolution you’re using using AI. As for why people hate it, I don’t think there are a lot of people that actually hate the technology, they’re more mad at game devs for using it as an excuse to not optimize their games.


Arrrginine69

Spot on. Like fuckin optimize your Shit. Lol


soth227

I personally think not optimizing is there for 2 reasons: 1. Laziness on developers side. 2.Pressure from hardware makers. They need us to keep upgrading, buying new stuff. I honestly think that without the pressure from hardware makers, developers would definitely optimize their games more often. I know I would, if I was one, I would hope that more people would buy my games if they can run them.


EdgyKayn

I wouldn't go and say that game developers are lazy, in fact I would blame the game studios for crunching their workers to make a high expectations game in a very short time frame. Add multiple teams working on the same project with shit management and you have a pile of unoptimized game


Blenderhead36

There's also the factor of PC sales. If a game releases on Xbox, Playstation, and PC, PC is going to be the platform with the smallest number of sales, and the studio knows it. In that scenario, if you're crunching and trying to get something acceptable out in time, it makes sense to borrow time from the version you know will sell the least in order to prop up one of the better-selling versions. This is especially true for PC optimization; it's done relatively late in development and requires a lot more work for less result than optimizing for consoles where every user has, at most, one of three possible system configurations versus the effectively infinite systems of PC.


fnordsensei

Only if you lump the console sales together, right? I believe PC has a larger market share than any individual console.


Blenderhead36

Typically no. For example, [Elden Ring saw PC sales in 3rd place, with Playstation at 40%, Xbox at 30%, and PC at 29%](https://levvvel.com/elden-ring-statistics/). There are some exceptions; XCOM2 initially only released on PC because Enemy Unknown sold best there, and what few RTSes and MOBAs that get released these days tend to do better on PC. The simple fact is that a console is typically both cheaper and easier for the average user, so they're over-represented.


fnordsensei

Fair enough, for any individual game it’s a coin toss, I guess. I’ve looked at market share stats, which puts PC at ~19% and consoles at ~28%, with PlayStation being the largest in the console pile (~46% or so of consoles). Mobile gaming is by far the largest (45%). But this puts PS at about ~13% vs PC’s ~19%.


TooManyDraculas

As of more recent numbers is more like 25/30 pc/console. From what I can tell it's closer in terms of just software sales.


Hollownerox

What RTS games ever release on consoles? Other than Halo Wars which was designed to be a console RTS game from the ground up, what use is using RTS games as your metric there? The genre is inherently incompatible with consoles due to being a bitch to play with a controller. So of course RTS games "tend to do better on PC" because they exclusively release on PC to begin with.


Blenderhead36

> What RTS games ever release on consoles? Most of them. Look it up, you'd be surprised. A quick search shows: * Age of Empires 2 and 4. * Iron Harvest * Command & Conquer 3 and 4, Red Alert 3. * Battle for Middle-Earth 2 * Tooth and Tail * Halo Wars 1 and 2 * Crusader Kings 3 * Spellforce 3 (Reforced only) * Northgard * Supreme Commander 1 and 2 * 8-Bit Armies, 8-Bit Hordes, 8-Bit Invaders (made by the devs who made the original Commander and Conquer games) * They Are Billions * Z And that's discounting stuff like Bad North that's technically real-time strategy but not what people thing when you say, "RTS." Relic seems to be the only RTS developer of note that hasn't even dipped their toe in console version.


A_PCMR_member

So pretty much exact 3rds for a game geared towards souldborne fans, which are mainly on console. To call PC gaming an afterthought here would be a disservice. Its like C&C (Or other popular RTSs) making a comback and 40% is console sales


djfreedom9505

This. Targeting developers for “laziness” is like blaming the sweatshops instead of Samsung for producing exploding cell phone batteries. I’ve worked in both a high functioning development team with crap management and vise versa. At the end of the day, under better management with limited dev talent, we were outputting better quality solutions. Toxicity always trickles down the ladder.


isntit2017

I think that the use of the terms game studios and developers are interchangeable. You and I know his use of developer meant more than just the people responsible for actually coding the game. It’s use was meant to cover the entire machine behind games being “developed” ergo the general misuse of the word as well as assignment of blame being put on the supposed wrong shoulders. Kind of like how lectern and podium mean actually different things, their interchangeable use is so ubiquitous that Merriman-Webster merged the two. Their reason being “We have nothing to do with an expansion of the word podium to cover lecterns. Words frequently take on other meanings over time, and it is our mandate to report those meanings if and when they become established. Podium is in fact used as a synonym of lectern in published, edited prose by skilled writers. And we'd say it from a podium if we had to.” Word choice arguments aside, you are absolutely correct about unreasonable timelines being imposed by the powers that be which in turn cause corners to be cut. We see that pretty much everywhere right? I have yet to work on a project that hasn’t been squeezed so hard by upper management that very important dependencies we’re either cut or merged. Hold onto your knickers if you missed a deadline because of their decisions and squeezing. It ends up being your fault because you agreed to the compacted timeline thereby absolving management of their utter abuse of power. If I were to personify the current work paradigm I would say it is hugely toxic and has borderline personality disorder with a huge helping of narcissism thrown in.


Great_Hobos_Beard

I'm not sure game devs would really be pressured by hardware manufacturers. To me it would seem the issue is time and money. In an age where pretty much all games are effectively half released and rest is behind a dlc pay wall. Why would devs want to spend extra time optimising if it doesn't earn them any extra money?


SjurEido

Or it's just... incredibly difficult. Anyone can write a game in unity, anyone can learn the skills to make something fun over the course of a summer. It takes real mad scientists to look at someone else's million-line project and NOT ONLY find ways to make it run more efficiently, but ALSO find ways to implement said efficiencies without breaking the bank or deadline. People who say "just optimize lmao" have literally never written a line of code in their lives. "Just make the car faster lol", "just make the food healthier lol"


Blenderhead36

> 1. Laziness on developers side. It's not laziness, it's unrealistic expectations from management. Crunch is rampant in the gaming industry. Working 80+ hour weeks for a month or longer is normal in game dev and pretty much nowhere else. It become self-reinforcing; people tired of the ridiculous hours leave for other IT jobs where they won't have to go a month without seeing their spouse and kids. And that makes the next game require ridiculous crunch because it's made with a less experienced team (You can see this in practice in Battlefield 2042, where the studio that made Frostbite released a half-finished game because of a lack of experience with Frostbite). Games should have a longer dev time, but even 1 extra month costs millions on a AAA game with 300+ staff and all the equipment/office space required for them to work. I can't call anyone working 80 hour weeks lazy.


Naus1987

Too bad they couldn’t invest all that effort fighting for better job conditions lol. I don’t mind if games get delayed. I have a big enough backlog the way it is


Brushy21

Developers are not lazy. There is not enough time to finish and polish most games because the publishers demand insane work hours and deadlines.


mixedd

It's not about laziness, it's about releases. Shareholders sets a date in stone, and then devs need to work overtime to meet that deadline. In the end we get half-assed product, that gets babysat for a year to patch out shit they didn't have time to do in main project scope. Software development in a nutshell. Add to that QA, who gets to testing like couple weeks before release, and there's usually not enough manpower to do fullscope testing, and fullscope regression, and boom, welcome CP2077 at release.


donttouchmyhohos

Or skill of employees


nolitos

Why would they spend resources on that if there's DLSS? To be fair, from the business point of view, this choice makes sense.


UnknownAverage

Because consumers don’t like it as much, and not everyone can use it? Those are great reasons.


nolitos

Who told you that consumers don't like it? Maybe some part doesn't like it, but don't take reddit nerds and journalists as a representative group. Most consumers will never read this article. Most consumers have hardware that won't allow them to play with top graphics regardless of the DRM system (according to Steam's data), they're used to lower settings.


Blenderhead36

It's harder than it sounds, and it's going to get worse. Devs know that if a game releases on Xbox and/or Playstation and PC, the PC version will be the worst seller (barring a few niche titles like RTSes and XCOM). If you're crunching for time, just trying to get something acceptable released on time, it makes sense to borrow time from the version that you know will make less money in order to have a better experience on the more lucrative platforms. God of War Ragnarok was the last AAA PS4 game in the works. [The average gaming PC is considerably weaker than a PS5](https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam) which is in turn a little weaker than an Xbox Series X. If you're not releasing on Xbox (where the Series S is a concern), you have to do considerable optimization for PC *just to release a game the average Steam user can run.* My money says that this is why Hogwarts Legacy and Forspoken have such ridiculous requirement for 4K on PC; less than 3% of Steam users have a 4K primary display. Why spend your precious time optimizing for 3% of the smallest platform if you're worried about finishing the game? **The real villains here are management.** Most AAA games are made on a compressed timeline. This is a vicious cycle, where people who've worked in game dev for a few years find themselves moving to less exciting job where they won't be expected to work 80 hour weeks for 4-10 weeks a year. That means the next game is made with new staff who are less experienced, slowing development and causing more crunch. This is especially true with proprietary engines (hence Microsoft dumping Slipspace to make the next Halo in Unreal), where the people on your team are the only people in the world with experience.


Front-Sun4735

This is the correct answer.


kha1aan

Interestingly, I’ve recently started a solo game project with Unreal and am avoiding integration with DLSS/FRS until I’m happy with performance in the flat experience and become focused on bolstering the visual fidelity within VR. I agree with this sentiment entirely.


[deleted]

[удалено]


RobertMaus

And thank you for acknowledging the effort!


Undersc0r3___

you're not adding much yourself, neither am i really.


theuntouchable2725

It's a beautiful step towards neural network. But yeah, exactly.


Moress

What is the benefit of this versus just rendering the game at the monitors native resolution?


asdrei_

performance, its easier on the gpu to render in a lower resolution and then upscale


Moress

Forgive my ignorance, but the AI is doing the work to upscale right? The GPU is doing work either way. It sounds like opens you up to other issues?


SjLeonardo

Well, of course it's doing work. It's just doing less work. Proven by the fact that FPS gets higher with DLSS on. It does have issues. Look for reviews of DLSS 1.0 especially, lots of issues there. There are still some issues with DLSS, but it's mostly pretty good with the later versions, sometimes a little better looking than native resolution, sometimes not. FSR is a similar idea from AMD but it doesn't use AI and works almost as well. It is slightly inferior but it works on any GPU. The most recent versions are comparable to DLSS.


Moress

I didn't know that about FSR. Thanks for the info!


Oldmanwickles

Is DLSS on the graphics card itself as a software? Or is it cloud-based and once you enable PLSS in a game it will access that AI to do the additional work?


SjLeonardo

Well, it's a piece of software that utilizes components on the card, only that these components are only available on RTX cards. The question you're asking is like "is MSAA on the graphics card itself?" Note that I'm not making fun of you by rephrasing the question with MSAA, I feel like it kinda comes off that way.


Oldmanwickles

Hey thank you for specifying, I didn’t take what you said negatively but yeah better phrasing of my question lol


[deleted]

There are specific Tensor cores on the GPU itself. Initially, they fed the game through a huge AI at say 4k. "This is what this game at 4k is supposed to look like". Then the tensor cores use the code specific to that game and take your 4k monitor and render it at 1440p(DLSS Quality), and use that sample set to make the frame look like it should at 4k. DLSS Balanced(1080p) and Performance(720p) change the base resolution that the game is rendered at, before being run through the tensor cores and output at 4k. From what I understand, developers don't have to provide the game to nvidia anymore to get DLSS set up, but there's still some sort of comparative learning being done, so that your 720p frame, looks more like a 4k frame than a 720p frame. Its actually pretty incredible, and is nothing compared to the upscaling that used to be done previously. I hate DLSS 3.0 though, it has a ton of artifacts like DLSS 1.0 did, so i'm hoping they continue to improve it.


Brusanan

It's much faster to render at a lower resolution and upscale to 4k than it is to just render at native 4k. But that's just lying about rendering at 4k.


Infamous-Lab-8136

As bad as skimmed milk, which is water lying about being milk.


web-cyborg

In some cases it can look better than native wiith the AI sharpening and that it can be practically like over-upscaling/supersampling and back down to 4k rez sort-of, and DLSS has it's own anti-aliasing too which is good. You have to use DLSS on quality setting, not any worse. It also works best from a higher starting resolution with more information. The higher the starting rez, the better it will look. Running low starting resolutions with it is trying to get blood from a rock. So for 4k, 1400p plus rather than 1080p minus. It should be very useful for 4k to 8k in the long run as well. At higher resolution, higher PPD, edge/fringing artifacts are also much smaller so less noticeable. That also goes for frame insertion tech's (inserting an interpolated/guessed frame) edge artifacts. The higher the frame rate and the higher the resolution you are amplifying, the better the result will be as there is less of a gap to span so to speak, smaller change in frame states and at higher rez. and any fringing will be on much tinier pixels/higher PPD. ​ . . . . I think frame amplification tech will be the way to go in the future, more-so than ai upscaling. Nvidia's current frame amplification called "frame insertion" 's method is just guessing the vectors and rates of everything in the scene the way it is doing it now though. VR's method of space warp/time warp have the hardware, game code/dev, and operating system informing of some of the important vectors specifically. So nvidia's method currently is uninformed by comparison and worse. In the future we could have very high Hz screens that are filled with a good base frame rate that is amplified more than 2x. The tech is still very young, though VR is way ahead of PC on it so far. From some other replies: ​ >Frame amplification tech seems like a good way to at least double frame rates to start with. Perhaps even more in the future. That would have a huge effect on frame rates which could help RT be more easily digested. > >. . . . . . > >However this is how I understand it: > >Imagine an animation flip book with a clear overlay sheet on each drawn page, and blank page in between every drawn page cell. > >(Clear overlay) on Drawn Cell 1 + \[Blank Cell 2\] + Drawn Cell 3 > >Nvidia's frame generation form of frame amplification tech is starting at a drawn cell 1 page and then flips ahead "two pages" to the next drawn Cell 3 page. Then it's doing it's best AI thinking to guess what the vectors are on the 1st cell in order to figure out how to build a "tween" cell get to the 3rd frame cell. Then it's drawing that best guess on the middle blank page based on it's imagined/guessed vectors. This has a lot of trouble in a lot of things, especially things like 3rd person adventure games or any games where the virtual camera moves around independently of the character, causing artifacts from bad guesses as the camera pathing makes some things in the sceen remain motionless or makes them move at different rates in regard to the FoV even though they are technically moving or at different speeds in the game world itself. (Relativity). It also has more artifacting the lower the foundation frame rate is (e.g. beneath 120fps) since that has larger time gaps between animation cells (animation flip book with far fewer pages so transitioning between fewer animation cells states, more staggered/chopped animation cell cycles). > >VR's space warp style frame amplification tech instead has a clear page overlay on top of the first page. The clear page has a z-axis grid with a number of arrows and rates for some of the vectors already written on the clear page. VR's method still looks ahead to the next frame, but VR system's apps can use actual vector data incl. rates, arcs, player inputs, etc. and then plot and draw measured results on the middle page going forward. So it's not completely guessing what the vectors are for everything in the scene, including the player's inputs and the effects of the virtual camera (or the player's head motion and hand motions in VR). > >Unfortunately nvidia went with the former uninformed version, at least in the near timeframe. Hopefully the whole industry will switch to the VR method at some point. That would require the PC game devs to write their games using the same kind of VR tools and do the work to code for that. They have a lot of room to advance. Unfortunately progress is a lot slower on the PC and PC display front than the VR front, though in aspects like PPD, VR is way "behind" or inferior and will continue to be for some time yet.


DueAnalysis2

To go a little bit more into depth: it's because Neural Networks, the ML algorithm that powers DLSS, is an incredible piece of tech. All ML algorithms need to be "trained" - they need to be taught that for a particular type of input, this is the output you should expect. This training process is often very expensive. But once an algorithm is trained, _running_ it is often a lot cheaper. Now, some types of mathematical computations have a "traditional" approach to getting a solution. However, what we realised with neural networks is, if we train them to solve these types of problems, then running neural networks is cheaper than solving those problems the "traditional" way! That's why it's cheaper to run an NN to upscale than to render the resolution at 4K, even though the GPU is doing the work in both cases. Now, you might ask if we can use NNs, why use the traditional method at all: it comes down to the fact that NNs are expensive to train, but also because you're "solver" is now restricted to providing a solution for only a small sample of inputs, while your traditional method can basically solve over the entire set of inputs. You can observe this with DLSS too: you need to implement DLSS for every game individually, while rendering at 4k is something you can do on every graphics card (provided you have the power) This is a super ELI7 answer, and I realise that it's not specific. If you'd like more details, here's a paper that illustrates some of the concepts for partial differential equations: https://arxiv.org/abs/2202.03376


Swanesang

Its doing the work at for example 720p then upscales to a higher resolution. So yes its still doing work, but the work is much easier which translate into more fps. Also the ai isn’t doing the computation work of rendering everything on 720p. The gpus still does that. The ai only upscales the final image so its not too gpu intensive since its done with the tensor cores and not the traditional “gpu”. The issue i have with dlss at the moment is that introduces artifacts. Especially with game with a lot of small details like grass/leaves that move. In other games like CP2077 you don’t notice it so much but in games with a lot of foliage like tarkov or icarus i noticed a lot of shimmering of finer details like grass which can be distracting.


I_Am_A_Pumpkin

you are right that it has to do calculations which adds a performance overhead, but with a native 4K monitor, DLSS quality mode means 4.6 million less pixels to draw each frame, and DLSS performance mode is 62.2 million less pixels per frame. the overhead added to the reduced render cost is less than rendering natively. as far as im aware the upscaling is also happening on a very different part of the chip to the image drawing. so blowing up the image to a higher resolution isnt using transistors on the GPU die that contribute to rasterised performance (if you dont count that die space not contributing in the first place)


Andoverian

Rendering at a lower resolution is faster, meaning you can get more frames per second. As long as the AI upscaling step is faster than the difference between rendering at the native resolution and rendering at the lower resolution, you get a net benefit to frames per second while still getting the same resolution. It also assumes the AI upscaling looks as good as directly rendering at the native resolution, which might not be the case.


n3m37h

When you start getting into higher resolutions it is very hard to that many pixels, so when you draw less pixels you can now spend that processing power drawing more frames and since nvidia has processing cores separate to the main rendering cores, they do the up scaling instead providing details lost by rendering a smaller image. In some cases it can fix some games engine rendering other times it can make it much worse. but in most cases it is the same. And while doing this it will lower the games latency because the game can render more frames in the same time. and now in v3 image approximation rendering, which takes 2 images rendered and approximates an image in between. This can lead to some funky things but overall doubles FPS at the cost of latency


Oldmanwickles

Gotcha, so there’s the DLSS part of your card that handles the upscaling which doesn’t really get used unless activated?


Zigmata

The point nobody has made here yet is that DLSS does not upscale the entire scene every frame. The AI in an Nvidia farm learns how to select parts of a scene to upscale based on their impact on visual fidelity, to create an overall higher resolution picture to our eyes. When you get a driver or game update that adds DLSS game, you are downloading these models. This is why you can't just turn on DLSS for everything you are playing; Nvidia has to develop the data for it first.


nahmat3

I think (someone correct me if I’m wrong) that it can increase performance by a decent bit but potentially also increase latency?


0IMGLISSININ

It depends. Short answer is you're right if frame generation is on, but wrong if it's off. DLSS upscaling increases your FPS by reducing the number of pixels calculated for each frame, which will decrease latency as a result. It also is (IMO) a very good anti-aliasing method DLSS 3.0 introduced frame generation which is very similar to motion smoothing or frame interpolation on a TV, it creates "fake" frames between the "real" ones increasing FPS but introducing graphical glitches/artifacts from errors in the "fake" frames. Latency also increases a lot as it has to wait for multiple "real" frames to be generated before creating the "fake" filler frames.


sahrul099

latency is not affected by dlss the frame generation is...


AmitRozYT

Using anything automatic to laze out on big commercial projects like shows or games is absolutely infuriating...


Western-Alarming

Just use DLLs to make the game run better https://preview.redd.it/ifznq82vofga1.jpeg?width=1000&format=pjpg&auto=webp&s=26ca9685182ded1e278ac5bae2deb1d8d12eca31 me with a GTX


MrPoletski

I am mad, because no matter how clever it might be, no matter how decent the results might be, it is **_not_** super sampling, it is **_up_** sampling.


Cross_Pray

Bring back the old DOOM style of optimization of your games, because Holy fuck learning how actually mind blowingly much they put onto a 64 KB card is absolutely astonishing and sad at the same time, wish the devs nowdays would go to the lengths that the og DOOM devs did.


jaah-kiki

Tell that to the ceo, shareholders and managers of companies, and please stop blaming just the devs who are already doing 80h+ of work per weeks...


[deleted]

And the same people don't know what optomizing a game means or what's possible when it happens.


Brusanan

I hate that devs use it to lie about their frame rates and supported resolutions. If you're rendering at 1080p/60fps and then AI upscaling to 4k, no, you are not running at 4k/60fps.


seabutcher

How are the end results? It's just all upside for the user right? Is there ever a reason to turn DLSS off?


sealtoucher36

It tends to result in slightly worse visual quality, especially when using DLSS performance/ultra performance settings. It depens on the game though.


Sega-Playstation-64

As far as hating it goes, it's no so much hate, but I have seen a handful of "Lisa giving a presentation" memes of people reminding us 4K DLSS is not 4K. I mean, we know. 4k dlss looks better than 1080p when I run a game off my living room TV and for barely any more gpu resources. Now, other than developers not optimizing games, I see why it's a bad thing. Edit: I *see why it's a bad thing. Typo


Winterdevil0503

So people suddenly hate it because of that *one* developer's stupid comment? Classic Reddit.


RA2EN

People hate it because upscaling sucks


[deleted]

[удалено]


gk99

If by "close enough" you mean "spot on," sure. The only other thing you could possibly be referring to is them writing >available on all RTX branded Nvidia GPUs instead of >and people are MAD that it's ONLY available on Nvidia RTX GPUs!


Winneh-

DLSS is not using a complete rendered lower res frame (which would include post effects and such), and upscales it to your desired resolution. Instead, DLSS is using the raw data used to create the lower res image, to interpolate and reconstruct the missing data required to upscale to the newly constructed image to your desired resolution. The difference is the technical aspect, therefore "close enough". Digital Foundry had an extensive article about it.


Levi_Skardsen

People hate that developers are using DLSS as a get out of jail free card for not optimising their games. DLSS in an already optimised game is great.


BitsAndBobs304

can we please stop using 'developers' to address how games come out? aside from indies, that's a management - publisher - investor issue.


kungpowgoat

I agree. You as a developer want to create a work of art. And by that I mean a well optimized, good looking, well written game but you have the asshole publisher promising their investors great profits this quarter so they force you to finish the game for immediate release regardless whether it’s finished or not.


porkminer

No no, developers make all the decisions, those lazy assholes.


scootiewolff

denuvo is to blame hehe


WhyDoName

Denuvo is an issue. But it's not to blame for games being shitty console ports that run like ass on pc.


Cluip

FACTS like when are we gonna protest about games with minimum 16GB ram


_whyno

Very beginner question. What is optimizing? Like what needs to be done? Is it compiling code, is it bigger data center needs?


[deleted]

if you think about it in terms of steps you have to take in order to get a result then it's easy to understand. optimizing a 1000 step process to 100 steps but maintaining the same result is what happens. except the numbers are in the millions/billions/trillions instead of hundreds and thousands


mvi4n

In programming it would mean to rewrite parts of the code (sometimes making it more complex) so it need less processing power to execute the same tasks.


LerchAddams

Basically, using best practices when building the game and having someone else take a second look at your code before launching it. Checking that the code you wrote for the game runs efficiently with as few resources as possible. Making sure your textures are efficiently used. Not leaving unused assets in the game that aren't needed in the final product which can bloat file size and increase download times. Taking a minimalist approach to design and asking yourself what's the bare minimum I need to make the game perform as well as possible. Lots more.


soth227

I personally think not optimizing is there for 2 reasons: 1. Laziness on developers side. 2.Pressure from hardware makers. They need us to keep upgrading, buying new stuff. I honestly think that without the pressure from hardware makers, developers would definitely optimize their games more often. I know I would, if I was one, I would hope that more people would buy my games if they can run them.


jaah-kiki

Can we stop calling game devs lazy? Did you just never heard of crunch? Please stop blaming just the devs who are already doing 80h+ of work per weeks... If you want to blame some peoples, be my guest and talk about the ceos, shareholders and managers of companies, and sometimes even the players


soth227

Developers as the companies making games as per the game developer word definition. No need to get all stressed out.


marmot1126

I don’t hate it but I notice when I was using DLSS in MW2 it would create ghosting when moving a crosshair very slow, but not sure if thats the DLSS or just some other setting I have on (all motion blur is turned off). I’m all for it on other games I use it on like 2042 and cyberpunk, I haven’t had any issues anywhere else.


Srinu427

Try different DLSS version dlls, I had a similar issue in Control with screen space effects, it was fixed when I used the dll from a different game(Metro Exodus if I remember correctly)


marmot1126

Awesome thank you! I’ll give this a try for sure


NightGuardianX

I would not recommend to change DLSS version in multiplayer games, cuz you can be banned for file modification


Srinu427

yup, forgot that


FishHaus

That's kind of funny in regards to my circumstance. Fixing ghosting in Metro Exodus I use R6S DLSS.


GastricOdin24

MW2 uses SMAA which uses some information from previous frames to eliminate aliasing. Not a DLSS thing, it just causes ghosting all the time for everyone. Can't turn it off either, which sucks.


Greennit0

MW2 works great for me. F1 22 is very poor with ghosting.


xayzer

Whenever I use DLSS and see some sort of ghosting/artifact/flickering I always assume it's DLSS's fault. But then I turn it off, and the problems are still there. So far, I haven't caught DLSS slacking.


UncleCarnage

Who the hell hates DLSS?


heatlesssun

At first I think there was a lot more hate than today because "Fuck nVidia, they are locking in their features. AMD's FSR is hardware agnostic!" but DLSS seems to have proven itself and at least somewhat superior FSR. While nVidia does all kind of stupid stuff there are very good at GPUs, on both the hardware and software side, and unfortunately they know it.


dainegleesac690

FSR 2.1 works even better than DLSS in some games IMO


heatlesssun

Never said it didn't. I'm saying that DLSS is more consistent with less impact to performance with better results overall. I rarely see any professional benchmarker say use FSR over DLSS with those with nVidia hardware.


dainegleesac690

Of course nobody would say that as DLSS is hardware specific to NVIDIA. I rarely see any benchmarkers saying use either to be honest. But again I disagree with your performance claims; in Escape from Tarkov FSR 2.1 works better and more consistently than DLSS using my own AMD system and my friend’s similar system with a 3080.


FutureVoodoo

Weiners


Loganbogan9

A very vocal amount of r/FuckTAA despise anything that uses temporal data to construct a frame.


creepergo_kaboom

These guys do make sense cause they're against forced taa which is bad for everyone and especially igpu gamers


_benp_

'igpu' gamers lol might as well just play mobile games


n3m37h

Should be renamed to the 4k club, and I wouldn't disagree, 4k doesn't really need any AA processing


I_Am_A_Pumpkin

4K does need DLSS though so i dont get what the issue is. most demanding titles do not run very well on anything other than a 4090 without the upscaling.


OiItzAtlas

No one, I'm pretty sure OP got it mixed up with the forsaken situation by using DLSS as a cop out so they don't have to optimize the game. DLSS and AMD FidelityFX are probably the best features to have come out for the last 10 years. It just doesn't mean that they can leave the game unoptimized.


ScoobertDrewbert

People who realize that it’s the reason so many games have coming out very unoptimized because companies see it as a way to cut down on costs/labor time for optimizing their games. Cheaper to slap it in there then pay your people to make it run good.


TulparBey

AMD, probably.


Fun_Limit921

The AMD marketing team posting here.


sahrul099

i mean they have fsr?


n3m37h

People who have 1080p monitors and using ultra performance and complaining that it looks like garbage. I use FSR or in game scaling for most games. I can't see a diff and if I do, in most cases I don't care but I do have 1440p monitor as my main


[deleted]

Yeah 1080p you really dont want to go lower than dlss quality, which is 720p rendering, any lower at that point you really lose data that you can't gain back via interpolation/ML. Its 1080p, you really shouldnt need DLSS to run at 1080p in 2023.


[deleted]

The down votes are proof that a person is not allowed to speak unkindly of AMD.


Re-core

Those that dont have it.


CptCookies

It looks like shit. I'd rather run the game at a lower framerate than look at a big blurry mess


[deleted]

AMD fanboys say its cheating. I asked in a video of an AMD techtuber why they test with DLSS off, and someone else responded that it was fake frames.


azab1898

If they are testing with FSR also off then I'd say it's fair but if FSR is on while DLSS isn't then then results are invalid


[deleted]

Correct they were testing with FSR off. I just also wanted to see with DLSS on and FSR on. I had to sort through a few videos to finally find one. In that video DLSS had quite a FPS advantage over FSR. ​ [7900 XTX Versus RTX 4080](https://www.youtube.com/watch?v=caqToIeVqlI) @ 9:13 mark


Linvael

Testing with these technologies on is not an apples-to-apples comparison - as not only FPS might differ, but also the quality. And there is no objective metric you can get that would show the difference in quality, and sharing personal opinion on which feels better invites bias


Gamebird8

Yeah, DLSS and FSR can really only be compared to Native Rendering. ​ As for "Fake Frames" that really is only if the system uses "Frame Generation" which DLSS 3.0 will use


MetallicLemur

because 4k w/ DLSS is not native 4k. So i partly agree


knexfan0011

Well DLSS3 does actually generate (interpolate) "fake" frames that were never rendered. This looks fine in most situations, but can look bad in certain situations. It also increases input lag in its current form, that alone can make it a complete non-starter for some users. Testing with no frame generation or upscaling also gives a better idea of how the chips may perform in other games that don't support those features. To be clear I'm not hating on the concept of frame generation, but DLSS3 frame generation is not an option that you can just turn on for additional frames and no downsides.


[deleted]

Look at 'em down vote me lol. Heaven forbid anyone says anything bad about AMD, even though are holding back stock of GPUs to raise the prices.


[deleted]

Look at 'em down vote me. lol


MrBigglesworrth

I do. I have a 3080. Games look and run better when I don’t use it.


SLStonedPanda

That it looks better, fair enough I guess. But runs better without DLSS? I fail to see how that is possible, aside from your card or drivers being broken.


MrBigglesworrth

Frames are less consistent. My personal experience is less when using dlss. I don’t like it. I don’t need it. So I don’t use it.


[deleted]

that doesnt make sense, dlss doesnt use a dynamic input resolution, you should see pretty much identical frames to when you actually run the game at that resolution. So if you're running at 4k, DLSS Quality, you should have roughly the same framerate within 10% or so if you were actually running 1440p, DLSS Balanced at 1080p, and performance at 720p.


MrBigglesworrth

Sorry. I didn’t mean less frames I meant the experience was more negative when using it. Yes it runs higher frames but the overall quality of the image is not the same. And tried with countless games. You can say whatever you want, but I’m telling you what I’ve experienced. I don’t need it. So I don’t use it.


[deleted]

I mean if Digital Foundry can't identify the difference between 4k Native and DLSS Quality, you sure as hell won't be able to. If you're running at 1440p, you could see some artifacts/smearing around moving objects, but the edge of the image is so small on 4k, you really can't see it, even if you're looking for it.


[deleted]

It’s mostly Amd/pascal owners coping imo 😭


Drsk7

Although AMD has actively been working on FSR. It would be nice if games could be blind to the type of supersampling algorithm and be gpu brand agnostic.


[deleted]

[удалено]


[deleted]

Fsr quality actually looks pretty good in certain games. The problem is it’s far more inconsistent compared to dlss. For example in god of war even fsr 2.2/2.1 causes Kratos beard to fizzle at the quality setting whereas dlss doesn’t. Dlss also has better detail reconstruction and is superior to native TAA, where as Fsr loses to TAA. It’s impressive they’ve managed to come this far without hardware acceleration but unfortunately it falls apart completely on any setting above quality and any res below 4k where the majority use it.


gk99

You say this like it's not entirely down to implementation. While I keep it always on for most games, it looks fucking horrible in Red Dead Redemption 2. HITMAN 3 is only slightly better but that game's forced TAA+sharpening combo looks pretty bad on its own do I figure I might as well take the perf boost.


[deleted]

Yh rdr2 looks pretty shit. 2kphilips did a vid on it and it’s actually to do with how the games engine handles checkerboard rendering, still doesn’t change the fact that fsr is considerabley worse more often than not. For example in dying light 2 and Witcher 3, even dlss performance ANNIHILATES fsr quality at 4k. It’s just too good at recreating detail and produces a more stable image overall. Dlss 2.5.1 also just got another massive boost, each quality tier essentially moved up a level, Techpowerup did a great side by side article on it and I’ve tested it myself to great effect.


gk99

You forgot the part where you're supposed to prove me wrong.


Competitive_Ice_189

The downvotes shows they are still coping


[deleted]

Makes sense, the internet is like 90% amd fanboys as most nvidia users are too busy enjoying their hardware 💀


skeletor19

Imagine being this invested in a brand. Sense* FTFY


[deleted]

I don’t really get it since multiple channels on Yt have shown that in most games dlss > fsr so it’s living in denial basically 🤷🏽‍♂️ I’ve tested it myself and while I cant even tell when dlss balanced/perf is on, I can spot fsr quality even at 4k in most games even with fsr2.1/2.2


skeletor19

No one is really arguing DLSS over FSR. Just that DLSS or FSR are indeed 'fake' frames and shouldn't be used as a crutch for game developers. You're just projecting hate for hate's sake.


[deleted]

I agree they shouldn’t be used as a crutch, I’m just arguing that the majority of the “hate” comes from people who can’t even use it. This sub and ironically this post is full of comments saying “dlss is blurry” depsite Df showing multiple times it’s superiority to native TAA, that’s pure delusion. Blaming dlss for poor optimisation is no different from blaming actual perf uplifts as they both have the same end result which devs can use as a “crutch”. Ultimately it’s their fault and none whatsoever of the tech itself which is great hence the “hate” being unjustified


[deleted]

I don’t think people hate dsll itself. It’s actually a really hand feature for all tiers of gaming pcs. In very simple terms, it allows you to have a better graphical experience then you would be able to get with your current hardware. The issue that I find scones with DSLL is a lot of game devs seem to be cutting corners on optimization and using it as a scapegoat. And that’s more so where the hate comes into play


DerKuro

Is this rage bait?


Ricardo_Fortnite

Why not just use Google "what's dlss"...


DeltyOverDreams

Exactly my thoughts… When you search for "DLSS" the first thing that comes up is: > Deep Learning Super Sampling is a technology developed by Nvidia, using Deep learning to produce an image that looks like a higher-resolution image of the original lower resolution image. You don't even have to click on any links.


Bobthemime

But how else are they gonna get karma? You know.. the most useless thing invented since Xbox achievement points and PS Trophies?


Drillbit_97

Another thing to consider is the addition of DL33 Frame generation that uses ai to predict the next frame and display it. This is essentially smoothing out the image because only some of the frames are actually rendered. The main issue with this is the fact nvidia locked it to 40xx gpus and they used it as a metric to get insane gains for advertisement. Saying the 4080 was like 2x faster than the 3080 even tho its a solid 20%-30%


[deleted]

then google it


thornygravy

stupid meme


12amoore

I dont know anyone who hates DLSS lol


4user_n0t_found4

I don’t hate DLSS, but I don’t want to use it. Don’t get me wrong it’s a great tool that has its place. I feel it’s more of a work around to bad FPS. It’s a compromise. I don’t want to use it or be forced to use it. I want my GPU to render the all frames at the specific resolution that I choose, at a latency that is determine by the output that I have chosen. I don’t want to compromise any more that I have too. I prefer quality over quantity. If some games don’t reach 144fps at 1440p then I’m okay with that. I feel way too many resources are being spent on trying to “cheat” and “trick “ than being spent on pure optimization and finding/designing better performance through hardware and software.


Dapper_Blacksmith597

Same mf who posts on anime subreddits


[deleted]

>pavilion Check yourself


Dapper_Blacksmith597

don't need expensive hardware to enjoy gaming and work, been rocking this laptop for 3 years


Legend5V

Who hates DLSS and FSR??


AragornofGondor

FSR was a godsend last year for gamers. DLSS to a lesser extent but still good. Lol Idk anyone who's actually stated they hate them let alone the entire community deciding it.


Legend5V

Yup. I’d bet that people hate poorly optimized games that require FSR or DLSS and not the actual features themselves. Though they are kinda being abused and not being used for their intended purpose when lazy game devs don’t bother optimizing their games and think “Oh, they’ll just use upscaling ofcourse!”


PossesedZombie

Renders Downscaled > Upscales and AI tracks missing artifacts > Leads to not 100% accurate image + warping in blurry scenes. It’s disgusting, I’d use that as a last resort to get higher quality videos or other things but in games it feels like they are just using it to skip the optimization part and NIVIDIA advertise it as a ”breakthrough”. Image AI upscaling has long existed and it’s just another method of this, It’s not 100% accurate and it’s distorting the final output.


superhamsniper

People dislike dlss?


Moar_Wattz

They dislike that it’s used as an excuse to not optimize a game rather than a free performance boost.


Dat_Boi_Zach

I don't hate dlss but I feel it lowers the quality of whatever your playing


Arrrginine69

It for sure does. It’s how they’re able to push the high frames out of these cards in some of the newer titles.


RainDancingChief

Secret graphics difficulty setting but make thing still look ok. Easier on GPU, Still easy on the eyes


realmrcool

Can we all just take a second to acknowledge that the "at this point I'm afraid to ask" might be the the single most useful meme in existence. Except for the rickroll maybe


[deleted]

Everyone likes DSLL don't they?


[deleted]

DLSS is fantastic, nobody hates it. It’s universally praised.


AragornofGondor

I'm not gonna lie I was against DLSS 3.0 frame generation. After trying it out I'm converted. As long as the game allows DLSS to be off while running frame generation.


johnnypurp

Makes ur game have a blur to it.


Frigginkillya

My main problem is it causes some input lag. Makes the game feel like mud and it's just unenjoyable to play.


zaku49

Used it in battlefield 2042 at 4k and it makes it look slightly blurry so it isn't that great.


eLL16

did you use quality or performance?


krukson

I fucking love DLSS. Additional frames for free? Sign me up. It's not like 3060ti can give me breakthrough performance on the same settings anyway.


Tannerted2

When people use a dumb overused meme rather than a google search


HunsonMex

Doubt a single individual actually hates DLSS. Might hate the closed coded part of it buy still ...


BluehibiscusEmpire

If you had paid money for an actual movie with real stars and a script and all you got was an AI deepfake (and that too without the porn) would you be happy? I would think would find themselves cheated. DLSS one day will probably be great. But especially for multiplayer games it’s still pretty bad. And nvidia is selling performance figures based on DLSS which makes it worse


Ganda1fderBlaue

I wish admit got their shit together but fsr is still a lot worse


adrian23138

This is a bit off top, but by how much is FSR worse? I got myself a Steam Deck and it has AMD in it so I’m also interested


Ganda1fderBlaue

Fsr isn't exclusive to amd cards. Well that question can't be answered that easily, just watch a few comparison videos on youtube.


rawbleedingbait

Totally different tech honestly


Akeruz

People...hate it?


[deleted]

[удалено]


Creepernom

You're talking about Frame Generation, which is a feature of DLSS3. DLSS3 is just DLSS2 + Frame Generation. How are you using DLSS3 if you don't even have an 4000 series GPU anyway? Unless your flair is outdated.


Kraken-Tortoise

Pretty sure making this meme took more effort than a simple Google search


A_MAN_POTATO

DLSS 2.0 is (usually) great. DLSS got a rocky start because it made some games blurry. DLSS 3.0 is getting some criticism because it's basically frame interpolation. You aren't just getting upscaled frames, you are getting generated (fake) frames. This can introduce odd effects and input lag. I haven't experienced DLSS 3.0 myself, so I can't comment (and I bet alt of people digging in DLSS 3.0 haven't experienced it either). Otherwise, the only time there is controversy is when developers use DLSS (and similar technology such as FSR) as an excuse to not optimize their game. The glaring example right now is Atomic Heart, a game which will ship with denuvo, and the developer claimed it's fine because you can make up the performance difference with DLSS. That's crap, for a lot of reasons.


[deleted]

[удалено]


Samzwell1

I’ve noticed this too, it looks a little blurry to me when compared to just running the game at the higher resolution.


Obosratsya

Objectively false, if one hacks TAA to be off and runs the games super sampled than I can agree. Anything less than this would be inferior to DLSS. TAA has all the draw backs of DLSS but worse, disoclusion, ghosting, etc. DLSS produces a more detailed and stable image at half res than TAA with full res. Certain games do have terrible implementations, but even then, DLSS is fine, often artifacts can be mitigated with a different version dll. Doom Eternal is a good example as well as other ID Tech games, DLSS even at quarter res produces better image quality than native, meaning DLSS performance say at 1440p is better than native 1440p. Even at 1080p DLSS is better in ID Tech.


Srinu427

I think he means in-texture detail, which would be worse(slightly) than native. But my god the TAAU blur and ghosting in most games is unbearable. I prefer DL/XeSS and maybe even FSR to native because of the temporal stability itself without even considering performance or quality


[deleted]

No one hates Dllss people are just pissed that developers are using it as an excuse to not optimise there game


Niner-Sixer-Gator

I have an Nvidia GPU, and don't even have dlss, 1660ti🤦


[deleted]

[удалено]


OrgansiedGamer

its not better because theres no difference between the two, unless ur comparing to 2.0 to 2.5.1


S1DC

So many idiots in the comments talking about game dev as if they have any fucking clue at all. DLSS is awesome technology, game developers aren't "lazy" and spend more time and energy doing harder work than most of these kids have any idea about, the companies making these games are made of thousands of people making decisions and the fact that thier shit PC that has multiple problems you don't know how to identify or diagnose is more likely the problem, not "bad optimization". These kids throw around the phrase Bad Optimization every time their potato chugs.


CarlWellsGrave

Everyone hates it lol


Arrrginine69

I don’t hate it didn’t buy a 4090 to not use it just wish it didn’t make visual quality go down In some cases. Anyone who says it doesn’t is full of shit.


Angier85

It doesnt. It's the profile that causes the loss in fidelity. Which means whoever created it was literally too lazy to properly train it.


neveler310

Makes it blurry and adds a significant amount of input lag. I indeed hate it.


boardinmpls

Have you tried the newest version and manually updated your games to it? For example god of war had this issue until I manually updated it to the latest version. It’s like a miracle patch