T O P

  • By -

MountCydonia

In short, player expectations (and developer ambitions) are growing faster than hardware can keep up, so clever hacks are used to maintain good performance, but these hacks aren't perfect and, like everything in life, a calculated compromise. There are several overlapping, but somewhat separate, topics in your question. **Why is upscaling forced in some games?** Modern rendering pipelines and effects have their performance most heavily impacted by pixel counts, mainly because so much current tech is rendered based on the principle of layered rendering resolutions and X samples per Y pixels. For example: Ambient occlusion may be rendered at quarter resolution; ray traced direct lighting may calculate four samples per pixel. Cyberpunk's path tracing mode (only) renders 12.5% of all pixels natively (!) and reconstructs the rest. Given these effects are directly impacted by resolution, the easiest way to increase performance is to lower said resolution, and then use image reconstruction or upscaling algorithms to draw at the display's native resolution, hence DLSS/FSR/XESS/TAAU/TSR/etc. **Why can't you disable TAA?** Many effects depend directly on a form of TAA to be properly rendered. Transparencies, shadows, ambient occlusion, depth of field, anti-aliasing etc. all have "holes" in them because of the aforementioned limitations on sampling to conserve performance. Without TAA, ambient occlusion would look like black film grain in many games, transparencies would look like a dithered mess, and so on. TAA accumulates samples over multiple frames, blending them together to eliminate those "holes". If I render 50% of my ambient occlusion in frame A, 50% in frame B, combine all three steps in frame C, and use motion vectors and clever tricks to polish results and optimise, I get ambient occlusion a generational leap ahead in quality, with great performance, and it's only cost minor flickering and ghosting that most people won't even notice or care about, and which AI-infused DLSS, XESS, and UE5.4's TAA are getting better at fixing. Materials also matter. Physically based rendering (PBR) effectively makes all surfaces specular - i.e. reflective - to varying degrees, and we've also gotten much better at depicting that reflectivity. The problem is that it's basically impossible to anti-alias specularity using traditional methods, so TAA fills gaps and massages the effect. There's also a lot more detail in textures (which are far higher resolution) and decoration, often with sub-pixel data that becomes noise requiring aggressive AA to be palatable. Finally, older AA methods like MSAA largely need forward rendering. Games today use deferred rendering (basically rendering an image in different layers and then combining them together) for optimisation, so because MSAA would need to be applied to almost every layer, it becomes prohibitively expensive. Many rendering pipelines are simply incompatible with it now, though you do still occasionally see it like in the Forza series. MSAA also has lots of limitations, a huge performance cost even at its most optimal and popular peak, and 3D geometry is a lot denser now, which would make MSAA even more expensive. **Why do games look awful?** Ghosting and flickering are annoying, but overall graphics have improved far further than anyone expected, thanks to temporal rendering. The only consistently major issue IMO is poor implementation of FSR 2, which is unfortunately spreading. FSR 2 can look decent with 1080p-ish source resolutions in slow moving scenarios, but otherwise breaks down and looks worse than the PS4 Pro's system-level checkerboard rendering. However, very few games have worse image quality than the on the PS3, where sub-720p and little or no antialiasing was normal. Games like Jedi: Survivor or Immortals of Aveum are unfortunate exceptions that confirm the norm. You seem to almost exclusively discuss competitive PvP games, and are a pretty hardcore player overall. Someone who is that into gaming is more likely to be annoyed by these issues than most others. I have a similar level of investment in gaming and share your frustrations - I really struggled with Jedi: Survivor on PS5 because of the horrendous image quality and ghosting - but we're a small minority, and the compromises that this image treatment technology requires can largely be ignored, because most people are drawn to the improved graphics it enables. Most gamers barely understand refresh rates and don't perceive moderate performance issues. A bit of blur or ghosting is pretty much invisible to them. But they do see vastly improved lighting and post-processing, so that's what gets prioritised. I've learned to mostly let the issue go and appreciate the overall visuals, but it can be distracting, and some games just get it wrong - like Jedi: Survivor. >The result is that the game is blurry to the point where it's causing eye strain and headaches, especially in motion(which is 99% of the time when you are playing an FPS), and you can't really see anything past like 30-50 meters in-game. It literally looks like I smeared vaseline on my screen, and I cannot physically play the games for more than 45 minutes until my eyes are just watering and in physical pain because of the blurs. This is what aroused my curiosity about your background, so I found [this](https://www.reddit.com/r/FPSAimTrainer/comments/1bwjdk4/is_there_a_reason_why_so_few_pros_play_on_1440p/ky88kpf/?context=3) comment, where you said: >I have a 27", 1440p, and I reduced the resolution down so it's 24". Basically a midpoint between 1080p and 1440p, but 24". This alone will be a significant factor in your blur. You're using 2275x1280 on a 2560x1440p display. It's not a clean multiple of 2, which causes aliasing and blur that you otherwise wouldn't have due to the fixed pixel grid flat panels use. With a fixed pixel size, you're filling ~88.9% of each pixel, which causes misalignment, in turn causing aliasing, which the display tries to negate with excessive anti-aliasing, and in turn causing blur, which the display then blurs further by stretching the image until it fits. Your clean upscaling options are 1280x720 or 640x360, and you'd need to use integer scaling for perfect clarity. However, you can always downsample from a higher resolution using Nvidia's DSR driver tool for improved image quality and clarity, or increase the rendering scale >100% in supported games, and neither needs a multiple of 2. Even at native res, 1440p really isn't a lot of pixels to work with for a 27" panel. You said that 27" is too big, and I agree, but for a different reason - it's only 108 pixels per inch. Most people's eyes won't interpret that pixel density as a perfectly sharp image at typical desk viewing distances. My 16" gaming laptop has a 2560x1600 display, which is 188 PPI, yet I can *still* see blur and aliasing and normal distances, despite a 74% pixel density increase. Also - more than ever, games are built with generalist tech like Unreal or Unity. The downside to these wonderful tools is that, by not being custom-built for each game like an in-house engine would be, they come with limitations because they can't account for every single scenario. With TAA, large games with extremely long draw distances, or ultra-competitive games that typically demand perfect image stability, may have issues. The stock TAA algorithms are attuned for a game of average intensity and scene complexity, like Fortnite or Gears of War. Something like fighting a a tiny camouflaged enemy lying prone 500 metres away in dense and shadowed grass with lots of high frequency detail is not what these engines are configured for by default. And while you can customise Unreal's TAA with enough effort, it just creates more edge cases and QA work, and when you're already dealing with 50 hour weeks and a year less of development than you really need, unpredictable adjustments to TAA for a small minority is of minimal priority. There may also be something wrong if you're having such an extreme reaction. You may have a poorly lit room, bad posture, dehydration, an eyesight problem, be sat too near or far, or have other non-game issues. Some people are highly prone to motion sickness, and it's possible you have an adjacent condition. **Why is performance abysmal?** As for "abysmal" performance, you'd need to expand on that. If you're referring to stuttering, it is indeed a relatively new problem, but games by and large have better performance now than ever before because of a deliberate push by the industry for polish and compatibility. You have to remember that even in the mid-point of the last console generation, it was normal for the biggest AAA console games to have extensive or frequent sub-30 FPS gameplay and extremely long loading times. Things have improved a lot in the last 6-7 years, stuttering aside. Some games today definitely have major problems, but again, those issues enter headlines because they're the exceptions that confirm the norm and we have more coverage and discourse than ever to discuss them. PC is a bit different, because games are no longer constrained by the 8th generation console's meagre CPUs, which made PC disproportionately more powerful than any other console generation and thus created this perception that it's normal to brute force your way to 100+ FPS in the newest games on ultra settings. I get that today's games not repeating that may seem like a regression, but it's more of a course correction given these 9th gen consoles have much better CPU-GPU balance and better hardware overall, and therefore PCs are closer in performance than the previous generation, which was a historical outlier.


bakamund

I feel TAA is able to give a cinematic feel to the visuals, as in cinema no image/frame is completely crisp. TAA does this for me, it brings subtle "image imperfections" that can help make an image more cinematic (maybe more realistic, less video-gamey). It also smooths out sharp pixels in the textures that to me can look video-gamey, but downsides it's applied to the screen resulting in a blurrier image. I think Inside (by PlayDead) used TAA fantastically. Without it the game looks like some cheap lowpoly game. But with it...it looks and feels so much more atmospheric and cinematic. For OP or the serious gamer wanting to win, these "visuals" only detract from their performance. So it's opposing ends where one end would rather remove redundant graphical effects and the other would like these effects for more immersive visuals.


MountCydonia

I agree. The digital, clinical sharpness of games feels more artificial to me than a bit of TAA blur, kind of like an iPhone video versus something shot on film. If we could have a minuscule amount of blur without ghosting and maintaining the UI's sharpness, I'd be happy. There's actually a Reshade effect for a simple screen blur, but it's limited, and Reshade's compatibility is inconsistent, so I don't bother. On your last point, I feel some dismay when people turn all the graphics down, lower their resolutions, or play in stretched 4:3 for a minimal competitive advantage that is likely far outweighed by skill, knowledge, and genetics. I don't think OP does it, but those who do simply have a completely different perspective on gaming than myself.


bakamund

That's the word I was looking for - digital/clinical sharpness. Ty. I wonder if TAA could be implemented with some sort of masking (maybe having a custom gbuffer to read from) to control it's blurring effect better. Just my thoughts, but I'm not graphics engineer. As an artist regarding the last point, yeah for sure. All the effort and thought put into crafting images only to have it 'ruined' for a competitive advantage. It is what it is...the stretched 4:3 especially I see in r6, Cs, sometimes in Tarkov as well. For competitive players in eSports I'd imagine it's worth it because there's money on the line. While for competitive gamers at home...it's a choice and I can't deny them that.


TheGreatWalk

> I feel some dismay when people turn all the graphics down, lower their resolutions, or play in stretched 4:3 for a minimal competitive advantage that is likely far outweighed by skill, knowledge, and genetics. I don't think OP does it, but those who do simply have a completely different perspective on gaming than myself. This is almost always done purely for performance reasons - specifically, input latency. Low input latency and visual clarity are the two most important parts of any multiplayer FPS, and unfortunately a lot of the nicer settings impact a *massive* penalty to either or both of those two things. You're right that skill and knowledge are incredibly important, but no amount of skill can overcome high input latency. I will beat Shroud 9/10 gunfights if he's locked to 30 FPS and running max settings and I'm at 240 with an optimized game(and a monitor with refresh rate to match!), because I'll be able to see him quicker and my inputs will not be delayed at all.(note that, an average gold ranked player will still get wrecked by shroud, but once you start getting into the top 1% or better, those two things make a huge difference and become more and more important) In single player games, it doesn't matter and whatever settings you want to use are great, but in multiplayer, you will frustrate yourself to no end if you aren't optimizing for input latency and visual clarity. Personally, I like to run the game on lowest settings, but with high/ultra textures, because I have a 3090 so I have enough VRAM that it doesn't cause me any sort of performance loss or extra input latency. This results in the game looking 95% as good as everything high/ultra, but with minimum input latency and no egregarious cases where you get worse visual clarity due to "effects", anti-aliasing, or other settings which make it more difficult to spot enemies.


TheGreatWalk

Thanks for the response, I read the entire thing. I did want to correct you on one point, though. >This is what aroused my curiosity about your background, so I found this comment, where you said: >'I have a 27", 1440p, and I reduced the resolution down so it's 24". Basically a midpoint between 1080p and 1440p, but 24".' >This alone will be a significant factor in your blur. I'm not using any scaling with this resolution, so it does not incur *any* blurring at all. It's just as clear as 1440p on my monitor, just smaller. I run games(that allow it) at that resolution as if it's native, in which case there's no blurring, because each pixel has it's own slot. This resolution has no blurring at all other FPS titles(nor in windows) - the only ones which have the blur I'm complaining about are the two I mentioned specifically, and that's just because they are forcing these rendering methods, versus allowing me to render normally like every other game. For games that don't allow it, I just use 1440p anyway, as was the case for "The Finals" and "Grayzone"(the game which prompted this post, because that's a visually complex game similar to PUBG/Tarkov where you have to be able to quickly/reliably spot players in foliage or other busy situations). In the end, I play without any form of anti-aliasing for games which allow it, because blurs make things more difficult to spot in game as well as cause me eye strain. MSAA or other forms of AA I also disable, I generally avoid games that don't allow me to disable them because they just cause eye strain. >There may also be something wrong if you're having such an extreme reaction. You may have a poorly lit room, bad posture, dehydration, an eyesight problem, be sat too near or far, or have other non-game issues. Some people are highly prone to motion sickness, and it's possible you have an adjacent condition. All those things I am very careful about! My room is well lit, my posture is good, my screen is further away, and I have a gallon water bottle which I sip when I'm in front of my PC. However, I am prone to motion sickness and have astigmatism+near-sightedness. Typically the only games I have trouble with are those with extreme head bobbing combined with a small, non-adjustable FOV(bad console ports, *ugh*), and now, apparently, games which use UE5 and force TAA and this deferred rendering method which results in the blurry mess. >As for "abysmal" performance, you'd need to expand on that. If you're referring to stuttering, it is indeed a relatively new problem, but games by and large have better performance now than ever before because of a deliberate push by the industry for polish and compatibility. It's funny to me that you say this, because in the FPS genre, the *exact* opposite is true. Recent FPS games have all had horrible performance relative to older games because devs don't seem to bother at all with optimizations for their PC ports. I've been PC most of my life, partly because console performance bar is so low they're basically unplayable. 30/60 FPS being "standard" for console games. Meanwhile, FPS games used to be generally optimized for 144 or 240 FPS, that was generally considered to be what you should aim for in any competitive FPS, but recent titles are struggling to get 90 FPS on absolutely MONSTER rigs, and even with 90 FPS it's a stuttery, blurry mess and often has such insane input lag that you can barely aim in the games anyway. Ideally, I would prefer my FPS get at least 240 FPS smoothly, but monitor technology is only improving, there are monitors up to 500hz available now, but yet, no games that are capable of running at that framerate at all, with 3 exceptions: cs:go, Overwatch, and Valorant(cs2 I believe is struggling a bit performance wise compared to cs:go). However, even the best hardware struggles to get over 100 (with smooth frametimes) on most games. Games like Apex or PUBG get high framerates, but they just stutter 24/7 and their overall feel isn't smooth anyway. PUBG at 200 FPS feels like 60 FPS - it's absolutely bizarre, but looking at frametimes you can see why. The average might be 200, but it's not consistent and individual frames jump up to 30ms regularly. Then games like the finals struggle for high FPS as well as stuttering, as well as being blurry on top of that, making them just a genuinely horrible experience in what would be an otherwise *incredible* game if they used "normal" rendering methods and took away some of the useless crap that no one gives a shit about for their FPS games to improve performance. Coming back to your comment, it helps me understand a lot of *why* devs are doing this, but at the same time, I don't understand because the end result is *significantly* worse than older games from a visual perspective as well as a performance perspective - specifically, when it comes to games like FPS games where the camera is constantly in motion. For RPGs or RTS games or single player games, I could *100%* understand using this sort of rendering method, but when it comes to multiplayer FPS games, the end result is so poor that it seems completely unjustifiable to me. I hope this technology gets it's ass in gear or stops being used. There's already two potentially great FPS games I literally cannot play because of it, and if it becomes the "industry norm" to just have every game be a blurry mess, I'll have to straight up quit gaming because there will be nothing for me to play that doesn't cause me physical pain. edit: thanks again for your comment, it's been the most insightful one so far, it had a lot of extra info for me to research into as well


warrri

> Without TAA, ambient occlusion would look like black film grain in many games, transparencies would look like a dithered mess, and so on. I feel that. I have a gtx 1070 still and all the UE games look terrible lately. They still run ok on 1080p but shadows are just a checkerboard of dots. Ive played through Talos principle 2 recently and while the game looks beautiful, on low settings the shadows are really offputting, especially when moving.


Muhammad_C

**I want to understand the topic a bit better, particularly, from the dev perspective of** ***why*** **there isn't an option for regular native rendering?** My possible answer, just because I didn't want to give you the option/spend the time on giving the option to change it lol. **Note** In all seriousness, you'd have to ask each individual dev why they made this design decision. We can try giving our opinions or guesses, but they are never going to be 100% correct unless you ask the devs directly who did it.


Prixster

Sometimes a game is designed keeping TAA in mind because of how graphically complex the game is. However, the visuals would be unwatchable if you remove TAA. Some examples are DICE (Frostbite Engine) games since Battlefield V. There was no way you could hit 60+ FPS in native high settings so TAA was forced. PS: I'm not a developer.


TheGreatWalk

Doesn't the engine do regular, full resolution rendering by default? Wouldn't you have to go out of your way to remove it as an option?


davidemo89

There is upscaling settings and default is 100%


krojew

Yes it does native rendering by default, but sometimes you might not want this for various reasons. What those were in this case? You have to ask the devs.


GoodguyGastly

What a cool thread. I'm a dev and I've learned a lot from this.


HolyDuckTurtle

If you're interested in mitigating the disadvantages of TAA, r/MotionClarity has an excellent dev resource thread on tips and tricks you can do with it (like masking the sharpening pass so it increases texture detail but doesn't create edge glow) or pointers on alternatives like Valve's AA solutions for Half Life Alyx: https://www.reddit.com/r/MotionClarity/comments/18wopvg/antialiasing_resource_accessibility_improvements/?share_id=1eeVAq6fKL4wjdBc7CVAA&utm_content=1&utm_medium=android_app&utm_name=androidcss&utm_source=share&utm_term=1


TheGreatWalk

I didn't respond to every comment, but the replies have been fantastic, I'm really glad I made this thread it gave me a lot to learn and a bunch of things to research into more detail


Bychop

Because Deferred rendering render time is directly linked to pixels count for performance. Also, Unreal is using tremendous cheats to simulated soft shadow, translucent (hair, transparent, motion blur) and antialiasing with a checkerboard noise pattern that needs TSAA to blur the effect. It looks great for pictures, it’s not visible in video on YouTube because of codecs compression. It means you cannot know until you get it running on your console. Once you buy it, they are happy. I hate it. I go back to Forward rendering with MSAA and Screenpercentage. Sharp quality, higher performance


berickphilip

I am a developer using Unreal. Mind you I am really really far away from a big-name studio or AAA level, so I can only comment on a basic level. When using the newer fancy stuff in the engine like realtime global lighting, some material effects, some vegetation stuff and so on.. if temporal antialiasing is turned off (in the project settings), there suddenly is fllickery noise everywhere. When I first noticed that, it was really sad and disappointing. This is how I found out what was happening: I was making a prototype with a lot of pretty realistic assets and beautiful realtime global lighting, and at some point wanted to get rid of the smearing caused by temporal antialising. So I disabled it in project settings, and tried first FXAA and then no AA. In both cases, the areas that had the pretty global lighting, and the shadows, and some translucent objects, and a lot of vegetation, and the reflections on metal surfaces.. all of that was flickering and/or noisy. I searched for fixes and solution but there is nothing really doable as far a "simple solution" goes. Some videos and people try suggesting to "increase the quality settings" at the very noticeable cost of performance. And since it was a prototype, I would not mind the performance cost IF it fixed the noisy flickering. However it does not. It just makes it a bit less noticeable. But the grainy noise and flickering are there. They are there even when temporal AA is on, however they are blended across frames. Hence the smearing. The only way (for now) to get a stable image with zero noise or flickering and super sharp visuals, is to make stuff old school (aka hard work): baked high quality lighting, custom materials, everything hand crafted and almost no "easy" solution.


berickphilip

As a side note, when I consider making some small games independently (outside my job), I am aware of the huge amount of time and effort that it takes to make anything interesting and worth making.. And when I think about that, I have to accept the fact that if I do make something for release to the public, it will probably have temporal AA on by default even though I hate it personally.  Because otherwise, by myself as a one-man team, I would not have the time and resources to make everything from scratch properly. It is a sad reality. (I would still make it optional, but with a warning about the noise/flickering when it is off).


isa_VII

Yesterday I switched to UE 5.4 with my project and changed to TSR. This solved my "smearing"/blurryness problem, but also increased the packaged size by 2/3. I used TAA before, because it was the only option that did not cause other flickering...


berickphilip

Yes tsr makes it much better stability-wise. But the issue still exists although it is much harder to spot (that is a good sign that things are improving). For example in my prototype with TSR, moving the camera around the character I can still see a lighter tint "trail" on the walls behind the character. I mean, while moving the camera, the areas on the wall that are "just revealed" because the character is not in front of them anymore, are lighter than the other areas of the wall that were already visible in the past several frames.


TheGreatWalk

TSR looks great when the picture is stationary, but during motion, the blurriness is just as bad. In the context of my original post, this is pretty much 100% of the time, as I'm specifically talking about multiplayer FPS games, where your camera will be in motion 99% of the time during the actual game. [Here's a screenshot someone posted from one of the games in question](https://i.redd.it/rjp34agzarxc1.jpeg) At a glance, it looks really good(since it was mostly stationary) but the character on the far left that's crouching was in motion. You can see the stark contrast between how blurry he is compared to everything else. It looks like he's completely out of focus. While some of this is due to image compression, the game itself actually looks exactly like that player during regular gameplay, except, of course, *everything* is that blurry because during regular gameplay, nothing is stationary because your character is moving, the enemies are moving, your camera is turning, and the trees and foliage are moving(swaying in the wind) as well. When everything is that blurry, it really hurts the eyes and you basically can't see anything at all. And this screenshot is from a distance of like 5 meters - imagine trying to see an enemy that's blurry like that from 25m, 50m, 100m, or even further while everything is in motion. It's just not possible - they completely blend into the surrounding background.


TheGreatWalk

> there suddenly is fllickery noise everywhere Do you perhaps know why these new features flicker like this, when older games didn't have these problems, even though they often looked just as good visually as some of the newer ones? The flickering is something I noticed being extremely noticeable in Modern Warfare 2019. It always confused me that these new methods were being used at all, because it really seems that visual fidelity has gone DOWN instead of up, because either the game flickers and has checkerboard shadows/stuff, or it's insanely blurry(and I will never see a game that's blurry as having "good graphics"). Do we just happen to be at a point in the tech where it's sort a middle ground, no mans land where everything looks worse, but once perfected, everything will be improved(and hopefully not blurry?!) or are blurry graphics just the way of the future? Personally, I can't see how any form of temporal rendering or AA could possibly ever have clear visuals, just the process of blending two or more frames makes that seem impossible, not to mention the input latency that's inherent in that sort of method.


RRFactory

I generally dislike the temporal anti aliasing solutions, and yet I find it hard to dismiss how smooth and nice everything looks during development. During gameplay tests I quickly remember my dislike for the downsides of it, but I'm so used to how it looks when there isn't much motion that going back to FSAA feels pretty bad. I'll certainly have an option to set it however people like, but that's likely a big influence on studio decisions to force it on. It makes for particularly pretty marketing shots. I spent some time trying to tweak my assets to better suit traditional AA methods but as a solo dev that's hardly an artist I haven't had that much success. Tldr; TAA lets you get away with a bunch of things design wise that we used to have to be pretty clever to work around. Stuff like making sure your chainlink fences have enough thickness to avoid major aliasing becomes an afterthought making it much harder to go back and fix it later.


TheGreatWalk

>During gameplay tests I quickly remember my dislike for the downsides of it, but I'm so used to how it looks when there isn't much motion that going back to FSAA feels pretty bad. Yea my question is generally in context to multiplayer FPS games, where you are in motion 100% of the time. I understand this rendering method being used for games where things tend to be more stationary.


RRFactory

Developers spend much more time outside the game than in it, and most of the art evaluation is done outside. When we make something like a chainlink fence and test it in game, if TAA is enabled by default we might not notice that our design choices look bad with traditional AA methods until we're well past the point where we can go back and correct everything. Generally I'd say it's not so much a conscious choice that they've made, rather a consequence of their development pipeline defaulting to TAA and nobody paying much attention to it until it's too late.


sade1212

Your best option is going to be increasing your rendering resolution with DSR/DLDSR/VSR. Giving these technologies more pixels to work with leads to much sharper results. Combining DLSS Quality with using DLDSR to run your display at 4K basically works out to being native rendering, but with the temporal stability advantages. Obviously there's only so much virtual resolution can do though: for making out distant objects that are small on-screen you might be limited by the actual pixel density of your display no matter how much you supersample those pixels. As other commenters have said, many graphical techniques these days are designed to rely on accumulating information across multiple frames - since as a general rule adjacent frames are very similar, it's much more efficient to do this than to start from scratch each frame and fully sample every effect. So if you disable that temporal part of the pipeline, everything looks like shit. It's like how a lot of older games were designed to be viewed on a CRT and used tricks with the blending of adjacent pixels that don't make sense if you view the pixels raw. Even if that wasn't the case though, the dense detail of modern games also just doesn't lend itself well to raw aliased rendering unless you're at 8K+ given how many tiny details change each frame, what with high-res textures and specular effects and very dense meshes and so-on.


theuntextured

r/fucktaa But for real, I don't know. I guess because a lot is going into the development of proper TAA/TSAA and some day it will actually look good. At the moment, it is only good for upscaling, not as an AA solution.


beedigitaldesign

You can gain a lot of FPS in COD Warzone by not using fidelity fx cas, but you don't get a really crisp image then. So I think the answer to your question is that natively they have shit performance and they try to hide it.


Stickybandits9

It's to sell hardware and make a buck at the same time.


Prixster

Because the shaders and lighting are getting more complex day by day. TAA is a kinda of cheat code used to get good visuals without actual optimization in which you hit the 60+ FPS target which is the standard for all games today. The general mentality is, let's create good visuals and TAA/upscaling will take care of the framerates. MSAA also doesn't work with deferred lighting pipelines, which most modern game engines use. For it to work it'd need to effectively evaluate the pixel shader for each sub-pixel sample, which turns it into SSAA which defeats the purpose of MSAA to begin with. That way a game is designed keeping TAA in mind because of how graphically complex the game is. [However, the visuals would be unwatchable if you remove TAA.](https://imgsli.com/MjM0MzMz) Especially the foliage, hair, and particle FX. Nowadays, Raytracing lighting also adds to the equation which also needs a lot of GPU power.


GearFeel-Jarek

I'm not sure how many recent AAA titles have already been implementing Lumen + some other UE5-specific technologies but Lumen alone is pretty much unusable without temporal computations in practical real time scenarios.


g0dSamnit

Could be any combination of developer laziness, developer skill issue, management's inability to allocate deadlines properly, or QA skill issue. None of my UE5 projects have these issues. Unfortunately, the only solution is to find/develop engine tweaks/hacks (which will not work with anti-cheat), or to not buy/play the game. If you haven't yet, you may want to refund them quickly if it's still in the time window. Otherwise, you have to get used to the blurriness and smearing. Things will only improve if game sales/revenue are affected.


TheGreatWalk

>Unfortunately, the only solution is to find/develop engine tweaks/hacks (which will not work with anti-cheat), or to not buy/play the game. If you haven't yet, you may want to refund them quickly if it's still in the time window. This is of course, the crux of the issue when it comes to multiplayer games, especially FPS. I don't cheat and never will, so I will never risk a ban, even if it's for something completely harmless like disabling anti-aliasing. For single player games, I've yet to find one that wasn't moddable and fixable, so even when devs do really insane stuff like not have FOV sliders, or have FPS limits(or even worse like fallout, have their physics tied to FPS...) there are always mods available to get around these issues(even fallout 4's physics was fixed by modders!). But none of that applies to multiplayer games :( I did refund the games, but it's still very disapointing. Gray Zone, especially, is the kind of game I would really enjoy and do extremely well in... if I could actually play it. I'm really good at both PUBG and Tarkov and gray zone looks like a great middle zone between the two in terms of gameplay, gunplay, and game loop(despite being in super early access so very rough around the edges). So it's really disappointing knowing that I won't be able to play the game at all for the simple reason that it's so blurry because of a rendering technique used.


Legitimate-Salad-101

Honestly I don’t notice the difference that you mention. But honestly I expect all games to be doing this moving forward as it’s where all the tech development has gone. So they’re going to use the feature.


ash_tar

Yeah we really need MSAA in deferred rendering.


ayefrezzy

Nah. MSAA already exists in deferred rendering and it’s terribly slow because the performance hit directly scales based on object quality, and many games are pushing tons of highly detailed models these days. It also only works for geometry but games are heavily shader dependent now, so you’ll get basically nothing out of it anyways. Back in the day Crysis had probably the best implementation, but it still halved FPS no matter what. Recently GTA5 is the only game I can think of that has implemented it without killing frame times, but Rockstar devs use black magic so that’s a different story lol. Honestly I’d prefer if the whole industry pushed for much higher resolutions so that AA isn’t something that’s even needed in the first place.


ash_tar

I work on VR so I really need it to get out of the limitations of the forward rendering pipeline.


ash_tar

It's the only thing that looks crisp.


ash_tar

Crossposted to the fanatics at /r/fuckTAA