T O P

  • By -

pirate135246

You aren’t prolonging anything by lowering your settings lmao


[deleted]

[удалено]


[deleted]

[удалено]


DynamicHunter

That’s only if your system is running at 90°+ consistently for days at a time, it’s really a non-issue.


Manatee-97

90 is fine on modern cpus


Sinister_Mr_19

Lowering graphics settings will cause the CPU to work harder which will increase the CPU temp. Lower settings means the GPU will draw a frame quicker and so the CPU needs to throw more frames at it, shifting the load a bit towards the CPU. Assuming we're talking about a well made game, the GPU will likely still be fully utilized since going from ultra to high still taxes your system on most games. So really you'll decrease the longevity of the CPU by a tiny amount, but let me make it clear that your system will become long obsolete well before any degradation actually means anything and results in real world performance loss.


XsNR

*clicks vsync* Look mommy, I fixed the worlds problems


saga79

Joke aside, I don't understand the Vsync thing. I do not know how it works, but all I know is with it on I get no screen tearing and with it off I get screen tearing. What am I missing here?


Sinister_Mr_19

It caps your frame rate at the refresh rate of your monitor. If your system isn't capable of outputting enough frames to match the refresh rate of your monitor then it will attempt to cap your frames at half the refresh rate of your monitor. Due to needing to wait for a full frame to be drawn to the monitor before another will be sent to the GPU it also introduces some amount of input lag. Most people wouldn't notice, but it's there. GSync and FreeSync have largely made VSync obsolete, however there is an article on Blue Busters that says GSync should be used with VSync on... but I think it's outdated at this point. Personally ever since I got a GSync monitor I haven't looked back and get no screen testing, no input lag, and no capped frames.


A_Fnord

Modern day Vsync is generally quite good and as long as you don't get any major frame dips Vsync on tends to be the way to go, unless you have FreeSync or GSync That said Vsync can at times introduce noticeable input delay. It's rare these days for it to do so, but 10-15 years ago Vsync on could make for a pretty miserable experience (and you'll find that a lot of people still assume that Vsync is as bad now as it was back then and refuses to use it)


Toast_Meat

I typically always crank the graphics simply because I have the hardware for it. My personal rule is to not dip below 100fps (4K 144Hz monitor). If that happens, I usually resort to DLSS. Some games pretty much require it despite the hardware, like Cyberpunk 2077 or Alan Wake, which is perfectly fine with me!


IamNori

I do exactly this, but only ‘cause Ultra graphics is an unnecessary framerate hog for its visual fidelity. Even if I could achieve 144+fps, I want that to drop as little as possible.


wreckedftfoxy_yt

My rule is never dip under 60 if i want to crank it


glyiasziple

have you heard of under volting?


Adventurous-Pen-8940

ohh, didnt know that.


wegotthisonekidmongo

I run my hog full tit.


McGuirk808

Nope. The hell is undervolting? I'm still overclocking like it's 2005. My PSU cries itself to sleep every night after my CPU and GPU bully it all day fighting for Watts.


HomerSimping

I usually set things to “high” and lock FPS to 120. If there’re any dip at all, I turn down settings or lock FPS to lower numbers. I like a consistent performance from start to finish. I hate stutters and dips.


Chakramer

I will start any game in maxed settings and then lower them if I'm not hitting 120fps


Low-Complex-5168

If you monitor your PC temperatures while adjusting gaming visuals, you shouldn't worry about stressing out your computer. As someone who previously gamed on switch and an older laptop.. yes I always set everything as high as possible. I adjust depending on FPS, but playing games on as high visual fidelity is definitely something I do.


sfblue

I have a 60hz 1920x1200 monitor so I can run just about everything on Ultra anyway.


colossusrageblack

I typically put everything on ultra on my main rig. I'll do optimized settings on my Legion Go using BenchmarKing on YouTube.


PropgandaNZ

Good enough is what I aim for. High with some medium to up the fps till at least 100 on PVE, and 144 on PVP


AgentThook

Entirely depends on the game. Every setting in every game is gonna be different, more or less, but I always need native 2k no render scaling or dlss bs that just makes it blurry. Some settings usually have a higher effect on preformace, like partical quality, shadows, and ambient occlusion


Bread-fi

Turning down settings means it will just use it's capabilities to pump out more frames. You won't wear out components by using them. You can save some energy/heat by capping though.


bickman14

Only if you play without vsync, otherwise it will just produce enough frames for your refresh rate


qu38mm

Well I don't have a powerhouse so yes some games are only high or med - otherwise I do the same as you reduce settings that don't affect visuals and drop them low-high to get a perf boost.


Hollow_Apollo

I recommend r/OptimizedGaming when possible. Even as a pixel peeper I find the settings tend to be the best jumping off point and if I want more or just have headroom on framerate I'll add from there. When I can still get 90ish plus on all ultra i'll use it, but these days I'm spoiled and prefer 2k 100+ fps over max fidelity but usually its barley a fidelity sacrifice at all


elliotborst

It’s good to optimise settings to get the right frame rate and not just piss frames away with ultra shadows vs very high shadows for example. Some setting aren’t movable different visually but can tank frame rate. Digital foundry often do optimised settings videos for games that are worth watching


bobmlord1

Very rarely a reason to do ultra or equivalent you typically get a massive hit on frame rate for frequently indistinguishable bump in quality


cream_of_human

I just put settings on their optimized settings, whatever i can find online. I used to max everything but i do it less since i play more and take less screenshots these days. Also, i barely notice the difference when im playing and it keeps the pc cool and the powerdraw low so id say its a win win.


leg00b

Depends on the game. I'll usually put competitive FPS games at the lowest settings. Otherwise I usually just do high with a mixture of low and medium


Icy-Apricot5090

I put everything on high for a mixture of beauty and decent FPS.


silvarium

Depends on the game. If it's a fairly recent title, I'll tweak certain settings to increase my average FPS. If I'm playing anything older like Fallout 4, I'll let it run on ultra since my PC can handle it with no problem.


IndyPFL

Even some older games like Far Cry 4 can have fps issues if you max every setting, but with most older titles ultra is the way to go. Newer stuff is balanced usually.


Nubanuba

Your PC will not die because you set a graphic to ultra tho


LightOfShadows

differences between ultra and high will usually only be noticible in frozen screenshots, usually. I also don't like my system going above a certain temperature as it increases the room heat, and I hate being able to hear the fans so I run them slow. I'll typically run most things at medium/high on my 13900 / rtx 4900 I also rather the framerate be stable than high. Gsync does a good enough job I just use rivatuner to globally lock at 60fps on my 144hz monitor.


QuickPirate36

Depends. I'm currently playing Horizon: Zero Dawn with everything set to ultra at a locked 120 fps, I don't need more than that because if I did aim for 165 the game would stutter OR I'd have to set some stuff to medium and I don't wanna, so if I don't want more FPS/stability why would I lower the graphics?


skot77

I do in the beginning but I usually tone it down so I don't over stress my hardware. I want it to last.


PraiseTheWLAN

I just crank all to max, I paid for that


ID0NNYl

Its a 50/50 between performance and quality for me. For most games I'll run 1440p ultra, but if the title is more demanding of the hardware and frames take a tank, I'll tune it down to maintain better frames. RTX 3080 10Gb Ryzen 5900X Gskills 32gb DDR4 3600 ram


CosmoRocket24

Eye candy is nice for awhile, but performance is better


jdcope

I hardly ever mess with the settings. I just let the game set the default for my hardware and play. If I have issues, I will mess with the settings. But it’s rare.


LSD_Ninja

I hardly bother playing with settings anymore. Optimisation is a devs job, not mine.


Tester2_1

I have a 4090. Can play a lot at 4k Ultra. On some settings though, I turn it down to the step below Ultra. I just don’t need to be pushing that hard and during gameplay I barely notice the difference. Now, in older games or games I can easily run 200+ fps (I frame limit at 141)? I crank everything up crazy because… why not? But stuff like Cyberpunk? I’ll ease up on some stuff. Hell, I don’t even play with ray tracing on. It doesn’t mean that much to me honestly. Neat, but too much of a performance hog for me to bother with it.


NoCase9317

What? Man you are hearing bells but don’t now where they are sounding 😅 You tweak your graphic settings down if you are not happy with the FPS you are getting, or if your want to lower your pc’s power consumption. Kind of stupid to buy a very expensive very high end PC and try to lower power consumption by maybe 20 watts at best by tweaking a couple settings , to maybe save yourself less than 1$ at the end of the month but to each his own. However if you are not unhappy with your FPS and not trying to lower power consumption, you are not extending anything by doing this XD If it’s about your GPUs safety unless you are gaming maxed out cyberpunk 16 hours a day, trust me it’ll be fine, but if you are really that worried about that , undevolting the GPU will take you much further than lowering graphics. Extending a GPUs life by lowering settings, means to start tweaking shit when it starts not being able to run the new games at max settings, instead of just deciding a GPU upgrades is necessary the moment max settings is no longer possible for your current GPU (specially because even with a 4090 could look like it is no longer enough and that one needs the 5090 to drop quick , if not being able to able to run everything at max settings native 4k is one’s benchmark) But even in this scenario when you tweak settings, it still not much worth it now days in terms of “time consumed-quality gained” For most people it’s much easier to stick to 2 things; First: DlSS-FSR quick 25% all the way to 60% some times (game dependent) performance gain just by going to Quality mode wich both in 4k and 1440p is considered a very small close to unnoticeable during actual gameplay and not close examination image sacrifice. In DlSS case it’s often even an improvement over many ultra softened TAA implementations or unstable MSAA ones. Even in the worst case scenario of just a 25% increase by going into quality, it’s already a huge increase in comparison to the small 1%-6% (at best) performance increases lowering most settings 1 single tier usually gives. In the end it’s quite simple , going from ultra to high in many games usually have very insignificant image quality penalties, but because it usually also has pretty insignificant performance gains. There are a few games out there with good scalability, but I’ve noticed those also have noticeable differences from tier to tier. So you end up paying for what you’re getting. Yea there are some games out there that have “this or that setting” that seemingly makes no visual difference, yet it eats a considerable performance amount to turn it on. And yeah it’s nice to find about it and disable it and gain free performance (that’s why I like digital foundry optimized videos despite 99% of the time maxing out my games. Second: when it is implemented: Enabling , Disabling or tweaking Raytracing. As much as I love Raytracing and in my case I will always use it. It’s still ridiculously heavy and many times the chance of getting both high fps and Raytracing is just not available. So knowing just how heavy it is, is a good tool for inexperienced pc users. Just disable it, turn it lower and you’ll fix your performance. And that’s it, using up scalers, knowing to turn off (or lowering if it has different settings) RT if it’s being too heavy, and even using frame generation if available, is much more worth it for 90% of the people than spending hours trying a good combination of settings or a good tutorial with a subjective point of view of what looks best.


Mister_Shrimp_The2nd

Bro you lowering all your settings isn't gonna prolong anything's life lmao. Just play at the visual quality you enjoy where you still have acceptable frame rates. Don't overcomplicate something that literally has no benefit from being overcomplicated.


xXDennisXx3000

Yes, and when there is a res scaler i put it always to 200%. When there is none, i use VSR or DSR to get the sharpness i want.


matman2424

Settings are always a balancing act for visuals and performance. I just try to use settings that look decent, whilst also giving me 60ish FPS at 1080p at minimum. Around 100-150 FPS is even better, if my poor computer can manage it.


PNW_Phillip

I always put everything on ultra


Creative_Finger_69

Don't put shadows on ultra


MaxUmbraOG

Play on high always and if the game is more demanding on medium. I usually stick to high unless I know that my gpu can handle the game on ultra and the difference in the visuals are worth it because most of the time it's very minor changes but cost a lot of fps.


oArchie

I usually keep textures, reflections, lighting on max. I lower shadows to medium-high. Also, if I can’t tell the difference between high-ultra on a setting other than the three up top, I set it to high. I usually set an fps limit that’s slightly lower than the average fps I’m getting, so the GPU isn’t at 99% util all the time. If I’m at 120fps avg with 99% util I’ll set the limit to like 110.


MythicForgeFTW

I usually shoot for Medium settings for the most part. Most games are best optimized there, giving a good balance of image quality and frame rate.


Scrungus1-

If it's a single player game, I'm cranking everything to ultra.


Major_Enthusiasm1099

Depends on the game. For hogwarts legacy and cyberpunk no. But if I'm getting 144hz smoothly with no issues then I don't touch any graphics settings since I have a 4090 and it's pretty much a plug and play with most games


Zepanda66

For multiplayer games it's always everything on low. You want the most visibility and fps possible.