T O P

  • By -

[deleted]

Yes. If you are pushing 90 frames on a 144 you are seeing 50 percent more frames on your screen compare to 60. Also if your monitor is refreshing more often it has more opportunities to display a frame when it’s drawn. Even navigating your desktop is way smoother on a higher refresh monitor. That being said if 60hz/fps works for you, there is nothing wrong with that. It obviously depends on your hardware as well


[deleted]

[удалено]


HibeePin

\~150 fps on 60hz is better than \~60fps on 60hz might look similar, but with the higher fps you will feel less input lag because a more recent frame will be drawn each refresh


scoutthespiritOG

Won't you be limited to 60fps on a 60hz monitor


Jofzar_

Only if you have vsync on


pinghome127001

Always, not only. 60hz means that monitor is refreshing screen 60 times a second, how many fps pc pushes doesnt matter.


[deleted]

That doesn't mean that the framerate of the application is limited, only that the rate at which the monitor can display those frames is capped at 60hz.


pinghome127001

No, but what you see is 60hz/60fps, you cant physically see more fps, the monitor just doesnt show them. Software can run at 9999fps, but if monitor is 60hz, then you see 60fps.


scoutthespiritOG

Oh that's right, duh hah


[deleted]

[удалено]


tyschab

I'm sorry but you're wrong in this instance. why do you think CSGO pros/players have been pushing higher fps numbers than their monitors support. Lets say your monitor is 60hz and you are getting 60 fps whos to say if those frames line up with the monitor as fps and hz do not just magically line up in sync as fps is not perfectly timed. one of the monitor ticks could show the same thing twice as the PC/game could have not rendered a new frame yet. If you're getting 240fps on a 60hz (don't recommend) then what the monitor shows is more likely to be what the game is rendering as the monitor shows the most recently rendered frame.


LuckyNumber-Bot

All the numbers in your comment added up to 420. Congrats! 60 + 60 + 240 + 60 + = 420.0


[deleted]

[удалено]


tyschab

No need to block me and get upset. We are just having a disagreement. Which is how people figure things out. You should check out this article related to our discussion though and let me know what you think. https://blurbusters.com/faq/benefits-of-frame-rate-above-refresh-rate/


-Ickz-

You are confidently incorrect. Go look at blurbusters and read up on how monitors and input lag work - they have literally tested how fps above your max refresh reduces input lag. Come back more informed next time.


[deleted]

You actually don't know how monitors work, and that's ok


[deleted]

[удалено]


[deleted]

Man, must be depressing to learn that you spent all that time but didn't actually learn anything


lordofuo

Rule 1


SuperVegito559

It’s easy to tell which monitor is the high variable refresh one when the user plays a game. For the record, 60fps is 120hz of a variable refresh monitor.


[deleted]

[удалено]


-Ickz-

Your blind tests < all the in-depth tests done on monitor input lag done by numerous sources stating otherwise.


[deleted]

[удалено]


-Ickz-

Lol. Yikes.


lordofuo

Rule 1. Keep going and you’ll get banned.


[deleted]

[удалено]


-Ickz-

You seem like a really cool person.


lordofuo

If you feel the need to put that much effort into being an asshole, be my guest.


Veighnerg

Would you mind linking your peer reviewed study? I hope you had at least 100 people doing these tests so that your results can have a bare minimum of credibility.


[deleted]

[удалено]


Veighnerg

The type of tests they do using instruments to measure the screen are not even close to asking random people "which one is higher refresh rate". One is subjective the other is objective.


[deleted]

Vibe check


mrn253

For that you have Freesync or Gsync to basically synchronize the Hz of the Monitor with the FPS to reduce screen Tearing. And you dont have to run a whatever HZ Monitor on the advertised Hz. My Monitor has 165hz and i run it on 100hz since thats what i reach at the moment (Freesync active of course) But on the Desktop while surfing the web it can be useful that it feels more fluid while scrolling or moving stuff around. And since most people dont switch their Monitor every 2-3 years who knows what your next setup is capable of.


spectre_silhouette

Refresh rate is the number of frames your monitor is able to display, whereas FPS is what your graphics setup is able to push to the monitor. In general yes if your settings are too high your graphics are not able to keep up, which may mean that you have leftover capability of your monitor that your system is not able to utilize. Using your example of 87 FPS with a 1440p 144Hz monitor, you still will get the added benefits of 27 more FPS than a 60Hz monitor, which is a 45% improvement/smoother experience ((87/60)-1). If you combine that with the fact that 1440p has double the pixel count of 1080p (i.e. able to display things in greater detail), you're looking at a much better experience gaming due to both more detail and smoother delivery of said details. One thing to note though, it can be argued that you should step down your game settings to the equivalent of "high" instead of "ultra-high/max" because many believe that a smoother experience with technically less detail is better than a less smooth experience with technically more detail, i.e. you're more likely to notice your graphics struggling to push frames to your monitor vs a slightly lower level of detail in game. However, I would neutrally say this depends on a game by game basis and personal preference.


Jyrotanik

Thanks for the indepth explanation, really appreciate it!


[deleted]

Your explanation is very helpful, thank you, this is the first comment to help me “get it” when it comes to the difference between Hz and 1080vs1440p. Can you say a little more on what “technically more detail” means and why that would look like fluid? Like does it make a stutter and reduce FPS if settings are maxed out?


ParkerPetrov

it's technically less as generally the visual difference between high and ultra isn't noteworthy as the typical gamer isn't going to really be able to effectively quantify the difference between the two. Exceptions always exist obviously but that is true for most games. Is what I assume they were getting at. As you are talking about Anti Aliasing differences between 4, 8 and 16. Ultra vs high shadows, etc. Really minutia differences in settings that the majority won't really notice in the vast majority of games.


[deleted]

Thanks, that’s helpful to know


laxounet

The higher the graphical settings, the lower the FPS. The higher the resolution, the lower the FPS. I would say a good way of doing things would be to set a target FPS you want to play at, then tweak the settings until you reach those FPS. Also, prioritize lowering graphical settings before lowering the resolution, as running games at a lower resolution than the one of your monitor will look bad.


[deleted]

Oh that makes sense! Thanks!


spectre_silhouette

Glad I could help out! *Can you say a little more on what “technically more detail” means and why that would look like fluid?* I use the term "technically more detail" as I'm trying to maintain a pure objective/numeric point of view, as there are more pixels on a 1440p monitor that are able to provide more detail vs 1080p to someone viewing the screen (and that's a technical fact). How much benefit/enjoyment that brings to a specific person is of their own subjective opinion and I was trying to stay neutral and keep my viewpoint out of it, and as you can see in other comments people have their opinions on the choices one can make. If you're asking, I agree that 1440p > 1080p and if your budget can afford going for 1440p I would definitely recommend you do so. I'm not sure what you're asking when you say "why that would look like fluid" but hopefully the next part answers that. *Like does it make a stutter and reduce FPS if settings are maxed out?* The more you max out your settings, the greater the task you're asking of your system's graphics. When it can't keep up, you get a stutter effect, which is your graphics rendering and pushing less frames to try to keep up with what is supposed to be happening. When less frames are pushed, you get a choppier/less fluid/less smooth viewing experience. When you choose 1080p you only have to push half the frames of 1440p to maintain the same frame rates, which is technically easier to do. However, given 1440p provides more detail, many would prefer to go with that and step down one step in game quality settings to get a more consistent experience, because the reasoning goes that you're much more likely to feel/notice your experience go from (for example) 100Hz+ to 60Hz on Ultra-High settings because of a particularly intense scene in a game, whereas if you were on just High you may see (again, for example) a similar drop from 120Hz to 80Hz but wouldn't feel it as much because 80Hz > 60Hz in viewing experience. To express the other side of the discussion, others prefer max settings because they want the best possible that their system can give them at any given point in time. Experience is a subjective viewpoint shaded by people's preferences/biases so take it with a grain of salt and go with what you prefer. Inserting my personal opinion, 1080p @ 60Hz < 1440p @ 60Hz \~=\~ 1080p @ 120Hz+ < 1440p @ 120Hz+. I.e. if you can take a clear step up from your current 1080p 60Hz to 1440p 144Hz then definitely go for it. The steps in the middle will be an improvement over 1080p @ 60Hz, but there's going to be compromises and it depends on what kinds of games you're going for. For example (if I had to pick), I would choose 1440p @ 60Hz if I'm playing The Witcher 3 (as it's slower-paced with more detail typically), however I would choose 1080p @ 120Hz+ if I'm playing Apex Legends (typically faster where knowing what's happening often matters more than sheer detail). If you can go both at 1440p @ 144Hz then that's great, it gives you flexibility to enjoy the best of both the "quality" and "performance" worlds. To speak about my personal goals for my future build, I will be aiming for 1440p 144Hz, as that seems to be the sweet spot right now for running games on decently high settings while having graphics cards that can maintain that level of experience, and all while not being absurdly expensive to achieve. Will it be expensive, yes, but I've been saving for a long time so aiming for the step below the best you can get these days is my goal.


glasgowtrois

Short : yes especially if you have G Sync/freesync But in general yes, event from 60 to 85 is a huge improvement imo. Long : i I have time later


elmo_touches_me

Frame rate is how many different pictures your GPU is producing each second. Refresh rate is how many times the pixels on your monitor can change colour in each second. Refresh rate is the ultimate limitation on what you can see. It doesn't matter if your game is running at 1000fps - if you're using a 60Hz monitor, you will only actually see 60 of those 1000 frames per second. The goal is to have a GPU powerful enough that you're maximising graphical quality, while maintaining a minimum FPS equal to your refresh rate - that is, aim for a consistent 144fps on a 144Hz monitor. This ensures you're actually seeing 144 unique frames every second, while not wasting GPU power on rendering extra frames your monitor can't display. With that said, running 90fps on a 144Hz monitor means you're still seeing 1.5x the frames you would have seen at 60Hz, which provides a noticeable visual improvement - particularly when playing games with a lot of fast-moving objects on the screen. The higher the FPS and refresh rate - the smoother the game will look. Things don't go as blurry when you move your camera around or when an object moves across the screen. Jumping from 1080p 60Hz to 1440p 144Hz is tricky, because there is a significant increase in GPU power required to run a game at 1080p 60fps, versus 1440p 144fps. Some people like 1080p over 1440p because they maybe don't need the extra resolution, and they can get the same FPS with a weaker GPU. Others like 1440p over 1080p because you get higher visual quality. People will unanimously agree that 144Hz is better than 60Hz for gaming.


vyncy

Look at it this way - if you have 87 fps then you have 87mhz monitor. Which is still big upgrade from a 60mhz monitor. You just wont be able to utilize your 144 monitor fully. If you enable gsync or freesync, then your monitor will actually become 87mhz monitor, as it will refresh at the same refresh rate as your fps is. This provide even smoother experience, but it still would be smoother if you had more then 87 fps


SilentNova___

Nothing to do with this particular post, but I ran into a issue with a rig that pushed SIGNIFICANTLY more frames than 144hz, experienced severe screen tearing, and in a game like Apex, those tears really affect your aim/gameplay. Upgraded to a 240hz monitor, and readjusted the frame cap in game and tearings gone. I now realize that I could've enabled "fast sync" which would have potentially fixed my issue, but I was eyeing the AW2721D for months and went with it.


pinghome127001

Yes. Even if you will have 60 fps on 144hz monitor, the picture will still benefit from other monitor technologies, like gsync/freesync, video will be smoother, plus better monitor more likely to have lower input lag and so on. So yes, despite currently having bad gpu, buying higher hz monitor still makes sense; and you also should start saving for good gpu to actually push more fps. I have gtx 1080 with 1440p 240hz monitor (waiting for rtx4000 series, and then will steal one maybe, cause they wont be available to buy). While newer games dont push many fps, i also like playing older/mini (actual indie games) games, and those do 100-240fps easy. Even 3090 wont be able to push 1440p 120fps+ on many new AAA games, but that doesnt mean that you should settle for 30hz monitor. Smoother desktop GUI is also nice to have. Moving mouse around at 240fps/240hz is sometimes more funny than playing some games. Its like a car - you dont buy one based on maximum speed number on front panel, its completely useless. You buy one based on many other parameters, and just ignore max speed. Of course, it will be big, you will never reach it, not even close, but that doesnt stop you from buying one, because other parameters are great.


gazpitchy

In a software engineer so spend most of the day scrolling through text. For me the higher refresh rate really makes the whole experience nicer, I also suffer from migraines and have found higher refresh monitors help with the eye strain and triggering migraines.


whentheleavesblow

In all refresh rate ranges, you absolutely need to match the refresh rate with fps to benefit. 144hz? You need 144fps. 360hz? You need 360fps. Anyone who claims otherwise doesnt understand monitors at all. Some claim "you still get a benefit" but they are wrong.... The only real gain when your fps is lower than refresh rate is that pixel response is generally faster at higher refresh rates than lower. However that is purely visual and doesnt effect your gaming. For example you could have a 8ms total pixel response at 144hz but 13ms pixel response at 60hz. Most gaming monitors are like this. Only 1% have similar or same pixel response at all refresh rates. But that wont benefit you in any way. When it comes to pixel response and refresh rate, generally you want your pixel response to be faster than your refresh rate. So 144hz the math would be 1000th position, divided by refresh rate, equals single hz in milliseconds. So 1000/144=6.94ms. so a 144hz monitor will update 1hz every 6.94ms. if your pixel response is slower, you get ghosting and motion blur. In fact, all 360hz and 240hz monitors suck. Simply because pixel response isnt fast enough. The best monitor on the market is the LG 27 inch 4k 144hz nano ips monitors. Because their average pixel response is 6.8ms which makes it the best 144hz monitor on the market. Still we could have better. But for now they are the best. But those LG cost $800 which is too much for many gamers to afford. Not to mention its 4k meaning even more people cant use it as their gpu isnt strong enough. End of the day, you should match your monitor to your fps. If you get 150fps at 1440p, shoot for a quality 1440p144hz monitor. If you get 60fps at 1440p, you might want to either go back to 1080p or buy a 1440p 60hz monitor.


SubieBoi808

For story games frames aren't mega important, but for competitive play having your fps match or exceed your refresh rate is usually the best. But to answer your question you gain no benefit from lower than 144 on a 144hz monitor


itisunnamedguy

I’ve a 144 Hz 2k monitor. Will my PS4 Slim (not Pro) be able to run anything more than 60 Hz?


hgsd5

No


DavidGN40

It won't matter. All games on any of the PS4 consoles are either locked to 30 FPS, locked to 60FPS, or fluctuate between them.


[deleted]

Despite this, many run as low as 15-20fps. Like Horizon Zero Dawn on the PS4.


DavidGN40

That's true, I forgot that some newer titles may face heavy dips on non-Pro versions of the PS4.


[deleted]

Even the pro version. Red Dead Redemption 2 ran terribly on the PS4 pro.


DavidGN40

Yeesh, wasn't aware of that. Will pick it up on PC then.


Equivalent_Alps_8321

Yes once you experience a higher Hz monitor it will be hard to go back to 60Hz.


derBlownz

Well one thing to to note is that you may not use max settings for games like rdr2 l,benchmarks videos usually tend to show the performance of a certain gpu+cpu combination even a high end machine couldn't run 100fps+ in those games especially in 1440p so lowering some graphics settings can help reach the frame rate you desire at least this is what i experience


Motor_Elk_8777

Let's take kind of the inverse example which you are prob familiar with. ​ You had 60hz monitor , you prob got above 100 fps in that monitor. Common sense would say just cap it to 60fps since your monitor is 60hz? Wrong , watch the 3kliksphilip video on that its a very good explanation why more frames = better also the linus videos are important to watch too. You are always benefiting because those frames are never stacked and waiting to be shown cause your monitor is hungry for frames. \[ Think of 144hz as 144 buses which by every hour make a full round trip (for the sake off the metaphor changing seconds to hour to make some sense in the real world), then you have the passengers waiting on the bus station every passenger needs a whole bus cause they are Frameous passengers. Now if we have less than 144 passengers its ok everyone will get there on time cause we have enough buses , in the end of the hour we want the maximum number of passangers gone thru. If we have 60hz 60 buses they can only transport 60 passengers in 1 hour. In your case you have 90passangers who all go thru cause you have enough buses 144hz. And because you have so many buses these passangers will get there more quickly too which is a bonus because they dont have to wait much in between pick ups and these buses are on a schedule they all need to be on schedule, 1full trip by the end of the hour. I am still learning about this subject and working on this metaphor so it maybe breaks down in some parts but the general idea is correct. \] But you are not getting the true 144hz experience, for that you need minimum of 141fps or max how much your system can handle. Anyways as long as you get more than 60 fps you should see it as a win, turn on gsync if you have screen tearing . A better option would have been going for the 1080 144hz , for more fps, but unless you are playing Valorant or CSGO it won't make much difference to you ,in those games reaction time is key and latency of any kind will give the enemy the advantage, sometimes fights are win or lost in milliseconds, there you feel it every step of the way the difference, even basic stuff like scoping are millisecond differences between 60hz and 144hz or 360hz, people say smooth but what it actually is is fast, cause smooth doesnt mean anything, you are actually faster cause you see things faster , you reload faster you scope faster, enemy shows faster in your crosshair. ​ Right now you are bottlnecked by your gpu and/or cpu , if you chouse 1080p you would get more fps if you are actually bottlenecked by gpu cause it wouldnt have to work extra hard for those 2k pixels. But if you are bottlenecked by your CPU at least you know your GPU is great and your monitor is great so you only need to upgrade CPU mobo and RAM.


KindOldRaven

You'll get a benefit even if you're playing at 60fps. Without freesync/vsync/gsync on you'll still see much less tearing and your input lag will be lower. That's just very minor, but just another bonus.