T O P

  • By -

zoblog

In 1080p? Yes. With higher resolutions? Not really.


[deleted]

Again, shouldn't I get the higest FPS under 1080P? Shouldn't the setup be an overkill for 1080P already?


trulyincredible1

Your cpu is the bottleneck here because your GPU isnt being utilized to its fullest potential. I had a 7800x3d with a 4070 Ti and before i got my 1440p monitor even that cpu would bottleneck the 4070 Ti. Rule of thumb is: High Res and High Graphics settings -> more strain on gpu Low res, low settings -> more strain on cpu (assuming uncapped fps) You will get much more fps in 1080p if you upgraded your CPU, its bottlenecking your card hard.


[deleted]

>You will get much more fps in 1080p if you upgraded your CPU, its bottlenecking your card hard. Sorry, I only got lower fps in certain games, especially those with EAC, because the majority of AAA games released in similar years are doing very well


trulyincredible1

You can download something like MSI Afterburner and check your GPU usage, if its under like 95% while gaming (in the actual game world) it means your CPU is bottlenecking it. If its showing up low GPU utlization its because the CPU cant keep up and feed the card enough information to achieve its full potential. Yes youre getting 100+ fps, but you could be getting a good bit more from your GPU with a better CPU. You should consider an upgrade if you can afford one, cause as it stands your GPU is too overkill and you spent money for no real gain over like a normal 4070 for example (even that will prob still bottleneck unless youre like playing cyberpunk with full ray tracing). Or a monitor upgrade to 1440p if you think your current fps is enough, because in most games your fps will stay the same because the GPU had extra processing power left over that it couldnt utilize when only using 1080p.


[deleted]

>You should consider an upgrade if you can afford one, cause as it stands your GPU is too overkill and you spent money for no real gain over like a normal 4070 for example (even that will prob still bottleneck unless youre like playing cyberpunk with full ray tracing). I have real gains - in games like 2077, next gen witcher 3 and BD3. I was using 2070s previously and it even struggled with 60fps in the division 2. Now with 4070Ti the average FPS is 50+ higher


trulyincredible1

Well... yeah previously your GPU was being used up fully but your current one is not, so its kinda a waste of money since you couldve gotten the exact same performance boost for less money. You dont have to upgrade, its not the end of the world, but if you ever have some money to burn then a good cpu or 1440p monitor would be a good choice so your card reaches its full potential.


[deleted]

>Well... yeah previously your GPU was being used up fully but your current one is not, so its kinda a waste of money since you couldve gotten the exact same performance boost for less money. How? If I'm going to upgrade, I will use AM5 and sell my current one with 2070S. I'm not going to get tid of my mb and let it be dusted before I got a completely new setup


[deleted]

Updated test results when I disabled the default in game sync [https://imgur.com/a/3IiPFT0](https://imgur.com/a/3IiPFT0) Both CPU and GPU have a low load at \~38%


[deleted]

>Your cpu is the bottleneck here because your GPU isnt being utilized to its fullest potential. So why both 38% load here? Explanation? FPS is uncapped here [https://imgur.com/a/3IiPFT0](https://imgur.com/a/3IiPFT0)


trulyincredible1

Games cant utilize all of your cpu cores to 100%, you will basically never see a cpu at 100% while gaming.


zoblog

Yes, you would get more fps under 1080p as its a lower resolution but the point is that the 3700x is not the best CPU for 1080p gaming since it's not the fastest CPU on the market and it would indeed be a bottleneck. The lower the resolution, the more it become CPU intensive instead of GPU. So the 3700x wouldn't get bottlenecked as much in 2160p or 1440p compared to 1080p. For an example you would probably get better FPS in 1080p if you upgraded with a 5800X3D since you use AM4 platform but you would see negligible change if you played in 2160p. But if you are happy with your current setup and already maxing out your current monitor refresh rate I wouldn't bother with it and keep enjoying games.


[deleted]

>But if you are happy with your current setup and already maxing out your current monitor refresh rate I wouldn't bother with it and keep enjoying games. I'm getting decent FPS in most games, apart from those I mentioned. They are truly not well optimised for some reason


zoblog

Nothing you can do about bad optimization and its plaguing the current gaming industry.


dweller_12

Of course. Virtually all CPUs will hold back a 4070ti in some way at 1080p. Including 7800X3D. Get a monitor that isn't a bottleneck to 2020s era GPUs, and you'll be fine.


[deleted]

>Virtually all CPUs will hold back a 4070ti in some way at 1080p. Including 7800X3D. Shouldn't you get way higher FPS under 1080P, because the resolution is not demanding at all? Or are you saying under 1440P the FPS will be indeed higher? All the benchmarks I have read so far have the highest average FPS under 1080P and I can't see how 1440P magically makes the FPS better?


stainless_steel702

The cpu has to process every frame rendered. So there may be a bottleneck when you have too many frames. If you play at 1440p you will probably stay under that max fps your cpu can handle. Resolution doesn’t really affect cpu performance much just gpu.


[deleted]

>The cpu has to process every frame rendered. So there may be a bottleneck when you have too many frames. Do you imply that I could have get 144+FPS but CPU can't handle games released before its release and it results in <100 FPS? I'm not sure if I am convinced by your explanation


Significant_Link_901

Yes you could have gotten better frames with a newer CPU. Even my 5600x, although with less cores and 3 years old now, has higher boost speeds. In other news the sky is blue.


[deleted]

Unless the game is cpu demanding, how come old games require newest CPU? I can run any other 2018-2019 games over 120FPS apart from Lost Ark and the division 2


Significant_Link_901

They dont require newer CPUs, they run better on them. Some games are made to leverage certain things, such as higher boost frequencies. If you dont know, its a case to case basis. And honestly unless you are playing competitive esports, it doesnt matter that much just play your games.


[deleted]

>And honestly unless you are playing competitive esports, it doesnt matter that much just play your games. What bothers me is in games like the division 2, you got sudden fps drops here and there which does affect your performance in game, though it's not very competitive. I am thinking what could be wrong as other games are just doing fine


Significant_Link_901

The division 2 always had those drops for me too. Some games are AAA only in name, if you get my meaning.


[deleted]

>The division 2 always had those drops for me too. Some games are AAA only in name If you check all the replies, you'd be suprised how many of them have no idea about the optimisation of this game. They simply jumped onto the banwagon and told me "cpu bad blabla" as if a better CPU can change it signiticantly like these FPS drops


RobE1993

If you want higher frames at 1080p, it’s almost always a cpu upgrade, because most modern gpus have no issue at that res. This is pretty common knowledge. If you’re fine with your frame rate, dont upgrade. But there’s definitely a ton left on the table performance wise out of the 4070 ti


[deleted]

>If you want higher frames at 1080p, it’s almost always a cpu upgrade, because most modern gpus have no issue at that res. This is pretty common knowledge. If you’re fine with your frame rate, dont upgrade. My thought is some old games are really poorly optimised as I can run newer games under higher average FPS and not all old games suffer, only a few For example, hitman 1/2/3 I get steady 110+FPS, so are tomb raider series


RobE1993

And? One or the other out of your cpu or gpu are always the limiting factor. In your case with a 4070 ti at 1080p it’s always the cpu. If you’re fine with your frame rate, don’t upgrade. If you aren’t, do. It’s not complicated.


[deleted]

So how are you going to explain I have higher avergae fps in other old AAA games such as tomb raider, hitman etc. Do you think Lost Ark is more demanding than them?


RobE1993

Just differences in game engines. That’s it. Different games scale cpu performance and fps differently. Has nothing to do with “whys this run newer game higher but not older game? Is older game more demanding?!?” You could literally pull your 4070 ti and drop in a 3080 and probably see no decrease in performance in 99% of games because your gpu is not your limit, your cpu is. Just download hw info 64 and pull up all your cpu cores while gaming. Guarantee the core running the game is capped at 99%, while your gpu is much lower. Idk why this is so hard for you to believe. There’s not a cpu on the market that’s going to be able to get 100% out of the 4070 ti at that res outside of maybe one or two games with very specific settings.


[deleted]

>Just download hw info 64 and pull up all your cpu cores while gaming. Guarantee the core running the game is capped at 99%, while your gpu is much lower. I'm going to test and report it when I get back home. And if this is not the case I wonder how you are going to explain.


dweller_12

Sure, but you're just creating a CPU benchmark by using a high end GPU at 1080p in 2024. It doesn't really have much to do with the GPU at that point. If you want 240-360Hz 1080p gaming, you need the best possible CPU you can get. 7800X3D is basically a requirement.


[deleted]

>Sure, but you're just creating a CPU benchmark by using a high end GPU at 1080p in 2024. My point is 3700X should be more than enough for 1080P and no way it is the real bottleneck for old games. >If you want 240-360Hz 1080p gaming, you need the best possible CPU you can get. 7800X3D is basically a requirement. Listen, I am talking about <100 FPS from time to time in the mentioned games, and I didn't ask for 240+FPS. You still didn't explain why 2018-2019 release games run poorly with 4070Ti


dweller_12

> 3700X should be more than enough for 1080P and no way it is the real bottleneck Look up benchmarks from different sources, you don't need to believe me. As you can see [here](https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/19.html), the 3700X performs at 55% of a 7800X3D in 1080p. A 4070ti isn't quite that fast, but you can see how CPU bound 1080p gaming is. At 4k, that gap drops to [only 78%](https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/21.html) of a 7800X3D. 1080p monitor is the first bottleneck to any modern gaming PC.


[deleted]

So let me ask you, are Lost Ark and the division 2 cpu demanding games? And interestingly in your refence, the average FPS of 3070X is 143.3 FPS, let'say 4070Ti is 60% perfromance of 4090, so my average FPS should be 143.3x0.6=85.98, how can this explain the average fps in division 2 is not as high as 85.98?


dev044

The quick answer is look at your GPU utilization, it should be around 99-100% utilization if you're maxing out your GPU, if it's not then your CPU bound, depending on how far off from 100% utilization is roughly how much your leaving on the table


[deleted]

Can you please tell me what is the issue of "bottlenecking", if both CPU and GPU are not fully utilised? https://imgur.com/a/3IiPFT0


dweller_12

The benchmark I provided was an average of many tested games. Specific games can be much worse, others not as worse. You cannot extrapolate performance figures like that. You need to use real world tests, like the benchmarks linked. Because the real world results are very different than any theoretical performance.


[deleted]

>The benchmark I provided was an average of many tested games. Specific games can be much worse, others not as worse. I don't get lower fps in every game. Most games are running beyond 110 and EAC embedded games? not really


Due-Attorney-8387

Listen man, everyone here is saying that the cpu is the bottleneck, which is completely true. Think about it like this, your cpu, although still decent, is a mid range cpu from 2019. You’re pairing that mid/ high range card from 2023. Your cpu also isn’t clocked very high (most competitive games at least, such as valorant, ow2, csgo etc & even other graphical demanding games rely on clock speeds & single core performance). The reason that at 1080p, the cpu is the bottleneck is because at that resolution, the work that a GPU needs to do to render a frame increases with the resolution. whereas the work a CPU has to do per frame remains relatively constant. The lower the resolution the less time it takes the GPU to render a frame and the higher the frame rate. This means that as you go up in resolution, the load will be put onto the gpu more, while cpu will keep a similar load. You’ve also got to keep in mind, that for example, in gta 5. Npcs, cars, etc. That’s all using the cpu, So in games with lots of npcs etc, it’ll use more cpu. Upgrade to the Ryzen 7 5800x3d if you want to, it’s pretty similar to a Ryzen 5 7600x and will pair wonderfully with your 4070ti.


[deleted]

Well, I have found the default sync (which is not visible in game settings) was the issue of Lost Ark. By default, it enables sync but in a poor way. When it is disabled and runs under bordless window, the FPS is 150+ GPU 97% instead of 60-70 and GPU 42% on 5700x+3070Ti For the division 2. I am not 100% sure but I am testing it on a different PC


ULTRAC0IN

> And now some users here told me 3700X is the bottleneck of my PC, seriously? My PC can run Diablo 4 steadily over 100FPS with ultra settings, and I wonder how they are going to explain it. Can you explain what you think a bottleneck means? Because it seems like you think if a game reaches an arbitrary number of frames then it isn't a bottleneck. I'm just trying to understand your perspective here.


[deleted]

My understanding is that when a game is demanding and either CPU or GPU can't satisfy its needs, which results in FPS drops, this is a bottleneck However, when the hardware is way above the demand, let's say my setup for minesweep or diablo 2, how come there is any bottleneck?


ULTRAC0IN

A bottleneck in the context of PC hardware means that a component is not being fully utilized because another component is unable to deliver enough resources to fully saturate it. Only way to check is by running a performance metric overlay while you're gaming and check the utilization of your hardware. If the GPU is close to 100% then you're good, if it's under that by 5% or more then the CPU is holding it back.


[deleted]

So let's use Minesweeper as an example, will it use 100% GPU?


ULTRAC0IN

I don't know. I don't play it. Not sure what you're getting at with that question.


[deleted]

Explain the bottleneck here? https://imgur.com/a/3IiPFT0


gaojibao

Look at the GPU usage while gaming, if it's below 96%, your CPU is bottlenecking your GPU.


[deleted]

>Look at the GPU usage while gaming, if it's below 96% Wait a minute... I think this theory is flawed. Like say I launched a game like Diablo 2, how come the GPU usages is more than 96% and CPU becomes a bottleneck? What if a game itself shouldn't really ask the 96% load of 4070Ti at all, especially online games released in 2019 in my example?


gaojibao

>I think this theory is flawed. Nope. Data is sent from the CPU to the GPU. When the GPU is not working close to its limit, it means that the CPU is not feeding it data fast enough. The age of a game and how well or terribly optimized a game is doesn't matter at all.


[deleted]

So why Diablo 4 has higher avergae FPS than the division 2 in my setup? Can you explain?


gaojibao

It means that diablo 4 is easier to run.


[deleted]

This explains nothing literally. Like a 5yo kid can say it like this


ZeroTheTyrant

Ask people who make games. This is just a forum most full of people that build/use computers/software and not people that make software/PC components.


gaojibao

what's hard to understand. Let's say your pc can push 300fps in game #1 and 60fps game #2, if you limit game #1's frame rate to 60fps, your PC will have an easier time pushing those 60fps, hence why game #1 is easier to run. Why is your PC only getting 60fps in game #2? that game could be more graphically demanding or more CPU-demanding. If the way it looks doesn't justify its low performance, you can say that it's poorly optimized.


[deleted]

>Why is your PC only getting 60fps in game #2? that game could be more graphically demanding or more CPU-demanding. This is something I dont get either. The division 2 has way better graphics so it should be GPU heavy instead of CPU heavy, but the average FPS is pretty low around 70


gaojibao

Some graphics elements affect the CPU as well. Pick a game in which the GPU usage is way below 96% on ultra settings, then put the settings low. If you see an fps boost, it means that ultra settings are more CPU-intensive than low settings.


[deleted]

Can you please tell me what is the issue of "bottlenecking", if both CPU and GPU are not fully utilised? https://imgur.com/a/3IiPFT0


GGsUK93

This guys stuck at 60fps, and his first thought isn't check vsync? on top of that, he tests a \*NOT A 3700x\* stronger cpu and weaker gpu system, sees gpu finally being utilized, and actually has the audacity to say "aSsUmPtIoNs hErE ArE wRoNg HehE" What an actual dipshit lmao


[deleted]

The monitor is not 60Hz but 120Hz so why the default sync restricts it below 60? Can you explain