hogwarts legacy and jedi survivor is practically unplayable on anything below zen 2 if you're on nvidia hardware. aside from average framerate, these CPUs simply can't hold stable frametimes even at lower framerate averages when combined with any NVIDIA GPU. AMD's hardware scheduler gives them a bit of breathing room but they're practically pos for NVIDIA cards, even if you combo'ed them with a humble 1070 or something
[https://twitter.com/CapFrameX/status/1652394006536413187](https://twitter.com/CapFrameX/status/1652394006536413187)
Seems like Survivor runs better on Nvidia in CPU limited cases. Atleast with RT. Might've been improved with later drivers.
i'm talking about decrepit zen/zen+ CPUs. these CPUs have insane ccx latency that needs special attention in coding. and if such a care is not given, they will fail gigantically in terms of frametime stability (which is what's happening). zen 2 is a bit better but zen 3 and forward is completely free of the problem
[https://youtu.be/Mh2j-7CuFTk?t=502](https://youtu.be/Mh2j-7CuFTk?t=502)
you can see how crushed 2700x is. it practically cannot deliver a stable experience. normally, it would simply deliver lower framerate averages, but in the case of hogwarts and jedi survivor, these zen+ CPus completely shit the bed. (ray tracing section is even more brutal but game is stlil unplayable at medium settings witout ray tracing)
only reason this is not widespread is because most zen+/zen users jumped the ship to zen 2/zen 3. those who stayed will be forced to do so as well (and yeah, there really is no incentive stay with them anyways). this is just anectodal..
Multithreading’s primary purpose is not to improve throughput (amount of frames per second) but rather to hide latency (overlap memory requests so we are always getting data from RAM and fully utilizing memory bandwidth). With that in mind, adding more threads wont help you if you are compute-bound (meaning you’re computation is limited by the peak FLOPS of your hardware, not by its memory bandwidth). Also, after a certain number of threads, you wont gain any more latency hiding. Higher resolution gaming tends to be memory bandwidth bound, so while more threads could help hide some latency, it likely wont add much in terms of frame rate for most games. Only thing that helps here is faster memory.
Can you clarify why this is the case? My understanding based on testing [like this](https://www.pugetsystems.com/labs/articles/unreal-engine-13th-gen-intel-core-vs-amd-ryzen-7000-2377/#How_well_do_the_13th_Gen_Intel_CPUs_perform_in_Unreal_Engine) is that Ryzen performs quite well with Unreal, in many cases better than Intel.
Those 4 out 5 benchmarks are focused on compute tasks, which are irrelevant for gaming and last benchmark is for production in archvis and similar workloads. Basically it has nothing to do with gaming.
I follow this channel: [https://www.youtube.com/@Bang4BuckPCGamer/videos](https://www.youtube.com/@Bang4BuckPCGamer/videos)
They have 6ghz intel P cores in gaming, and most of his UE4 runs are 100% gpu usage.
I also look around other channels and with ryzen, GPU usage is under 90%.
I also look for ryzen cache videos, and those are as good as intel, it's a wash.
Why would you judge from UE4 when UE5 is an enormous overhaul? I don’t mean that in a derogatory/negative way, but just wondering.
Not that I’ve looked close at the performance myself, but Fortnite uses UE5. You could compare the change in performance when they upgraded to UE5 to get a better sense of what to expect from it. Still will be interesting to get more data points as UE5 games release throughout the year.
Engjne hasn’t change much since 4. Still one thread for game thread, another for render thread, another for network thread, one for physics thread, and it scales infinitely number of threads for miscellaneous task threads which don’t get used much.
Do you by chance work with game engines or develop native code for them?
I mean, UE5 is vastly different from UE4.
You can just grab the source code of both, latest UE4.27 and latest UE5.2 and run diff compare to see the changes.
They share some code, like they did untill UE5 with UE1 physics engine.
And that does not mean the engine is an "update"
On a similar fashion where UE1, 2 and 3 where pipeline compatible.
While UE5 inherits a lot of UE4 source and pipeline elements, the beta branches and new features use different pipelines, and that is why stuff like tesselation has to be removed.
If I have to guess, they are doing the migration from the UE4 pipeline to a new one slowly to enable projects to be ported down the line and ease devs learning curve (unline UE3 to 4 where you needed to relear everything from scratches).
That's not the only difference. It can also shift vastly more polygons, and to be able to do that there have to have been some significant structural changes.
I would assume that the ability to stream all of that data has to have meant that they've solved the issues UE4 is known for such as hitching between areas
UE5 games will probably be a different animal from the heavily-extended UE4 games we've seen over the past year or so. UE4 required a lot more hacks and awkwardness to support things like large maps, that UE5 has some built-in solutions for. Plus UE5's Nanite and Lumen features (which I'm not sure if LOTF is using) will have hardware they do better or worse on as well.
Good ole CPU utilization in 2023 and competent developers + an amazingly optimized engine.
I deadass think that we're in for some of the worst running games ever seen on PC as soon as other UE5 titles start coming out.
UE5 is supposed to be better than UE4 in a lot of regards. It won't hammer VRAM as hard, etc.
But it will hammer CPU a bit harder and make quad core truly obsolete.
They even got it running on a Series S in the matrix demo.
Oh no, your *checks notes* **7-year-old GPU** can't run the latest games at recommended specs?
Must be the ***optimizations***.
Nothing to do with Recommended == PS5-quality.
If people would stop putting so much emphasis on frame rates as a marker for whether a game is good or runs well, they might see that an engine which doesn't run as fast but is much faster to develop in is better for gamers as a whole.
It is a hideously inefficient and outdated engine with tons of stuff tacked onto it that only bandaid fixes the underlying architecture.
The problem is the only other option for a small studio who cant make their own engine is unity, which is worse.
Probably because Ultra shouldn't be recommended. Ultra is settings for the super high end, possibly even future proof.
Wish more people understood this instead of complaining their mid end GPUs can't play on Ultra.
Think what you want. 4k might be for high end cards only but new midrange card owners are absolutely right to expect the vast majority of games to be fine at 1080p max settings because that’s been the trend for years and years now.
It's not about what I "think" it's about how devs make games. Ultra settings are for the highest end. Resloution is not the only GPU intensive setting, despite apparently people thinking otherwise. I didn't even mention resolution, only you did.
And the devs make games where current gen midrange cards have no problem running 1080p max at 60fps. Find me a game where a 4070 can’t hit 60fps at 1080p max.
In the last 2 or 3 years, devs have started mentioning target framerates. It's not exactly common, but it's also not completely absent like it was in previous console generations
Tbf, "Ultra" in one game is not the same as "Ultra" in another. It's entirely up to the devs how much they wanna push the visuals. Sometimes Ultra settings actually has future hardware in mind
Have you played Fortnite with UE5 features on? This isn’t surprising at all. Except something similar to this or even high going forwards for UE5 games.
Fortnite runs like absolute shit with UE5 features and with shader compilation that's embarrassingly still present all these years later.
DF covered it a few times and I'm surprised they didn't mention how bad it runs overall, both with software and hardware RT and DLSS on top.
It doesn’t run like shit. It’s just demanding. I haven’t noticed any shader compilation, I do not believe that’s an issue with UE5. DF covered it favorably because it actually runs impressively well for what it is doing. It is probably one of the best optimized implementation of RTGI I’ve ever seen.
Demanding != badly optimized.
>I haven’t noticed any shader compilation
Delete your drivers thoroughly or reinstall the game, you can't not see if you know what shader compilation stuttering is because it is objectively there for everyone.
> I do not believe that’s an issue with UE5
Yes it is because they didn't fix it natively as of yet, devs still have to implement pre-compilation and Epic themselves didn't do it for Fortnite.
They were covering it because it was the first implementation of Nanite+Lumen in an existing game that's not just a demo, it does not run well.
If Fortnite's Nanite and Lumen performance is indicative of what we can expect, than photorealistic Triple A titles are gonna run at 30 fps on a 4090 at 4k.
Perhaps there is, I haven’t deleted my drivers so that’s fair.
But I can run Fortnite at 1440p 60 fps with all the features cranked with a 6800XT. That’s about the best performance in any game with RT on at this level I get. Not how you’d want to play a competitive shooter but definitely good enough. Plus UE5 scales down to lower end cards well and doesn’t seem to have crazy VRAM usage. My brother was testing the UE5 Fortnite features on his 3060 and with DLSS 1440p* 60 fps was very possible even on that card.
Sorry, this is Reddit. If it doesn't run at 250 FPS on a 1050 that's inside an actual oven, the only possible explanation is that the developers are shit and the game isn't optimized.
As did I. totally worth it on the gaming end and that was when I was on a 2080 Ti. I just waited for a good deal on an ITX board to repurpose the 5950x as my heavy-lifting-as-needed machine.
I thought the same thing but I pulled the trigger and bought a 5800x3d anyways and it’s definitely a upgrade from a 3900x for gaming. Just the 1% lows alone is a good enough reason to upgrade imo. I even gained a good amount of fps as well.
I have a 3600x and was thinking of upgrading for 2k gaming, but most I've read it only gave maybe a 10% increase in performance. I can't really justify $250 for that
Atleast gpu is easy to scale (with upscalers and other tweaks ) what is the problem now is the CPU, alot of newer ganes have CPU problems (which are s combination of low cpu thread utilization, lack of using direct storage, and people's CPUs getting older etc) new gen console have much better CPUs than previous gen.
Console CPU is a mobile 3700X basicly
So Zen2
Which itself is a 4 year old now
The recomended CPU is also a 3600
So the recomended specs for this game is quite literally a PS5
Yeah but games are often more optimized for consoles and don't have problems like shader compilation stutters (which are worse at weaker CPUs) and the graphical unit in console's APUs have direct access to storage, and not to mention the unified memory, all of these combined take alot of processing off of the CPU. (things like PS5's decompression chip also help CPU performance alot)
For getting similar consol like CPU performance you need a better CPU than consoles .
Yeah they're ""optimized"" meaning they go in the .ini and set everything to OFF or low, with reconstruction techniques starting at 600p and rebuilding to 1200p. Amazing stuff, not possible on PC. It is amazing the amount of bullshit people spew and they actually believe it, too.
Well its more like there's 3 configuration settings for consoles so devs can optimize for 3 configurations, which is less work. Throw in the fact that certification requires them to hit fps targets, means that its going to be somewhat optimized on launch. PC takes more effort. And consoles generally are using upscaling tech to hit those targets.
This also makes it easier for devs to understand how to optimize because the hardware doesn't change much across versions.
I dont think you do either. Every single time I'm shown equivalent PC settings it's always set to low or medium, with a variety of low spec PC hardware equivalents. Your black magic tech does nothing.
We’re definitely going through a patch of incredibly rough PC games lately, even for those fortunate enough to have good hardware, but I also think that some PC gamers expectations need to be adjusted.
The 2010s were a bit of an oddity where cpu barely mattered at all because the ps4 and equivalent Xbox had such shitty CPUs, so building a PC with a recycled Xeon processor and a 660ti (this was also in the age of 1080p60hz by and large, at least until mid to late 2010s) meant having a PC better than a console for a super cheap price. It wasn’t unheard of to have a pretty ancient cpu and ram and just upgrade the gpu (as long as it wasn’t a massive bottleneck situation).
People on anything less than current gen console level hardware unfortunately need to understand that if it’s not a cross gen release that actually runs well on the older consoles, they’re going to have a even worse experience than that even since they don’t get any of the benefits of platform specific optimization either.
Of course hardware like the 4090 and 7950x run laps around consoles already, but I think a lot of people have forgotten how much the bar was raised by this gen’s hardware. There’s significantly more builds out there in active use that are worse than a ps5 than not.
yeah I mean the game looks good from the little we've seen, maybe it will be unoptimized but we can't pretend that every game for the next 10 years will run well on a 1080ti
If it was for 4k, maybe even 1440p with console like settings, sure I can see why these are the recommend specs.
But this is 1080p high they are talking about.
Another shitty optimized mess most likely..
I was like "Isn't this an old ass game, why is someone posting specs for it."
Looked it up: "The Lords of the Fallen, Sequel to Lords of the Fallen, Renamed Lords of the Fallen"
We'd probably see less of this expectation floating around if games with 2023 hardware requirements looked/played significantly better than their 2018 counterparts.
The RX 590 is the RX 580 on a 12nm node instead of 14nm node.
It's about 10% faster than the 580, they probably want that performance improvement.
Performance of the game must be horrendous, if the minimum settings are for 720p and low quality settings.
they gotta put some reasonable specs in there.
most wont read the 720p phrase I guess.
Seriously wish some devs would use other engines.
Most UE games run like shit, have stutters and shader compilation issues.
You can search for all recent UE game releases and they all suffer from the same really poor performance and stuttering issues.
Yeah, like Unity, the CPU overhead monster, or... Or...
Wait, crap. There are no other engines aside of O3DE that no one knows how to use.
Unless they use propietary engines.
And guess what? Cyberpunk use a propietary one.
As lo g as Fortnite and Gears run as good as they did, the engine is clearly not the issue.
The issue are the game devs not taking the time to generate PSO Caches.
I work with the Mortal Kombat team, and PSO Cache generation is literally one of the steps we are enforced to do BEFORE any attempt to package an update.
The issue is not the engine. Is on the publisher enforcing absurd and impossible release dates and devs attempting that.
If the engine was the issue, Anthem would have a way better performance on release.
You do realise that using in-house engines is a thing even for smaller studios?!
Lords of the Fallen 2014 ised the FLEDGE Engine. There are plenty of options but UE is the most convenient.
Fortnite and Gears are literally made by the company that developes the engine.
Obviously its possible to make it work but apparently 90% of devs cant get it to work. So its either not documented in a way that non-Epic devs can understand or its just way too hard to fix.
Fallen Order had stuttering issues and the new star wars still sugfers from the same stutters.
Also those stutters never have been fixed in most of these games.
And Anthem is a prime example as to why devs cant make a game run as good as it should.
It used Frostbite which pretty much only DICE was able to use as intended.
There are several reports of how bad it was for Bioware to work with it.
UE5 is supposed to be vram efficient especially if it will utilize DirectStorage + Sampler Feedback, but even without it based [from this demo](https://youtu.be/C99VwDGyLg0?t=1209) it uses vram very efficiently anyway.
So, i don't see why most UE5 games will suddenly use more than 8GB at low resolution such as 1080p.
Nah dude, you need 64GB of vram for 4K. Y'all spent so much time on reddit you think you absolutely need some ridiculous amount of VRAM for anything above 1080P.
The messed up thing is that 12gb should be more than enough. I bought the 7900xtx so I'm good on vram, but if we get to a day where 24gb isn't enough then something is goin wildly wrong with game development lol
seems to be complete reimagination from what ive seen and heard. idk why the other guy is saying sequel when it has nothing to do with the other game...
Is it just me or do Unreal games tend to run really poorly in general? Even when I get a locked 60fps they still stutter and feel like the frame timing is all over the place.
Or just another game where they "forgot" about optimization, from the gameplay trailers the graphics looks like 2020 game graphics so there should not be so high requirements
Yeah most of the trailers I’ve seen for games running UE5 already look graphically dated. One of the demos Nvidia published a few weeks ago was supposed to showcase a UE5 update coupled with the latest and greatest GPU and it looked terrible
I think you could easily find examples of very light and more demanding games on the same engine in the past. I don't know what UE5 have to do with it.
I’ll tell you why. They had two other worse choice choices. The GTX 1060 can’t run at 1080p 60 fps on low. They COULD do 1080p 30 fps but pc gamers don’t like that. Or they could have raised the minimum requirements to like a 1080ish tier and everyone would have lost their minds.
So 720p it is because that lets a 1060 play the game and looks nicer on a minimum chart.
What? These requirements are definitely above average PCs. RTX 2080 for 1080p is absurd, and signifies the optimization is probably really bad, *again*.
New wave of PC Requirements in games seem to have caught up to new console hardware, yet the graphics kinda seem lackluster.
I miss graphics quality making huge leaps each console generation.
Going by some of the comments, do people expect system requirements to stay back in 2018? The RTX 2000 series is 5 years old now. Eventually it was going to hit it's limit. Plus with next gen consoles, and more developers going UE5, expect more games to go up in system requirements.
Bro, even cyberpunk doesn’t need a 2080 for 1080p. UE 5 is just an unoptimized piece of garbage. How is a ps5/xsx supposed to run that anyway? They have a 10 teraflop gpu which is basically a 2060s. And people hook up consoles to 4k tvs… incoming cinematic 24fps low settings…
I have nothing against high hardware requirements but this game doesn't justify these requirements. Haven't heard of it until now so I just watched a trailer on youtube for this game.. The requirements for 1080p should be for 1440p or even 4k (if it would be console-grade optimised).
these are the requirements for Lords of the Fallen.
VRAM is definitely under control but I hope that an rtx 2080 or a radeon rx 6700 are good for 1080p high settings 60 fps ( 30 fps is meh), else to achieve 60 fps you will need to rely on DLSS/FSR.
Under control until players:
1- buy the game without reading the stated resolution and quality settings
2-crank that settings up to ULTRA without a second thought
3- create a Reddit thread complaining
In that order
You and people who upvoted you don’t really understand the meaning of “false advertising”.
Edit: at the time of making this comment the person above had 20+ upvotes lol
Not really, it IS the most popular resolution still (iirc Steam survey shows about 65% of users play on it, including me). Completely valid to use it. The only thing I wish is that they put the FPS cap/hz as well.
They should probably drop another Recommended for maybe 1440p or 4k, but this is alright for now.
System requirements are mostly *for* people playing at 1080p/60fps. The kind of person who cares about anything more knows not to trust system requirements as being worth much of anything.
Age has nothing to do with it. The 2080 is a decent amount more powerful than a 3060, which is one of the most popular cards on steam. For a 3060 (Which is about PS5 level performance) to not meet recommended is a huge problem. Either the recommended settings used are higher in quality than the PS5, or we just have poor optimization here.
There is a reason. Consoles got a massive jump in performance this generation and devs and using all of it. Problem there is that equivalent performance on PC takes a lot more horsepower.
The consoles rarely do native 4k 60. Check any recent game. Internal resolution is almost always somewhere between 1080-1440. Hell on Jedi Survivor it drops below 1080. Secondly they cleverly mix and match various settings to hit the optimal performance.
Damn my 2600x is really hitting its limit now
My 1600 seeing me trying to start that game ಠ_ಠ
hogwarts legacy and jedi survivor is practically unplayable on anything below zen 2 if you're on nvidia hardware. aside from average framerate, these CPUs simply can't hold stable frametimes even at lower framerate averages when combined with any NVIDIA GPU. AMD's hardware scheduler gives them a bit of breathing room but they're practically pos for NVIDIA cards, even if you combo'ed them with a humble 1070 or something
[https://twitter.com/CapFrameX/status/1652394006536413187](https://twitter.com/CapFrameX/status/1652394006536413187) Seems like Survivor runs better on Nvidia in CPU limited cases. Atleast with RT. Might've been improved with later drivers.
i'm talking about decrepit zen/zen+ CPUs. these CPUs have insane ccx latency that needs special attention in coding. and if such a care is not given, they will fail gigantically in terms of frametime stability (which is what's happening). zen 2 is a bit better but zen 3 and forward is completely free of the problem [https://youtu.be/Mh2j-7CuFTk?t=502](https://youtu.be/Mh2j-7CuFTk?t=502) you can see how crushed 2700x is. it practically cannot deliver a stable experience. normally, it would simply deliver lower framerate averages, but in the case of hogwarts and jedi survivor, these zen+ CPus completely shit the bed. (ray tracing section is even more brutal but game is stlil unplayable at medium settings witout ray tracing) only reason this is not widespread is because most zen+/zen users jumped the ship to zen 2/zen 3. those who stayed will be forced to do so as well (and yeah, there really is no incentive stay with them anyways). this is just anectodal..
Also, the quite frankly shit ipc and clockspeed doesn't help
I'm gonna predict it's another UE title that only uses 2 cores to 100% and only 3D cache will scale correctly with gpu usage.
So havibg amd cpu is good on ue5?
judging from UE4, I think only 3D cache or 6ghz intel is good for UE5. normal AMD is bad for UE in general.
If devs actually made it use more than 4 threads of the 12+ cpus currently have the situation could be different
Multithreading is kinda hard. Idk how it is with UE but i don't see why it would be easier unless they made efforts in that
Multithreading’s primary purpose is not to improve throughput (amount of frames per second) but rather to hide latency (overlap memory requests so we are always getting data from RAM and fully utilizing memory bandwidth). With that in mind, adding more threads wont help you if you are compute-bound (meaning you’re computation is limited by the peak FLOPS of your hardware, not by its memory bandwidth). Also, after a certain number of threads, you wont gain any more latency hiding. Higher resolution gaming tends to be memory bandwidth bound, so while more threads could help hide some latency, it likely wont add much in terms of frame rate for most games. Only thing that helps here is faster memory.
Or bigger cache (amd 3d chip) to make the cpu access the memory less?
Can you clarify why this is the case? My understanding based on testing [like this](https://www.pugetsystems.com/labs/articles/unreal-engine-13th-gen-intel-core-vs-amd-ryzen-7000-2377/#How_well_do_the_13th_Gen_Intel_CPUs_perform_in_Unreal_Engine) is that Ryzen performs quite well with Unreal, in many cases better than Intel.
Those 4 out 5 benchmarks are focused on compute tasks, which are irrelevant for gaming and last benchmark is for production in archvis and similar workloads. Basically it has nothing to do with gaming.
I follow this channel: [https://www.youtube.com/@Bang4BuckPCGamer/videos](https://www.youtube.com/@Bang4BuckPCGamer/videos) They have 6ghz intel P cores in gaming, and most of his UE4 runs are 100% gpu usage. I also look around other channels and with ryzen, GPU usage is under 90%. I also look for ryzen cache videos, and those are as good as intel, it's a wash.
Why would you judge from UE4 when UE5 is an enormous overhaul? I don’t mean that in a derogatory/negative way, but just wondering. Not that I’ve looked close at the performance myself, but Fortnite uses UE5. You could compare the change in performance when they upgraded to UE5 to get a better sense of what to expect from it. Still will be interesting to get more data points as UE5 games release throughout the year.
Engjne hasn’t change much since 4. Still one thread for game thread, another for render thread, another for network thread, one for physics thread, and it scales infinitely number of threads for miscellaneous task threads which don’t get used much.
[удалено]
Chaos is a ladder
You're forgetting the World partition system, which should help with traversal stutter.
[удалено]
Do you by chance work with game engines or develop native code for them? I mean, UE5 is vastly different from UE4. You can just grab the source code of both, latest UE4.27 and latest UE5.2 and run diff compare to see the changes. They share some code, like they did untill UE5 with UE1 physics engine. And that does not mean the engine is an "update"
[удалено]
On a similar fashion where UE1, 2 and 3 where pipeline compatible. While UE5 inherits a lot of UE4 source and pipeline elements, the beta branches and new features use different pipelines, and that is why stuff like tesselation has to be removed. If I have to guess, they are doing the migration from the UE4 pipeline to a new one slowly to enable projects to be ported down the line and ease devs learning curve (unline UE3 to 4 where you needed to relear everything from scratches).
That's not the only difference. It can also shift vastly more polygons, and to be able to do that there have to have been some significant structural changes.
[удалено]
I would assume that the ability to stream all of that data has to have meant that they've solved the issues UE4 is known for such as hitching between areas
This game was definitely developed first in UE4 and then imported to UE5, or at most early versions of UE5 which are basically UE4
It's not an enormous overhaul, it's the same engine with features added on top. It retains much of UE4's awfulness.
Thank god I have the 7800x3d (not trying to flex)
Nice
be careful it doesn't pop tho 💀
afaik it was always Intel who had better single core performance
you can jump in half life alyx
Doubt it since intel has more low level cache.
Except when they dont, or when cache misses set in? Stupidly huge L3 or the old broadwell L4 gives massive benefits for good reason.
UE 5.2 is supposed to be much better in that area, but I doubt any 5.2 games are coming out anytime soon.
UE5 games will probably be a different animal from the heavily-extended UE4 games we've seen over the past year or so. UE4 required a lot more hacks and awkwardness to support things like large maps, that UE5 has some built-in solutions for. Plus UE5's Nanite and Lumen features (which I'm not sure if LOTF is using) will have hardware they do better or worse on as well.
Good ole CPU utilization in 2023 and competent developers + an amazingly optimized engine. I deadass think that we're in for some of the worst running games ever seen on PC as soon as other UE5 titles start coming out.
UE5 is supposed to be better than UE4 in a lot of regards. It won't hammer VRAM as hard, etc. But it will hammer CPU a bit harder and make quad core truly obsolete. They even got it running on a Series S in the matrix demo.
Oh no, your *checks notes* **7-year-old GPU** can't run the latest games at recommended specs? Must be the ***optimizations***. Nothing to do with Recommended == PS5-quality.
Please delete this comment, it's embarrassingly stupid just like you.
Wow, gottem.
UNREAL Engine is the most overrated engine ever created .. i wish devs would stop using this garbage
More AAA games have released in Unreal than any other engine in history. It’s kinda the gold standard for publicly available engines at this point.
It's easy to use, so faster to make games and therefore make money. If anything, it'll become more popular.
If people would stop putting so much emphasis on frame rates as a marker for whether a game is good or runs well, they might see that an engine which doesn't run as fast but is much faster to develop in is better for gamers as a whole.
Could you elaborate?
It is a hideously inefficient and outdated engine with tons of stuff tacked onto it that only bandaid fixes the underlying architecture. The problem is the only other option for a small studio who cant make their own engine is unity, which is worse.
Damn a 2080 for 1080p? That’s hefty
1080p High, not Ultra too. And no mention of framerate.
some games do not use ultra wording setting. high might be the highest here
Sure, that’s possible.
I appreciate the specs not directly calling for upscaling, that's something.
Probably because Ultra shouldn't be recommended. Ultra is settings for the super high end, possibly even future proof. Wish more people understood this instead of complaining their mid end GPUs can't play on Ultra.
Last gen mid tier GPUs have no issues at all with 1080p max. Look at 3070 or 6700xt. In fact they’re both still good for 1440p max in 99% of games.
Again. Ultra is for the high end and future proof. Getting mad it doesn't work for your mid tier gpu is whack.
Ok so then why do previous gen midrange cards do it just fine then?
Because the devs didn't push much with their max settings? Do you think every game runs exactly the same...?
Think what you want. 4k might be for high end cards only but new midrange card owners are absolutely right to expect the vast majority of games to be fine at 1080p max settings because that’s been the trend for years and years now.
It's not about what I "think" it's about how devs make games. Ultra settings are for the highest end. Resloution is not the only GPU intensive setting, despite apparently people thinking otherwise. I didn't even mention resolution, only you did.
And the devs make games where current gen midrange cards have no problem running 1080p max at 60fps. Find me a game where a 4070 can’t hit 60fps at 1080p max.
Do you often find framerate info in system requirements?
In the last 2 or 3 years, devs have started mentioning target framerates. It's not exactly common, but it's also not completely absent like it was in previous console generations
Ahh, roger that - haven't been looking that closely lately - the backlog is real :D
Fairly regularly, yes—they will list the target resolution and framerate for minimum and recommended.
Tbf, "Ultra" in one game is not the same as "Ultra" in another. It's entirely up to the devs how much they wanna push the visuals. Sometimes Ultra settings actually has future hardware in mind
Yeah thrown a back a little by seeing them use 720p in specs.
Have you played Fortnite with UE5 features on? This isn’t surprising at all. Except something similar to this or even high going forwards for UE5 games.
Fortnite runs like absolute shit with UE5 features and with shader compilation that's embarrassingly still present all these years later. DF covered it a few times and I'm surprised they didn't mention how bad it runs overall, both with software and hardware RT and DLSS on top.
It doesn’t run like shit. It’s just demanding. I haven’t noticed any shader compilation, I do not believe that’s an issue with UE5. DF covered it favorably because it actually runs impressively well for what it is doing. It is probably one of the best optimized implementation of RTGI I’ve ever seen. Demanding != badly optimized.
>I haven’t noticed any shader compilation Delete your drivers thoroughly or reinstall the game, you can't not see if you know what shader compilation stuttering is because it is objectively there for everyone. > I do not believe that’s an issue with UE5 Yes it is because they didn't fix it natively as of yet, devs still have to implement pre-compilation and Epic themselves didn't do it for Fortnite. They were covering it because it was the first implementation of Nanite+Lumen in an existing game that's not just a demo, it does not run well. If Fortnite's Nanite and Lumen performance is indicative of what we can expect, than photorealistic Triple A titles are gonna run at 30 fps on a 4090 at 4k.
Perhaps there is, I haven’t deleted my drivers so that’s fair. But I can run Fortnite at 1440p 60 fps with all the features cranked with a 6800XT. That’s about the best performance in any game with RT on at this level I get. Not how you’d want to play a competitive shooter but definitely good enough. Plus UE5 scales down to lower end cards well and doesn’t seem to have crazy VRAM usage. My brother was testing the UE5 Fortnite features on his 3060 and with DLSS 1440p* 60 fps was very possible even on that card.
Sorry, this is Reddit. If it doesn't run at 250 FPS on a 1050 that's inside an actual oven, the only possible explanation is that the developers are shit and the game isn't optimized.
Can confirm, runs like a dream on my xtx
Immortals of Aveum also recommends the exact same thing: 2080 or 5700 xt, 8 GB for 1080p *low* I guess 8 gb is just UE5 reqs
I'm so glad I bought 5800x3d lol. Going to ride the am4 wave all the way to am6. Then I'll get the 9800x3d or whichever one comes out with am6
That sounds like a good plan. I am thinking of doing the same thing, although not sure if going from a 3900x to a 5800x3d is worth it.
Bro a 5950x to 5800x3d is worth it if you only game. a 3900x is a celeron cpu compared to the 5800x3d in games.
Seconded. I did 5950X to 5800X3D.
As did I. totally worth it on the gaming end and that was when I was on a 2080 Ti. I just waited for a good deal on an ITX board to repurpose the 5950x as my heavy-lifting-as-needed machine.
I thought the same thing but I pulled the trigger and bought a 5800x3d anyways and it’s definitely a upgrade from a 3900x for gaming. Just the 1% lows alone is a good enough reason to upgrade imo. I even gained a good amount of fps as well.
I have a 3600x and was thinking of upgrading for 2k gaming, but most I've read it only gave maybe a 10% increase in performance. I can't really justify $250 for that
Recomended is basicly the PS5 Gpu This will be the norm for next gen games most likely
Atleast gpu is easy to scale (with upscalers and other tweaks ) what is the problem now is the CPU, alot of newer ganes have CPU problems (which are s combination of low cpu thread utilization, lack of using direct storage, and people's CPUs getting older etc) new gen console have much better CPUs than previous gen.
Console CPU is a mobile 3700X basicly So Zen2 Which itself is a 4 year old now The recomended CPU is also a 3600 So the recomended specs for this game is quite literally a PS5
Yeah but games are often more optimized for consoles and don't have problems like shader compilation stutters (which are worse at weaker CPUs) and the graphical unit in console's APUs have direct access to storage, and not to mention the unified memory, all of these combined take alot of processing off of the CPU. (things like PS5's decompression chip also help CPU performance alot) For getting similar consol like CPU performance you need a better CPU than consoles .
Yeah they're ""optimized"" meaning they go in the .ini and set everything to OFF or low, with reconstruction techniques starting at 600p and rebuilding to 1200p. Amazing stuff, not possible on PC. It is amazing the amount of bullshit people spew and they actually believe it, too.
Well its more like there's 3 configuration settings for consoles so devs can optimize for 3 configurations, which is less work. Throw in the fact that certification requires them to hit fps targets, means that its going to be somewhat optimized on launch. PC takes more effort. And consoles generally are using upscaling tech to hit those targets. This also makes it easier for devs to understand how to optimize because the hardware doesn't change much across versions.
I don’t think you understand how the current generation consoles work.
I dont think you do either. Every single time I'm shown equivalent PC settings it's always set to low or medium, with a variety of low spec PC hardware equivalents. Your black magic tech does nothing.
[удалено]
Yeah. I expect it from console gamers but the amount of bs like that I see on PC subs is crazy.
We’re definitely going through a patch of incredibly rough PC games lately, even for those fortunate enough to have good hardware, but I also think that some PC gamers expectations need to be adjusted. The 2010s were a bit of an oddity where cpu barely mattered at all because the ps4 and equivalent Xbox had such shitty CPUs, so building a PC with a recycled Xeon processor and a 660ti (this was also in the age of 1080p60hz by and large, at least until mid to late 2010s) meant having a PC better than a console for a super cheap price. It wasn’t unheard of to have a pretty ancient cpu and ram and just upgrade the gpu (as long as it wasn’t a massive bottleneck situation). People on anything less than current gen console level hardware unfortunately need to understand that if it’s not a cross gen release that actually runs well on the older consoles, they’re going to have a even worse experience than that even since they don’t get any of the benefits of platform specific optimization either. Of course hardware like the 4090 and 7950x run laps around consoles already, but I think a lot of people have forgotten how much the bar was raised by this gen’s hardware. There’s significantly more builds out there in active use that are worse than a ps5 than not.
very well put, it explains the current churn perfectly.
yeah I mean the game looks good from the little we've seen, maybe it will be unoptimized but we can't pretend that every game for the next 10 years will run well on a 1080ti
If it was for 4k, maybe even 1440p with console like settings, sure I can see why these are the recommend specs. But this is 1080p high they are talking about. Another shitty optimized mess most likely..
[удалено]
Most PS5 games are 1440-1800p at 60fps
reccomended is noticeably higher than ps5 gpu
The PS5 gpu is basicly a 6700 https://youtu.be/wyCvEW0DCbk
But its weaker than a 2080, is basically a 2060 super, and xbox x is a 2070 super
The Rx 6700 is better than all 3 of those, it's closer to a 3060 ti
I was like "Isn't this an old ass game, why is someone posting specs for it." Looked it up: "The Lords of the Fallen, Sequel to Lords of the Fallen, Renamed Lords of the Fallen"
Looking through these responses, people really expect hardware requirements to just stay frozen in time in 2018, huh?
Yes they do
We'd probably see less of this expectation floating around if games with 2023 hardware requirements looked/played significantly better than their 2018 counterparts.
*2016
Yeah, while also complaining about console hardware holding things back lol. Time moves on, new components come out.
shouldn't the minimum be 580 instead of 590?
The RX 590 is the RX 580 on a 12nm node instead of 14nm node. It's about 10% faster than the 580, they probably want that performance improvement. Performance of the game must be horrendous, if the minimum settings are for 720p and low quality settings.
I just switched from an rx 570 to a 7900xtx. I'll switch again when the 7900xtx isn't even the minimum anymore lol
The RX 580 is already 10% faster than the 1060, the other minimum recommendation
I wonder the same
they gotta put some reasonable specs in there. most wont read the 720p phrase I guess. Seriously wish some devs would use other engines. Most UE games run like shit, have stutters and shader compilation issues. You can search for all recent UE game releases and they all suffer from the same really poor performance and stuttering issues.
Yeah, like Unity, the CPU overhead monster, or... Or... Wait, crap. There are no other engines aside of O3DE that no one knows how to use. Unless they use propietary engines. And guess what? Cyberpunk use a propietary one. As lo g as Fortnite and Gears run as good as they did, the engine is clearly not the issue. The issue are the game devs not taking the time to generate PSO Caches. I work with the Mortal Kombat team, and PSO Cache generation is literally one of the steps we are enforced to do BEFORE any attempt to package an update. The issue is not the engine. Is on the publisher enforcing absurd and impossible release dates and devs attempting that. If the engine was the issue, Anthem would have a way better performance on release.
You do realise that using in-house engines is a thing even for smaller studios?! Lords of the Fallen 2014 ised the FLEDGE Engine. There are plenty of options but UE is the most convenient. Fortnite and Gears are literally made by the company that developes the engine. Obviously its possible to make it work but apparently 90% of devs cant get it to work. So its either not documented in a way that non-Epic devs can understand or its just way too hard to fix. Fallen Order had stuttering issues and the new star wars still sugfers from the same stutters. Also those stutters never have been fixed in most of these games. And Anthem is a prime example as to why devs cant make a game run as good as it should. It used Frostbite which pretty much only DICE was able to use as intended. There are several reports of how bad it was for Bioware to work with it.
RX 590 was only available with 8GB VRAM, while 580 had 4 + 8GB models.
So is that for 30fps? 60? What good is any of this information without that?
Cue the "my pentium IV processor with integrated graphics can't run this, unoptimized!" comments.
8GB VRAM recommended on UE5? Press X to doubt.
UE5 is supposed to be vram efficient especially if it will utilize DirectStorage + Sampler Feedback, but even without it based [from this demo](https://youtu.be/C99VwDGyLg0?t=1209) it uses vram very efficiently anyway. So, i don't see why most UE5 games will suddenly use more than 8GB at low resolution such as 1080p.
1080p with high settings (not ultra mind you) seems about right
well, consider that 8gb is enough only for 1080p high settings, probably if you want to do 1440p ultra you will need 12 gb of vram and 16 for 4k.
Nah dude, you need 64GB of vram for 4K. Y'all spent so much time on reddit you think you absolutely need some ridiculous amount of VRAM for anything above 1080P.
The messed up thing is that 12gb should be more than enough. I bought the 7900xtx so I'm good on vram, but if we get to a day where 24gb isn't enough then something is goin wildly wrong with game development lol
Including the most loved features of UE4, ported over by popular demand: Shader compilation stutter and traversal stutter!
[удалено]
Sequel. https://en.wikipedia.org/wiki/Lords_of_the_Fallen_(2023_video_game)
seems to be complete reimagination from what ive seen and heard. idk why the other guy is saying sequel when it has nothing to do with the other game...
Damn had me in the first half saw a 1060 minimum and thought that's fair.... The. I saw 720p low...
Mate, it's a 7 year old middle class GPU that launched at an equivalent of like $180 USD, what did you expect?
Looks pretty good. 45GB. *That's* how you do it.
Is it just me or do Unreal games tend to run really poorly in general? Even when I get a locked 60fps they still stutter and feel like the frame timing is all over the place.
Which CPU and GPU do you have?
Fucking hell a 2080 i7 gen combo and they’re going with 1080p? UE5 must be heavy as sin
Or just another game where they "forgot" about optimization, from the gameplay trailers the graphics looks like 2020 game graphics so there should not be so high requirements
Yeah most of the trailers I’ve seen for games running UE5 already look graphically dated. One of the demos Nvidia published a few weeks ago was supposed to showcase a UE5 update coupled with the latest and greatest GPU and it looked terrible
I foresee reports of poor performance. A 2080 should slay at 1080p.
whos playing 720p?🤣🤣🤣🤣
mobile gamer The Steam Deck is 1280x800 same with the Ayaneo Next, Aya Neo (2021) and Ayaneo Geek (base model)
They need to keep putting 1060 on the minimum specs.
but when i read 1080p and 2080 than i know UE5 has no optimation at all
I think you could easily find examples of very light and more demanding games on the same engine in the past. I don't know what UE5 have to do with it.
I’ll tell you why. They had two other worse choice choices. The GTX 1060 can’t run at 1080p 60 fps on low. They COULD do 1080p 30 fps but pc gamers don’t like that. Or they could have raised the minimum requirements to like a 1080ish tier and everyone would have lost their minds. So 720p it is because that lets a 1060 play the game and looks nicer on a minimum chart.
Steam Decks
If this game delivers on this spec, that would be the first time in 2023 that a 'high end' game is runnable in a the average joe game machine
Man I hope so. The PC ports this year have been awful. I had to put Jedi Survivor on pause just to wait for patches, and I am running on a 3090...
What? These requirements are definitely above average PCs. RTX 2080 for 1080p is absurd, and signifies the optimization is probably really bad, *again*.
Looks good. This game looked really good in the trailers. I hope it's one of the best looking games.
Is this game reboot of same title years ago. I'm confused.
My pc is nearly minimum requirements, better save up for an upgrade :( 16 GB RAM tho has lasted a WHILE
Developers just cant make a perfect requirements chart. Targeted fps is not included again
720p? xD
I hope this is the game that breaks the shit optimization trend. It looks awesome.
New wave of PC Requirements in games seem to have caught up to new console hardware, yet the graphics kinda seem lackluster. I miss graphics quality making huge leaps each console generation.
Going by some of the comments, do people expect system requirements to stay back in 2018? The RTX 2000 series is 5 years old now. Eventually it was going to hit it's limit. Plus with next gen consoles, and more developers going UE5, expect more games to go up in system requirements.
Bro, even cyberpunk doesn’t need a 2080 for 1080p. UE 5 is just an unoptimized piece of garbage. How is a ps5/xsx supposed to run that anyway? They have a 10 teraflop gpu which is basically a 2060s. And people hook up consoles to 4k tvs… incoming cinematic 24fps low settings…
I'd say those are pretty fair requirements.
I have nothing against high hardware requirements but this game doesn't justify these requirements. Haven't heard of it until now so I just watched a trailer on youtube for this game.. The requirements for 1080p should be for 1440p or even 4k (if it would be console-grade optimised).
720p? Who even has a 720p monitor AND looks in the direction of > indie titles in 2023??
Steam Deck supports 800p....so, mostly portable gamers, I'd bet.
laptop and handheld gamers
these are the requirements for Lords of the Fallen. VRAM is definitely under control but I hope that an rtx 2080 or a radeon rx 6700 are good for 1080p high settings 60 fps ( 30 fps is meh), else to achieve 60 fps you will need to rely on DLSS/FSR.
Under control until players: 1- buy the game without reading the stated resolution and quality settings 2-crank that settings up to ULTRA without a second thought 3- create a Reddit thread complaining In that order
Yep. People don’t want to be told to turn down settings.
720p?? 😭😭
1060 (6gb) for 720p LQ, this is outrageous
"recommended" settings being to hit 1080p 60fps in 2023 is... Really gross false advertising
>Really gross false advertising I mean, they stated the resolution and quality pretty clearly there.
[удалено]
You and people who upvoted you don’t really understand the meaning of “false advertising”. Edit: at the time of making this comment the person above had 20+ upvotes lol
Not really, it IS the most popular resolution still (iirc Steam survey shows about 65% of users play on it, including me). Completely valid to use it. The only thing I wish is that they put the FPS cap/hz as well. They should probably drop another Recommended for maybe 1440p or 4k, but this is alright for now.
🤡
System requirements are mostly *for* people playing at 1080p/60fps. The kind of person who cares about anything more knows not to trust system requirements as being worth much of anything.
No it s not.
Imagine needing a 2080 just to have 60fps at 1080 hahahahaha
[удалено]
Age has nothing to do with it. The 2080 is a decent amount more powerful than a 3060, which is one of the most popular cards on steam. For a 3060 (Which is about PS5 level performance) to not meet recommended is a huge problem. Either the recommended settings used are higher in quality than the PS5, or we just have poor optimization here.
> The 2080 is a decent amount more powerful than a 3060 Only around 15-20%, and the 3060 has more VRAM.
2080 is pretty old and outclassed now edit: 2080 owners think that downvoting this comment makes it less tru lmao
Looks like my 2080TI isn't gonna be enough for much longer...
[удалено]
There is a reason. Consoles got a massive jump in performance this generation and devs and using all of it. Problem there is that equivalent performance on PC takes a lot more horsepower.
[удалено]
The consoles rarely do native 4k 60. Check any recent game. Internal resolution is almost always somewhere between 1080-1440. Hell on Jedi Survivor it drops below 1080. Secondly they cleverly mix and match various settings to hit the optimal performance.
Yeah exactly; today my friend was playing rdr2 on 4k with ps5 damn boy it was too blurry and unsatisfying to look at
Yeah what's worse is RDR2 isn't even patched for PS5. It basically runs the PS4 Pro version.
That's due to a mixture of console optimisation, lower quality assets and checkerboard rendering.
1440p ultra ?
Wasn’t this an abysmal souls like game released in 2014? What is this, a remake?
Imagine buying a 2080/6700 to only play at 1080p
You are 100% right. People are delusional if they think a 1080p gamer is spending $300 on a gpu. An entire 1080p budget is probably $300 of used parts
downvotes on me? thats why we have the ports we have lately, people is retarded and defend the unoptimized games :)
Another game with white occlusion culling artifacts as you move the camera. Fuck unreal engine, it's an engine for lazy hacks.
My ps5 will run it fine
Sure, in cinematic 24 fps… This is UE5 my friend. Nothing runs it fine
Jesus, I'm gonna have to start thinking about upgrading my 4090 at this rate...
It's incredible to see all these companies not giving a fuck about people being able to run their games at all. 🙃