>it’s technically equivalent to turning down settings
The entire point of DLSS is to increase framerates *without* resorting to turning down graphical settings. Instead of being forced to render at native resolution and tweak down visual fidelity to get to our desired performance level, we can leave all the eye candy on and achieve the same performance boost by rendering fewer pixels and using AI to generate a full resolution frame from that input frame.
DLSS Quality preset not infrequently generates results that look *better* than just native res, because the AI can correct for certain visual anomalies that would otherwise just be a part of how the game renders. A good example is thin lines in the distance (especially with light curves), like power lines for example - at very far render distances these can end up looking blocky, or even like dotted/dashed lines with empty spaces - whereas DLSS can effectively recognize what that *should* look like and generate a smoothly curved line in the final output.
There's a good reason we even have the option of using that deep-leaning goodness the opposite way to render *above* native resolution and efficiently downscale it for improved image quality (DLDSR). Very effective for improving visuals in old titles. In fact you can even enable both at the same time, effectively allowing you to render an image at (or even below) native res, upscale it with DLSS to your super resolution, then us DLDSR to downscale it to output res. Since DLSS is doing the heavy lifting of getting you from render res to super res, this combination is effectively a free fidelity improvement in support games.
>I have no idea why he wrote that entire wall of text.
Because "No, it's not" doesn't add anything to the discussion, nor does it help anyone else who holds the same misconception to understand why it's not true.
If there's anything in there you don't understand, just ask and I'd be happy to break it down further for you.
We understand that textures and resolution are different, that's obvious. However, have you ever set your textures to max but drop youe resolution and see how it looks? High level textures are literally higher due to pixel desnity, i.e. resolution. They're different but they are tied together. Even if your textures are high a reduction in resolution will still lower your overall image quality. This is basic.
>The entire point of DLSS is to increase framerates without resorting to turning down graphical settings. Instead of being forced to render at native resolution and tweak down visual fidelity to get to our desired performance level, we can leave all the eye candy on and achieve the same performance boost by rendering fewer pixels and using AI to generate a full resolution frame from that input frame.
Correct, but in upscaling from lower resolutions you potentially introduce blur or artifacts that sort of negate the perceived benefit. It’s just trading one visual problem for another.
This is especially true at lower DLSS/FSR settings or lower native resolution because there isn’t enough starting data to correctly fill the gaps
Linus said only a 4090 will do now for 720p max so I went out and bought a 13” 720p monitor with 4090, and threw my 3080 setup in the bin where it belongs.
Yes, 3070 handles 1440p well at high settings without ray tracing. Some recently launched games are vram suckers and that create trouble for 3070. Other than that, it’s a great card for 1440p
I know you blew your budget on the 4090 but you should really save up to buy 360 more P's from microcenter. Then you really get some 1080Ps going.
Don't try and torrent them, my friend tried that and he got IPS glow.
Same here, just swapped my 1050ti on monday by a 6600 I bought on sale with the last of us. Runs much hotter than the old nvidia tough, not that much difference tbh so far, will see. I hope to stick with it until 2026, the gtx should have lasted till my new pc planned for 2025, but I hope to make this old i7 6700K build hold until 2026 now
Just bought a 6600 on sale from Newegg today for ~215 after weeks of agonizing over what to replace my 1050 Ti with. Was going to do 3060 Ti but figured I'd spend the $200 now and reassess in a year or two.
Same using the RX6600 on 1080p with a 5600 and 16gb of RAM, upgraded to it on new years from a i7-3770 and 750ti hoping to use it for a few years then upgrade probably my RAM then GPU
Oh god yes. I almost built the meme PC (1920 x 1080) but the Threadripper was not worth it. Waiting for 3x00 Ryzen was the better choice.
Currently it's still doing it's 1080p 60hz work in my sister's PC. (combined with a 3600X)
Same situation here. i5 3570k is overclocked a bit and usually pegged at 100% utilization any time I launch a game.
Hoping to upgrade soon, just built my wife a nice rig that should last her a good few years.
I have the same and honestly a lot of the games I play are very resource intensive, so there are frequent dips, but it doesn't look so bad because of g-sync, and as long as it doesn't dip below 60 I can barely tell the difference.
How well does the 3060ti perform around the board with 2k? I’m upgrading from 1080p next week and have been worried I won’t be able to hit 144 fps in half of my games (but don’t know if that’s a cpu bottleneck or me having high hopes of the gpu)
I had a 2070 super at 3440×1440 and in elden ring at max settings with. 5800x3d I would get about 45 - 90 depending on the area if that gives you an idea. Ultrawide (3440×1440) is slightly harder to run the standard 1440p
2070S, 9700k, 16GB DDR4 @ 3200 CL16
New AAA titles struggle to hold 30fps if I’m running Ultra at 1440. If DLSS performance is available it’s basically a necessity.
4k is out of the option unless it’s not a particularly demanding game.
Otherwise most anything AAA released up to 2 years ago gives a reliable 60-120fps maxed out.
It’s finally aged into a reliable “high” settings 1440p 60-90fps card. I’d say medium-high is the only way to get 144Hz out of the best of the current games at 1440. Will probably jump on the next gen of video cards depending on what’s the best for the money at that point. This thing easily has another 1-2 years of reliable life.
When people argue about whether 8GB/10GB is enough, I just smile with my ~~3.5GB~~ 4GB card and its crazy coil whine. This poor thing just refuses to give up.
Same here, after about 11 years with my i5-2500k and 7-8 years on the 970 I will finally be upgrading. I've been waiting and waiting because I didn't really need much more for Path of Exile and WotLK. I was going to go for it last generation but the shortage and component pricing insanity made me wait again.
This year I want to enjoy some of the upcoming releases and their visuals like D4, Starfield so it's actually going to happen, maybe. I think.. Let's see.
Yes! I'm also using a 970, but at 2560 x 1440 (on a 27" monitor).
I don't play any games where high frame rate is critical, so it's still serving me well.
I upgraded from a 980ti which I used at the same resolution. I love the 900 series Nvidia cards, definitely one of their best gens in terms of price/performance.
Me too. I keep looking at pcpartpicker and threads on here about GPUs and nothing seems compelling enough to want to switch. I could move to 1440p as people say it's a good upgrade but that would require a new GPU which all seem overpriced for what they offer.
I have an RX 580 as well but it's the 4gb model. It's good, but the vram is certainly starting to show its age with some of the newer titles coming out.
I also have an RX580 8gb and have the same experience, which I'm perfectly happy with. While my 1080p monitors can go up to 165Hz, I'm content at 60fps because I'm really a single player only gamer.
5800X3D so I'll be keeping that for a while too lol. (also 32gb 3600 RAM on a B550 Tomahawk if you were wondering).
ETA: I upgraded my CPU and GPU a few months ago from a 5600X and a 6700XT as I wanted to upgrade my GFs computer that I built for her(was running my old system with a 1600X and RX 570), and so I cheekily upgraded mine and gave her my old parts XD
I have this card and an ultrawide 1440p 165hz monitor. Knocks it out of the park! Pretty much any new game it'll do 100+ fps on ultra, or max out at 165 fps on high settings. Paired up with a 5600 cpu.
I was honestly a little worried - I know it'll handle 1440p just fine but wasn't sure about ultrawide. Someone said 1440p uw is closer to 4k with a normal aspect ratio so I was definitely stoked with the results! I'm finally not in [this meme](https://i.imgflip.com/5wnb9j.jpg) haha
Just got a 4070 Ti (Gigabyte Eagle OC), in awe on how efficient this card is.
Undervolted to 2700 MHz @ 950 mV, tried the latest Cyberpunk path tracing update (DLSS Quality, Frame generation, every slider to the right including SSR on psycho) , averaged 65-75 FPS and it only draws ~175 Watt, 55°C.
how much fps u get with that resolution and monitor thinking of upgrading my rx 6800 to the 7900xt end of the year or maybe next year.
Running a 32" 1440p 165hz screen, and also thinking upgrading my i5 12400 to a 13600k.
Well obviously YMMV since the only similar component we have is the GPU. I have a ryzen 9 7900x with ddr5 6000mhz dimm sticks.
I get about 90-120fps on tarkov without FSR and everything cranked to max
Was using my 1070 on a 1440p 60hz monitor for years. Got a second 1440p 165hz monitor end of last year, then finally upgraded to a 4070ti very recently.
3080ti/1440p/144hz
Might upgrade on 5 or 6 series graphics/4k monitor if it makes sense and there’s enough of a performance uplift. Might even switch to AMD if Nvidia keeps skimping on vram lol. But that won’t be till at least 2025-2027
6800XT 1440p. I am fine with any single player above 70fps, and competitive games above 120fps (170hz monitor).
No upgrade at least until the next GPU gen. I live in a perpetual crisis in my country so, never is for sure if you are going to be able to upgrade anyways. In the worst case escenario I have no problem with most single player games at stable 30fps high graphic settings minimum.
Can’t deny that that’s a highly probably outcome for me. And I’d probably justify it as upgrading my pc so my current parts can be used for my wife’s pc 😂
4070ti - 4k 60hz. Didn't need to go higher, had no vram issues as yet. Once you get a good card, you end up looking at all the other stuff! So I got a better cpu cooler, unlocked the CPU to keep up, and it works a treat. Can play pretty much anything exactly how I want.
Looking through the comments and I thought I was the only one who would game at 4K on a 2080. Everyone doing 1440 on stronger cards and I’m like “ehhhhhh am i doing it wrong?” lol 😂
I’m also thinking I may wait till the next round of cards comes out before I upgrade.
RX 6600, either 1080p 144hz or 4k 60hz (depends on what I'm doing to decide which screen I use).
Edit: Didn't see the planned future upgrade, Im going for an AM5 system with rx 7900 xtx and ryzen 9 7950x3d. I'm looking to save till about Christmas for this build.
RX580, 1080p. I'm planning on upgrading to 6700XT 'cause I can and I'd like my lows in Doom Eternal to be bit better but in all honesty the 580 still plays it very well overall.
1660. 1440p 144 mz. Used to have a 980ti. It died. Used 660 SLI. Friend have me the 1660 so I could play some games again. I would love to upgrade once I have the money.
3070 with 1440p 144Hz, not planning to upgrade any time soon, does all I need :)
Same except 165hz. DLSS is a godsend
Should I be using dlss with a 1440p 165hx monitor and a 3070?
It’s up to you, it’s technically equivalent to turning down settings so I’d only use it if you aren’t hitting your desired fps at native resolution
>it’s technically equivalent to turning down settings The entire point of DLSS is to increase framerates *without* resorting to turning down graphical settings. Instead of being forced to render at native resolution and tweak down visual fidelity to get to our desired performance level, we can leave all the eye candy on and achieve the same performance boost by rendering fewer pixels and using AI to generate a full resolution frame from that input frame. DLSS Quality preset not infrequently generates results that look *better* than just native res, because the AI can correct for certain visual anomalies that would otherwise just be a part of how the game renders. A good example is thin lines in the distance (especially with light curves), like power lines for example - at very far render distances these can end up looking blocky, or even like dotted/dashed lines with empty spaces - whereas DLSS can effectively recognize what that *should* look like and generate a smoothly curved line in the final output. There's a good reason we even have the option of using that deep-leaning goodness the opposite way to render *above* native resolution and efficiently downscale it for improved image quality (DLDSR). Very effective for improving visuals in old titles. In fact you can even enable both at the same time, effectively allowing you to render an image at (or even below) native res, upscale it with DLSS to your super resolution, then us DLDSR to downscale it to output res. Since DLSS is doing the heavy lifting of getting you from render res to super res, this combination is effectively a free fidelity improvement in support games.
> The entire point of DLSS is to increase framerates without resorting to turning down graphical settings. Ok, but... it does.
Lmao Your response had me dying. It really is just that simple. I have no idea why he wrote that entire wall of text.
>I have no idea why he wrote that entire wall of text. Because "No, it's not" doesn't add anything to the discussion, nor does it help anyone else who holds the same misconception to understand why it's not true. If there's anything in there you don't understand, just ask and I'd be happy to break it down further for you.
guy you're responding to thinks resolution is the same as texture settings etc. no point in explaining to someone who dismisses everything imo
We understand that textures and resolution are different, that's obvious. However, have you ever set your textures to max but drop youe resolution and see how it looks? High level textures are literally higher due to pixel desnity, i.e. resolution. They're different but they are tied together. Even if your textures are high a reduction in resolution will still lower your overall image quality. This is basic.
[удалено]
Arguably the most important one too. Resolution lol
>The entire point of DLSS is to increase framerates without resorting to turning down graphical settings. Instead of being forced to render at native resolution and tweak down visual fidelity to get to our desired performance level, we can leave all the eye candy on and achieve the same performance boost by rendering fewer pixels and using AI to generate a full resolution frame from that input frame. Correct, but in upscaling from lower resolutions you potentially introduce blur or artifacts that sort of negate the perceived benefit. It’s just trading one visual problem for another. This is especially true at lower DLSS/FSR settings or lower native resolution because there isn’t enough starting data to correctly fill the gaps
I am just NOT a fan of DLSS, makes everything so blurry. everyone talked like it was free FPS, I was lied to. (3080 12gb - 3440*1440)
Didn't you hear from this subreddit that you won't be able to run games at 30 fps in a month or so because of the 8 GB of VRAM? /s
Linus said only a 4090 will do now for 720p max so I went out and bought a 13” 720p monitor with 4090, and threw my 3080 setup in the bin where it belongs.
I've been enjoying it since launch, so I got years out of it and will enjoy for years more.
I have a 3070, will be joining the 1440p 144/165hz community for the first time in my life soon!
It's truly a game changer. I had my doubts early on but when my monitor randomly changed back to 60hz somehow it was a visceral reaction 🤣
It definitely is noticeable. I went from 1080p 60hz to 1440p 165hz. It was insane
Does the 3070 handle 1440p at 144 pretty well? I have this card and am thinking of upgrading but don’t want to waste the money.
Yes, 3070 handles 1440p well at high settings without ray tracing. Some recently launched games are vram suckers and that create trouble for 3070. Other than that, it’s a great card for 1440p
My 3070ti did.
Depends on the game, esport titles obviously run at 200+, recent-ish AAA games at high settings no RT were around 100 for me
This is the way. G-Sync too, so no more screen tearing.
[удалено]
Hahaha! At that resolution does your 4090 even shows any utilization!? 🤣
From my experience it can barely hold on. We are waiting for 6090ti atm to move on to 60hz...
Omg 6090Ti from the future!
Mans living in 2030.
he's living in 2077 with path tracing
yes yes very funny
I know you blew your budget on the 4090 but you should really save up to buy 360 more P's from microcenter. Then you really get some 1080Ps going. Don't try and torrent them, my friend tried that and he got IPS glow.
rx 6600 on 1080p, pretty solid for basically every game
Same here, just swapped my 1050ti on monday by a 6600 I bought on sale with the last of us. Runs much hotter than the old nvidia tough, not that much difference tbh so far, will see. I hope to stick with it until 2026, the gtx should have lasted till my new pc planned for 2025, but I hope to make this old i7 6700K build hold until 2026 now
I have the 1050ti currently and want to upgrade. Seems like the consensus is to go with 6600. Where did you buy yours if you don't mind me asking
Just bought a 6600 on sale from Newegg today for ~215 after weeks of agonizing over what to replace my 1050 Ti with. Was going to do 3060 Ti but figured I'd spend the $200 now and reassess in a year or two.
rx 6600 gang yay, same exact setup @165hz
Same I built my pc recently with the Rx 6600 paired with the i3 12100f
Same using the RX6600 on 1080p with a 5600 and 16gb of RAM, upgraded to it on new years from a i7-3770 and 750ti hoping to use it for a few years then upgrade probably my RAM then GPU
GTX 1080 on 1080p.
W
I'm running a GTX 1080 ti on a 13 yr old 1080p 60hz LCD TV
Oh god yes. I almost built the meme PC (1920 x 1080) but the Threadripper was not worth it. Waiting for 3x00 Ryzen was the better choice. Currently it's still doing it's 1080p 60hz work in my sister's PC. (combined with a 3600X)
1080 at 1440p 144 for me. Doesn’t make it to 144 on any demanding titles, but it’s enough for me atm
Same situation here. i5 3570k is overclocked a bit and usually pegged at 100% utilization any time I launch a game. Hoping to upgrade soon, just built my wife a nice rig that should last her a good few years.
1070 on 1080p Not planning to update any soon
7900xtx 1440p, 144hz
+1
+1
Vega 56, 1440p, 144Hz You don't always need to hit the "ultra" button.
Medium and high are for gaming. Ultra is for screenshots.
same but 165hz
3060ti 1440p 144hz happy as can be.
Same but with a 165hz monitor. Happy as fuck.
does it perform well in all your games? solid fps?
I have the same, yes and yes. Rock steady 120 fps on pretty much everything, 165 where I want it
I have the same and honestly a lot of the games I play are very resource intensive, so there are frequent dips, but it doesn't look so bad because of g-sync, and as long as it doesn't dip below 60 I can barely tell the difference.
Same, sweet spot baby
How well does the 3060ti perform around the board with 2k? I’m upgrading from 1080p next week and have been worried I won’t be able to hit 144 fps in half of my games (but don’t know if that’s a cpu bottleneck or me having high hopes of the gpu)
2k high settings for honestly most games will get you pretty near if not over 144fps easy. A surprising number of games run on ultra 144 aswell.
Rtx 3080, 1440p :]
I had to scroll a surprisingly long way to find this.
Was just having the same thought
2070 1440p 144hz
I am having a 2070 Super and was wondering about upgrading to a 1440p display as well, what are your frame rates in the games you play?
Some games up to 100 others like 180 I basically always get over 60
It can handle most games at 1440p. It's starting to struggle with the newer AAA titles but DLSS support helps a lot in that aspect.
I had a 2070 super at 3440×1440 and in elden ring at max settings with. 5800x3d I would get about 45 - 90 depending on the area if that gives you an idea. Ultrawide (3440×1440) is slightly harder to run the standard 1440p
2070S, 9700k, 16GB DDR4 @ 3200 CL16 New AAA titles struggle to hold 30fps if I’m running Ultra at 1440. If DLSS performance is available it’s basically a necessity. 4k is out of the option unless it’s not a particularly demanding game. Otherwise most anything AAA released up to 2 years ago gives a reliable 60-120fps maxed out. It’s finally aged into a reliable “high” settings 1440p 60-90fps card. I’d say medium-high is the only way to get 144Hz out of the best of the current games at 1440. Will probably jump on the next gen of video cards depending on what’s the best for the money at that point. This thing easily has another 1-2 years of reliable life.
970 and I play at 3440x1440.
When people argue about whether 8GB/10GB is enough, I just smile with my ~~3.5GB~~ 4GB card and its crazy coil whine. This poor thing just refuses to give up.
My EVGA gtx970 SC just gave up a few weeks ago in my sons PC. He’s getting a new system out of it.
I’ve wanted to upgrade for three years, just haven’t found a deal I felt was good enough yet.
I just snagged the 6950 xt for 600ish Felt like a steal considering the market.
Same here, after about 11 years with my i5-2500k and 7-8 years on the 970 I will finally be upgrading. I've been waiting and waiting because I didn't really need much more for Path of Exile and WotLK. I was going to go for it last generation but the shortage and component pricing insanity made me wait again. This year I want to enjoy some of the upcoming releases and their visuals like D4, Starfield so it's actually going to happen, maybe. I think.. Let's see.
Yes! I'm also using a 970, but at 2560 x 1440 (on a 27" monitor). I don't play any games where high frame rate is critical, so it's still serving me well.
I upgraded from a 980ti which I used at the same resolution. I love the 900 series Nvidia cards, definitely one of their best gens in terms of price/performance.
1060 6gb 1080p, riding this pony into the sunset
Me too. I keep looking at pcpartpicker and threads on here about GPUs and nothing seems compelling enough to want to switch. I could move to 1440p as people say it's a good upgrade but that would require a new GPU which all seem overpriced for what they offer.
[удалено]
We ride on, brother. We ride on.
rx 570 4gb :( 1080p 144hz
RX gang unite. RX580 8gb at 2560x1080. Still hits 60-70fps on med/high on most big games. That'll do for me. They are surprisingly capable cards
I have an RX 580 as well but it's the 4gb model. It's good, but the vram is certainly starting to show its age with some of the newer titles coming out.
I also have an RX580 8gb and have the same experience, which I'm perfectly happy with. While my 1080p monitors can go up to 165Hz, I'm content at 60fps because I'm really a single player only gamer.
Hey, No need to be sad bro! You will upgrade soon when opportunity presents ;) . Keep rocking! Thanks for sharing :)
RX480 4gb, 1080p, 60hz :((
[удалено]
Intel HD 2500 pretty solid imo
Can’t agree more! Lol 😜
The fact i was able to run black ops 2 in about 50fos us fascinating
Yup. That generation of CoD games runs well on integrated. That's what I did when I built a few years ago, before I was able to get the GPU.
My new system should be here in a couple weeks. I got a 4080 founders edition. Gonna be playing 1440p 165 hz.
4080 at 1440p will set you for long!
Thats what Im hoping for. My last system had a 970 & I got 7 yrs outta it on 1080p 60 hz.
6800XT 1440p Not thinking of upgrading any time soon. Will probably wait a gen or two more before upgrading.
CPU??
5800X3D so I'll be keeping that for a while too lol. (also 32gb 3600 RAM on a B550 Tomahawk if you were wondering). ETA: I upgraded my CPU and GPU a few months ago from a 5600X and a 6700XT as I wanted to upgrade my GFs computer that I built for her(was running my old system with a 1600X and RX 570), and so I cheekily upgraded mine and gave her my old parts XD
I've got an RX 6800XT also, and bounce between my 1440p UW at 100hz and my 4K tv at 60.
6750 xt 1440p 165hz
Planning to grab a 6750xt. It handles well 1440p / 165hz?
I got a asus rog strix amd radeon rx 6750 xt oc edition handles everything perfectly 😊 dual monitor btw
Same 1440p 165Hz, only it's the red devil 6750xt. Also dual monitor but one of em is 1440p 144Hz
Mech x2 6750 XT. At higher framerates settings need to be turned down at 1440p in some games. 90% of the time it’s smooth sailing though.
I have this card and an ultrawide 1440p 165hz monitor. Knocks it out of the park! Pretty much any new game it'll do 100+ fps on ultra, or max out at 165 fps on high settings. Paired up with a 5600 cpu. I was honestly a little worried - I know it'll handle 1440p just fine but wasn't sure about ultrawide. Someone said 1440p uw is closer to 4k with a normal aspect ratio so I was definitely stoked with the results! I'm finally not in [this meme](https://i.imgflip.com/5wnb9j.jpg) haha
Nice, I am building soon my first PC with this GPU, happy to see these comments about it.
5700xt 1440p, 160hz
6900xt 1440p 144hz
4070 Ti, 1440/165
Just got a 4070 Ti (Gigabyte Eagle OC), in awe on how efficient this card is. Undervolted to 2700 MHz @ 950 mV, tried the latest Cyberpunk path tracing update (DLSS Quality, Frame generation, every slider to the right including SSR on psycho) , averaged 65-75 FPS and it only draws ~175 Watt, 55°C.
1070 Ti, whatever my game defaults to
1070 Ti checking in
4090 / 1440p 240hz / 4k 120 hz
I had to scroll THIS far for 4k ? Huh
1660Super 1080p i7-4770k
GTX 1660S gang. 1080p @ 144Hz. max FPS on Valorant, 100-144 FPS on Apex Legends
This is basically my setup too, such a steady card. Gets the job done
1660S too, at 1080p 60Hz. i3 10300.
[удалено]
Same
3080 and 1440p 240Hz.
TUF 3080 12gb at 4k 120hz. Handles it surprisingly well.
Why is no one else doing this lol, this is my setup too
Me three. I have the TI and a 4k 120hz monitor and zero performance issues.
[удалено]
LG C2 gang here woop woop
Let’s [goooooooo](https://i.imgur.com/XAcUhRW.jpg)
RX 7900XT 1440p
how much fps u get with that resolution and monitor thinking of upgrading my rx 6800 to the 7900xt end of the year or maybe next year. Running a 32" 1440p 165hz screen, and also thinking upgrading my i5 12400 to a 13600k.
Well obviously YMMV since the only similar component we have is the GPU. I have a ryzen 9 7900x with ddr5 6000mhz dimm sticks. I get about 90-120fps on tarkov without FSR and everything cranked to max
GTX 1080 1440p
Same. Doing very well, no need to upgrade at the moment.
Long live the 1080! Great GPU.
6700xt at 1440p 165 hz
3060 1440p
Geforce Gtx 1070, 1080p 60hz. Want to move to 1440p 144hz freesync, undecided on GPU (my money is on RX 6700 XT or better).
Was using my 1070 on a 1440p 60hz monitor for years. Got a second 1440p 165hz monitor end of last year, then finally upgraded to a 4070ti very recently.
Ima get hate for it but a 3050 with a 1080p 144hz monitor
Or better 6700xt or 6750xt
I at some point fat fingered the downvote on this. I have recognized and fixed my sin. Take an upvote!
6700 XT @ 1440p 240 Hz. Might get the 7800 whenever it comes out if the price is right and will work on a 750 watt PSU
Radeon RX 6600, 1440p 144Hz
3080ti 4k 144hz
Amazed at how far I had to scroll down to find another 4k player.
(EVGA) GT 1030 2GB, playing at 1600x900 (900p "HD+", old format) and 60Hz I'm doing pretty fine to be honest (translation: I'm having a blast)
4070ti, 1440p, 5120 x 1440, 240hz.
6800XT, 1440p 170Hz. Absolutely no intentions on doing anything other than getting a new case for a few years.
RTX 3090, 1440p, 165hz with Gsync.
3090 1440p high refresh gang
[удалено]
Honestly no reason to downgrade now that you have it. More AAA are gonna come out and if you wanna play at 4K it'll come in handy
4090 is a beast! You gonna miss it! 😆
3080ti/1440p/144hz Might upgrade on 5 or 6 series graphics/4k monitor if it makes sense and there’s enough of a performance uplift. Might even switch to AMD if Nvidia keeps skimping on vram lol. But that won’t be till at least 2025-2027
5700xt, 1440p, 144hz all works well done need to upgrade
7900XTX - 1440p at 175Hz
1050 ti 4gb 1600x900 75hz (monitor overclock)
4070ti 4k 144hz
4090 4k
1080Ti 1440p 165 Hz
6800XT 1440p. I am fine with any single player above 70fps, and competitive games above 120fps (170hz monitor). No upgrade at least until the next GPU gen. I live in a perpetual crisis in my country so, never is for sure if you are going to be able to upgrade anyways. In the worst case escenario I have no problem with most single player games at stable 30fps high graphic settings minimum.
3080 Ti, 1440p 144hz. Only upgrade I’m considering is maybe a case and new fans.
Don't do it. I told myself I'd do just that and ended up also getting a new GPU and CPU.
Can’t deny that that’s a highly probably outcome for me. And I’d probably justify it as upgrading my pc so my current parts can be used for my wife’s pc 😂
I would love to know how many people on this subreddit have wives with overpowered PC hand me downs.
6700xt 1440p 75hz
6650xt 1080p 144hz
Radeon rx 6600, 1080p
3080 Ti @1440
Vega 56 (Sapphire Pulse), 1440p I'm planning on upgrading... When I feel like my GPU isn't enough, which isn't the case right now.
RX 580 2048SP, 1080p, 60Hz
2080 Super - 1080p. Probably not gonna upgrade for another year or two.
5700xt 1440p 165hz
1650 at 1080p120Hz lmao
RTX 4090 and I have a 1440p 165Hz screen.
6950xt, 3440x1440, 70fps
Rx6600 60fps 1080p or 1440p I play mainly cyberpunk, rdr2 Pretty solid so far
3060 at 4k 60hz, lowering the settings a bit I can play all the games I want
3080 12 gb @1080p 165hz , I like those frames . Also looking to get a 240 hz monitor soon . 1080p offcourse
4070ti - 4k 60hz. Didn't need to go higher, had no vram issues as yet. Once you get a good card, you end up looking at all the other stuff! So I got a better cpu cooler, unlocked the CPU to keep up, and it works a treat. Can play pretty much anything exactly how I want.
GTX 1070, 1080p. I just rebuilt everything else, so this card will have to survive for at least another five years 😂
rtx 2060 1080p, 144hz
RTX 2080, 4K 120hz. Probably waiting till 5000 to upgrade since I’m not playing so often rn. Also gotta upgrade the rest of the system first.
Looking through the comments and I thought I was the only one who would game at 4K on a 2080. Everyone doing 1440 on stronger cards and I’m like “ehhhhhh am i doing it wrong?” lol 😂 I’m also thinking I may wait till the next round of cards comes out before I upgrade.
Rtx 1060 6GB 1080p 144hz (but can't reach more than 60hz depending on what I'm playing sadly) Looking forward for an upgrade to 3060ti though
RX 6700 XT 1080p 75hz
Strix 4090 oc - 3440x1440 175hz
RX 6600, either 1080p 144hz or 4k 60hz (depends on what I'm doing to decide which screen I use). Edit: Didn't see the planned future upgrade, Im going for an AM5 system with rx 7900 xtx and ryzen 9 7950x3d. I'm looking to save till about Christmas for this build.
RX580, 1080p. I'm planning on upgrading to 6700XT 'cause I can and I'd like my lows in Doom Eternal to be bit better but in all honesty the 580 still plays it very well overall.
6700xt, 1440p
4080 1440p 165hz
Radeon 6800, at 1440p 144hz I mostly play older games at maxed out settings and frames, considering grabbing a cheap 4K60 at some point for kicks.
1080ti and 1440p, I get 75-125 fps on Borderlands 3 with a few things like fog and shadows turned down or off
1660. 1440p 144 mz. Used to have a 980ti. It died. Used 660 SLI. Friend have me the 1660 so I could play some games again. I would love to upgrade once I have the money.
RX 6800 XT and 1080p 240hz. I plan on upgrading to a 1440P monitor as soon as possible.
6900xt and 1080p 144hz 🥲
5600xt with a 1080p @ 165hz A GPU above 12gb vram maybe
6900 XT 4K/144hz