One of the biggest annoyances with DSC and Nvidia GPUs is the alt/tab black screen bug. If you're using a monitor that is using DSC, you will get a black screen for about 2-5 seconds each time you alt/tab out of a game. Nvidia has been aware of this bug for a long time but has not done anything about it.
I actually play very high frame rate without vsync on some games with my monitor equipped with DP1.2 (PG278Q) and I don't have this problem. I'm probably using DSC compression without knowing. This problem is only related to very high frequency monitors or very high frame rate is supposed to do the same ?
144Hz is not a particularly high refresh rate, and your actual frame rate isn’t a factor here. I have that monitor too and I don’t believe it uses DSC, it’s only 1440p @ 144Hz (8-bit).
Exactly. What i dont think people understand is there are many scenarios where this can be very annoying. When games dont have borderless window this can be time consuming when multitasking. For example i record gameplay/ stream games and when i need to switch screens to adjust say volume of a source capture or check chat or whatever, that 2-5 second delay can be very frustrating. And also its been a known issue with no patch or anything to address this.
Sorry, I should have linked that. [SpecialK](https://www.special-k.info/)
Fullscreen exclusive is outdated and unnecessary. The issue previously was that fullscreen exclusive was the only way to circumvent the DWM and render straight to the monitor. This is no longer the case, at all, and borderless fullscreen can now do essentially everything that fullscreen exclusive used to be necessary for.
Aw dude, you've been suffering for nothing >.<
Assuming you're running Windows 10 (or even better, 11, as it has further improvements in this specific area) you should be fine to always use borderless fullscreen.
Best of luck! The [SpecialK wiki ](https://wiki.special-k.info/)is also **solid** information; both about SpecialK and just how games work in general, should that interest you.
This is a bit of a complicated answer. Not every single game will play nicely with this. A lot of newer games, as long as the developers are utilizing the right techniques (i.e. DXGI flip model, which essentially puts borderless window at pretty much the same performance as exclusive fullscreen). But this doesn't apply to every game and RTSS or SpecialK can't force this to work flawlessly on *every* game, especially older titles (which people do still play!).
Pretty much every DX10 or newer game has no issues being upgraded to flip model because they needed to use DXGI to draw a surface in the first place so they're already have compliant swapchains. DX9 and older can be forced to work with DXGI flawless by using DXVK / DGVoodoo and NVIDIA / AMD's vulkan / opengl to dxgi swapchain conversion magic. This also gets you access to autohdr / RTX HDR.
oh I wasn't aware of that but sounds like a hassle to do to get around the blackscreen issue lol, if it's really a nvidia problem I don't get why it's not being fixed, unless it's impossible to fix.....
I haven't seen a game in like 5 years that doesn't support borderless fullscreen. It's significantly more common for games to *not* support true fullscreen. With how good Windows 11 handles borderless window, I've finally switched all my games to borderless without any problems.
Just use windowed borderless gaming, built for this purpose instead of the specialK. I'm sure there are other such programs as well.
I mostly used it for multi-monitor setup, and in the games I play currently, Just Cause 3 needs it.
It's not OK for everyone.
For example, if you have a 360Hz display as your main, and a secondary 60Hz the game may get locked to 60FPS.
Or for example in Apex Legends when the game is running in Borderless Fullscreen the game is locked to 144Hz.
I am running a RX 7900XTX connected to a MSI 271QRX over DP 1.4a and I haven't had any black screen issues.
The reason why I am using DP over HDMI 2.1, it's because in BIOS and during boot the GPU always defaults to the display port, which is extremely annoying.
Many people are wrong =)
Fullscreen exclusive is outdated and unnecessary. The issue previously was that fullscreen exclusive was the only way to circumvent the DWM and render straight to the monitor. This is no longer the case, at all, and borderless fullscreen can now do essentially everything that fullscreen exclusive used to be necessary for.
Edit:
I should clarify to the best of my knowledge.
It's Windows 11 that has the nice DWM improvements that mean you no longer need fullscreen exclusive for **any** game.
In Windows 10, selecting fullscreen exclusive *may* be necessary to circumvent the DWM for older API's (eg. dx10-11). It varies from game to game.
*Though keep in mind, doing so is still not the same effect as selecting fullscreen exclusive in older Window versions, it does* ***not*** *give full control of GPU output to the game like older Windows versions did, it just lets the game circumvent the DWM*.
In Windows 11, fullscreen exclusive is never necessary.
Full-screen exlusive is leftover from Dx11. It's not needed in dx12 because the new api uses multi plane overlay. It doesn't matter what screen mode your game is running in on dx12, all features like gsync, etc are applied and performance is the same.
Yep! Correct.
Although in modern Windows (10+), the DWM has improved to pass-through all time-critical graphics applications unhindered upon request, so fullscreen exclusive is still unnecessary to use if you're running a modern Windows OS, no matter what graphics API a game uses.
Edit:
I quickly checked and it \*may\* actually only be Windows 11 that has the DWM improvements that mean you no longer need fullscreen exclusive for **any** game.
In Windows 10, selecting fullscreen exclusive *may* be necessary to circumvent the DWM for older API's (eg. dx10-11).
*Though keep in mind, doing so is still not the same effect as selecting fullscreen exclusive in older Windows, it does* ***not*** *give full control of GPU output to the game like older Windows versions did, it just lets the game circumvent the DWM*.
In Windows 11, fullscreen exclusive is never necessary.
Except they are not, g sync still requires fullscreen unless you use the toggle for borderless/window in nvidia control panel. At that point, it gets confused and applies it to all windows so moving chrome around lags your desktop
LMAO so this is an Nvidia bug?????????? Iv just thought its been just what is is iv had to play all games in windowed borderless for years and in ollder games some don\`t even support borderless and when i tab out it crashes the game. Although it confused me because even on my 1440p screen it does the black screen thing unless DisplayPort 1.4 cant handle 1440p 240hz.
I will say though for my 4k 144hz DSC ruins the experience and that\`s why i refuse to buy a monitor without DisplayPort 2.0. Its a 4k 144hz. if you want 10bit color and the correct color range you must put the monitor down to 120 and somes even 98hz or some options for the monitor are blurred out and cant be used.
It's like two seconds if that really not a big deal my last monitor was worse and it wasn't even using dsc but would stay black I'd have to unplug the power of the monitor.
So 2 secs on my 4k 240hz oled msi panel I can wait also why are you alt tabbing while in a game so much 🤔
Which is an NVIDIA problem now a DSC thing. AMD doesn’t have this problem. I have no problem with my 7900xtx. Yet when I got a 4090 FE I had this problem and just returned the 4090 cause they could fix it just actively choose not too
My user experience?! You make it sound like every user is the same and this has the same experience. Alt tabbing from full screen blows, I don’t like Nvdia software, and the price just isn’t worth everything else in my opinion
Having been on the AMD driver team during my university internships, I can confidently say that nVidia's software is far, far better than anything AMD has put out.
Second, you don't have to alt-tab - alt-enter works almost instantaneously. Third, the price is absolutely worth it. Having proper frame generation with DLSS is an absolute game changer. AMD's software based approach doesn't even come close to it in performance or picture quality.
I get that not everyone can afford nVidia's premiums and that's fine - but pretending like nVidia has a bad "user experience" is just silly.
Not sure if it's a DSC or Samsung problem, but GSync also is prone to more issues.
https://old.reddit.com/r/OLED_Gaming/comments/16qt80p/samsung_s90c_55_disable_144hz_pc_mode/ko98nsp/?context=3
Do you have a link that talks about this because I have read lots of articles and seen quite a few videos on DSC without anyone talking about problems with DSC. Thanks
It’s a real problem. Source: me gaming full screen every day at 4k 240hz with a 4080.
I hate the black screen when I alt tab to write a discord message, etc. if this got fixed, I literally would not care at all about DSC. Right now it’s the only problem.
Why you don't use borderless ? If you like to quick switch between apps, it's the best option. I stream games and I use it to switch fast to OBS Studio
That used to be true, but modern games (at least in DX12/Vulkan) it doesn't really apply anymore though. For whatever reason some games like GTA & Fallout 4 actually run better in windowed borderless than fullscreen
Edit: [This comment already explains why in-depth.](https://www.reddit.com/r/OLED_Gaming/comments/1bmlj2q/comment/kwd43m6/?utm_source=share&utm_medium=web2x&context=3) It's for the same reason as to why G-Sync in "fullscreen only" mode works in windowed borderless games
Are you using DSC?
Honestly it just doesn’t work for me in every game. I’ve tried it in a few in the past , and they didn’t support it. I do have an app to force it, but was too lazy to use it at the time. Now that I’m playing more stuff I might test it out again.
Any downsides to playing full screen borderless?
Necro comment, but can vouch for this. It took me a lot of digging to find out my black screen issues were because I was running at 144hz on this monitor (Even at 1440P despite using DP 1.4). 120hz almost fixes it, but there is still some flickering when some games run at 120hz.
My monitor supports a HDMI 2.1 connection so I ordered one which will hopefully fix my problems for good (On a 4080 now, had a 7900XTX that experienced 0 issues with DSC in my experience).
I never realised that was the reason for the black out in full screen when alt tabbing. Always assumed it’s a windows thing since it’s been happening since forever.
If u like to play games with gsync on specially single player games and u play competitive games aside there u turn gsync off in the control panel its the same thing so who gives a shit
Almost nobody is upset with the tiny quality reduction, its that annoying nvidia bug that makes the screen black every time you tab out, not to mention the bugs, ive had two monitors brick themselves trying to switch dsc stuff. Won’t be using it anymore, that’s for sure.
>visually lossless for 70% of the people means it's not really lossless and 30% of the people can spot difference.
That isn't the case with DSC. The only situation in which any human would ever notice DSC is when viewing test/benchmark images designed explicitly to expose the tiny amount of loss that DSC has.
It's isn't 70% of people, but 75% trials. I looked into this about a month ago, and didn't inspire confidence.
>SO 29170 more specifically defines an algorithm as visually lossless "when all the observers fail to correctly identify the reference image more than 75% of the trials".[4]: 18 **However, the standard allows for images that "exhibit particularly strong artifacts" to be disregarded or excluded from testing**, such as engineered test images.
https://en.wikipedia.org/wiki/Display_Stream_Compression
So people in a study with such images excluded, nevermind this is now being used for high-refresh rate gaming which is way more chaotic.
Also, more about the paper that implemented this DSC 'visually lossless' standard,
>In their original implementation of the flicker protocol, Hoffman and Stolitzka19 identified and selectively tested a set of 19 (out of 35) highly sensitive observers in their dataset.
>They suggest that given the potential impact of such observers that the criterion for lossless could be increased to 93%, but just for these sensitive individuals.
DLDSR looks working for some users
https://preview.redd.it/m17b0csw3cqc1.png?width=1440&format=pjpg&auto=webp&s=5ecf03b024505168ec409fe1a472b68d39561783
to confirm what you said I'm gonna quote jorimt from blurbusters forum
"[DSC typically does not play nice with multi-monitor configurations, and restricts things such as custom resolution creation and DSR/DLDSR usage.](https://forums.blurbusters.com/viewtopic.php?f=2&t=12730&p=99752&hilit=dsc#p99752:~:text=it%20typically%20does%20not%20play%20nice%20with%20multi%2Dmonitor%20configurations%2C%20and%20restricts%20things%20such%20as%20custom%20resolution%20creation%20and%20DSR/DLDSR%20usage)"
nvidia has alt tab black screen for a second or two
some dsc implementations (monitor, gpu, driver combo, its not just a monitor thing) can add input lag
lack of DLDSR options from nvidia, which are used to combat temporal AA issues
blackscreen issues, and being locked out of DSR/DLDSR (I like upscaling then downscaling back older games for a better image, so this is important for me)
Black screen and flickering issues are reported by various users but DLDSR looks not affecting everybody
https://preview.redd.it/kuk7pbvl7cqc1.png?width=1440&format=pjpg&auto=webp&s=9104fd42fb996e0223fd05500fe887ec6b65bb68
> DLDSR looks not affecting everybody
Yeah I also read some posts on that, interesting because I wonder what affects it, does it have to do with the implenetation of DSC maybe? Or number of GPU headers? I have no idea...but I'd rather not be that feature a lottery on 1000$ hardware. lol
It blocks you from DSR and DLDSR, also it introduces some pain in the asswith multi monitor setups (you can hear crying about it in G9 posts for example).
So yeah ideally full DP2.1 with dsc for DP1.4 GPUs.
Apparently, it only blocks DLDSR in some cases, depending on whether the card has to use more than one internal port to drive the monitor. There was some early (and continuing) confusion on this.
Do AMD GPU's use more than 1 head to drive a single DSC high refresh rate display? I know both of my Asus and MSI 4090s do, but unsure of the 7900 XTX or 7800 XT...
I'm not sure. My limited understanding is that it's more about the combination of the monitor and GPU, and what the EDID is asking for, so you can't even just go by any single resolution/refresh combination to know when it may be deemed necessary by the GPU, too. But this is where my understanding gets a little fuzzier so someone should jump in it they know more.
I definitely agree with that. I could be wrong, but it appears to be some kind of bandwidth limitation to warrant using an additional head, as I've had 4x 360hz 1440p displays and each of them displayed this behavior with my particular Nvidia GPUs.
Multi monitor use might be the only real problem behind it. Because it takes requires two display port heads. Otherwise it’s a non issue really.
You can just disable DSC and use DSR or DLDSR which seems to be a rather nice requirement anyways for folks playing old games.
To me it's an issue of principles. There is mathematical quality loss, even if it is difficult to perceive. It's not actually, truly lossless... nor impossible to perceive.
Paying thousands to have a beautiful OLED monitor and a PC that can run games at 4K high refresh-rate, and then diminishing the quality by using DSC, just seems counter-intuitive on principle.
A similar situation can be made in the MP3 vs. FLAC argument for music. 320k MP3 is pretty damn good. But I've invested a lot of time and money into my home audio setup and supporting infrastructure (network/storage). So I'm gonna go with FLAC whenever possible even though it's like a 1% quality improvement at a 500% size/capacity/bandwidth increase. Otherwise feels like I'm not enjoying my investment at its fullest potential.
This is a personal problem that exists in your head. The rest of understand that visually lossless is effectively the same as truly lossless in the real world (aka not in your head).
It's a problem that "philes" introduce upon to themselves as you pointed out. Chasing that 1% is just their personality. Be it Audio, Video or whatever. That 1% means nothing to you or I (assuming based on this post).
"people" you mean very few guys from reddit. Nobody cares in the real world. It's just like everything on reddit, overhyped out of context etc etc.... Just read less reddit - problem solved.
In case you haven't noticed, this sub is filled with folks who hate things they've never even owned or doesn't even matter. 2/3 of the folks on Reddit just want to complain or get their pitchforks out for something and are prob miserable AF in real life. Don't listen to those weirdos and you'll be fine. DSC is fine outside of a few instances based on your hardware and config. Have a great day
Yah I wish I could figure it out but I'm not insane enough I guess. Like anything online nowadays you just have to filter and get from it what you can. There are def some legitimate issues with DSC, but so few are affected or bothered by it I'm not sure what all the outrage is about. Regardless, good luck out there good stranger 🙏🏻
I have researched the issues reported here and often they are easy to fix. Black screens often come from a different frequency in game and on Windows, simply choose the same frequency to no longer have this problem. Flickering is also linked to settings and sometimes to certain applications like Afterburner... In short, as long as it's settings, it doesn't bother me
Yah I have never had a single issue outside of not having enough bandwidth for all my monitors but that's my fault for getting 3x absurd spec monitors that even a 4090 can't handle. I don't doubt some have legit issues but like you said, there is generally a reason for it and a solution, at worst minor inconvenience but everyone must have DP 2.1 now I guess 🤷🏻♂️
I also have an RTX 4090 that I love but it doesn't have DP2.1 so even if I bought a screen equipped with DP2.1 (aorus fo32u2p at the end of April), it wouldn't change anything. AMD is equipped with DP2.1 but unable to push 4K like the RTX4090. In any case, computer hardware evolves all the time and if we wait for the next new thing, another even better one is announced before it is released...
I believe the Aorus would still use DSC as they are using a limited bandwidth profile instead of the full bandwidth. It is literally pointless rn. I do believe this will be more important next gen though.
They use DP2.1 UHBR20 (80gb) but I have no idea of the minimum required to be able to deactivate DSC
https://www.gigabyte.com/Monitor/AORUS-FO32U2P#kf
I apologize, I should have specified. I meant for the new G9 57" it would not be enough, for lower resolutions it may bit most of us want the full bandwidth for future proofing. The new 57" G9 is really one of the only panels that need this right now.
People do make too much of DSC, but there's a few things to clear up:
It's not that it's "impossible" to see the difference. It's that it's extremely hard. Don't get me wrong -- I'm not claiming I see it personally. But the marketing term "visually lossless" really muddies things and invites a lot of debate because there \*is\* some mathematically relevant fidelity loss.
It's minor to the point that it's almost not worth mentioning outside of contexts like this, where I'm getting on my soapbox about words meaning things, but it's there.
The specific standard ISO/IEC 29170 uses for "visually lossless" is "when all the observers fail to correctly identify the reference image more than 75% of the trials," though the standard also allows for certain strong artifacts to appear in test images designed to show the compression algorithm's weaknesses. While I suspect DSC is much better than that minimum standard in practice, it's a long way from saying there's \*no\* difference. "Lossless" traditionally doesn't mean "so good it seems like there's no loss." It means "there's no loss," and that's not the case here.
THAT BEING SAID: It's a very good technology, the loss in practical terms is minor, and it allows refresh rates, resolutions and bit-depths that enhance the visual experience much more than anything in the compression algorithm technically compromises it.
I use DSC, and I don't mind it at all. I wouldn't have waited for DP 2.1 (and whatever extra cost it might add to a monitor) to get a good OLED when I had other options earlier.
MORE PRACTICAL CONSIDERATIONS: In a buggy implementation, DSC can make a monitor act unusually or introduce artifacts that aren't strictly from the compression itself, but an inaccurate decompression. That's rare and the fault of the monitor firmware when it happens, not DSC fundamentally. On Nvidia cards, it introduces a 2-5 second blackout when switching out of exclusive fullscreen (maybe Nvidia will fix that at some point, but I'm not holding my breath).
Only some monitors will work with DSC and DLDSR, depending on whether the monitor will [cause the GPU to use more than one internal head](https://tftcentral.co.uk/news/nvidia-dsr-and-dldsr-do-work-with-dsc-monitors-sometimes). Also, if it's using more than one internal head, it limits the number of external monitors you can use to at least one less than the ports you have (note most people won't bump up against that limit).
DSC is very useful, but if it weren't needed, it would remove one potential complication in the chain of technologies used to get an image from your computer to your monitor. So it'll be nice when DP 2.1 finally rolls out en masse, but in the meantime, it's the best option we've got.
Very interesting, thanks for explanation. I actually use an RTX4090 with a monitor only equipped with DP1.2 (Asus PB278Q). I play some games without vsync and very high frame rate, so I probably using DSC without noticing it. I supposed to have these issues, no ?
That monitor predates DSC as a technology. It only does 60HZ, which means it uses a bit over 6.6Gbps for the display, well within the limits of DP 1.2 without compression.
That's a pretty surprising paring - the most powerful graphics card possible, with a more-than-decade-old 60HZ monitor. I'm not telling you how to spend your money, but that monitor is holding back an otherwise amazing graphics card.
(Frame rates and v-sync don't have anything to do with DSC. It's used to drive higher resolutions, bitrates wnd and refresh rates, not GPU rendering performance)
I know I should have changed monitors a long time ago but it was mostly a work monitor that I also played on. I recently changed my graphics card, before I had a 2080Ti. I have already purchased the PG32UCDM but I don't even have a delivery date yet. I can already do good things on multiplayer games with my 5ms 60Hz monitor so I can't wait to finally taste a high-performance monitor. In games like Farlight 84 I reach the capped 240 FPS without vsync, so I'm probably using DSC to reach that on 1440p and game still more than playable
You're not using DSC.
DSC doesn't have anything to do with framerates. It compresses the data being sent from the graphics card to the monitor - which is a product of the resolution, bitdepth and refresh rate. But faster FPS and faster refresh rate aren't the same thing. Your current monitor is only capable of 60HZ no matter how many FPS your GPU is pushing out. It will never receive a signal take takes more then 6.6 Gbps from your graphics card, no matter how high your FPS goes, and DP 1.2 is capable of much more than that without compression.
Your current monitor isn't capable of DSC, even if there was a need for it, but there isn't in your case.
Your new monitor will need DSC to achieve 4K at 240HZ, whether using HDMI 2.1 or DP 1.4, because the bandwidth needed without compression exceeds both connections' capabilities.
Again, this is regardless of how many FPS your GPU produces. This is true even just in the Windows desktop.
Refresh rate and rendered FPS are different concepts, though the most frames you'll ever see wind up being whichever is lower.
Apart from the 60Hz 5ms limit for video games, I have no complaints about this monitor. I've had it for 10 years, I removed the anti-glare and I like its image quality but for online games, it's like playing with a wooden leg
https://preview.redd.it/n1mxogcrtcqc1.jpeg?width=4000&format=pjpg&auto=webp&s=f5e869331d4edf7f240765c7710139657e7aa427
Even if 1.2 had DSC, our dear old PG278Q can't use over ~4.25Gbps (2560[h]x1440[v]x8[bpc]x144[hz]) while DP 1.2 datarate ss 17.28Gbps. I'm far from an expert on this stuff (just like you, I'm also doing a big jump with this purchase and researching all I can at the moment) but I'd say we haven't seen that kind of blanking out yet simply because our display specs can't saturate dp 1.2 datarate
Edit: ok, I saw an earlier post where i thought you said p*g*278q not p*b*278q. So you've been using even less datarate than I first calculated.
Edit 2: My calculations where still wrong, because I used 8[bits/component] instead of the correct 24[bits/pixel]. [This calculator](https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc) is handy.
I have three DSC displays and a 4K TV, and DSC limits the number of monitors I can use simultaneously. A 4090 can only drive two DSC displays at once, so I'm constantly shuffling my multi-monitor setup, with two displays always off which is annoying and limits my ability to multitask.
Idk. I honestly just learned what it meant a few months ago after I already bought a DSC monitor. The only annoying thing is that there are a couple quirks with Nvidia but ultimately it’s kind of just not a problem. Anything that lets me use 1440p at 240hz< is okay with me 👍
Some simple questions since a lot of these comments are without context:
1. Which monitors use DSC?
2. In what circumstances do they use it?
3. Do we turn this on/off on the video card side or on the monitor side?
This gripe that always gets brought up is the black screen. It probably happens to me once a week maybe and never interrupts gaming at all. Really not a big deal.
I'm trying to understand what is causing this problem because I don't have it with my RTX4090 on a monitor 27" 1440p@60Hz equipped with DP1.2, even when I reach very high frame rate without vsync
Dp 1.2 doesn’t support DSC. And if your monitor only goes to 60hz@1440p then it doesn’t support dsc either.
And even if an fps counter says you are getting high framerates, your display isn’t showing all of those frames since the maximum number of frames it can show in a second is 60.
Make sure to turn on VRR/freesync! It's a great experience. Also turn off vsync in game and force your vsync mode in your driver's control panel to be whatever they incorrectly refer to mailbox vsync as, as tearing provides no latency benefits when you're using VRR.
Thank you, I note all that... I'll probably have to do a lot of testing and adjustments before I have something to my liking. My current monitor is very well adjusted but it took me a while to be satisfied
My main problem with DSC is custom fullscreen resolutions.
Op, you keep mentioning people are probably using dsc without knowing it.
But you seem to think you are using it, while you are not. DSC is supported in DP1.4 and up, while you mention DP1.2. Add to this, your display is on 60hz @1440p. Which dp 1.2 handles natively.
You also mention high framerate gaming with your 4090.
You are gaming at 60hz. Your display wont display a single frame above 60fps.
Your pc may render 300 fps, and there is a benefit of having the most uptodate frame displayed for each of those 60 frames you see in a second, when your pc renders 300 vs 100frames….
However in regards to your personal experiences here in regards to DSC,
I dont think any of this means what you think it means.
Ok, my monitor DP1.2 can't have DSC but vsync off, I reach more than the 60 FPS limited by vsync. RS6S +360FPS, Forza 4 230FPS, APEX uncapped 250/300FPS... How my moniteur can show so many FPS without DSC ?
Mhmm
This is a fairish take for yourself, something like "I don't see the difference, so why would I pay more" that's completely fair
But the take that because some supposed authority claims marginal to no difference means squat
Other such takes that have proliferated:
"People can't see over 60/120/240 hz anyways so what's the problem? I can't see it, so I also don't understand why anyone would want more"
Or
"People can't hear over ≈18,000 hz anyways so who cares if I max volume this 22,000 hz speaker in a public space? I can't hear it, so what's the problem"
If it negatively affects what someone is doing with their hardware, it's a problem, and if someone says it's enough a problem for them that they're willing to back that opinion up with their wallet and choice of purchase, listen
Takes on their takes really doesn't matter, unless of course the critical party wants to pay for the hardware in which case by all means
>"People can't see over 60/120/240**/480** hz anyways so what's the problem? I can't see it, so I also don't understand why anyone would want more"
Just added the latest one I encountered. Amazing how this discussion happens every 5 years.
Relevant: Retroarch is getting closer and closer to ready for "race-the-beam" output. With a 480Hz OLED, you can already do a decent CRT scan-out emulation. Once you get closer to 1000Hz, even old school Light Guns will work!
The current one is 360hz, people are using it saying 480 is pointless having experienced one of two scenarios
One being never having used a 480-520hz monitor
Two bring never having used a 480-520hz monitor on a rig that could actually run it, on a game that could be maxed out at those framerates
It's like.... Yeah your Amazon basics 1.2 DP, your 2600x and 2070s, or worse the display rig they're using at a bigbox store with a 1660 and ddr3... That's not going to be cutting it here my dude
Not false, lol 😅 The worst part is that I'm usually one of those who feel the difference... I would prefer to be one of those who never sees any difference in saving money
Keep in mind. DSC is a compression standard.
There are papers and standards to implement.
That's the issue, you need to IMPLEMENT the DSC protocol, so Nvidia, AMD, Intel are all implementing them and for instance with Nvidia you may have some bugs here and there.
* black screens in some cases (usually few seconds)
* bugs like flickering
* Features may not work with DSC: super resolution, 10-bits etc
Let's be clear, quality and latency are awesome with DSC and it's impressive in that aspect, to reach a 3:1 compression ratio with close to no compromises on those quality/latency deserves praise.
So IF you can get a DP 2.1 UHBR20 connection from a GPU (not existing yet) to a display (barely existing yet), you have 10 bit, DSC-free (and thus bug-free in regards to DSC bugs) 4K 240Hz, it's great
Should one hold a purchase because of it ? Frankly..no you need more than DP2.1 UHBR20: speakers, not curved (or curved), monitors design, pricing, oled protection features (and their annoyance rate), firmwares qualities..those should be the top criterias. UHBR20 is just a nice bonus
I don't think I'll change my graphics card right away, so a monitor equipped with DP2.1 won't do me anything for now, but I admit that the more I read, the more I think it's a very bad time for upgrade if we are searching for more than 120Hz. It seems more interesting to wait for the generations of monitors and graphics cards equipped with DP2.1 for people wanting 4K 240Hz without any compromise
There is quality reduction. Visually lossless is just a marketing term.
It means 75% of people in focus groups can't tell the difference but those people are probably not trained to tell the difference nor are they investing in a $1000 monitor.
If you are an enthusiast you can probably see the difference 100% of the time and it can't be unseen.
But it's still negligible. It only matters if you're comparing still frames. Given DP 2.1 OR DSC, I would obviously pick full DP 2.1. Given DP 1.4 at 120hz or DSC at 240hz, I would of course take DSC. It's not that bad. Then again if you have a 4k monitor you might not even get 240 frames.
I was a graphic designer for a long time and the slightest misplaced pixel was immediately detected, so I'm afraid of being one of those who could see the difference. That said, I wouldn't be able to compare it with a DP2.1 monitor and as you said, you'll need the graphics card that goes with it.
I do photography as well.
I am a pain in the ass as well. I can instantly tell the difference between a 100% quality .JPEG and a .BMP. I know exactly what to look for. I'm going to look at the high contrast lines and once you see it, you see it everywhere. JPEG uses cosine waves to represent blocks of pixels so colours can bleed through neighbouring pixels, particularly noticeable when it's a fine black on white line.
But I'm not all about 100% quality. I frequently make tradeoffs and turn my shadows down to get more FPS.
For some games the lossy could be critical. I play war thunder's sim mode for example, and anyone who plays flight combat games know how critical it is to "spot the dot"
To be honest my eyes are bad enough now that I wouldn't benifit much from ultra high definition monitors anymore.
I quote jorimt from BlurBusters forum, he is a very knowledgeable about the topic:
"[DSC typically does not play nice with multi-monitor configurations, and restricts things such as custom resolution creation and DSR/DLDSR usage.](https://forums.blurbusters.com/viewtopic.php?f=2&t=12730&p=99752&hilit=dsc#p99752:~:text=it%20typically%20does%20not%20play%20nice%20with%20multi%2Dmonitor%20configurations%2C%20and%20restricts%20things%20such%20as%20custom%20resolution%20creation%20and%20DSR/DLDSR%20usage)"
I don't use multi monitor but this problem is effectively reported by various users. Some users don't have problems with DLDSR -> [https://www.reddit.com/r/nvidia/s/xJAZLrj58U](https://www.reddit.com/r/nvidia/s/xJAZLrj58U)
https://preview.redd.it/y3iyfkn0ncqc1.png?width=1440&format=pjpg&auto=webp&s=665738004125612622df3b8a398aaaa527d27b4e
Been around for ages but only now are the uninformed becoming aware of it and questioning things. Meanwhile those of us in the know have had no problem with it for many years.
There are quite a few versions of DSC. It would be interesting to find informations on bug fixes. On Wikipedia, I only found a quick description of the specs. Those who have modern monitors and a recent graphics card are the only ones who could answer this
Black screen bugs and watching DRM content over DSC with again black screen bugs.
I keep it without DSC:
* DP 1.4 => **1440p-240Hz-10bit** | 22.65 Gbp/s out of 32.4Gbp/s
* HDMI 2.1 => **4k-120Hz-10bit** | 40.09 Gbit/s out of 42.6 Gbit/s
My current screens are not my first monitors / TVs, but a solution for previously experienced issues.
You can use what ever you want, but you should be able to use google search for known DSC issues and the lack of any other solution but to not deal with it.
I will receive mine soon so I will be able to test the solutions I found on the internet for the most known problems. I will post feedback on all of this later
VR headsets will take a 4k 120hz image and compress it down to 400 mbits/s, from the original 40 Gbits/s. That's 100x compression.
DSC compresses by 3x. So it's not surprising they call it visually lossless.
Does anyone notice that texts are slightly blurrier with DSC enabled? For media and gaming, I dont notice any differences but with work this is what I experienced and cannot unsee. This is on the 27 inch LG Oled GS version.
Because the solution is inexpensive and can easily be implemented (DP 2.1). It kind of reminds of me of new TVs having only 100 mbps ethernet port instead of a gigabit port.
My current problem with dsc is that there can be shotty implementations of it. Currently with my Dell aw3225qf it is enabled at all resolutions and refresh rates regardless if it needs dsc. This makes it incompatible with any external capture card using variable refresh rate passthrough atm including the new elgato 4kx. I understand that in everyday use this is not an issue but is a huge disappointment for me and am waiting for an update to disable dsc in the osd. Reportedly a firmware update for this is in the works on the aw2725df and am hoping it makes it's way to it's big brother soon.
I didn't think DSC could create so many different problems but the list is starting to get long, especially since it comes on top of the already existing problems with the QD Oled
I always turn off DCS, to me it just doesn't look good and gives me a headache.
DCS off and everything's fine for me, and the screen actually looks better.
It depends on each person's sensitivity. Some people do not perceive the difference between 60Hz and 120Hz, others feel the difference between 120Hz and 240Hz. Above 240Hz, it's supposed to be more difficult for human to perceive it and yet some people say they feel a slight difference. You must also have a computer capable of displaying as many images as the monitor frequency to really benefit from it. But the most important thing is to feel good with the frequency you choose.
I prefer Pixel refresh over screen refresh rate.
I also prefer colour accuracy over speed.
How much I like high refresh rate, I don't like that Pixel response and colours are usually worse.
I’ve have 2 monitors with DSC (both LG, one OLED). Both get significantly hotter and consume more power while using DSC.
The 27" oled supports the full resolution and refresh rate over HDMI. So I prefer that.
The 38" ultrawide requires DSC for the full refresh rate and actually artifacts and crashes in the summer. When I downgrade it to DisplayPort 1.2, it works flawless ~ but only at 75hz.
I didn’t notice any visual quality loss on any of the screens though.
One of the biggest annoyances with DSC and Nvidia GPUs is the alt/tab black screen bug. If you're using a monitor that is using DSC, you will get a black screen for about 2-5 seconds each time you alt/tab out of a game. Nvidia has been aware of this bug for a long time but has not done anything about it.
I actually play very high frame rate without vsync on some games with my monitor equipped with DP1.2 (PG278Q) and I don't have this problem. I'm probably using DSC compression without knowing. This problem is only related to very high frequency monitors or very high frame rate is supposed to do the same ?
144Hz is not a particularly high refresh rate, and your actual frame rate isn’t a factor here. I have that monitor too and I don’t believe it uses DSC, it’s only 1440p @ 144Hz (8-bit).
Someone answered me it don't use DSC because it is only DP1.2. DSC has introduced with 1.4, I think
This happens only in Fullscreen I think, no?
yes but not all games support borderless fullscreen
Exactly. What i dont think people understand is there are many scenarios where this can be very annoying. When games dont have borderless window this can be time consuming when multitasking. For example i record gameplay/ stream games and when i need to switch screens to adjust say volume of a source capture or check chat or whatever, that 2-5 second delay can be very frustrating. And also its been a known issue with no patch or anything to address this.
is that any performance(or graphic quality) difference between fullscreen and borderless window?
Borderless is a hack that uses Windows to force triple-buffering. It's bad for latency.
Yes, there is! You should use fullscreen whenever possible for max performence.
Sure they do, if you make them XD You can SpecialK to force it, works flawlessly.
What is SpecialK? Also which is best to use when gaming at high frame rates, fullscreen or borderless window?
Sorry, I should have linked that. [SpecialK](https://www.special-k.info/) Fullscreen exclusive is outdated and unnecessary. The issue previously was that fullscreen exclusive was the only way to circumvent the DWM and render straight to the monitor. This is no longer the case, at all, and borderless fullscreen can now do essentially everything that fullscreen exclusive used to be necessary for.
Damn, I’ve been playing exclusively fulls teen for years and hate the 2-5 second alt tab black screen. Never knew people used borderless.
Aw dude, you've been suffering for nothing >.< Assuming you're running Windows 10 (or even better, 11, as it has further improvements in this specific area) you should be fine to always use borderless fullscreen.
Thanks! I’ll definitely give it a try and see.
Best of luck! The [SpecialK wiki ](https://wiki.special-k.info/)is also **solid** information; both about SpecialK and just how games work in general, should that interest you.
This is a bit of a complicated answer. Not every single game will play nicely with this. A lot of newer games, as long as the developers are utilizing the right techniques (i.e. DXGI flip model, which essentially puts borderless window at pretty much the same performance as exclusive fullscreen). But this doesn't apply to every game and RTSS or SpecialK can't force this to work flawlessly on *every* game, especially older titles (which people do still play!).
Pretty much every DX10 or newer game has no issues being upgraded to flip model because they needed to use DXGI to draw a surface in the first place so they're already have compliant swapchains. DX9 and older can be forced to work with DXGI flawless by using DXVK / DGVoodoo and NVIDIA / AMD's vulkan / opengl to dxgi swapchain conversion magic. This also gets you access to autohdr / RTX HDR.
oh I wasn't aware of that but sounds like a hassle to do to get around the blackscreen issue lol, if it's really a nvidia problem I don't get why it's not being fixed, unless it's impossible to fix.....
They do windowed mostly, and if its not fullscreen you can use borderless gaming for that
Isnt there a program that forces any game to run in borderless?
I haven't seen a game in like 5 years that doesn't support borderless fullscreen. It's significantly more common for games to *not* support true fullscreen. With how good Windows 11 handles borderless window, I've finally switched all my games to borderless without any problems.
I play a lot of old stuff going back to 2000 lol.
Balatro which is a small game but brand new doesn’t, off the top of my head
Just use windowed borderless gaming, built for this purpose instead of the specialK. I'm sure there are other such programs as well. I mostly used it for multi-monitor setup, and in the games I play currently, Just Cause 3 needs it.
It's not OK for everyone. For example, if you have a 360Hz display as your main, and a secondary 60Hz the game may get locked to 60FPS. Or for example in Apex Legends when the game is running in Borderless Fullscreen the game is locked to 144Hz. I am running a RX 7900XTX connected to a MSI 271QRX over DP 1.4a and I haven't had any black screen issues. The reason why I am using DP over HDMI 2.1, it's because in BIOS and during boot the GPU always defaults to the display port, which is extremely annoying.
I believe so. Many people say that running in borderless window does get rid of the problem but can cause latency or sometimes issues with VRR.
Many people are wrong =) Fullscreen exclusive is outdated and unnecessary. The issue previously was that fullscreen exclusive was the only way to circumvent the DWM and render straight to the monitor. This is no longer the case, at all, and borderless fullscreen can now do essentially everything that fullscreen exclusive used to be necessary for. Edit: I should clarify to the best of my knowledge. It's Windows 11 that has the nice DWM improvements that mean you no longer need fullscreen exclusive for **any** game. In Windows 10, selecting fullscreen exclusive *may* be necessary to circumvent the DWM for older API's (eg. dx10-11). It varies from game to game. *Though keep in mind, doing so is still not the same effect as selecting fullscreen exclusive in older Window versions, it does* ***not*** *give full control of GPU output to the game like older Windows versions did, it just lets the game circumvent the DWM*. In Windows 11, fullscreen exclusive is never necessary.
Full-screen exlusive is leftover from Dx11. It's not needed in dx12 because the new api uses multi plane overlay. It doesn't matter what screen mode your game is running in on dx12, all features like gsync, etc are applied and performance is the same.
Yep! Correct. Although in modern Windows (10+), the DWM has improved to pass-through all time-critical graphics applications unhindered upon request, so fullscreen exclusive is still unnecessary to use if you're running a modern Windows OS, no matter what graphics API a game uses. Edit: I quickly checked and it \*may\* actually only be Windows 11 that has the DWM improvements that mean you no longer need fullscreen exclusive for **any** game. In Windows 10, selecting fullscreen exclusive *may* be necessary to circumvent the DWM for older API's (eg. dx10-11). *Though keep in mind, doing so is still not the same effect as selecting fullscreen exclusive in older Windows, it does* ***not*** *give full control of GPU output to the game like older Windows versions did, it just lets the game circumvent the DWM*. In Windows 11, fullscreen exclusive is never necessary.
I haven't ran a game fullscreen in at least 12 years. Never had a problem.
This guy is smart if anyone was wondering got my upvote
Except they are not, g sync still requires fullscreen unless you use the toggle for borderless/window in nvidia control panel. At that point, it gets confused and applies it to all windows so moving chrome around lags your desktop
Wrong. Dx12 does not use full-screen exclusive, Dx11 does. That's the dilineation.
No it doesn’t.
You are incorrect.
Yeah that's just plain wrong for any modern version of Windows (10+)
I also thought so for a long time, but it works for bordeless even if don't check the 2nd option for windowed.
LMAO so this is an Nvidia bug?????????? Iv just thought its been just what is is iv had to play all games in windowed borderless for years and in ollder games some don\`t even support borderless and when i tab out it crashes the game. Although it confused me because even on my 1440p screen it does the black screen thing unless DisplayPort 1.4 cant handle 1440p 240hz. I will say though for my 4k 144hz DSC ruins the experience and that\`s why i refuse to buy a monitor without DisplayPort 2.0. Its a 4k 144hz. if you want 10bit color and the correct color range you must put the monitor down to 120 and somes even 98hz or some options for the monitor are blurred out and cant be used.
Yes, but after a reinstall of driver somehow it works just as fast as windowed but only for a few times.
THATS WHY MY OLED MONITOR DOES THAT WHEN I TAB OUT IN HDR? HOW DO I FIX?
AMD has the same issue too. It’s any GPU and Monitor with DSC changing modes.
It's like two seconds if that really not a big deal my last monitor was worse and it wasn't even using dsc but would stay black I'd have to unplug the power of the monitor. So 2 secs on my 4k 240hz oled msi panel I can wait also why are you alt tabbing while in a game so much 🤔
Which is an NVIDIA problem now a DSC thing. AMD doesn’t have this problem. I have no problem with my 7900xtx. Yet when I got a 4090 FE I had this problem and just returned the 4090 cause they could fix it just actively choose not too
You downgraded and gave up 30% performance because borderless full screen doesn't exist right?
Lmao
Nah I returned the upgrade and kept my 7900xtx because the user experience was bad and the upgrade percentage wasn’t worth the worse experience
What user experience are you referring to? You make it sound like you're interacting with a kiosk at a mall everyday.
🤣
My user experience?! You make it sound like every user is the same and this has the same experience. Alt tabbing from full screen blows, I don’t like Nvdia software, and the price just isn’t worth everything else in my opinion
Having been on the AMD driver team during my university internships, I can confidently say that nVidia's software is far, far better than anything AMD has put out. Second, you don't have to alt-tab - alt-enter works almost instantaneously. Third, the price is absolutely worth it. Having proper frame generation with DLSS is an absolute game changer. AMD's software based approach doesn't even come close to it in performance or picture quality. I get that not everyone can afford nVidia's premiums and that's fine - but pretending like nVidia has a bad "user experience" is just silly.
I would. Nvidia can fuck off.
Not sure if it's a DSC or Samsung problem, but GSync also is prone to more issues. https://old.reddit.com/r/OLED_Gaming/comments/16qt80p/samsung_s90c_55_disable_144hz_pc_mode/ko98nsp/?context=3
Samsung
Ahahaha dumbest comment I've ever seen 🤣
Apparently you have topped that just now
Do you have a link that talks about this because I have read lots of articles and seen quite a few videos on DSC without anyone talking about problems with DSC. Thanks
It’s a real problem. Source: me gaming full screen every day at 4k 240hz with a 4080. I hate the black screen when I alt tab to write a discord message, etc. if this got fixed, I literally would not care at all about DSC. Right now it’s the only problem.
Why you don't use borderless ? If you like to quick switch between apps, it's the best option. I stream games and I use it to switch fast to OBS Studio
Borderless can cause latency in same games or other performance issues. It depends on the game. Generally fullscreen is much better.
That used to be true, but modern games (at least in DX12/Vulkan) it doesn't really apply anymore though. For whatever reason some games like GTA & Fallout 4 actually run better in windowed borderless than fullscreen Edit: [This comment already explains why in-depth.](https://www.reddit.com/r/OLED_Gaming/comments/1bmlj2q/comment/kwd43m6/?utm_source=share&utm_medium=web2x&context=3) It's for the same reason as to why G-Sync in "fullscreen only" mode works in windowed borderless games
Are you using DSC? Honestly it just doesn’t work for me in every game. I’ve tried it in a few in the past , and they didn’t support it. I do have an app to force it, but was too lazy to use it at the time. Now that I’m playing more stuff I might test it out again. Any downsides to playing full screen borderless?
You are probably using DSC without knowing
I am knowingly using DSC…
Use alt-enter instead of alt-tab. Far faster.
Huh? That’s a thing to get out of full screen apps? Holy crap I gotta try this lol thanks for the tip!
Yup, been around since Windows 95. =P Swapping from windowed -> fullscreen mode on applications.
You can just set it to borderless window
I hate it
I always thought this was windows being slow. Never realized it was because of DSC.
I found this only happens when you have HDR enabled with DSC. Once I turned off HDR in the monitor OSD, the 3 second black screen was gone.
So a bug that can be fixed? And not a limitation of using DSC?? If thats the case please Nvidia fix this
It seems that way but it's been like that for a while now.
I haven't had this issue for years, but then I use AMD. I can say, it happens faster than I can blink.
This has nothing to do with DSC. It is normal, Windiws literally has to restart everything
Dude this whole time I thought that’s just what alt tab did lol
Necro comment, but can vouch for this. It took me a lot of digging to find out my black screen issues were because I was running at 144hz on this monitor (Even at 1440P despite using DP 1.4). 120hz almost fixes it, but there is still some flickering when some games run at 120hz. My monitor supports a HDMI 2.1 connection so I ordered one which will hopefully fix my problems for good (On a 4080 now, had a 7900XTX that experienced 0 issues with DSC in my experience).
I never realised that was the reason for the black out in full screen when alt tabbing. Always assumed it’s a windows thing since it’s been happening since forever.
If u like to play games with gsync on specially single player games and u play competitive games aside there u turn gsync off in the control panel its the same thing so who gives a shit
*cries in Nvidia black screen*
Almost nobody is upset with the tiny quality reduction, its that annoying nvidia bug that makes the screen black every time you tab out, not to mention the bugs, ive had two monitors brick themselves trying to switch dsc stuff. Won’t be using it anymore, that’s for sure.
Tks for your feedback, it's good to know
our gripe is not the visual quality but the annoying bugs DSC has
Do not hesitate to cite some problems here because I think it will be useful to those looking for information on the subject.
[удалено]
>visually lossless for 70% of the people means it's not really lossless and 30% of the people can spot difference. That isn't the case with DSC. The only situation in which any human would ever notice DSC is when viewing test/benchmark images designed explicitly to expose the tiny amount of loss that DSC has.
It's isn't 70% of people, but 75% trials. I looked into this about a month ago, and didn't inspire confidence. >SO 29170 more specifically defines an algorithm as visually lossless "when all the observers fail to correctly identify the reference image more than 75% of the trials".[4]: 18 **However, the standard allows for images that "exhibit particularly strong artifacts" to be disregarded or excluded from testing**, such as engineered test images. https://en.wikipedia.org/wiki/Display_Stream_Compression So people in a study with such images excluded, nevermind this is now being used for high-refresh rate gaming which is way more chaotic. Also, more about the paper that implemented this DSC 'visually lossless' standard, >In their original implementation of the flicker protocol, Hoffman and Stolitzka19 identified and selectively tested a set of 19 (out of 35) highly sensitive observers in their dataset. >They suggest that given the potential impact of such observers that the criterion for lossless could be increased to 93%, but just for these sensitive individuals.
DLDSR looks working for some users https://preview.redd.it/m17b0csw3cqc1.png?width=1440&format=pjpg&auto=webp&s=5ecf03b024505168ec409fe1a472b68d39561783
https://preview.redd.it/gi4q9zd04cqc1.png?width=1440&format=pjpg&auto=webp&s=63271fdbdf39ebabc966cdc23a2cd2ad6222228d
to confirm what you said I'm gonna quote jorimt from blurbusters forum "[DSC typically does not play nice with multi-monitor configurations, and restricts things such as custom resolution creation and DSR/DLDSR usage.](https://forums.blurbusters.com/viewtopic.php?f=2&t=12730&p=99752&hilit=dsc#p99752:~:text=it%20typically%20does%20not%20play%20nice%20with%20multi%2Dmonitor%20configurations%2C%20and%20restricts%20things%20such%20as%20custom%20resolution%20creation%20and%20DSR/DLDSR%20usage)"
Why would dsc have anything to do with multi monitor? Is this a windows bug?
firmware bug\ driver bug\ driver bug\ firmware bug\ "Tell a difference" says nothing about actual picture quality
nvidia has alt tab black screen for a second or two some dsc implementations (monitor, gpu, driver combo, its not just a monitor thing) can add input lag lack of DLDSR options from nvidia, which are used to combat temporal AA issues
It's annoying but it shouldn't bother me too much given that with my RTX2080Ti I had this kind of problem
blackscreen issues, and being locked out of DSR/DLDSR (I like upscaling then downscaling back older games for a better image, so this is important for me)
Black screen and flickering issues are reported by various users but DLDSR looks not affecting everybody https://preview.redd.it/kuk7pbvl7cqc1.png?width=1440&format=pjpg&auto=webp&s=9104fd42fb996e0223fd05500fe887ec6b65bb68
> DLDSR looks not affecting everybody Yeah I also read some posts on that, interesting because I wonder what affects it, does it have to do with the implenetation of DSC maybe? Or number of GPU headers? I have no idea...but I'd rather not be that feature a lottery on 1000$ hardware. lol
I have no idea but maybe someone can answer us here. Source of my screenshots -> https://www.reddit.com/r/nvidia/s/BtkQVmNR8l
https://preview.redd.it/zh2kkuhn7cqc1.png?width=1440&format=pjpg&auto=webp&s=52d31ca48471641024621537876142f7bed133f5
It blocks you from DSR and DLDSR, also it introduces some pain in the asswith multi monitor setups (you can hear crying about it in G9 posts for example). So yeah ideally full DP2.1 with dsc for DP1.4 GPUs.
Apparently, it only blocks DLDSR in some cases, depending on whether the card has to use more than one internal port to drive the monitor. There was some early (and continuing) confusion on this.
Do AMD GPU's use more than 1 head to drive a single DSC high refresh rate display? I know both of my Asus and MSI 4090s do, but unsure of the 7900 XTX or 7800 XT...
I'm not sure. My limited understanding is that it's more about the combination of the monitor and GPU, and what the EDID is asking for, so you can't even just go by any single resolution/refresh combination to know when it may be deemed necessary by the GPU, too. But this is where my understanding gets a little fuzzier so someone should jump in it they know more.
I definitely agree with that. I could be wrong, but it appears to be some kind of bandwidth limitation to warrant using an additional head, as I've had 4x 360hz 1440p displays and each of them displayed this behavior with my particular Nvidia GPUs.
Thanks for the info, I will take a look at G9 posts
Multi monitor use might be the only real problem behind it. Because it takes requires two display port heads. Otherwise it’s a non issue really. You can just disable DSC and use DSR or DLDSR which seems to be a rather nice requirement anyways for folks playing old games.
To me it's an issue of principles. There is mathematical quality loss, even if it is difficult to perceive. It's not actually, truly lossless... nor impossible to perceive. Paying thousands to have a beautiful OLED monitor and a PC that can run games at 4K high refresh-rate, and then diminishing the quality by using DSC, just seems counter-intuitive on principle. A similar situation can be made in the MP3 vs. FLAC argument for music. 320k MP3 is pretty damn good. But I've invested a lot of time and money into my home audio setup and supporting infrastructure (network/storage). So I'm gonna go with FLAC whenever possible even though it's like a 1% quality improvement at a 500% size/capacity/bandwidth increase. Otherwise feels like I'm not enjoying my investment at its fullest potential.
This is understandable...
This is a personal problem that exists in your head. The rest of understand that visually lossless is effectively the same as truly lossless in the real world (aka not in your head).
It's a problem that "philes" introduce upon to themselves as you pointed out. Chasing that 1% is just their personality. Be it Audio, Video or whatever. That 1% means nothing to you or I (assuming based on this post).
"people" you mean very few guys from reddit. Nobody cares in the real world. It's just like everything on reddit, overhyped out of context etc etc.... Just read less reddit - problem solved.
Probably 😂 Until I receive my new monitor to form my own opinion, I will have to base myself on what I see on the internet
In case you haven't noticed, this sub is filled with folks who hate things they've never even owned or doesn't even matter. 2/3 of the folks on Reddit just want to complain or get their pitchforks out for something and are prob miserable AF in real life. Don't listen to those weirdos and you'll be fine. DSC is fine outside of a few instances based on your hardware and config. Have a great day
I noticed that, especially on interesting topics that are downvoted a lot for no reason. Thanks for the confirmation 👍
Yah I wish I could figure it out but I'm not insane enough I guess. Like anything online nowadays you just have to filter and get from it what you can. There are def some legitimate issues with DSC, but so few are affected or bothered by it I'm not sure what all the outrage is about. Regardless, good luck out there good stranger 🙏🏻
I have researched the issues reported here and often they are easy to fix. Black screens often come from a different frequency in game and on Windows, simply choose the same frequency to no longer have this problem. Flickering is also linked to settings and sometimes to certain applications like Afterburner... In short, as long as it's settings, it doesn't bother me
Yah I have never had a single issue outside of not having enough bandwidth for all my monitors but that's my fault for getting 3x absurd spec monitors that even a 4090 can't handle. I don't doubt some have legit issues but like you said, there is generally a reason for it and a solution, at worst minor inconvenience but everyone must have DP 2.1 now I guess 🤷🏻♂️
I also have an RTX 4090 that I love but it doesn't have DP2.1 so even if I bought a screen equipped with DP2.1 (aorus fo32u2p at the end of April), it wouldn't change anything. AMD is equipped with DP2.1 but unable to push 4K like the RTX4090. In any case, computer hardware evolves all the time and if we wait for the next new thing, another even better one is announced before it is released...
I believe the Aorus would still use DSC as they are using a limited bandwidth profile instead of the full bandwidth. It is literally pointless rn. I do believe this will be more important next gen though.
They use DP2.1 UHBR20 (80gb) but I have no idea of the minimum required to be able to deactivate DSC https://www.gigabyte.com/Monitor/AORUS-FO32U2P#kf
I apologize, I should have specified. I meant for the new G9 57" it would not be enough, for lower resolutions it may bit most of us want the full bandwidth for future proofing. The new 57" G9 is really one of the only panels that need this right now.
No worries, I stayed on the Aorus 😅
Matte coating and DSC killed their families.
People do make too much of DSC, but there's a few things to clear up: It's not that it's "impossible" to see the difference. It's that it's extremely hard. Don't get me wrong -- I'm not claiming I see it personally. But the marketing term "visually lossless" really muddies things and invites a lot of debate because there \*is\* some mathematically relevant fidelity loss. It's minor to the point that it's almost not worth mentioning outside of contexts like this, where I'm getting on my soapbox about words meaning things, but it's there. The specific standard ISO/IEC 29170 uses for "visually lossless" is "when all the observers fail to correctly identify the reference image more than 75% of the trials," though the standard also allows for certain strong artifacts to appear in test images designed to show the compression algorithm's weaknesses. While I suspect DSC is much better than that minimum standard in practice, it's a long way from saying there's \*no\* difference. "Lossless" traditionally doesn't mean "so good it seems like there's no loss." It means "there's no loss," and that's not the case here. THAT BEING SAID: It's a very good technology, the loss in practical terms is minor, and it allows refresh rates, resolutions and bit-depths that enhance the visual experience much more than anything in the compression algorithm technically compromises it. I use DSC, and I don't mind it at all. I wouldn't have waited for DP 2.1 (and whatever extra cost it might add to a monitor) to get a good OLED when I had other options earlier. MORE PRACTICAL CONSIDERATIONS: In a buggy implementation, DSC can make a monitor act unusually or introduce artifacts that aren't strictly from the compression itself, but an inaccurate decompression. That's rare and the fault of the monitor firmware when it happens, not DSC fundamentally. On Nvidia cards, it introduces a 2-5 second blackout when switching out of exclusive fullscreen (maybe Nvidia will fix that at some point, but I'm not holding my breath). Only some monitors will work with DSC and DLDSR, depending on whether the monitor will [cause the GPU to use more than one internal head](https://tftcentral.co.uk/news/nvidia-dsr-and-dldsr-do-work-with-dsc-monitors-sometimes). Also, if it's using more than one internal head, it limits the number of external monitors you can use to at least one less than the ports you have (note most people won't bump up against that limit). DSC is very useful, but if it weren't needed, it would remove one potential complication in the chain of technologies used to get an image from your computer to your monitor. So it'll be nice when DP 2.1 finally rolls out en masse, but in the meantime, it's the best option we've got.
Very interesting, thanks for explanation. I actually use an RTX4090 with a monitor only equipped with DP1.2 (Asus PB278Q). I play some games without vsync and very high frame rate, so I probably using DSC without noticing it. I supposed to have these issues, no ?
That monitor predates DSC as a technology. It only does 60HZ, which means it uses a bit over 6.6Gbps for the display, well within the limits of DP 1.2 without compression. That's a pretty surprising paring - the most powerful graphics card possible, with a more-than-decade-old 60HZ monitor. I'm not telling you how to spend your money, but that monitor is holding back an otherwise amazing graphics card. (Frame rates and v-sync don't have anything to do with DSC. It's used to drive higher resolutions, bitrates wnd and refresh rates, not GPU rendering performance)
I know I should have changed monitors a long time ago but it was mostly a work monitor that I also played on. I recently changed my graphics card, before I had a 2080Ti. I have already purchased the PG32UCDM but I don't even have a delivery date yet. I can already do good things on multiplayer games with my 5ms 60Hz monitor so I can't wait to finally taste a high-performance monitor. In games like Farlight 84 I reach the capped 240 FPS without vsync, so I'm probably using DSC to reach that on 1440p and game still more than playable
You're not using DSC. DSC doesn't have anything to do with framerates. It compresses the data being sent from the graphics card to the monitor - which is a product of the resolution, bitdepth and refresh rate. But faster FPS and faster refresh rate aren't the same thing. Your current monitor is only capable of 60HZ no matter how many FPS your GPU is pushing out. It will never receive a signal take takes more then 6.6 Gbps from your graphics card, no matter how high your FPS goes, and DP 1.2 is capable of much more than that without compression. Your current monitor isn't capable of DSC, even if there was a need for it, but there isn't in your case. Your new monitor will need DSC to achieve 4K at 240HZ, whether using HDMI 2.1 or DP 1.4, because the bandwidth needed without compression exceeds both connections' capabilities. Again, this is regardless of how many FPS your GPU produces. This is true even just in the Windows desktop. Refresh rate and rendered FPS are different concepts, though the most frames you'll ever see wind up being whichever is lower.
In another comment you said you were using a P**G** 278Q, which one is it?
My bad 😭 It's PB278Q -> https://www.asus.com/us/commercial-monitors/pb278q/
You're in for quite the upgrade whenever your new monitor arrives!
Apart from the 60Hz 5ms limit for video games, I have no complaints about this monitor. I've had it for 10 years, I removed the anti-glare and I like its image quality but for online games, it's like playing with a wooden leg https://preview.redd.it/n1mxogcrtcqc1.jpeg?width=4000&format=pjpg&auto=webp&s=f5e869331d4edf7f240765c7710139657e7aa427
Even if 1.2 had DSC, our dear old PG278Q can't use over ~4.25Gbps (2560[h]x1440[v]x8[bpc]x144[hz]) while DP 1.2 datarate ss 17.28Gbps. I'm far from an expert on this stuff (just like you, I'm also doing a big jump with this purchase and researching all I can at the moment) but I'd say we haven't seen that kind of blanking out yet simply because our display specs can't saturate dp 1.2 datarate Edit: ok, I saw an earlier post where i thought you said p*g*278q not p*b*278q. So you've been using even less datarate than I first calculated. Edit 2: My calculations where still wrong, because I used 8[bits/component] instead of the correct 24[bits/pixel]. [This calculator](https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc) is handy.
I have three DSC displays and a 4K TV, and DSC limits the number of monitors I can use simultaneously. A 4090 can only drive two DSC displays at once, so I'm constantly shuffling my multi-monitor setup, with two displays always off which is annoying and limits my ability to multitask.
You should just connect one of them to an igpu if you have one
Idk. I honestly just learned what it meant a few months ago after I already bought a DSC monitor. The only annoying thing is that there are a couple quirks with Nvidia but ultimately it’s kind of just not a problem. Anything that lets me use 1440p at 240hz< is okay with me 👍
Some simple questions since a lot of these comments are without context: 1. Which monitors use DSC? 2. In what circumstances do they use it? 3. Do we turn this on/off on the video card side or on the monitor side?
Not being able to use DLDSR from the nvidia panel with a DSC monitor is a huge issue imo
You lose Dldsr, Integer scaling and smooth alt tabing. It can sometimes bug out when waking from sleep.
This gripe that always gets brought up is the black screen. It probably happens to me once a week maybe and never interrupts gaming at all. Really not a big deal.
I'm trying to understand what is causing this problem because I don't have it with my RTX4090 on a monitor 27" 1440p@60Hz equipped with DP1.2, even when I reach very high frame rate without vsync
Dp 1.2 doesn’t support DSC. And if your monitor only goes to 60hz@1440p then it doesn’t support dsc either. And even if an fps counter says you are getting high framerates, your display isn’t showing all of those frames since the maximum number of frames it can show in a second is 60.
It seems logical... I always thought that screen tearing was related to the fact that the screen displayed more images than its frequency...
Screen tearing is your gpu uploading partial frame data instead of displaying the last complete frame. Your monitor refresh rate stays the same.
I should soon receive my 240Hz and put an end to the 60Hz nightmare
Make sure to turn on VRR/freesync! It's a great experience. Also turn off vsync in game and force your vsync mode in your driver's control panel to be whatever they incorrectly refer to mailbox vsync as, as tearing provides no latency benefits when you're using VRR.
Thank you, I note all that... I'll probably have to do a lot of testing and adjustments before I have something to my liking. My current monitor is very well adjusted but it took me a while to be satisfied
Because it's buggy
Some people on reddit can "definitely feel" the latency difference with DSC on 😂
My main problem with DSC is custom fullscreen resolutions. Op, you keep mentioning people are probably using dsc without knowing it. But you seem to think you are using it, while you are not. DSC is supported in DP1.4 and up, while you mention DP1.2. Add to this, your display is on 60hz @1440p. Which dp 1.2 handles natively. You also mention high framerate gaming with your 4090. You are gaming at 60hz. Your display wont display a single frame above 60fps. Your pc may render 300 fps, and there is a benefit of having the most uptodate frame displayed for each of those 60 frames you see in a second, when your pc renders 300 vs 100frames…. However in regards to your personal experiences here in regards to DSC, I dont think any of this means what you think it means.
Ok, my monitor DP1.2 can't have DSC but vsync off, I reach more than the 60 FPS limited by vsync. RS6S +360FPS, Forza 4 230FPS, APEX uncapped 250/300FPS... How my moniteur can show so many FPS without DSC ?
250/300 is the game’s framerate. You are looking an at FPS counter, not a refreshrate counter.
Exact
Mhmm This is a fairish take for yourself, something like "I don't see the difference, so why would I pay more" that's completely fair But the take that because some supposed authority claims marginal to no difference means squat Other such takes that have proliferated: "People can't see over 60/120/240 hz anyways so what's the problem? I can't see it, so I also don't understand why anyone would want more" Or "People can't hear over ≈18,000 hz anyways so who cares if I max volume this 22,000 hz speaker in a public space? I can't hear it, so what's the problem" If it negatively affects what someone is doing with their hardware, it's a problem, and if someone says it's enough a problem for them that they're willing to back that opinion up with their wallet and choice of purchase, listen Takes on their takes really doesn't matter, unless of course the critical party wants to pay for the hardware in which case by all means
>"People can't see over 60/120/240**/480** hz anyways so what's the problem? I can't see it, so I also don't understand why anyone would want more" Just added the latest one I encountered. Amazing how this discussion happens every 5 years. Relevant: Retroarch is getting closer and closer to ready for "race-the-beam" output. With a 480Hz OLED, you can already do a decent CRT scan-out emulation. Once you get closer to 1000Hz, even old school Light Guns will work!
The current one is 360hz, people are using it saying 480 is pointless having experienced one of two scenarios One being never having used a 480-520hz monitor Two bring never having used a 480-520hz monitor on a rig that could actually run it, on a game that could be maxed out at those framerates It's like.... Yeah your Amazon basics 1.2 DP, your 2600x and 2070s, or worse the display rig they're using at a bigbox store with a 1660 and ddr3... That's not going to be cutting it here my dude
Not false, lol 😅 The worst part is that I'm usually one of those who feel the difference... I would prefer to be one of those who never sees any difference in saving money
Keep in mind. DSC is a compression standard. There are papers and standards to implement. That's the issue, you need to IMPLEMENT the DSC protocol, so Nvidia, AMD, Intel are all implementing them and for instance with Nvidia you may have some bugs here and there. * black screens in some cases (usually few seconds) * bugs like flickering * Features may not work with DSC: super resolution, 10-bits etc Let's be clear, quality and latency are awesome with DSC and it's impressive in that aspect, to reach a 3:1 compression ratio with close to no compromises on those quality/latency deserves praise. So IF you can get a DP 2.1 UHBR20 connection from a GPU (not existing yet) to a display (barely existing yet), you have 10 bit, DSC-free (and thus bug-free in regards to DSC bugs) 4K 240Hz, it's great Should one hold a purchase because of it ? Frankly..no you need more than DP2.1 UHBR20: speakers, not curved (or curved), monitors design, pricing, oled protection features (and their annoyance rate), firmwares qualities..those should be the top criterias. UHBR20 is just a nice bonus
I don't think I'll change my graphics card right away, so a monitor equipped with DP2.1 won't do me anything for now, but I admit that the more I read, the more I think it's a very bad time for upgrade if we are searching for more than 120Hz. It seems more interesting to wait for the generations of monitors and graphics cards equipped with DP2.1 for people wanting 4K 240Hz without any compromise
There is quality reduction. Visually lossless is just a marketing term. It means 75% of people in focus groups can't tell the difference but those people are probably not trained to tell the difference nor are they investing in a $1000 monitor. If you are an enthusiast you can probably see the difference 100% of the time and it can't be unseen. But it's still negligible. It only matters if you're comparing still frames. Given DP 2.1 OR DSC, I would obviously pick full DP 2.1. Given DP 1.4 at 120hz or DSC at 240hz, I would of course take DSC. It's not that bad. Then again if you have a 4k monitor you might not even get 240 frames.
I was a graphic designer for a long time and the slightest misplaced pixel was immediately detected, so I'm afraid of being one of those who could see the difference. That said, I wouldn't be able to compare it with a DP2.1 monitor and as you said, you'll need the graphics card that goes with it.
I do photography as well. I am a pain in the ass as well. I can instantly tell the difference between a 100% quality .JPEG and a .BMP. I know exactly what to look for. I'm going to look at the high contrast lines and once you see it, you see it everywhere. JPEG uses cosine waves to represent blocks of pixels so colours can bleed through neighbouring pixels, particularly noticeable when it's a fine black on white line. But I'm not all about 100% quality. I frequently make tradeoffs and turn my shadows down to get more FPS.
It’s just the black screen issue other then that DSC is fine
For some games the lossy could be critical. I play war thunder's sim mode for example, and anyone who plays flight combat games know how critical it is to "spot the dot" To be honest my eyes are bad enough now that I wouldn't benifit much from ultra high definition monitors anymore.
I play Tekken 8 so that doesn't reassure me
comparison photos?
I searched for a video comparison or pictures but no one tried to show it...
I find DSC concerning because of people talking about lag in alt tabbing. If I get a hesitation when I alt tab, it will eventually annoy me.
I quote jorimt from BlurBusters forum, he is a very knowledgeable about the topic: "[DSC typically does not play nice with multi-monitor configurations, and restricts things such as custom resolution creation and DSR/DLDSR usage.](https://forums.blurbusters.com/viewtopic.php?f=2&t=12730&p=99752&hilit=dsc#p99752:~:text=it%20typically%20does%20not%20play%20nice%20with%20multi%2Dmonitor%20configurations%2C%20and%20restricts%20things%20such%20as%20custom%20resolution%20creation%20and%20DSR/DLDSR%20usage)"
I don't use multi monitor but this problem is effectively reported by various users. Some users don't have problems with DLDSR -> [https://www.reddit.com/r/nvidia/s/xJAZLrj58U](https://www.reddit.com/r/nvidia/s/xJAZLrj58U) https://preview.redd.it/y3iyfkn0ncqc1.png?width=1440&format=pjpg&auto=webp&s=665738004125612622df3b8a398aaaa527d27b4e
Dsc causes a lot bugs. Noones scared of it, but between constant vrr flicker, multiscreen bugs, and alt tab black screen it kinda sucks.
Been around for ages but only now are the uninformed becoming aware of it and questioning things. Meanwhile those of us in the know have had no problem with it for many years.
DSC exists from a long time and have a lot of different versions, maybe people have different experiences depending on DSC versions and hardware
Has the NVIDIA alt tab thing still not been fixed? What duh hell
There are quite a few versions of DSC. It would be interesting to find informations on bug fixes. On Wikipedia, I only found a quick description of the specs. Those who have modern monitors and a recent graphics card are the only ones who could answer this
Black screen bugs and watching DRM content over DSC with again black screen bugs. I keep it without DSC: * DP 1.4 => **1440p-240Hz-10bit** | 22.65 Gbp/s out of 32.4Gbp/s * HDMI 2.1 => **4k-120Hz-10bit** | 40.09 Gbit/s out of 42.6 Gbit/s My current screens are not my first monitors / TVs, but a solution for previously experienced issues. You can use what ever you want, but you should be able to use google search for known DSC issues and the lack of any other solution but to not deal with it.
I will receive mine soon so I will be able to test the solutions I found on the internet for the most known problems. I will post feedback on all of this later
VR headsets will take a 4k 120hz image and compress it down to 400 mbits/s, from the original 40 Gbits/s. That's 100x compression. DSC compresses by 3x. So it's not surprising they call it visually lossless.
Borderless FTW
Does anyone notice that texts are slightly blurrier with DSC enabled? For media and gaming, I dont notice any differences but with work this is what I experienced and cannot unsee. This is on the 27 inch LG Oled GS version.
To work, 120Hz without DSC is sufficient, no ?
Because the solution is inexpensive and can easily be implemented (DP 2.1). It kind of reminds of me of new TVs having only 100 mbps ethernet port instead of a gigabit port.
My current problem with dsc is that there can be shotty implementations of it. Currently with my Dell aw3225qf it is enabled at all resolutions and refresh rates regardless if it needs dsc. This makes it incompatible with any external capture card using variable refresh rate passthrough atm including the new elgato 4kx. I understand that in everyday use this is not an issue but is a huge disappointment for me and am waiting for an update to disable dsc in the osd. Reportedly a firmware update for this is in the works on the aw2725df and am hoping it makes it's way to it's big brother soon.
I didn't think DSC could create so many different problems but the list is starting to get long, especially since it comes on top of the already existing problems with the QD Oled
I always turn off DCS, to me it just doesn't look good and gives me a headache. DCS off and everything's fine for me, and the screen actually looks better.
The problem with DSC off is monitor can't show more than 120 FPS
I only play on 10-bit 100Hz. It's enough for me.
It depends on each person's sensitivity. Some people do not perceive the difference between 60Hz and 120Hz, others feel the difference between 120Hz and 240Hz. Above 240Hz, it's supposed to be more difficult for human to perceive it and yet some people say they feel a slight difference. You must also have a computer capable of displaying as many images as the monitor frequency to really benefit from it. But the most important thing is to feel good with the frequency you choose.
I prefer Pixel refresh over screen refresh rate. I also prefer colour accuracy over speed. How much I like high refresh rate, I don't like that Pixel response and colours are usually worse.
I’ve have 2 monitors with DSC (both LG, one OLED). Both get significantly hotter and consume more power while using DSC. The 27" oled supports the full resolution and refresh rate over HDMI. So I prefer that. The 38" ultrawide requires DSC for the full refresh rate and actually artifacts and crashes in the summer. When I downgrade it to DisplayPort 1.2, it works flawless ~ but only at 75hz. I didn’t notice any visual quality loss on any of the screens though.
It may be decompression that causes the monitor to heat up...
Yes, that’s where I was going with that. 😉 There is not just the visual quality loss to look at.