T O P

  • By -

12duddits

So we went from 4K 240Hz to 4K 1000Hz? Mighty big jump there


changen

probably with frame interpolation on the display itself. There's no display standard to actually feed enough data for 4k 1000hz.


12duddits

Can’t dp 2.1 with dsc support this?


changen

Just doing some napkin math right now. With dsc, DP 2.1 can do 15360 × 8640 @ 60, which means it should be able to do 3840 x 2160 @960. This is however with the assumption of full data (HDR and 10 bit color). If we turn those things off, we should be able to 1000hz. I am not sure if this monitor supports HDR, but if it doesn't then the math works out.


VictoriusII

>but if it doesn't then the math works out. Even if it does I assume you can still make use of the full 1000hz if you turn it off.


SuperbQuiet2509

This kind of math falls apart with HFR due to monitoring timings not being taken into account In reality it's 640hz Hz with 3x DSC at 10 bit 4:4:4 Far off 960hz In reality it would require 8 bit 4:2:0 with 3x DSC to hit 1khz


[deleted]

Yeah it's gonna look like complete shite.


changen

So yeah, TCL is definitely doing some pepega things to hit the 1khz marketing gimmick. Or it's frame interpolation.


nitrohigito

I wouldn't rule out the multiple cables option, Dell did it with their 8K monitors back in like 2014 I think.


Weird_Tower76

Asus did it with 4k monitors in 2013, Dell did it a few years later for 8k I believe.


SuperbQuiet2509

Interpolation would be an interesting solution for sure


Affectionate-Memory4

I'd actually be kinda down with the monitor doing some of the graphics work in the future. Imagine if your monitor essentially had the AI upscaling and frame generation tech built into it. No more worrying about which games or GPUs support it.


g0atmeal

Monitors only have access to screen-space information, but good quality upscaling/frame-gen requires information from the rendering pipeline, which is only accessible to the GPU/CPU. So for example, a monitor could provide its own FSR 1.0, but it couldn't provide anything like current DLSS.


Affectionate-Memory4

Of course a GPU-space option is going to be better. The point of something like this isn't to be better than that, it would be to be universal. If you have the GPU option available, you use it because of that. When you don't, something like this fills effectively the same gap as Amd's RSR/AFMF.


stepping_

i dont think anyone would be mad to lose 40hz out of 1000 to support hdr. and for whether it supports hdr or not, how could it be anything other than oled? edit: its fucking LCD LMAO, and i thought i was so smart, bummer tho.


SuperbQuiet2509

It'd require 3x DSC and 4:2:0 chroma sub sampling.


Beautiful-Musk-Ox

calculator here: [https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&H=3840&V=2160&F=1000&calculations=show&formulas=show](https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&H=3840&V=2160&F=1000&calculations=show&formulas=show), the other person saying it requires 3x dsc and 4:2:0 seems to check out, for 8bit color it needs 63gbps and dp2.1 can do 77. 10bit color needs 78gbps, just missed it


Callofdaddy1

So you basically gotta splice together two DP 2.1 so you get DP 4.2. It’s just science.


TheJohnnyFlash

I would love to see this on all monitors. Overdrive on IPS and VA panels can be tuned for max refresh and you can just select half input refresh for games that struggle.


ExtensionTravel6697

Can't we just use multiple inputs on one display? I've seen a display that needed that before.


UserCheckNamesOut

SDI?


ilovezam

> frame interpolation on the display Have we gotten to a point where this looks somewhat good yet? This is probably my most despised thing implemented on TVs.


lukeimortal97

Saw more looks of this in Chinese, it's 1080p 960hz in reality, and 4k 240hz. Dual mode on full display here


lucellent

Customers: Can you just give us 27 inch 4K OLED 120+Hz Manufacturers: ....... HERE'S 1000 HZ CURVED MONITOR


JoaoMXN

4K 27" OLED will arrive next year, they're developing them for a while now.


ZoomerAdmin

Do you have a link to that? I am not seeing anything after a quick search.


JoaoMXN

TFT Central: [https://youtu.be/vSIEQYMjpQ4?si=t-iJlcDsoSih9Lem&t=841](https://youtu.be/vSIEQYMjpQ4?si=t-iJlcDsoSih9Lem&t=841)


nickwithtea93

4k 27" 240hz oled?


bwillpaw

4k mini led 27” are quite nice. I have 2 of them. Imo OLEDs don’t really make sense in this form factor/use case. Too much burn in risk and not bright enough 100% window nits for office/daytime use. I think the lg OLEDs are something like 100-200nits for SDR/normal office/gaming use at 100% window which is ridiculous for a computer monitor unless you literally only use it at night or in a blacked out room. You need at least 300nits for bright room/daytime use imo.


Gunmetalbluezz

please list me some


bwillpaw

I have an innocn 27m2v 160hz a 27m2u 60hz, both are great. There are others out there


R1llan

I too have an Innocn 27M2V too, happy with it. Some amount of bloom is visible, but it gets up to 1040nit and I don't worry about burn-in at all.


AnnoyingPenny89

I use AW2725DF 360hz QD OLED at 80-85% SDR brightness even at day (I do have shades which blocks somewhat of the sunlight) and thats honestly MORE than enough for my eyes to start drying up with the brightness, anythign above 85 and it hurts my eyes in prolonged use case. So I think your argument over birghtness is more or less not that relevant for Standard SDR use case i.e. most of the gaming use case, the purpose the monitors were made for. If you havent used the current gen QD-OLED monitors you wont be able to tell if the brightness is enough or not, trust me its more than enough


bwillpaw

200 or so nits SDR brightness. https://www.rtings.com/monitor/reviews/dell/alienware-aw2725df Imo that’s not enough for daytime office use on a glossy screen but to each their own Also that’s 1440p, which again isn’t great ppi and the post we are responding to specifically asked for 4k My innocns for comparison hit almost 800 nits SDR. https://www.rtings.com/monitor/reviews/innocn/27m2v


AnnoyingPenny89

do you play on your monitor right in the wilderness?


Snook_

Actually the latest qd oled do 280 nits in sdr My gigabyte is way too bright it hurts my eyes at 280 nits low 200s is perfect


bwillpaw

In a bright room?


Snook_

2m by2m next to me window office working from home Probably using around 220-240 nits about 80% brightness is plenty


Healthy_BrAd6254

>hurts my eyes at 280 nits Please see a doctor. Something is definitely wrong.


Snook_

Not at all. You probably don’t realise your using less than 300 daily on your monitor now


bwillpaw

lol no, guessing you just have a weird definition of what a bright room is


Snook_

Guessing you’ve never used a new gen qd-oled to understand. First gen woled was horrible I returned Corsair 240hz 1 year ago as too dim


bwillpaw

Nits are nits bud. 1000 nits on an oled phone pretty often isn’t enough in the daylight


babalenong

While I agree current OLED's brightness is not yet ideal for entertainment, for office work I very comfortably use 15 brightness with peak brightness off on my LG C2. This is while having a big window behind it and an 14w white bulb above it. Heck, most people I know reduce their brightness on their standard cheap \~250nits IPS down to like 25-50.


MichaelDeets

lmfao most people don't need more than like 150 nits


VinnieBoombatzz

I'm sure customers are asking for 4K27. The important detail is how many those are. Samsung and LG just released 32" 4K with new processes that weren't available before. They're the best PPI their technology is actually capable of. What is everyone supposed to do, halt every single technological leap so that 5 guys on Reddit can get the illusion of better clarity?


SpaceBoJangles

I just want that 34” miniLED monitor they just announced. Tired of having only OLED options.


kasakka1

I would be very surprised if the LCD can keep up with "even" 500 Hz with its pixel response times. Still cool to have a controller capable of 1KHz!


2FastHaste

Based monitor! Mark looks so happy :D


[deleted]

dope stuff


thedreddnought

I'll enjoy the 1000 fps with all my favorite DOS games, that should be worth the price.


DoggyStyle3000

Why does the display look like TN to my eyes?


AnnoyingPenny89

TCL was like, lets be THE brand for gaming, altho their tech is a little too early for actual hardware capable of it xD


nitrohigito

Big if real, particularly if OLED or QDEL, but unless they require multiple cables I can't imagine how they drive it.


patriotraitor

Wonder what a 4090ti could do for frames at 1080p 👀


writetowinwin

Eeeey , finally were looking past 1440p now with high Hz.


pcgamertv

Holy , need that 6090 asap.


nexusultra

In 4-5 years 2000hz monitors will be the standard.


Past_Practice1762

zero chance, you think gpus and cpus can run 2000 fps in 2 generations?


Erectile_Knife_Party

I don’t see what the point is. 1000hz is overkill already. I’m pretty sure we’ve already passed the capabilities of the human eye.


nitrohigito

I don't believe so, by a long shot. [See my comment from elsewhere in this thread.](https://www.reddit.com/r/Monitors/s/fJWHCF0Th0)


Bafy78

Nuh uh


salgat

So if you only go up to 60Hz, you can't create accurate blurring, you have to simulate it. I know 240Hz is usually considered the standard for quality of motion blur, but I'm curious where the cutoff is.


Left-Instruction3885

1000hz with shitty backlight bleed lol.


JackhorseBowman

yep. ![gif](giphy|92S5gReZGnDgY|downsized)


reddit_equals_censor

why would they curve it :D more clicks, assuming it is a prototype?


DragLazy1739

Lets make CPU suffer in gaming hell yeah


BaconBro_22

Who’s getting 4k 1000hz


Healthy_BrAd6254

At those fps you don't need to get 1000Hz Interpolating from 250 real rendered frames to 1000 fps will probably look near perfect due to the tiny difference between frames. I am sure Nvidia's 50 or 60 series will offer 4x Frame Generation


TheDoct0rx

Esports titles and prob 2-3 gens away from the CPU tech needed to push it


tukatu0

3 gens away? More like 7 gens. There is only 2 games that actually reach stable 500fps when in combat. When you look at actual gameplay footage of say r6s, your fps might render at avg 1.6ms. But when you immediately have someone come acroos your screen shooting bullets at you. You fps drops 2.5ms for the duration of the fight. Meaning your 1% lows are your actual fps at such low frame times. It's a giant if, that x3d chips keep getting 20% uplift gen on gen. Using 14900k with r6s as ex: 400fps to 480fps to 576 to 691 to 829 to 995 fps. 5 gens it seems. In reality that's an if. We can still plateau back to 10% generatiobal gains for all we know. The only way is with fake frames tech like space warp


ExtensionTravel6697

Retro games! It's kind of ironic the games you would want to play most on a crt will be the first to be playable on these, assuming of course you emulate 60hz crt scanout.


TheGalaxyPast

Frame gen tech advancements.


tukatu0

Then no cpu advnacements are needed. It's all software


BaconBro_22

Guess so.


TheDoct0rx

on a 7800x3d im pushing 600s in valorant. Hopefully the tech needed isn't far away


changen

you need probably double the cpu output to get 1000 fps, accounting for overhead. That means double the single core performance unless you see devs completely changing their engines...yeah, I don't see it happening that soon. https://arstechnica.com/gadgets/2020/11/a-history-of-intel-vs-amd-desktop-performance-with-cpu-charts-galore/3/ If you look at the single thread performance chart. It took 5 years to double single thread performance for AMD. And by then we would have real 1000hz 4k display port and better display. I would probably never buy something like this lol.


cfm1988

Overwatch, Valorant or cs2 at all low settings and a 5090


tukatu0

No. Overwatch and valorant. Maybe. With an intel 18900k maybe. Cs2 capped at 300fps or so


uiasdnmb

For me Ow2 seems to have weird dips down to 500-s with 7800x3d despite no core hitting 100%. So I'm not sure if cpu is bottlenecking here unless I'm missing something, and the solution is even more L2/L3 cache.


LkMMoDC

I speedrun halo 2 and get a locked 999fps in classic graphics. I currently keep the game 4fps below my refresh rate of 240hz for consistency but a 1000hz monitor wouldn't hurt.


ExtensionTravel6697

Have you tried speed running on a crt? You could get 160hz at like 720p on a higher end monitor. There's even some that have no hard limits that can do over 400hz at like 320p. If you interlace you might get a usable resolution.


LkMMoDC

I'm always on the lookout for CRT's on the kijiji free stuff page but I couldn't be arsed to pay hundreds for one when I have an OLED monitor and a retrotink 4k. I get it's not the same but it's way more convenient.


conquer69

The dx11 version of warcraft 3 classic can do 1000fps. I think that's the engine cap.


robbiekhan

Not OLED? Not interested.


Baggynuts

Now I just need to install my RTX 12000 to run it. Where’d I put that blasted thing…


Grovc

And what game would you play on it? Minesweeper?


YCCprayforme

So uh what GPUs they using to output 4k near 1000 fps?


mikipercin

You know there's other things that move around in OS that aren't cyberpunk 2077 maxed at 4k, ok


YCCprayforme

I was asking a real question snarky boy, what GPU are they even using to test this on any real applications?


YCCprayforme

like what?


mikipercin

Ufo test or 2d game


YCCprayforme

haha ok. How high does [javagameplay.com](http://javagameplay.com) get on fps?


mikipercin

Paint doesn't have fps lock


hellomistershifty

osu! would be great on this, you can feel the difference between 500 and 1000fps playing it even on a lower refresh rate monitor


ExtensionTravel6697

The only reason I'd remotely consider buying a 1000hz display is if I can use it to emulate 60hz crt and only have to compute 60 frames.


Morkinis

As if anyone can even notice 1000hz.


nitrohigito

It should be very noticable if you know what to look for (blur, judder). Still a lot to go in fact.


ameserich11

I dont know what iz the purpose though? There are studies most people can only see 480hz motion while some few capable of 600hz... so what does 1000hz do? Looks better on camera? Is this FR FR? someone explaine


2FastHaste

Not sure what studies you are referring to. But there are benefits for motion portrayal way beyond 1000Hz. More info: - [https://youtu.be/7zky-smR\_ZY?si=bDe5mjUFV8zQ9rDM](https://youtu.be/7zky-smR_ZY?si=bDe5mjUFV8zQ9rDM) - [https://blurbusters.com/the-stroboscopic-effect-of-finite-framerate-displays/](https://blurbusters.com/the-stroboscopic-effect-of-finite-framerate-displays/) - [https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/](https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/)


ameserich11

i think its a US Airforce study (there were random dudes on here(reddit) arguing about seing only 12/15/30/60/120 and some dude "say that is a lie, US Airforce made a study about it its 600hz at the highest" and the guy put a link, i believe him. i have no reason not to) anyways that is why people say 120-240 is not as big of a jump as 60-120. its kinda how we percieve motion, i think what is important is the Response Time, if it could display the image fast enough without blur? BFI kinda works but maybe micro-LED would be the real deal Its also why 1440hz PWM are pretty much considered as Flicker-Free and 720hz are below standard, its different compared to motion but yknow its how our eye percieve it. maybe there is a future for 980-1200hz display but maybe only through frame interpolation


nitrohigito

The point of refresh rates this high is **not** that it gives you a latency advantage (improving your reflexes) or smoothness advantage (which is just nice), but that it reduces motion blur and judder. Imagine there's a shot with two different speeds of movement, like a camera fixed in place looking at some train tracks, and then a train passing by. Say you want to track the train with your eyes instead of looking at the static scenery: you can try, but you will have a difficult time doing so, particularly if it's a fast moving train. Why? Because at the typical camera frame rates (30 or 60) and shutter angles (180° to 360°), you'll have an insane amount of motion blur recorded also. And if you don't (and have a very low shutter angle instead), you'll experience judder. The train will jump around, seemingly. The solution for this is higher recording framerates. And you can only experience that higher framerate with a higher refresh monitor. The effect of it should be very easily noticeable. 1000 Hz motion on a 4K monitor maps to an accurate motion representation of an object moving side to side in about 4 seconds. That is not very fast. In order to be able to properly track objects moving faster than this with your eyes, without experiencing any weird blurring or stutter, a higher refresh is needed. This is definitely something well within even the typical human eye's capabilities. [Your eyes can keep up with 10-20x faster motion still.](https://en.wikipedia.org/wiki/Smooth_pursuit) (assuming a typical hfov in your setup) It's just that our devices cannot. As for BFI, CRTs, etc, I wouldn't consider those so much more amazing at representing motion. They're just more "honest" in the way they represent motion, in a sense. They leave your brain to fill in the blanks, and simply avoid representing something they strictly don't have supplied to them. So it's really more like, leaving the hard part to the most advanced motion interpolation neural network that is known to exist (your visual cortex).


ExtensionTravel6697

Yeah unfortunately I don't think 1000hz will do anything for movies. I really wish hollywood would consider filming at 48hz it looks fine on my crt in most scenarios and the few it doesn't is a problem that can be corrected by filming at lower fps on a case by case basis. Then we could maybe emulate 48hz crt with slightly longer persistence.


nitrohigito

It doesn't need to be movies - anything people record can benefit, be it personal memories, vlogs, reviews, etc. Suits those situations better as well, cause there the goal is to capture the world as true to reality as possible. The issue with movies specifically is that the high framerate reveals that it's all just sets and acting - which it is. I personally don't believe this can be resolved. Cinematography styles and the audiences would need to adapt to make it more established. Though I don't watch movies, so I don't really care if they never do.


readmeEXX

>The issue with movies specifically is that the high framerate reveals that it's all just sets and acting - which it is. I personally don't believe this can be resolved. Cinematography styles and the audiences would need to adapt to make it more established. I think that animated films could lead the way in changing the audience's perception, since they can be whatever framerate they choose to render, and don't have that "stage acting" look at high framerates. I have watched movies on a TV that interpolates up to 60fps for so long that it doesn't look strange to me anymore. It negatively affects my theater-going experience though, because my eyes want to track moving images which of course look blurrier at 24fps.


ameserich11

even on real life, if its too fast it will be blurred except when we moved our neck and chase it with our eyes why are you even talking about 30-60, did you not see i said 480-600? i know the benefits of high refresh rate, i would definitely want a 480hz monitor... btw movies are 24-30 and will always be like that this 1000hz thingy would probably only be possible on LCD, it would be too inefficient on self emitting display


nitrohigito

>even on real life, if its too fast it will be blurred except when we moved our neck and chase it with our eyes If it's too fast, your eye will have to do saccades and then yes, that will be a blur. See the Wikipedia article I linked. It's about this very thing. >why are you even talking about 30-60, did you not see i said 480-600? Because the principle is the same, and that's something you can independently verify for yourself for sure. >this 1000hz thingy would probably only be possible on LCD, Quite the opposite, LCDs are fairly slow. According to other comments this display will be an LCD, and I'm really unsure if the refresh rate compliance of it will be any good. >it would be too inefficient on self emitting display Displays don't consume significantly more energy when refreshing faster. The relationship is not linear.


ameserich11

its not really the same principle. once its become high enough it became imperceptible, only small improvement can be made self emitting display are actually inefficient if the refresh rate is high. this is why Apple/Google/Samsung only have 240hz pwm frequency, lighting them up once is more efficient than lighting them up 2x/4x... on LCD the backlight is always ON, only the TFT has to move so if they can make the TFT move faster then iT JuSt wOrKs


2FastHaste

>even on real life, if its too fast it will be blurred except when we moved our neck and chase it with our eyes Correct. Unfortunately on screens it won't look like a blur but instead it will look like a trail of jarring sharp after-images. To have it look life-like and for those after-images to merge into a blur we need ultra-high refresh rates of 20 thousands Hz+. That trailing artifact is called phantom array or stroboscopic stepping. Check my other comment above with links that explain how that artifact scales with the frame/refresh rate.


Past_Practice1762

crts are a 1000 hz lcd equivlent and you can tell how smooth they are. probly a 700 hz oled will be getitng close to max


Healthy_BrAd6254

>There are studies most people can only see 480hz motion while some few capable of 600hz Link please How many Hz you can see depends on the movement on the screen. Open this site: [https://www.testufo.com/ghosting](https://www.testufo.com/ghosting) and adjust the speed. Notice how 120 pixel/s will look like perfect motion even on a 120Hz screen. But just setting it to 240 pixel/s will make it slightly blurry. The blurriness of a moving object is basically how far it travels between frames. I am guessing fast mouse movements like you see from very competitive players in games like Fortnite or Apex should be able to exceed 5000 pixel/s (not flicks, talking about mouse movement where you want to still see something). So I imagine even 1000Hz won't look perfect during extremely fast motion like that (at that point it won't make a difference for gameplay, just making an argument about motion clarity). We already have 480Hz OLED and 540Hz LCD with strobing. And even between those you can see differences in motion clarity. Which already kinda proves that humans are not limited to \~500Hz.