piker 4 hours ago

The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off. For example, it's clear that a lot of the Rust UI framework developers have been working on Macs for the last few years. The font rendering on many of those look bad once you plug them into a more normal DPI monitor. If they hadn't been using Macs with Retina displays they would have noticed.

  • guhcampos an hour ago

    This is more widespread than we like to admit.

    Developers writing software on 64GB M4 Macs often don't realize the performance bottlenecks of the software they write.

    Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

    Developers writing services over unlimited cloud budgets often don't realize the resource wastes into which their software incurrs.

    And to extend this to society in general.

    Rich people with nice things often alienate themselves from the reality of the majority of people in the World.

  • SeasonalEnnui 2 hours ago

    Yes! I’m glad to see this pointed out - when working on UIs, I regularly move them between 3 monitors with varying resolution & DPI. 4k @ 200%, 2K at 125%, and 2K at 100%. This reveals not only design issues but application stack issues with DPI support.

  • nine_k 4 hours ago

    As a designer, one should keep a couple of cheap, low-res monitors reset to the factory defaults for proofing what many users are going to see.

    • eb0la 3 hours ago

      I must confess I felt a lot of lust looking at the self color calibration feature.

      It is extremely useful if your work ends up in paper. For photography (edit: film and broadcast, too) would be great.

      My use case are comics and illustration, so a self-color-correcting cintiq or tablet would be great for me.

      • BolexNOLA 37 minutes ago

        I like having a color calibrated monitor but at the end of the day it’s about trusting my scopes too. Audio unfortunately has this perception element that for some reason doesn’t seem as big of an issue with video. We have dB/loudness standards for a reason, but different stuff just sounds louder or softer no matter what.

        If it looks good on a mac laptop screen/imac and the scopes look right, it’s good for 99%+ of viewers. You can basically just edit visually off any Mac laptop from the last 10 years and you’ll probably be happy tbh.

    • sim7c00 3 hours ago

      this exactly. same ppl do for sound, listen in the car, over shity headphones etc. - that's just quality control not the fault of any piece of equipment.

      • mikepurvis 2 hours ago

        Yes this is universal in pro mixing setups, having filters or even actual physical hardware to provide the sound of stock earbuds, a crappy Bluetooth speaker, sound system in a minivan, etc.

  • sz4kerto an hour ago

    This is exactly how sound studios do mixing. They don't just use top-end monitors -- they generally also listen on low-end speakers that color sound in a way that's representative to what people have at home (hello, Yamaha NS-10).

    • Intermernet 40 minutes ago

      People used to buy NS-10s because they knew professional studios used them. They were then underwhelmed when they sounded worse than the hifi speakers they had at home.

      Many audio engineers live by the mantra "if it sounds good on NS-10s, it'll sound good on anything".

      We need such a touchstone for software engineers.

  • ponector an hour ago

    Also, it's not only about the screen resolution. Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

    Had reported many issues where to reproduce they needed to enable 10x throttling in the browser. Or use a Windows machine.

    • throw0101d an hour ago

      > Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

      Part of what QA testing should be about: performance regressions.

  • mschuster91 an hour ago

    This is just as valid for mobile app and website development.

    When all you use for testing is Browserstack, local emulators and whatnot and only the latest iPhone and Samsung S-series flagship, your Thing will be unusable for wide parts of the population.

    Always, always use at the very least the oldest iPhone Apple still supports, the cheapest and oldest (!) Samsung A-series models still being sold in retail stores as "new", and at least one Huawei and Xiaomi device. And then, don't test your Thing only on wifi backed by your Gbit Wifi 7 router and uplink. Disable wifi and limit mobile data to 2G or whatever is the lowest your phone provider supports.

    And then, have someone from QA visit the countryside with long stretches of no service at all or serious degradation (think packet loss rates of 60% or more, latencies of 2 seconds+). If your app survives this with minimal loss of functionality, you did good.

    A bunch of issues will only crop up in real world testing. Stuff like instead of keeping a single socket to the mothership open, using fresh from scratch SSL connections for each interactions is the main bummer... latency really really eats such bottlenecks alive. Forgotten async handling leading to non-responsiveness of the main application. You won't catch that, not even with Chrome's network inspector - you won't feel the sheer rage of the end user having a pressing need and be let down by your Thing - even if you're not responsible for their shitty phone service, they will associate the bad service with your app.

    Oh, and also test out getting interrupted while using your Thing on the cheap-ass phones. Whatsapp and FB Messenger calls, for example - these gobble so much RAM that your app or browser will get killed by OOM or battery saver, and when the user has their interruption finished, if you didn't do it right your Thing's local state will have gotten corrupted or removed, leading the user having to start from scratch!

  • stephenr 4 hours ago

    Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

    The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.

    • mrbungie 4 hours ago

      Yeah sure, as long as you have a lot of resources for testing widely.

      Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.

      • stephenr 4 hours ago

        > if you were to make an analogy you should target for a few devices that represent the "average"

        For Macs, 220DPI absolutely is the average.

        • swiftcoder 2 hours ago

          I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.

          One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.

          • gr4vityWall an hour ago

            I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.

            > One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world

            I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.

            > I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

            If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)

e40 8 minutes ago

I’ve had a few ProArt monitors and they aren’t very high quality, IME. I had high-pitched whine and blinking off/on issues, on several Mac models, from iMac to Air to Studio. Yes, I tried a variety of cables. The Apple Studio monitor, while insanely priced, has been flawless for me, sitting next to a ProArt.

  • JKCalhoun 5 minutes ago

    I've often gone into an expensive display purchase with hesitation but then never regret it as, years later, when machines have moved in and out of my workspace, the display is still there.

martinald 3 hours ago

Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

Why is this? 5k/6k at 27" would be the sweet spot for me, and potentially 8k at 32". However, I'm not willing to drop $2k per monitor to go from a very nice 27" 4k to 27" 5k.

You can get 8K TVs for <$1000 now. And an Quest 3 headset has 2 displays at far higher PPI for $600.

  • rickdeckard 2 hours ago

    Because the vast majority of Monitor Sales-Volume are (public) tenders from companies buying huge volume, and those companies still mostly look for monitors <4K (without fancy specs and without i.e. USB-C).

    If 4K reaches mass-market for those, the specs will shift down and there will be room in the (much smaller) Premium-Tier monitor segment

    Heck, even if you just want USB-C and an integrated webcam on an average display, the price-hike compared to one without it is crazy, because everything except those basic office-monitors is still niche-production...

  • throw0101d an hour ago

    > Me and a friend were just chatting how annoying it is monitors stalled out at 4K.

    There's been a bit of a 'renaissance' of 5K@27" in the last ~year:

    > In just the past few months, we've taken a look at the ASUS ProArt Display 5K, the BenQ PD2730S, and the Alogic Clarity 5K Touch with its unique touchscreen capabilities, and most recently I've been testing out another new option, the $950 ViewSonic VP2788-5K, to see how it stacks up.

    * https://www.macrumors.com/review/viewsonic-vp2788-5k-display...

    There are 15 monitors discussed in this video:

    * https://www.youtube.com/watch?v=EINM4EysdbI

    The ASUS ProArt PA27JCV is USD 800 (a lot less than $2k):

    * https://www.youtube.com/watch?v=ojwowaY3Ccw

  • 4ggr0 3 hours ago

    as a gamer 8k makes me sweat because i can't imagine what kind of hardware you'd need to run a game :O probably great for text-based work, though!

    • pornel 2 hours ago

      You don't really need 8K for gaming, but upscaling and frame generation have made game rendering resolution and display resolution almost independent.

      • jsheard 2 hours ago

        And if all else fails, 8K means you can fall back to 4K, 1440p or 1080p with perfect integer scaling.

        • layer8 2 hours ago

          Except that the hardware doesn’t necessarily offer perfect integer scaling. Oftentimes, it only provides blurry interpolation that looks less sharp than a corresponding native-resolution display.

          • jsheard an hour ago

            The monitor may or may not offer perfect scaling, but at least on Windows the GPU drivers can do it on their side so the monitor receives a native resolution signal that's already pixel doubled correctly.

  • layer8 2 hours ago

    The likelihood of dead pixels increases quadratically with resolution, hence panel yield drops correspondingly. In addition, the target audience who has hardware (GPUs) that can drive those resolutions is smaller.

  • swiftcoder 2 hours ago

    > and potentially 8k at 32"

    What's your actual use-case for this? I run a 32" 4K, and I have to stick my nose within a foot (~30cm) of the display to actually spot individual pixels. Maybe my eyesight isn't what it used to be

    I'd kill for a 40" 5k or 6k to be available - that's significantly more usable desktop real estate, and I still wouldn't be able to see the pixels.

  • mschuster91 25 minutes ago

    > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

    Multiple reasons.

    The first one being yield - yes you can get 8K screens, but the larger they get, the more difficult it is to cut a panel with an acceptably low rate of dead/stuck pixels out of a giant piece of glass. Dead pixels are one thing and bad enough, but stuck-bright pixels ruin the entire panel because they will be noticeable in any dark-ish movie or game scene. That makes them really darn expensive.

    The second reason is the processing power required to render the video signal to the screen, aka display controllers. Even if you "just" take regular 8 bit RGB - each frame takes up 33 million pixels, so 796.262.400 bits. Per frame. Per second? Even at just 30 FPS, you're talking about 23.887.872.000 bits per second - 23 gigabits/s. It takes an awful, awful lot of processing power just to shuffle that data from the link SerDes around to all the control lines and to make sure they all switch their individual pixels at the very same time.

    The third is transferring all the data. Even if you use compression and sub-sampling, you still need to compress and sub-sample the framebuffer on the GPU side, transfer up to 48 GBit/s (HDMI 2.3) or 77 GBit/s (DP 2.1) of data, and then uncompress it on the display side. If it's HDCP-encrypted, you need to account for that as well - encrypting and decrypting at such line speeds used to be unthinkable even two decades ago. The fact that the physical transfer layer is capable of delivering such data rates over many meters of copper cable of varying quality is nothing short of amazing anyway.

    And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p).

    What's stopping further progress? Other than yield and simple physics (similar to microchips, the finer the structures get the more difficult and expensive it is to make them), the most pressing issue is human visual acuity - even a human with very good vision can only make useful sense of about 74 of the theoretical 576 megapixels [1]. As we already established, 8K is at 33-ish megapixels, so the usual quadratic increase would already be far too detailed for 99.999% of humans to perceive.

    Yes, you could go for intermediate sizes. 5K, 6K, weird aspect ratios, whatever - but as soon as you go there, you'll run into issues with video content because it can't be up- or downscaled to such intermediates without a perceptible loss in quality and, again, a lot of processing power.

    [1] https://clarkvision.com/articles/eye-resolution.html

  • Paianni 2 hours ago

    The Asus PA27JCV is rather less than $2k...

  • littlestymaar 2 hours ago

    > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

    It's mostly because the improvement over 4k is marginal. In fact, even from 1920x1080 it's not so big of a deal, which is why people keep buying such monitors in 2025.

    A the worse is that the higher spending consumer segment of PC parts, the gamers, can't really use high resolution display at their full potential because it puts such a burden on the GPU (DLSS helps, but the results is even smaller of an improvement over 1920x1080 than regular 4k is)

  • znpy 36 minutes ago

    Ah yes. It’s the same with memory… 8gb/16gb is incredibly common, even though 16gb memory was a thing in like 2008 already. It’s only with high end machines that you get 64/128gb memory, which should be much more common in my opinion.

ec109685 7 hours ago

I don’t get marketing people. The only link in the press release is to adobe’s creative cloud. Why isn’t there two taps to buy the monitor with Apple Pay and have it shipped when it’s available?

> The redemption period ends August 31, 2026. For full details, visit https://www.asus.com/content/asus-offers-adobe-creative-clou....

Well, the monitor is €8,999, so maybe it’d be more than two taps for me:

> The monitor is scheduled to be available by October 2025 and will costs €8,999 in Europe (including VAT)

  • pjerem 4 hours ago

    Buy a 9k€ monitor and get free 3 months for a cloud subscription. What a deal !

    • ryanjshaw 2 hours ago

      If you’re not careful, that adobe creative cloud sub will cost you more than the monitor when you try to cancel

  • gigatexal 7 hours ago

    Too rich for me. Also I don’t need a creative cloud sub. But I’m the wrong customer for such a monitor.

    I’ll wait till 8k becomes more of the norm for say 1-1.5k

    • nine_k 4 hours ago

      Human eye resolution is about 1 arcminute. The comfortable field of view is about 60°, or 3600 arcmimutes. A 4K display should mostly suffice %)

      • gigatexal 3 hours ago

        But I run double on Mac so an 8k is 4k.

qaq 7 hours ago

6K 32" ProArt model PA32QCV might be more practical for YN crowd at 1299 USD VS 8-9K USD PA32KCX will run you

  • tom_alexander 2 hours ago

    I'm not buying a new monitor with a decade-old version of DisplayPort. Non-oled monitors are products that last a long time (at least a decade) so if I bought this monitor, I'd still be using DisplayPort 1.4 from 2016 in 2036. I need UHBR20 on a new monitor so I can rest assured that I will have some lanes available for my other peripherals. I've already lived the hell of needing to dedicate all 4 lanes to DisplayPort, leaving only a single USB2.0 connection remaining for all my other peripherals to share[0][1].

    [0] https://media.startech.com/cms/products/gallery_large/dk30c2...

    [1] https://i.imgur.com/iGs0LbH.jpeg

  • retrac98 5 hours ago

    An aside - this monitor is proving surprisingly difficult to buy in the UK. Everywhere I look it seems to be unavailable or out of stock, and I’ve been checking regularly.

    Relatedly, I also don’t understand why a half-trillion dollar company makes it so hard to give them my money. There’s no option to order ASUS directly on the UK site. I’m forced to check lots of smaller resellers or Amazon.

    • ErneX 39 minutes ago

      Same in Spain, I got tired of looking for it.

    • brzz an hour ago

      Struggling with the exact same issue myself. If you do find a place to buy it, please let me know

    • qaq 5 hours ago

      Was same in US till maybe 2-3 weeks ago. Maybe they are slowly rolling out to various markets

  • WilcoKruijer 3 hours ago

    I've been enjoying the PA32QCV in the last couple months. It's definitely not perfect, but the 220 PPI at 32 inch is just amazing to code on.

  • zokier 4 hours ago

    I'd imagine for most people the HDR perf difference is more noticeable than the resolution. This new monitor can do 1200 nits peak with local dimming, PA32QCV can only do 600 nits peak with no local dimming. Also Dolby Vision.

    • qaq 17 minutes ago

      I'd imagine most people can't spend 9,000 USD on a monitor

ChrisMarshallNY 3 hours ago

Nice monitor, but its target demographic is pretty small, and its price makes Eizo look cheap.

I’ve done a lot of color-calibrated work, and, for the most part, don’t like working in a calibrated system. I prefer good ol’ sRGB.

A calibrated system is a “least common denominator” system, where the least capable element dictates what all the others do. So you could have one of these monitors, but, if your printer has a limited gamut, all your images will look like mud, anyway, and printer technology is still pretty stagnant. There was a big burst of improvement in inkjets, but there hasn’t been much progress in a long time. Same with scanners. I have a 12-year-old HP flatbed that is still quite valid.

A lot of folks get twisted over a 60Hz refresh rate, but that’s not something I worry about. I’m old, and don’t game much. I also watch entertainment on my TV; not my monitor. 60Hz is fine, for me. Lots of room is my priority.

  • Scene_Cast2 an hour ago

    Where do you go for wide gamut prints? How do commercial printers compare to consumer printers in this regard?

    I'm working on a few wide gamut art pieces, and so far the test prints have been less than stellar. Disclaimer - I'm an amateur in this field.

    • ChrisMarshallNY an hour ago

      Inkjets are the best bang for the buck. I had good luck with higher-end Epson printers (with good gloss/matte photo paper). The ink is much better at remaining viable for a long time, and no longer freaks out, whenever the relative humidity goes up.

      With inkjets, though, you need to keep using them. Otherwise, the ink clogs.

      Expensive process printers have wide gamuts. Laser printers basically suck. Xerox used to make decent color laser printers, but they had an odd “waxy” ink. Not sure if they still do it.

      I don’t think anyone does dye-sub printers, anymore. They used to be good.

fleventynine 10 hours ago

No mention of 120Hz; I'm waiting for a 6k or higher-density display that can do higher refresh rates.

  • dietr1ch 10 hours ago

    I was going to joke about 8k@120Hz needing like 4 video cables, but it seems we are not too far from it.

    [8k@120Hz Gaming on HDMI 2.1 with compression](https://wccftech.com/8k-120hz-gaming-world-first-powered-by-...)

    > With the HDMI 2.2 spec announced at CES 2025 and its official release scheduled for later this year, 8K displays will likely become more common thanks to the doubled (96 Gbps) bandwidth.

  • ryukoposting 9 hours ago

    I wouldn't hold my breath. Competing models seem to top out around 120 Hz but at lower resolutions. I don't imagine there's a universal push for higher refresh rates in this segment anyway. My calibrated displays run at 60 Hz, and I'm happy with that. Photos don't really move much, y'know.

    • eviks 9 hours ago

      > Photos don't really move much, y'know.

      They do when you move them (scroll)

      • justsomehnguy 8 hours ago

        And?

        Can you provide a ROI point for scrolling photos at 120Hz+ ?

        • eviks 6 hours ago

          Sure, give me your ROI point for an extra pixel and I can fit refresh rate in there.

        • klausa 7 hours ago

          It looks and feels much better to many (but not all) people.

          I don't really know how you expect that to translate into a ROI point.

    • klausa 9 hours ago

      I imagine your mouse still moves plenty though.

tombert 9 hours ago

I swore a blood oath that I would never buy an Asus product ever again, after three terrible laptops from them in a row, but holy hell do I kind of want this monitor.

My main "monitor" right now is an 85" 8K TV, that I absolutely love, but it would be nice to have something smaller for my upstairs desk.

  • mnw21cam 2 hours ago

    I have a fantastic Asus laptop that is 8 years old now and (after an easy battery replacement) easily does everything I want from it and feels nice and solid. I was so impressed that I recommended Asus to someone else, and what they got was pretty awful.

    So basically, YMMV. They make good stuff, and they make awful stuff.

  • 8cvor6j844qw_d6 6 hours ago

    What would you pick for your next laptop if you had to buy one?

    I had an Asus laptop, but the frequent security firmware updates for one of the Dell laptop that I had makes me think it might make a good candidate in terms of keeping up with security updates.

    Not sure for the current latest models for Asus/Dell/HP/etc., but I liked the fact that disassembly manuals are provided for older Dell and HP. I can hardly find disassembly manuals for Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.

    • speedgoose 6 hours ago

      I’m only one data point, but I also swear that I would never buy an Asus laptop again. If you are fine with the operating system, a MacBook Pro is the best in my opinion. It’s not even close.

      Otherwise I had okay Dell or Lenovo laptops. Avoid HP, even the high end Zbook ones. A framework might be worth a try if you have a lot of money.

      • sspiff 5 hours ago

        I have used a ZBook G1a for the past few months because it is the only laptop with AMD's Ryzen 395+, and while not thinkpad or XPS/Precision tier, the laptop has been perfectly fine.

        • xarope 4 hours ago

          I've been toying with getting one of these with 128GB of RAM. What's your opinion (especially since you have compared it to thinkpad/xps)?

      • simulator5g 5 hours ago

        You can also run Asahi Linux or Windows for ARM on Macs

        • sspiff 5 hours ago

          I run Asahi Linux as a daily. Support is imperfect and for a daily driver you can probably forget about using anything newer than an M2 at the moment. On my M2, missing features include USB-C video out and microphone support. Windows on ARM is worse and has zero drivers for Mac hardware as far as I know.

  • ssivark 8 hours ago

    What are the cons of having a large TV as a monitor? I've been considering something like this recently, and I wonder why is this not more common.

    • bee_rider 7 hours ago

      Someone mentioned the latencies for gaming, but also I had a 4K TV as a monitor briefly that had horrible latency for typing, even. Enough of a delay between hitting a key and the terminal printing to throw off my cadence.

      Only electronic device I’ve ever returned.

      Also they tend to have stronger than necessary backlights. It might be possible to calibrate around this issue, but the thing is designed to be viewed from the other side of a room. You are at the mercy of however low they decided to let it go.

      • ycombinete 4 hours ago

        You could probably circumvent this by putting the display into Gaming Mode, which most TVs have. It removes all the extra processing that TVs add to make the image "nicer". These processes add a hell of a lot of latency, which is obviously just fine for watching TV, but horrible for gaming or using as a pc monitor.

        • bee_rider 3 hours ago

          It was a while ago (5 years?), so I can’t say for certain, but I’m pretty sure I was aware of game mode at the time and played with the options enough to convince myself that it wasn’t there.

      • xnx 4 hours ago

        > horrible latency for typing

        Was this the case even after enabling the TVs "game mode" that disables a lot of the latency inducing image processing (e.g. frame interpolation).

        • sim7c00 3 hours ago

          game mode is a scam. it breaks display quality on most TVs. and still doesn't respond as fast as a PC monitor with <1ms latencies.... it might drop itself to 2 or 3 which is still 2x or 3x atleast slower.

          you can think 'but thats inhumanly fast, you wont notice it' but in reality, this is _very_ noticeable in games like counter-strike where hand-eye coordination, speed and pinpoint accuracy are key. if you play such games a lot then you will feel it if the latency goes above 1ms.

          • eurleif 3 hours ago

            Where are you finding monitors with <1ms input lag? The lowest measured here is 1.7ms: https://www.rtings.com/monitor/tests/inputs/input-lag

            • theshackleford 2 hours ago

              Most people lack an understanding of displays and therefore what they are quoting and are in fact quoting the vendors claimed pixel response time as the input lag.

              It’s gotta be the most commonly mixed up things I’ve seen in the last twenty years as an enthusiast.

              • sim7c00 an hour ago

                well atleast i didn't misunderstand my own lack of understanding :D ... -

                the part of feeling the difference in response times, that's true though, but I must say, the experience is a bit dated ^^ i see more high resolution monitors have generally quite slow response times.

                <1ms was from CRT times :D which was my main counter-striker days. I do find noticable 'lag' still on TV vs. monitor though but i've only tested on HD (1080p) - own only 1 4k monitor and my own age-induced-latency by now far exceeds my display's latency :D

            • sim7c00 2 hours ago

              false advertisements :D

    • swiftcoder an hour ago

      Depending on the specific TV, small details like text rendering can be god-awful.

      A bunch of TVs don't actually support 4:4:4 chroma subsampling, and at 4:2:2 or 4:2:0 text is bordering on unreadable.

      And a bunch of OLEDs have weird sub-pixel layouts that break ClearType. This isn't the end of the world, but you end up needing to tweak the OS text rendering to clean up the result.

    • tombert 8 hours ago

      I'm sure there are reasons with regards to games and stuff, but I don't really use this TV for anything but writing code and Slack and Google Meet. Latency doesn't matter that much for just writing code.

      I really don't know why it's not more common. If you get a Samsung TV it even has a dedicated "PC Mode".

      • baq 4 hours ago

        "PC Mode" or "Gaming mode" or whatever is necessary - I can tell any other mode easily just by moving the mouse, the few frames of lag kill me inside. Fortunately all tvs made in this decade should have one.

    • sim7c00 3 hours ago

      high latency on TVs make it bad for games etc. as anyhting thats sensitive on IO timings can feel a bit off. even 5ms compared to 1 or 2ms response times is noticable by a lot in hand-eye coordination across io -> monitor.

      • puzzlingcaptcha 2 hours ago

        It sort of depends on what you perceive as 'high'. Many TVs have a special low-latency "game" display mode. My LG OLED does, and it's a 2021 model. But OLED in general (in a PC monitor as well) is going to have higher latency than IPS for example, regardless of input delay.

        • TheOtherHobbes 2 hours ago

          OLED suffers from burn-in, so you'll start seeing your IDE or desktop after a while, all the time.

          I have a couple of budget vertical Samsung TVs in my monitor stacks.

          The quality isn't good enough for photo work, but they're more than fine for text.

      • dahauns an hour ago

        In the context of this thread that's a non-issue. Good TVs have been in the ~5ms@120Hz/<10ms@60Hz world for some time now. If you're in the market for a 4K-or-higher display, you won't find much better, even among specialized monitors (as those usually won't be able to drive higher Hz with lower lag with full 4k+ resolution anyway).

    • terribleperson 8 hours ago

      If you play video games, display latency. Most modern TVs offer a way to reduce display latency, but it usually comes at the cost of various features or some impact to visual quality. Gaming monitors offer much better display latencies without compromising their listed capabilities.

      Televisions are also more prone to updates that can break things and often have user hostile 'smart' software.

      Still, televisions can make a decent monitor and are definitely cheaper per inch.

    • jmarcher 8 hours ago

      For me, on macOS, the main thing is that the subpixel layout is rarely the classic RGB (side by side) that macOS only supports for text antialiasing.

      If I were to use a TV, it would be an OLED. That being said, the subpixel layout is not great: https://pcmonitors.info/articles/qd-oled-and-woled-fringing-...

      • bestham 7 hours ago

        IIRC Apple dropped sub pixel antialiasing in Mojave or Sonoma (I hate these names). It makes no sense when Macs are meant to be used with retina class displays.

        • ahoka 5 hours ago

          A.K.A. workaround for a software limitation with hardware. Mac font rendering just sucks.

    • 112233 6 hours ago

      For me it's eye fatigue. When you put large 4k TV far enough it's same view angle as a 27" desk monitor, you're almost 1.5m away from it.

    • xeonax 8 hours ago

      I have been using a 43 inch TV as a monitor, since last 10 years, currently on a LG. You get lot of screen-space, as well as you can sit away from desk and still use it. Just increase the zoom.

    • monkpit 8 hours ago

      Usually refresh rate and sometimes feature set. And it’s meant to be viewed from further away. I’m sure someone else could elaborate but that’s the gist.

hoangtrannn 2 hours ago

Is there a good 5k monitor at 27" that does not burn the wallet? It's worth mentioning that it should be also very reliable because these monitors seem to have issue after awhile, especially burn-in.

  • def- an hour ago

    ASUS ProArt PA27JCV

cheema33 10 hours ago

There is a lot of marketing material at the linked page. But there is no mention of price and available sizes. Also, there is no link to purchase one. This is November. I can look these things up, but why link to a PR fluff piece if there something more substantial available?

metaphor 9 hours ago

8K HDR implies that DSC becomes unavoidable...but DSC's "visually lossless" criteria relies on the human eye and is statistically subjective at face value.

Any domain experts know how that actually squares in practice against automated colorimeter calibration?

  • amarshall 8 hours ago

    DisplayPort 2.1 (which the monitor supports) provides sufficient bandwidth for 7680x4320@60 Hz 10-bit without DSC when using UHBR20. The press release unfortunately doesn’t clarify whether the monitor supports UHBR20 or only the lower UHBR10 or UHBR13.5 speeds. Of course, the GPU must also support that (Nvidia RTX 5000 only at the moment, as I believe AMD RX 9000 is only UHBR13.5).

    • a-french-anon 5 hours ago

      I believe you're right regarding AMD's lack of UHBR20 on its cards. Fingers crossed for their next gen!

      • jsheard an hour ago

        AMDs current workstation cards do support UHBR20, just not their consumer cards, even though it's the same silicon. Artificial segmentation on GPUs is nothing new but segmenting on display bandwidth is a strange move, especially when the market leader isn't doing that.

  • altairprime 8 hours ago

    8K 60fps 4:4:4 8bpp uncompressed requires a 96gbit HDMI cable, which is labeled Ultra96 in HDMI 2.2 afaik: https://www.hdmi.org/download/savefile?filekey=Marketing/HDM...

    DisplayPort over USB4@4x2/TB5 at 120Gbps would be required for uncompressed 12bpp.

    • metaphor 8 hours ago

      Apologies for not tracking; the monitor in question is spec'd with HDMI 2.1 and TB4 I/O.

      • altairprime 6 hours ago

        Apologies never expected when it comes to USB and HDMI naming and spec stuff. I have to look them up every time.

        But, that’s 8K DSC or.. 24fps maybe? then. Weird oversight/compromise for such a pro color-focused monitor, perhaps Asus reused their legacy monitor platform. “8K HDR” at 24fps could be a niche for theater movie mastering, perhaps?

bob1029 4 hours ago

I tried a 32" 4k for a while but the form factor never worked for me. 8k seems absurd after working with that monitor.

27" 1440p is much easier to drive and live with day to day. I can still edit 4k+ content on this display. It's not like I'm missing critical detail going from 4k=>qhd. I can spot check areas by zooming in. There's a lot of arguments for not having to run 4k/8k displays all day every day. The power savings can be substantial. I am still gaming on a 5700xt because I don't need to push that many pixels. As long as I stay away from 4K I can probably use this GPU for another 5 years.

  • zokier 4 hours ago

    32" 4k is pretty much the worst of all worlds configuration. It is just dense enough that traditional 100% scale is not great, but not dense enough to get that super smooth hidpi effect either. I'd argue that for desktop monitors around 200 ppi is sweet spot, so 5k for 27" or 6k for 32".

    This 8k is bit overkill, but I suppose makes some sense to use a standard resolution instead of some random number.

    • jon-wood 3 hours ago

      These things aren't for use in an office setting where you're fiddling with a web browser, Excel, or writing software. They're for situations where colour calibration matters, so either designing for print, or working on video.

      Particularly for the people doing video an 8k display is great - that means you can have full resolution 4k video on screen with space around it for a user interface, or you can have a display with the 8k source material on it if the film was shot at that resolution.

    • stephenr 4 hours ago

      Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.

  • bartvk 3 hours ago

    There's two instances where 32" is helpful. First for Xcode and Android Studio, where you write some UI code and the phone/tablet preview on the right, in both horizontal and vertical orientation.

    And second for doing writing and research, because recently I had to get a certificate for which I had to write a portfolio of old-fashioned essays. 32" but even 40" is extremely helpful for this. Basically I kept my screen organized in three columns with the word processor on the left, and two PDFs in the middle and on the right.

  • ThatMedicIsASpy 3 hours ago

    42" 4k 100%

    I don't want to ever go back but I got this 2020 Dell for 200. I don't want to pay 800-1400 if I ever have to replace it

  • baq 4 hours ago

    I HATE (yes, all caps) Apple for very actively discouraging 1440p as a useful resolution (as in, it is literally, not figuratively, painful to use out of the box). I'm a happy customer of BetterDisplay just to make it bearable, but it's still not as sharp as any other OS.

polaris421 8 hours ago

This looks amazing for creators — 8K, HDR, and auto calibration in one screen!

cmgriffing 9 hours ago

I shudder to think how small the macOS ui text will be on this but I’m willing to find out.

  • Kerrick 8 hours ago

    For macOS, 8K should have a larger screen. This 8K monitor is 32 inches, which leaves us with a very awkward 275ppi. 42" would be 209ppi, which is great for 16.5" from your face. 48" would be 183ppi, which is great for 18.8" from your face (my preference). But at 32" and 275dpi, that would be a 12.5" viewing distance, which is far too close for a 32" monitor. You'd be constantly moving your neck to see much of the screen--or wasting visual acuity by having it further.

    macOS is optimized for PPIs at the sweet spot in which Asus's 5K 27" (PA27JCV) and 6K 32" (PA32QCV) monitors sit. Asus seemed to be one of the few manufacturers that understand a 27" monitor should be 5K (217ppi), not 4K (163ppi). 4K will show you pixels at most common distances. But if you follow that same 217ppi up to 8K, that leads to 40.5" not 32".

    My wife has a triple vertical PA27JCV setup and it's amazing. I've been able to borrow it for short stints, and it's nearly everything I've ever wanted from a productivity monitor setup.

    • numpy-thagoras 8 hours ago

      Yeah I currently daily drive a 43" monitor and it has been a life changer since I got it in 2022.

      I'm still happy with it, would kill for an 8K 43" 120hz monitor but that's still a ways away.

    • zakki 7 hours ago

      What is the right size for 4K monitor and the distance from our eyes? I have Skyworth monitor at 27" already. If I set macos resolution at 4K, the default font is too small. My distance with the monitor is around 16,5".

  • Cyphus 6 hours ago

    You can scale the UI according to your preferences, but the real problem is that if your monitor’s ppi is not close to the macOS sweet spot of 220ppi (or an integer multiple thereof) you’re going to have aliasing issues with text and other high contrast elements.

    https://griffindavidson.com/blog/mac-displays.html has a good rundown.

  • pugz 7 hours ago

    I recently (a couple of weeks ago) got the 6K version of this screen, the Asus PA32QCV. It has the same pixel density as my MacBook Pro, so the UI looks great. To be honest, it's enough screen real estate that I now operate with my laptop in clam shell mode.

    My only complaint is that the KVM leaves a bit to be desired. One input can be Thunderbolt, but the other has to be HDMI/DisplayPort. That means I need to use a USB-C cable for real KVM when switching between my two laptops. I'd like two cables, but four cables isn't the end of the world.

  • SamuelAdams 9 hours ago

    You can run it natively, but it is better to downscale to 4k or 1080p. I run three 5k versions of this monitor and they are all downscaled to 1440p. I get 1:1 pixel mapping so text looks crisp in every app except Microsoft Teams.

    • Tepix 7 hours ago

      Isn‘t downscaling the wrong term? You‘re still taking advantage of its native resolution.

  • BoorishBears 9 hours ago

    It'll look normal, maybe even a little big by default if the XDR is anything to go by

    OSX does great at scaling UIs for high resolutions

veridianCrest 8 hours ago

The specs look impressive, especially the 8K HDR and built-in color calibration. It’ll be interesting to see how it performs compared to Apple’s Pro Display XDR in real workflows.

guerrilla 9 hours ago

Why does it have blinders?

  • andrewstuart2 9 hours ago

    To prevent glare and reflections usually. Similar to how a lens hood functions.

inatreecrown2 3 hours ago

i long for a “Eizo” like quality monitor, 15 or 17 inch, with “retina” ppi count.

am i the only one who thinks that this would make sense?

  • jeroenhd 25 minutes ago

    There are a few displays like that, although quality will differ on your criteria of course. Many of them are marketed as "portable monitors", some specced for gaming, others for artists, some built to be cheap.

    ASUS ProArt PA169CDV, UPerfect 184T01, Lipa AX-60 (and AX-60T), UPerfect UFilm A17, UPerfect UGame J5, and two portable screens by Verbatim, just to name a few.

  • precompute 2 hours ago

    Hey, I think that's a great idea, too. 4K panels on phones (tiny!) exist for some absurd reason. But somehow there are no 22" 4K monitors. I think they probably don't sell well. Probably the same reason why all monitors are 16:9.

    • jeroenhd 28 minutes ago

      At one point there was the ASUS ProArt PQ22UC, but I don't think that panel was produced after that stopped selling.

      If you go slightly bigger, there are the ASUS ProArt PA24US, Japannext JN-IPS2380UHDR-C65W-HSP, ViewSonic VP2488-4K, AG Neovo EM2451, and the UPerfect UColor T3.

jbellis 10 hours ago

About twice the price of the Dell 8k.

jiggawatts 9 hours ago

This is a direct competitor to the Apple Pro Display XDR.

I wouldn’t be surprised if it comes in at a similar price point.

The sustained 1,000 nit HDR and Dolby Vision support suggest their target market is very specifically film color grading.

  • rainbaby 7 hours ago

    it’s already on sale in the Chinese market for about 70,000 CNY, so the price is likely around 9,000–10,000 USD.

efficax 9 hours ago

realistically what’s the point of all those pixels at 32 inches? 5k at 27 inches seems more than enough.

  • jeswin 8 hours ago

    If you need 5k at 27 inches, you need more at 32". But if you're saying that 32" are excessive, I think it's a personal preference. I would never go back to a smaller monitor (from 32) personally - especially as you grow older.

  • metaphor 8 hours ago

    Apparently, ASUS believes there's an addressable market willing to pay a premium for +26.5% color-calibrated ppi in larger form factor.