Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Apple Studio Display and Studio Display XDR (apple.com)
227 points by victorbjorklund 1 day ago | hide | past | favorite | 295 comments
 help



I might be the only one, but it's still to this date (and dating all the way back to 2014 with the first iMac 5k display) Apple is the only company that truly gets HIDPI desktop displays with high quality gloss and 200+ ppi at screen this large. In the meantime popular and widely sold gaming screens with matte blur filters and mediocre ppi give me headache and eye fatigue after a few hours of use. Prior generation Studio Display is the only external display that truly worked for text heavy work with my eyes (including software engineering), and I'm sure the latest generation is fantastic as well.

The hardware is great, but the software is lacking. macOS only supports resolution-based scaling which makes anything but the default 200% pixel scaling mode look bad. For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry because macOS renders at a higher resolution and then downscales to the 4K resolution of the screen.

Both Windows and Linux (Wayland) support scaling the UI itself, and with their support for sub-pixel anti-aliasing (that macOS also lacks) this makes text look a lot more crisp.


I would love to see examples of this. I have a MBP and a 24" 4K Dell monitor connected via HDMI. I use all kinds of scaled resolutions and I've never noticed anything being jagged or blurry.

Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.

And then Windows has serious problems with old apps - blurry as hell with a high DPI display.

Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.

[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.


I'm more surprised that you're using a 24" display at any resolution. Of course, everyone has different preferences, but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably.

I'm personally on the old 30" 16:10 2560x1600 form factor, and it's wildly better visually than the 27" 1440p screen by the same brand (all of them Dell) I use at the office.


> I'm more surprised that you're using a 24" display at any resolution

I have an 24" 4K Dell I bought when big 4k screen with good (measured) colors were still expensive. It's a very pleasant screen to use. Sure, it has less real estate than a bigger one, but this is somewhat mitigated by the fact that I can keep it closer to my eyes, so I can use smaller text.

I find it makes me more "focused" in a way. Can't have multiple windowfuls of crap visible at the same time. It's very practical for TWMs. It also works well in a dual screen scenario, for stronger separation when you need it, but I'm still not sure if a single bigger screen is better than two smaller ones for things like having docs up next to code for example.

I find I can't use two 27" or higher screens, they're just too big and I need to turn my head way too much for comfort. At work we have a 2x27" 4k setup, and I basically only use the screen in front of me. Later I've been experimenting with pushing them very far away, but then I just need to increase text size and lose actual real estate.

> but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably

I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI. Which screens are easily available with the PPI of a 4K 24"? I'd expect something like 5k 27" or 6k 32". These are very expensive (>1000 € for a crappy 27" Samsung, 2000 for a 32" Dell) and not that common, at least in France.


> I'm personally on the old 30" 16:10 2560x1600 form factor

I sorta wish that form factor had taken off instead of 27" 1440p. The extra vertical space is really nice, and that seems to be the ideal PPI for 100% scaling IMHO.

I keep telling myself I'd like to get a 4K OLED display at the same PPI, but 40" seems to be conspicuously missing in every monitor lineup... at least at a price that will convince me to buy three of them, anyway.


Agree! I still have several (now discontinued) Philips 40 inch monitors, and that is the perfect size to do programming work. Very little scrolling needed while you work. But I would love to have a 40 inch in 4K+ instead of 2560x1600, why is no one making these? (I did get a Samsung 8K 50 inch, but that's too large for a multi screen setup)

Agreed. I'm hoping that some more decent 6k 32" screens come out this year, but they're still all 16:9 which just sucks imo

I took one of my dual 24" office monitors during Covid WFH and ended up keeping it when I quit that job. I use it as a second display alongside the MacBook which is on a stand.

I think the largest I would want at my current desk is 27". 30 is way too big for me. But more importantly I want something that matches the crispness of the MBP display, and 1440p and 1600p are too low res.


This [1] has good examples. 24" 4K is on the smaller side and so less noticeable than on larger displays like 27" or 32".

[1] https://bjango.com/articles/macexternaldisplays2/


I have a Macbook pro and a Linux machine attached to my dual 4k monitors.

Fonts on Linux (KDE Plasma on Wayland) look noticeably sharper than the Mac. I don't use subpixel rendering either. I hate that I have to use the Mac for work.


This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read. I ran the 5K Studio Display at 4K scaled for a bit but it was noticeably blurry.

This would've been easily solved with non-integer scaling, if Apple had implemented that.

(I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)


All through the 2000s Apple developed non-integer scaling support in various versions of MacOS X under the banner of “resolution independence” - the idea was to use vectors where possible rather than bitmaps so OS UI would look good at any resolution, including non-integer scaling factors.

Some indie Mac developers even started implementing support for it in anticipation of it being officially enabled. The code was present in 10.4 through 10.6 and possibly later, although not enabled by default. Apple gave up on the idea sadly and integer scaling is where we are.

Here’s a developer blog from 2006 playing with it:

> https://redsweater.com/blog/223/resolution-independent-fever

There was even documentation for getting ready to support resolution independence on Apple’s developer portal at one stage, but I sadly can’t find it today.

Here’s a news post from all the way back in 2004 discussing the in development feature in Mac OS tiger:

> https://forums.appleinsider.com/discussion/45544/mac-os-x-ti...

Lots of of folks (myself included!) in the Mac software world were really excited for it back then. It would have permitted you to scale the UI to totally arbitrary sizes while maintaining sharpness etc.


Yep, I played with User Interface Resolution app myself back then in uni. The impact of Apple's choice to skip non-integer scaling didn't hit me until a few years ago when my eyes started to fail...

> This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read.

> (I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)

The TV is likely a healthier distance to keep your eyes focused on all day regardless, but were glasses not an option?


If you can get used to using it (which really just requires some practice), the screen magnifier on Mac is fantastic and most importantly it’s extremely low latency (by this I mean, it reacts pretty much instantly when you want to zoom in or out).

Once you get used to flicking in and out of zoom instead of leaning into the monitor it’s great.

As an aside, Windows and Linux share this property too nowadays. Using the screen magnifiers is equally pleasant on any of these OSes. I game on Linux these days and the magnifier there even works within games.


Oh man... I'm in the same situation wrt eyesight. Are you coding on the 4K tv? I have enough space to make that configuration work. TIA

Yep, 4K is plenty of resolution for me running Sequoia. But running at simulated 1920x1080@2x, as at native 4K text would be way too small.

Thank you!

I just tested on my 4k display and 150% and 175% were not blurry at all. I'm on a 32 inch 4k monitor. Is it possible this information is out of date and was fixed by more recent versions of macos?

Absolutely not fixed. Try to look on black text on white background. Its not very obvious but still a little annoying

> For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry

I use a Mac with a monitor with these specs (a Dell of some kind, I don't know the model number off the top of my head), at 150% scaling, and it's not blurry at all.


I also feel it's just fine. Not as amazing as the Apple displays, but I'll have to sit really close to make out the difference for text.

> For example, with a 27" 4K display

4K pixels is not enough at 27" for Retina scaling.

Apple uses 5K panels in their 27" displays for this reason.

There are several very good 27" 5K monitors on the market now around $700 to $800. Not as cheap as the 4K monitors but you have to pay for the pixel density.

There are also driver boards that let you convert 27" 5K iMacs into external monitors. I don't recommend this lightly because it's not an easy mod but it's within reason for the motivated Hacker News audience.


Apple still uses ancient 450nm panel though, nowadays everyone and their dog moved to 455-460nm ones. 450nm considerably more harsh on my eyes.

If your Mac goes bad it can be worthwile. My friend gave me his pre-Retina 27" iMac, part of the circa-2008 generation of Macs whose GPUs all failed.

I removed all the computing hardware but kept the Apple power supply, instead of using the cheapo one that came with the LCD driver board I bought. I was able to find the PWM specs for the panel, and installed a cheap PWM module with its own frequency & duty-cycle display to drive it and control brightness.

The result is my daily desktop monitor. Spent way too much time on it, but it works great!


Have you ever seen a MacBook air's screen? Those use fractional scaling and look fine.

> macOS renders at a higher resolution and then downscales to the 4K resolution

That seems weird to me. I remember 20 years ago one of the whole points of macOS version 10 was display PDF, i.e. a vector based UI.


While the original OS X display model, Quartz, evolved from Display PDF via NextStep, I believe that it shifted back to pixel rasterization to offload more of the display stack onto the GPU.

Quartz Extreme?

John Siracusa, Ars Technica:

It's possible that existing consumer video cards could be coerced into doing efficient vector drawing in hardware. Apple tried to do just that in Tiger [note], but then had to back off at the last minute and disable the feature in the shipping version of the OS. It remains disabled to this day.

[note] https://arstechnica.com/reviews/os/macosx-10.4.ars/14

https://arstechnica.com/staff/2006/04/3720/


Yeah this is correct, I don't know why you're being downvoted. The decisions Apple made when pivoting their software stack to high-DPI resulted in Macs requiring ultra-dense displays for optimal results - that's a limitation of macOS, not an indictment of less dense displays, which Windows and Linux accommodate much better.

Wayland supports it (and Chrome supports it very well) but GTK does not. I run my UI at 200% scaling because graphical Emacs uses GTK to draw text, and that text would be blurry if I ran at my preferred scaling factor of 150% or 175%.

GTK uses Pango/Harfbuzz and some other components to draw text, all of which are widely used in other Linux GUI stacks. GTK/GDK do not draw text themselves, so your complaints are not with them directly.

I'm not asseting that text is being rendered incorrectly. I'm asserting that after rendering, the text is being downsampled.

This works with GTK for me at least. I've been using Gnome+Wayland with 150% scaling for almost 4 years now, and I haven't noticed any issues with GTK. Actually, my experience is essentially backwards from yours—anything Electron/Chromium-based needed a bunch of command-line flags to work properly up until a few months ago, whereas GTK apps always just worked without any issues.

If you're using a high-DPI monitor, you might not notice the blurriness. I use a standard 110-DPI monitor (at 200% scaling in Gnome) and I notice it when the scaling factor is not an integer.

Or more precisely, I noticed it eventually as a result of my being primed to notice it after people on this site insisted that GTK cannot handle fractional scaling factors.

Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice (even on a standard 110-DPI monitor used at 150% and 175% scaling) any blurriness in those apps since the app I'm most conditioned to notice blurriness is my browser, and Chrome's viewport is resolution independent except when rendering certain image formats -- text is always non-blurry.

Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default and for years before that could be configured to talk Wayland, so I don't consider that worth talking about. If Xwayland is not involved, the contents of Chrome's viewport is non-blurry at all scaling factors except for the PNGs, JPGs, etc. For a long time, when run at a fractional scaling factor under Gnome (and configured to talk Wayland) the only part of Hacker News that was blurry was the "Y" logo in the top left corner, then about 2 years ago, that logo's PNG file was replaced with an SVG file and the final bit of blurriness on HN went away.


> If you're using a high-DPI monitor [...] I use a standard 110-DPI monitor (at 200% scaling in Gnome)

FWIW, I'm using a 184 DPI monitor with 150% scaling.

> you might not notice the blurriness. [...]

> Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice

I'm pretty sensitive to font rendering issues—to the point where I've complained to publishers about their PDFs having unhinted fonts—so I think that I would have noticed it, but if it's really as subtle as you say, then maybe I haven't.

I do have a somewhat unusual setup though: I'm currently using

  $ gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer','xwayland-native-scaling']"
although that might not be required any more with recent versions. I've also enabled full hinting and subpixel antialiasing with Gnome Tweaks, and I've set the following environment variables:

  MOZ_ENABLE_WAYLAND=1
  QT_QPA_PLATFORM=wayland
  GDK_BACKEND=wayland,x11,*
  CLUTTER_BACKEND=gdk,wayland
  SDL_VIDEODRIVER=wayland
  SDL_VIDEO_DRIVER=wayland
  ECORE_EVAS_ENGINE=wayland_egl
  ELM_ENGINE=wayland_egl
  QT_AUTO_SCREEN_SCALE_FACTOR=1
  QT_ENABLE_HIGHDPI_SCALING=1
So maybe one of those settings would improve things for you? I've randomly accumulated most of these settings over the years, so I unfortunately can't really explain what (if anything) any of them do.

> Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default

Ah, good to hear that that's finally the default; that probably means that I can safely remove my custom wrapper scripts that forced those flags on.


Do you notice blurriness on MacOS when the Settings app (name?) has been used to change the scaling factor to a fractional value?

Sorry, but I haven't ever used a Mac, so I unfortunately can't answer that. I've used Windows with fractional scaling, and most programs aren't blurry there, but the few that don't support fractional scaling are really blurry.

I bought that original 5k iMac on release day in 2014. I was thrilled with that display, and stoked to see the entire display industry go the route of true quadruple-resolution just like smartphone displays did.

Sadly, it basically never happened. There was the LG display that came out a couple of years later. It didn't have great reviews, and it was like two thirds the cost of an entire 5k iMac.

It took Apple over 7 years to release their standalone 5k display, and there are a few other true 5k displays (1440p screen real estate with quadruple-resolution, not the ultrawide 2160p displays branded as "5k") on the market now with prices just starting to drop below 1,000 USD.

Unfortunately in that time I've gotten used to the screen real estate of the ultrawide 1440p monitors (which are now ubiquitous, and hitting ridiculous sub-$300 prices). As of now, my perfect display for office work (gaming, video/photo work, or heavy media playback are different topics) would be 21:9 with 1440p screen real estate with quadruple-resolution—essentially just a wider version of that original 5k iMac display.


I bought an LG Ultrafine 5k at the time and felt kind of stupid for being spending on it. But nearly 10 years later... its still my daily driver. Best ROI of any tech equipment I've bought. It changed my mind about how to think about it, not just the monitor, but having speaker / camera / mac built in, and all over one cable, its been such a joy when I bounce around the house to be able to plugin / unplug so easily; or when I swap from work to personal laptop. Its such a simple setup. Im definitely considering the Apple one, basically regardless of what it costs, once its time. Its simply been too convenient to have a one-plug solution for the laptop that has everything I need, never breaks (my LG may be exception here lol), and that has somehow taken forever to be super ceded by something better.

Only thing that holds back that thought lately is, I'm suddenly spending more and more time in multi-pane terminals, and my screen real estate needs have dropped. The only two things I greatly miss now on my laptop is keyboard quality and general comfort (monitor height, etc).


The iMac Pro is nearly 9 years old at this point. At the time, there was no other option for a retina-quality 27" display, but you could get a 4k 27" for $400.

A decade later, it boggles my mind that it's so hard to find a retina-class desktop monitor. The successor to the Cinema Display is basically an iMac, and priced like it. There have very recently been releases from ASUS and BenQ, but it still feels like an underserved niche, rather than standard expectation.

All that is to say: hard cosign.


You can get a 27 inch 5k from Asus for $750. A 31.5 inch 6K goes for around $1200. A 28 inch 4K is around $350-$400.

Anyone reading this I am begging to please thoroughly test anything that comes out of ASUS before committing. Maybe only purchase with a generous return policy and possibly insurance. They are decent panels but everything around the panel is horrendous. Random connection errors with different machines, poor UX for switching inputs, takes a millenium to boot up and connect to the screen, forget about any support, if you have built in speakers you'd be better off with a tin can connected to your computer.

You get what you pay for with ASUS.


I’ve frankly have had worse experience with Samsung and better experiences with LG. The model I have is pretty bare bones, which is much better than the Samsung 27 inch 5k I had that just died on me after a couple of years. The LG 28 inch 4k is going on its 6th year. I think if I buy a 6K, I’ll wait for the LG to come down in price a bit ($2k for LG vs $1300 for Asus on Amazon).

They all suck in their own ways. In my experience LG, has random hardware failures (like one audio channel just dying how? I dont know), still kinda slow booting but this has gotten better, and their designs can be hit or miss(terrible stands, aesthetics are not ergonomic enough etc.). Samsung has been better for me but suffers from variations of the above.

These brands all have glowing fans online pushing their products(the flamewars about ASUS made me even hesitate to comment) but they burn their reputations customer by customer and I guess enough have been burned that Apple is able to maintain enough sales.


It was also really disappointing to see 24" 4k displays disappear from the market instead of becoming the new standard resolution for that size. A few years ago, there were several options including a cheap LG that was usually around $300 or less. Those all seem to be gone, likely for good, even though there are still plenty of 24" displays with 1080p and even a fair number with 1440p.

Indeed. I’m holding on to my 24” Dell P2415Q that I got like 10 years ago because it’s the perfect size for my desk and there just isn’t anything in that size to replace it with.

I've been very pleased with my ViewSonic VP2488-4K. A little steep for $550, but if you spend any significant time in front of the screen I think it's very much worth it. I'm planning to buy a second one.

The entire monitor market is completely dominated by televisions and it's really, really obvious.

The LG UltraFine's were garbage, but got better over time as either the firmware improved or macOS added drivers that worked around the nonsense. For a while I ran with two of them on an iMac Pro with a 5K itself, but switched to a single Pro Display XDR with a laptop eventually. I'm very sad to see the 6K/32" form disappear, it's by far the best screen I've ever used.

Asus ProArt Display 6K PA32QCV

Since about six months ago, 4th quarter of 2025.

I haven't got one yet, but it has the magic Mac 218 dpi for $1289


The Studio Display shares a panel with the MSI MPG 271KRAW16

Worth noting that these (and the LG with the same panel) aren’t shipping yet.

Even the new one in this post?

Yes. That MSI monitor was unveiled at CES 2026, alongside several other monitors that use the same panel, such as the LG 27GM950-B.

That's pretty good. I think the sales of monitors have become slow overall, so now they can focus on higher-end stuff to make some money even if it's for niche products at first.

I just saw a brand new display for 70 bucks at a store the other day; the margins must be extremely low.


I just want to know who's naming these things, it's been like this forever.

Why can't it be something simple?


> Why can't it be something simple?

Because monitors aren't simple. There are dozens of axes along which they can be scaled.

They have resolution (1080p FHD, 1440p QHD, 4K, 5K, 6K, 8K), aspect ratio (16:9, 8:5, 4:3, 3:2, 21:9, 32:9), refresh rate (60 Hz, 75 Hz, 120 Hz, 144 Hz, 165 Hz, 240 Hz, 360 Hz, 480 Hz, 1 kHz, and of course adaptive refresh rate tech including G-Sync), colour quality (depth and accuracy), contrast ratios for HDR, panel technology (LCD-TN, LCD-IPS, LCD-VA, OLED, QD-OLED, WOLED, and now RGB stripe OLED), backlight technology (CCFL, edge-lit LED, miniLED, microLED), connectivity (HDMI/DP, USB-B, USB-C, DP alt mode, Thunderbolt, 3.5 mm, and KVMs).

It's very hard to stuff all this information in one neat model number.

On the consumer's part it makes sense to understand these features and what is necessary for one's use case, filter monitors by said features, and note down the model numbers that satisfy the requirements.


But they make it like this. They also have the power of simplifying their offers.

Simplifying their offerings for the sake of the model number doesn't make any sense. Simplifying their offerings for other reasons might make sense, but the companies themselves would be the best judge of whether or not it makes sense for them.

I feel like they do it deliberately, so that you can’t easily research their products and find if they are out of date. They can sell you a monitor from 2012 as if it’s brand new, because you have no idea what it is.

So apple is just selling generic white labelled slop as a $5000 premium display?

> So apple is just selling generic white labelled slop

There are only ~5 flat-panel manufacturers worldwide: AU Optronics, Innolux, LG Display, Samsung Display, Sharp Display, and recently BOE Display. Apple has to use one of these, even for its bespoke, notched, curved iPhone/iPad displays.

This new 5K 2304-zone panel was developed by LG Display, and is not 'generic white-labelled slop' by any means. It is an extremely good panel in its own right, probably the bleeding edge of LCD technology today achieving top-notch responsiveness, contrast, and colour depth and accuracy.

That MSI monitor will probably retail for ~£800 as will the Asus and LG equivalents, which is not a trivial amount for a monitor. Apple just marked it up 3×, as they are prone to do for anything.


The Apple monitor will likely have better speakers, and I'm not even sure the others will have microphones at all. Apple also does a better job with color accuracy/consistency, at least historically. There's still a sizeable markup, but it's not entirely for nothing.

Back in the day (~15 years ago), when 4K monitors were unheard of and even Apple's high-end displays were still 1440p, you could get a bottom-dollar monitor using one of their panels (e.g. Yamakasi Catleap Q270) for about a third of the price. However, it came with no amenities, a single connector (dual-link DVI only), a questionably legal power cable, and no built-in scaling. The vendors, presumably to prevent refunds, even asked for your graphics card model before selling it to you, because it wouldn't work with low-end cards. Oh, and there were very few in the U.S., so you were typically getting them shipped straight from abroad, customs duties and all.

We've definitely come a long way.


Apple monitors are one of those things that are absolutely worth buying on release, but every month after that they get a worse and worse value.

After a few years, the "cheap ones" have usually caught up, if you're willing to do the research.


I disagree, the software and excellent integration in the ecosystem has always differentiated Apple and even years later models from ASUS are still headaches when it comes to everything outside the panel. Its like when gamers used to compare Apple spec by spec (ie. CPU, RAM, Disk) and valued all the software they provide at $0.

These days they still value software at $0 but the specs have become quite competitive and many times exceed what the rest of the market offers.


Sure, all I'm pointing out is the prices don't go down - so that you might as well buy as soon as they're released and get the most value.

Whereas with their laptops and almost everything else you might as well wait if you can, next year's is gonna be better and/or cheaper.


There’s a solid use case for matte screens. I use an 800R curved monitor and there’s absolutely no way that would work for me if it wasn’t matte. I know this because when I glance over at my coworker’s 1200R glossy screen it’s like looking in a funhouse mirror.

Edge use case I know.


Does gloss mean reflective? Like where I can see the room lights reflecting off my screen. I never considered the possibility that someone might consider that a good thing.

In an environment with little to no reflections, gloss looks so much better. It becomes truly transparent with no distraction. Matte displays always have a little frost to them.

If you do most of your computing in a prepared or controlled room, I can see the logic in that, although I think I'm not personally nearly sensitive enough to care.

For me though, I am frequently working in different rooms with arbitrary lighting situations. Net effect of the gloss is negative for me unquestionably.


This is a monitor, not a laptop. I pretty much set it down and never moved it again. In my case, a glossy glass screen is ideal.

What kind of environment is that? Maybe if you're a black person wearing black clothes, no glasses (maybe contacts are ok?) in a room with closed curtains, no lights and nothing reflective, sure.

I used to daily drive an apple thunderbolt display (the last non-retina one, 2560x1440). That thing was atrocious. I could often see the reflections of my glasses, or a white glare if I was wearing a white shirt. At nigh, in a dark office (lights off, just whatever came in from the street).

I'm typing this on a matte "ips black" dell ultrasharp something-or-other at 10% brightness, wearing glasses, a white t-shirt, with an overhead light, and see no reflection or glare on my screen. There's no way in hell I'd go back to a shiny screen.

I understand "anti-glare" technology has improved. The most recent apple screen I've tested is an m1 mbp. It seems somewhat better than my 2013 mbp, but still a worse experience than my 2015 (or thereabouts) 24"@4k dell, which is pretty old technology. My 2025 lenovo has a screen that's much more confortable to use inside.

Paradoxically, I'd say the one environment where I prefer my macs to my matte screens is in bright sunlight. Sure, there are more reflections than you can shake a stick at, but there's always an angle where you can see the part of the screen you want. You have to move around, which is obviously annoying, but you can see. The matte screens just turn to mush. Luckily for me, I hate being out in the sun, so I never encounter this situation in practice.

I think the "frost" you're talking about depends a lot on the screen implementation. I tested once an HP model, 27"@4k, and it did have such an effect. Anecdotally, it didn't handle reflections all that well, either. So maybe it's just a question of lower quality product?


Personally, I can't handle glossy displays, trying to read with reflections gives me a headache. Most other manufacturers offer both glossy and matte, except for Apple, because they know better.

The nano-texture matte finish is available as an option

You should try some of the newer OLED panels. They're all glossy and look really good.

Text sucks in oled displays. 200 ppi is not enough to make it look decent.

OLED smartphones have much higher ppi to deal with this.


Upcoming OLED panels are switching to vertical RGB stripe, similar to LCDs, which should fix the remaining text issues.

https://www.tomshardware.com/monitors/lg-display-reveals-wor...


WOLED handles text much better than QDOLED, I don't think anyone would say the 27" 4k versions "suck"

> Text sucks in oled displays.

Not anymore, as long as you make sure that any RGB antialiasing is turned off. Linux defaluts to disabling this and doing only grayscale antialiasing, so it looks great on an OLED out of the box. Windows can be configured to do this.


I have no idea what you mean by "Linux defaults to" ... what possible Linux-wide global could there be for antialiasing? Apps are free to turn on different kinds of antialiasing for text rendering all by themselves.

Default configurations in font rendering on typical distributions.

Low-res is low-res. Curves on SVGs and vector graphics look terrible.

4k OLED text is great.

LG used to with the Ultrafine 5k (I believe it's discontinued now?)

I got a deal on a used one last year and I love it. It's the only monitor I've used plugged into a MacBook that didn't look notably off (worse) compared to the MacBook's display sitting next to it. Only thing a bit jarring is it's 60Hz but I can live with it.


The $1600 Studio Display is also 60hz, including this "brand new" one (which appears to be the exact same, just with a new web cam?)

Asus has picked up the 5k 27" monitor from LG, it's the $730 PA27JCV


I've been using a work-issued one since 2018, and my only complaint in 2026 is that some of its rear USB ports are failing.

You are not the only one.

I have an ASUS ProArt Display 27” 5K. And I somewhat regret it.

I love the pixel density. But I don’t love the matte finish. Which is apparently a controversial take. But I really don’t. I like the crisp pop of typography you get with a glossy display. And, for UI design, the matte finish just doesn’t “feel” like the average end-user experience. I am constantly pushing Figma between my laptop display and my monitor to better simulate what a design will look like on an average glossy LCD or OLED display.


I've got that display, too, and quite like it. Matte finish is essential (IMO) if you're annoyed by reflections.

> In the meantime popular and widely sold gaming screens with matte blur filters and mediocre ppi give me headache and eye fatigue after a few hours of use.

I presume you also mean "when used for text heavy work" here, yes? Or do you mean that these displays tire out your eyes even when used "for what they're for", i.e. gaming? (Because that's a very interesting assertion if so, and I'd like to go into depth about it.)


Agreed.

I constantly see people saying Apple displays are a terrible value. Last Apple display I had was the Thunderbolt 27 but from now on I'm sticking with Apple.

I've had nothing but issues with non-Apple monitors as well. Customer service ime is non-existent if you need a repair. For something I rely on to get work done, I'm starting to think the premium is worth it.


> Apple is the only company that truly gets HIDPI desktop displays with high quality gloss and 200+ ppi at screen this large.

And somehow they completely forgot how to seamlessly work with displays in general. Connect multiple displays via Thunderbolt? Nope. Keep layouts when switching displays? No. Running any display at more than 60Hz? No. Remember monitor positions? No.


Great news. Apple announced a 120hz display today.

There are other 120Hz displays than Apple's.

There are even 240Hz displays.

IIRC Apple couldn't get above 60Hz even on third-party displays they explicitly advertised.


I have an Alienware AW2721D and my M series Macs have no problem driving it at 240hz. macOS picks up that it’s a GSync display and supports VRR on it too.

I could never get my two ASUS displays work at anything but 60Hz

My other setup has an ASUS PA278CGV as a secondary monitor and the MBP hooked up to it drives it at 144hz no problem.

Make sure your dock, dongle, and/or cables aren’t bottlenecks.


> Make sure your dock, dongle, and/or cables aren’t bottlenecks.

I've switched docks, dongles, cables, to no avail.

Support also varies a lot between M chips, and Thunderbolt often doesn't support high refresh rates https://support.apple.com/en-us/101571

I can't remember now the actual setup I had, sadly


My MacBook M3 Air & Pro laptops can run two QHD displays with one at 240 Hz and the other at 120 Hz. What it can't do is run either above 60 Hz with HDR enabled. But for my use cases, I've never need more than 60 Hz anyway.

There are 5k displays at 240hz?

How many 27” 5k 120hz+ high PPI are shipping right now? Reddit is particularly clowning on this for the refresh rate and completely ignoring the resolution.

This is a workstation-class monitor for people using these machines to make money. It's not a gamer toy monitor. People on Reddit don't get this. Apple's monitors are fantastic for those of us who use our computers to make money and need high quality. I am not playing video games on the same machine I use to make money.

Driving my LG oled at 120hz over HDMI. What?

?

Both of my LG ultrawides work at 144Hz?


(I think) what you are thinking of was something introduced around the Catalina>Big Sur transition, when the Pro Display XDR was introduced.

At the time, people were "marveling" at the magic of Apple, and wondering how they did the math to make that display work within bandwidth constraints.

The simple answer was "by completely fucking with DP 1.4 DSC".

I had at the time a 2019 (cheesegrater) Mac Pro. I had two Asus 27" 4K HDR 144Hz monitors, that the Mac had no problems driving under Catalina.

Install Big Sur. Nope. With the monitors advertising DP 1.4, my options were SDR@95Hz, HDR@60Hz. I wasn't the only one, hundreds of people complaining, different monitors, cards, cables.

I could downgrade to Catalina: HDR@144Hz sprung back to life.

Hell, I could on the monitors tell them to advertise DP 1.2 support, which actually improved performance, and I think I got SDR@120Hz, HDR@95Hz (IIRC).

So you don't deserve downvotes on this. Apple absolutely ignored standards and broke functionality for third party screens just to get the Pro Display XDR (which, ironically, I own, although now it's being driven by an M2 Studio, versus the space heater that was the Xeon cheesegrater).


I was using a dell S3225QC with 120 hz and even variable rate with macbook m1 pro. No hdr with 120 or variable rate though, only at 60.

So the $1600 Studio Display does not have 120hz.

Here’s some monitors you can buy at that price point:

- 6k 32” monitor (similar PPI) (Acer PE320QX)

- most high-end 4k displays (even OLEDs) with 144hz+ refresh rate

32” 4k isn’t great PPI, but it’s still fine PPI, at a reasonable distance. Double the refresh rate is a much more noticeable improvement to me than 40% better pixel density, at a distance where retina matters a bit less than laptops & handhelds. And you can get that for less than half the cost

Plus, you can get it with multiple outputs & KVM to switch between MacBook & PC. And still run it off a single USB C cable.


Do you notice 120Hz and above when doing office tasks? I'd much rather have improved resolution and PPI rather than 120Hz for that use case.

120 Hz vs 60 Hz? Night and day. Immediately noticeable just by moving the mouse pointer. Would expect improvements in scrolling to be apparent to even the most casual passers-by.

120 Hz can also noticeably improve frame pacing for 24p video*.

120 Hz vs 144 Hz? Barely noticeable when flipping between the two. Not sure if I'd pass an ABX test with 100% accuracy.

Can't speak for 240 Hz or higher, as I haven't used them.

* Though 119.88 Hz is probably a better default for this since most non-DCI "24p" video is still 23.976 FPS; this is changing, but until browsers and streaming apps support VRR for video, I'm not convinced this is a good thing due to the mountain of legacy 23.976 FPS content.


> 120 Hz vs 60 Hz? Night and day.

It's night and day when you're going back and forth between looking at them and wiggle your mouse around in circle. But after a few seconds of being focused on your work, you're not thinking about it anymore.

Being able to watch 24fps video without non-integer frame weirdness is the only real advantage outside of twitch-reaction gaming.


I disagree. 120hz makes typing, mousing, etc. noticeably more responsive. I never stop noticing it. I never liked having to use 60hz all the time once LCDs were replacing CRTs. The original iMac didn't even let you choose 60hz to run the desktop at -- it only offered higher refresh rates in the menus. (Games could set the display to 60hz if they really wanted to.)

i'm currently on a 60hz laptop screen and keep finding desktop switching and scrolling jarringly choppy, it's also harder to read while scrolling or panning around a map/pdf

everything feels much more responsive on 120hz+

especially noticeable with typing. and scrolling.


I don't notice it at all, on my laptop or phone. Even when having one monitor 60 and one 120 next too it.

Only when looking at demo pages to show off high refresh rates can I tell.

Though what I do notice is replacing the mouse with a higher polling rate from 125Hz to 250Hz.


Yes. Even 90 Hz is a noticeable improvement over 60 Hz. I wouldn’t pick it over high-DPI, though.

Yes, absolutely

100% yes

Very obvious when scrolling text and moving windows around, for example.

any animation work

> So the $1600 Studio Display does not have 120hz.

Usually these exists only to bump the price of the pro model.


I was hoping for OLED or dual-OLED based monitors, especially for this price point but I’d want this slightly lower than the XDR price. Sequoia+Tahoe seems like they’ve been laying the groundwork for OLED macs — removing the menu bar background and making text dynamically change colour, moving/cycling backgrounds, liquid glass reducing the effect of static UI elements, etc.

I personally wouldn’t buy a new LCD based display anymore at this price. There are flaws inherent to the technology that affect all of my recent Apple displays (Studio Display, M1 Pro iPad, M1 Pro MPB, M4 Pro MPB). After using OLED TVs and OLED iPhones for years, it’s very difficult to look past LCD’s issues (edge yellowing+dimming specifically affects all my Apple screens more than I am happy with).

There are no reviews/studies on long-term aging of Apple’s LCD displays, so all of this should be taken with a grain of salt, maybe my devices are just unlucky.

I don’t know if the Pro XDR line is better or how that would carry over to the Studio XDR. I haven’t seen many complains about the Pro XDR, but the Studio Display form factor has a different cooling design which would affect longevity.

I will say I can never go back from retina resolution text, and that alone has made the experience of Studio Display good. If we could get OLED it would be perfection. I think I would have to see the XDR in practice to be convinced, but 120hz requiring a whole new computer does make it a non-starter for me.


Along similar lines, there's no way I would buy an OLED at this price point. If I'm dropping $3k on a monitor, it needs to be a technology that lasts, not a technology that wears out over time.

Current gen OLEDs almost don't wear out (saying this as an OLED owner). To see the wear you need to have a completely black room and the wear is unnoticeable unless you're specifically looking for it. You don't need to spend 3k, 1k is enough.

Ah, you should update wikipedia then: https://en.wikipedia.org/wiki/OLED#Lifespan

> In 2016, LG Electronics reported an expected lifetime of 100,000 hours

23 years for an older generation OLED seems fine to me, I don't understand the problem here?


The US Department of Energy report from the same year reports far lower numbers, which I'd be more inclined to trust since they are impartial / not trying to market a product.

True, but those numbers are from 2016, 10 years ago. For a more apples to apples comparison see [1] [2] [3].

[1] https://youtu.be/H43wnV-v7V0

[2] https://youtu.be/RbEgQrigiLc

[3] https://youtu.be/AZfwHcMLorY

In my case, 3205 hours of use:

- 428 pixel cleans

- 1/3 brightness (my room is pretty dim and I often code during the night)

- static control on

- pixel shift on

- apl low

- sub-logo dim on

- corner dim on.

During the day I am not able to see any burn in. During the night it's unnoticeable unless you're looking for it. And it's only visible on gray backgrounds, unnoticeable during normal use. My phone (Nothing Phone 2) fails to capture it no matter how hard I try (even during the night).

The only issue I had was at 2417 hours and it was vertical white stripes like this: [4] but they were completely gone after a manual pixel clean. No issues since. I am never going back, worth every penny I spent.

[4] https://www.reddit.com/r/gigabyte/comments/1gyv1db/fo32u2_ve...


That doesn’t sound very reassuring. 3205 hours, or a little over a year at 8 hours a day. Be generous and call it two years of use. You’re babying it with low brightness, dynamic dimming, etc. etc. and the fact that there’s anything, even if you have to “look for it”, is not a good sign.

I've been having it for 1y 4mo.

> You’re babying it with low brightness

That's the same brightness I was using on my IPS. And if you watched the videos then you'd know that those people use OLEDs at "almost max brightness" and see no burn in.

> dynamic dimming

Such features are unnoticeable during normal use and most of them are defaults.

> the fact that there’s anything, even if you have to “look for it”

Again, this is only noticeable if your room is completely black and you're staring at gray content.

To counter your argument, you have a much worse backlight bleeding on IPS, which is very but very visible during normal use. To quote you: "the fact that there's anything, is not a good sign".

It's weird how you call OLEDs bad but completely forget about IPS downsides, and I'm not gonna even start on VA.

At the current state, OLED wins.


I _have_ watched those videos and they show burn-in after an alarmingly short amount of use. My current IPS monitor has been going strong for the past decade. I expect monitors to last at least that long. Get back to us about your burn-in after 8.6 more years.

Adding one more reference, here is a recent post to /r/monitors showing burn-in after 2 years of constant use: https://www.reddit.com/r/Monitors/comments/1pf0tmi/here_is_m...

And as a personal anecdote, I've experienced burn-in on my pixel 3a after 2 years. When switching to a full-screen solid grey, you could clearly see the bottom button bar with the home/back buttons.


I bought an LG 32" 4k OLED for $999 and it's hands down the best display I've ever used. No burn in even with lots of static browser/terminal windows for days and days. The fact that it's $3k and _not_ OLED is insulting.

I believe these monitors are meant for professionals, which means it is going to be used in bright office buildings. That means running the display at high brightness which is the worst case for OLED since they degrade faster at higher brightness. Quoting wikipedia:

> A US Department of Energy paper shows that the expected lifespans of OLED lighting products goes down with increasing brightness, with an expected lifespan of 40,000 hours at 25% brightness, or 10,000 hours at 100% brightness


> If I'm dropping $3k on a monitor, it needs to be a technology that lasts, not a technology that wears out over time.

I bought my OLED TV when fearmongering was the highest, and it still works perfectly with zero burn-ins. So it is definitely possible. I bought the tv 8 years ago.


Yeah my LG C9 looks great, minor dimming where the captions are, but that’s it.

In the 7 years since they’ve gotten better, with micro lens arrays and stuff to improve brightness without heat causing faster decay.

RTINGs has some great content on TV longevity, but I haven’t seen anything for monitor workloads.


It's mind-boggling that Apple is considering the base 27 inch Studio Display with the same 4 year old panel, but with some new accessories slapped on an "upgrade".

The base 27" wasn't even a new display 4 years ago, it's the same thing they were shipping in iMacs before that. It dates back to like 2017?

The 5k iMac was introduced in 2014. There was one change in 2015 that added P3 color gamut, so it appears to have been the exact same LG-manufactured panel for at least 11 years.

Oh, and if you want to utilize 120Hz on the XDR display, you're going to have to replace your perfectly functioning Mac.

> Mac models with M1, M1 Pro, M1 Max, M1 Ultra, M2, and M3 support Studio Display XDR at up to 60Hz. All other Studio Display XDR features are supported.


Almost certainly due to bandwidth limitations on older versions of Thunderbolt. Full bit depth HDR 5k @ 120hz requires some absurd data thoughput.

I don’t think so. My M3 Pro is on the list as supporting 120 hz but it only has Thunderbolt 4.

Also the base M4 doesn’t habe Thunderbolt 5 and it support 120 hz.


> My M3 Pro is on the list as supporting 120 hz

Can you point me to said list? All I could find was:

> Mac models with M1, M1 Pro, M1 Max, M1 Ultra, M2, and M3 support Studio Display XDR at up to 60Hz. All other Studio Display XDR features are supported.

And The Verge reports:

> There’s also support for adaptive sync that can adjust between 47Hz and 120Hz (if it’s connected to an M4 Mac or later, or the M5 iPad Pro)

I got an M3 Max and was strongly considering upgrading my old monitor, but if I can't do 120hz, I'll just wait until I upgrade my laptop as well.


I’ll give you an anecdote: my work laptop is an M3 Pro MBP, and my Dell U4025QW works just fine with it over Thunderbolt at 120Hz VRR

That monitor has a noticeable lower pixel count.

Dell U4025QW: 5120 x 2160 = 11,059,200 vs Apple Studio Display XDR: 5120 x 2880 = 14,745,600

So your display has 25% less pixels.


It’s quite possible this is running with a reduced color space (chroma subsampling). Degradation happens automatically based on available throughput and most people don’t notice.

For desktop use? Chroma subsampling is obvious. DSC compression, on the other hand, is not. DisplayPort and HDMI support both.

It’s obvious if you use a test pattern and/or know what to look for: https://testufo.com/chroma

I had no idea what it was for the longest time. As it turns out, macOS frequently enables it even when it’s unnecessary, and without any way to override.


> Can you point me to said list?

There’s no list per-se. The MacBook Pro (2021 and later) is listed as supported. The M3 Pro and M3 Max are not listed as only supporting 60Hz while the M3 and M1 Pro are.


They did say M3, not M3 Pro. You're probably okay.

(Notice how they listed the M1 chips individually.)


I don't really see your point. The chips mentioned do not have enough bandwidth on display outputs to support the monitor at 6K@120Hz. If anything, I find it surprising that Apple supports running the display in 60Hz mode instead of telling people to go pound sand and buy new Macs.

I got the Kuycon G32P and it’s an incredible alternative. 32in + 6K for less than 2k$

Also works great with other sources like an Xbox

I used a Pro Display XDR as my daily driver at work and the difference is minimal


I'm really after higher refresh rate than 60, but it seems it would cost me an arm, leg, both kidneys and my newborns to get it at 5k or more resolution.

Take a look at: "AOC AGP277KX" OR "LG 27GM950B", both can do 5k @ 165 Hz

I own this as well and while I appreciably the levelized cost, there is simply zero comparison to my gen 1 Studio Display. The gloss and shin on the Kuycon means it only works in dimmly lit rooms.

Nano texture in mixed lighting scenarios is worth every penny even on a lower resolution and lower refresh rate panel.


Do you own the matte display version or the default one?

The matte. It's offensive.

They sell a matte version, the G32X.

That's a hefty premium to pay to not also have high refresh or high nits but the higher density options are so thin there's not really much else to go for if getting the resolution density is the goal.

I can attest to the greatness of the Kuycon G32P; <1.5k€ in my case.

Hah, the absolute shamelessness of that design and the site is crazy!

only 1/4th the brightness

Pretty lame that the Studio Display with a height-adjustable stand is still 400 Euro more. My biggest regret is getting my Gen 1 Studio Display without.

Also the non-XDR is only a small upgrade otherwise, no 120Hz, no HDR, only Thunderbolt 5 and a new camera. Finally a downstream Thunderbolt port though.

This is all after 4 years?


VESA mounts are only a few bucks and give you even better height and tilt adjustment. You also get desk space back. I have a shorter desk (24" vs typical 30" depth) and I have two monitors and a laptop mounted on 3 VESAs and I can extend them so that the monitor edge is inline with the desk edge, giving me the same 24" that a 30" desk would have with a monitor stand.

Which mount do you have? I've got a 24" as well and I've never imagined I'd fit 2 monitors.

Herman Miller's Jarvis [1]. I'm probably paying up for the brand, but I got it installed a few years ago (with the nano-textured Studio display), and it works beautifully.

[1] https://store.hermanmiller.com/home-desk-accessories/jarvis-...


I just use some old textbooks to raise the height of the display:

- Design Patterns by the Gang of Four

- Modern C++ Design by Andrei Alexandrescu

- Code Complete from the Microsoft Press

That's enough old paper to raise the display height to a comfortable level.


I do the same, though ideally the height is different between putting my desk in sitting/standing height.

> Also the non-XDR is only a small upgrade otherwise, no 120Hz, no HDR, only Thunderbolt 5 and a new camera. Finally a downstream Thunderbolt port though.

The camera is still 12MP but offers Desk View. Maybe this is a feature unlocked by the improved onboard A-series chip (A19?).

I wouldn't sniff too hard about Thunderbolt 5. Thunderbolt 5 doubles throughput to 80 Gbps from 40.

Would have loved refresh above 60Hz but then who's gonna get the XDR?


Yeah if they put everything on the lower end device than nobody would buy the higher end device.

> Pretty lame that the Studio Display with a height-adjustable stand is still 400 Euro more.

just buy a nice one on amazon for $100, it's still VESA mounts


Insanity that a monitor that expensive is stuck at 60Hz

Super disappointed that the base model doesn't get 120hz. I own the old model and it's great, but I will have to look for an alternative 5k display with 120hz refresh rate. There are a few on the market now, and I won't pay 3.5k for 120hz.

So it seems the new Studio Display XDR is the only display on the market that offers:

- 5k resolution at HIDPI (27inch)

- 120hz refresh rate

- TB5 and single cable connectivity.

There are a couple of other HIDPI displays at 5k with 120hz refresh rate but they don't do TB5.


I was hoping for a 6k 32inch model.

But even so, these 2 new monitors still don’t support multiple inputs.


I'm also a little bummed that they seem to have dropped the Pro Display XDR. I wanted a 32" display as the main display, and then use my existing two Studio Display vertically as secondary on each side.

I guess we're going to see how the support for DP Alt-Mode will be, as I'm not sure how much bandwidth that can provide, so 120Hz might be out of the question. But for now that has been a simple way to get around the lack of multiple display inputs, you just needed a separate KVM switch for it.


Current hardware and standards have them backed into a corner.

No Mac today supports 6k 10-bit @ 120Hz because the DisplayPort 2.1 standard can't handle it uncompressed and that's the best Macs offer. HDMI 2.2 just came out last year and would likely be able to handle it over a TB5 cable, but again, no hardware support.

So say that Apple did update the Pro Display XDR, what would it have exactly? More dimming zones for sure, the new Studio XDR has 4x the dimming zones. But they are clearly not confident in OLED tech for standalone monitors yet, so no OLED.

Anyway, their updated XDR would be shipping with the same ol' 60Hz. Reviewers and social media and tech nerds would rip them to shreds, it'd be a PR clownshow. I can already see the "Apple really expects us to pay $7k for a 60Hz monitor in 2026" viral posts.

And Apple being Apple would never explain why a monitor is lacking a feature like 120Hz, because it would mean acknowledging people had higher expectations. So we get an expensive 5k 120Hz monitor instead.


I just want to natively hook up a PS5 without capture card latency... I would've bought a Studio Display years ago but can't bring myself to purchase a $2000 device-locked monitor.

> Still no support for multiple inputs

It looks like a nice display, but that’s a deal killer for me.


I've been pretty happy with my ASUS ProArt PA32QCV (32", 6k, but only 60Hz). Kinda infuriating that Apple doesn't let you adjust third-party monitor brightness though (and my work disallows apps like BetterDisplay).

Thank you, that’s exactly the one I’m going to get now, I was just waiting for these from Apple to be announced to make the decision.

I’ve owned my nano-textured XDR since launch (with the stand), and I love it.

As the years have gone, the only upgrade I wished to have was 120 refresh for some very limited design work - but 120 really is still not widely adopted in most places anywhere, so it’s really a non-issue for me.

The new XDR is smaller, has a less ergo stand, and also loses the beautiful lattice etchings on the rear which I often admire.

The XDR was overdue for a refresh, it’s nice the price dropped some, but I won’t be upgrading for now.


Too small… I got used to my 4K Philips OLED 42" that I hung directly on the wall in front of my desk (no stand at all)… USB-C cable also charges the MacBook. This size is so good to work with; so much screen estate.

I agree, and use a 55" LG OLED TV similarly. Got it on sale for $1,300.

Especially nice in a small apartment where I use the same display for video, gaming, and desktop.

No USB-C, but HDMI works better for long cable runs anyway, so I can keep my (non-laptop) computers in the other room and just "dock" my wireless input devices to a USB-C charger as needed.

Thunderbolt would be even worse, as even if I could somehow get Thunderbolt out of an Nvidia GPU, I'm not aware of any devices that would allow switching between multiple Thunderbolt inputs, and 4 sufficiently long optical Thunderbolt cables would probably cost more than the display itself.

As for crisp text, I'll replace it with a 120 Hz 8K display in a few years if the price is right. In the mean time, I value screen real estate far higher (and dislike multi-monitor setups).


You're using the pixels for something different than the target audience.

People who want a Studio Display want retina crispness. If you enjoy a 42" 4k, you're more concerned with real estate than image fidelity.

I'm happy with a 65" 4K TV in my living room, but a 4K 27" monitor is borderline too low-res for computer work. Same pixel count, but different use cases.


I think I’m absolutely the target audience: I’m a designer, programmer, animator. Crispness at 4k is still quite good at 1m distance from my face. I’d buy it without hesitation if it came much, much larger.

42 inches! thats a lot of viewing area.

Indeed! The big monitor is about 1m from me, the median a bit below my eyes. The laptop on which I type on sits in-between and the two screens align almost perfectly (optically). This setup works well for me and I feel it’s very ergonomic. That's why I can't go back to tiny (<32") screens anymore.

You could get something smaller but have it closer to your face than 1m?

The sort of “visual impact” a screen can have is mostly a combination of what percentage of your FOV it consumes.

People think they’ve got a bunch of screen real estate when they buy a big TV to use as a monitor… and then they use it a twice or more the distance of a regular monitor.


So Apple essentially introduce a new (middle) price point in their displays:

  $1,500  Studio Display
  $3,300  Studio Display XDR  <-- NEW
  $6,000  Pro Display         <-- DISCONTINUED ???
Apple is amazing at "laddering" people up to the next higher tier.

EDIT: It appears the Pro Display has been discontinued.


Do they still sell the Pro Display? https://www.apple.com/pro-display-xdr/ redirects to the Studio Display XDR now.

it seems like the Pro Display XDR is discontinued. The webpage for that now redirects to the Studio Displays XDR

There is a note at the end of the linked announcement:

”Studio Display XDR replaces Pro Display XDR and starts at $3,299 (U.S.) and $3,199 (U.S.) for education.”


I can't find it either.

Which means they don't have a 32" display option if true.

Maybe it will also be updated, but on a different day this week?


On the announcement page, they say "Studio Display XDR replaces Pro Display XDR" in the footnotes, so doubtful.

They could be considering the new high end display a different product rather than a refresh (for marketing purposes at least).

I recall the XDR being announced alongside the last Mac Pro redesign. No new Mac Pro yet, so maybe they’ll announce the new large display whenever that is announced?


> Studio Display XDR replaces Pro Display XDR

How does a 5k display replace a 6k display? Are they giving up on 6k? Disappointing.


It's impossible for them to support a 10-bit 6k@120Hz monitor with current hardware and keeping the old one around would be embarrassing. The Pro 5k will probably sell better/be more profitable anyway.

It's a smaller display

I was curious to see the "Innovative DICOM Medical Imaging" section. I wouldn't have thought that Apple would be interested in niche applications like viewing radiology imaging, but I guess they're probably interested in any cost-insensitive market for these since they're so expensive.

At a local hospital the radiologists have been all Mac for a long long time. They refused to give it up and resisted all attempts to get them to switch. So it doesn’t surprise me at all.

Yeah, in my first job I was an Apple technician for a company that supplied DICOM solutions to radiologist, both in hospitals and standalone.

I thought it was weird they spent so much money on Apple hardware when most of what we sold was servers that would be hidden anyway. But they do like OsiriX; once a solution is established in those fields, they stick with it, very conservative professions obviously...


Interesting, I would've guessed that they would've forcibly been on Windows since time immemorial.

Entirely unsurprised that someone would refuse to give up their workflow, though! I've rarely found a user with specific needs who wants to change literally anything else about their system, since what they have works for them.


It's probably an easy win for them. It also might have been a good target when they were ideating on specs. Having these pro certifications gives the devices a halo of premium quality.

Regular consumers probably don't buy these displays in bulk, when you can get very nice displays for less than half the price that are 98% the same on specs.

So targeting checkbox-compliance for places like hospital systems is probably an easy win to generating / keeping some long term contracts.


> you can get very nice displays for less than half the price that are 98% the same on specs.

Can you recommend any displays with PPI and brightness equivalent to the studio display, with 120Hz+ refresh rates? I was waiting for this announcement to buy a studio display because I thought they might bring 120Hz to the base model, but $3300 is a lot to spend on a single display. I have an original studio display and a high refresh rate 4K OLED monitor, and they are both compromises unfortunately.


https://www.bestbuy.com/product/asus-rog-strix-27-dual-mode-...

I haven't found a glossy competitor, or even one with the same HDR spec, but this is the closest I could find so far.


The price point is super painful. 2k would have been bad enough but I would have considered it. It’s a no go at $3,300 for me.

I don't think you can get a DICOM-certified display at 5K and 27" for half the price. Probably like $1k less but that's it - and if you're a radiologist making $300k+ you're not going to want to cheap out on a display.

If you're a radiologist making $300k+ you're going to want to use certified displays so that you don't get sued for using non-approved devices for diagnostic use, and that's going to cost you maybe $6k for a 21" monitor.

https://www.monitors.com/products/jvc-cl-s500-rn?variant=427...

$3300 for a 27" display is ridiculous in comparison.

(Acknowledging that the link I provided is for a pair of monitors, but also those monitors are half price because they're refurbished)


No I'm saying regular consumers don't care about DICOM certification. They care about the other 98% of the specs, and can find a suitable alternative.

This also keeps their development targets at the state of the art.

Sad, but not surprising to see Apple discontinue the Pro Display XDR. Hard to go back to 5K once you’ve used 6K.

I vaguely recall an Apple rumor from the last few months about 3 new display model numbers, 2 of them being 27" and one of them being 32"... so still possible a Pro Display XDR refresh is on the horizon.

”Studio Display XDR replaces Pro Display XDR and starts at $3,299 (U.S.) and $3,199 (U.S.) for education.”

They'd just call it something else or simply add a new size option for Studio Display XDR like they do Macbooks.

Studio Display XDR Pro

Available July 2026 - calling it now.


The pixel density is the same I believe - I guess their theory is that 5K is more fungible than a large 6K display since people looking for more real estate can daisy chain the 5K displays.

I have two 27" 5k displays (both more than 5 years old, so they're not HDR or 120Hz).

I know I'm privileged but my biggest issue with them isn't the HDR or 120Hz. It's that the seam between them causes me to not be able to use that "middle" real estate.

So I was side-eyeing a 6k display cuz it would have most of the benefits of a dual 4k but more real estate and more flexibility in windowing.

The curved displays also look quite promising (like the Neo G2), but not feeling like spending money when I have two perfectly good monitors that already work.


Curved displays are polarizing - I definitely am not a fan although some people I know love them. In the olden days of dual monitors when the aspect ratios were more square it made a lot more sense to have them side by side. Now with the aspect ratio so wide it's weird having dual displays side by side. I'm actually tempted to move back to a single 27" display (currently have dual 24" 4k with one oriented vertical and one horizontal).

Same pixel density, but smaller monitor though.

The only monitor on the market of this size and resolution that I am aware of that has really high brightness and works well when I work outside on the terrace.

Really glad Apple is building it.


Are you being cheeky or do you really drag a monitor outside?

> Featuring extensive connectivity to support a variety of workflows, Studio Display XDR includes two Thunderbolt 5 ports and two USB-C ports.

That is not extensive connectivity. That’s the bare minimum one might credibly expect.

If I were to consider buying a display like this, I would want at least two and preferably more inputs and at least a DisplayPort input. Not everything in the world is USB-C, especially when discrete GPUs are involved.


If I had to guess, with so many devices (speakers, microphone, webcam) on top of any external ones you connect, having multiple inputs especially one that can't possibly connect your computer to those devices, is virtually guaranteeing that some users will complain that it doesn't work. I believe there is a similar reason why usb-c hubs rarely have downstream usb-c ports. When you do find one, they always have several reviews complaining that it doesn't work with 3 hard drives and 2 monitors plugged in.

As long as we're here:

What are people's current favorites for a 5K 27" screen that doesn't cost as much as a whole damned computer?


I got this https://www.samsung.com/uk/monitors/high-resolution/viewfini... and am pretty happy with it. I got it fairly cheap with a student voucher (I think ~650 GBP).

It had some coil whine initially but that has gone. There's a load of nonsense software in it but I just have it disconnected from the internet and only use it as a monitor. The web cam is not useful but I don't use that either.

This was a couple of years ago - I think that there are a lot more options available now?


I have been using 3 VP2788-5K for 6 months. Much better than 1440p or 4k monitors IMO. I spend most of the day on teams meetings and looking at code.

Text is very crisp at this DPI. The built in thunderbolt dock works reliably.

It is annoying how the cables stick down on the bottom of the monitors. A few right angle adapters helps with that.


I've tried the LG UltraFine and LG UltraGear (w/144hz) ... still went with the Studio Display. Expensive, but my previous Thunderbolt display gave me 12 solid years – hoping the SD does the same.

Edit: Also consider the price of speakers, camera, hub, power, and the "it just works" factor.


> power

Most modern usb-c / tb seem to come in an integrated "dock" which provides power. But many provide a laughably low number.

Also, for some reason, many come with external power bricks for some reason, which are a special kind of PITA with their short, permanently attached cords.


Ultimately, I'm beholden to whatever I can get work to pay for, unless I want to pay out of pocket to subsidize the capex of a multi-trillion dollar conglomerate.

A lot of people are saying nice things about Kuycon displays, but no personal experience with those. Otherwise I think Asus has good offerings, cannot remember the name though

Also JapanNext offers a 6k screen for <$1000, but it hasn't started shipping yet to my knowledge


Get a broken 27" iMac, rip out the guts, and slap in a converter board that adds a bunch of inputs. It's not nearly as difficult to build as most of the blogs make it out to be.

Mine's not even broken, but it is corp, so sticking an Aliexpress board in there is ill-advised. Hence, in the market for a 5k when I've had a perfectly good one on my desk since 2017.

BenQ PD2730S.

Asus ProArt

I can second this. I bought two of the 5K 27" ProArt monitors plus a Thunderbolt hub to be my home setup, all for less than the price of one Studio Display, and it has been working perfectly.

That was the one I'd heard of, but I was shocked to see how low the Amazon reviews are (3.8). A highly-rated review said it had horrible clarity, like a greasy phone screen.

You haven't had problems, and would buy it again?


As someone who is _very_ sensitive to screen texture (cannot tolerate grain, own glossy Dough Spectrum 27" 4k for that reason), from what I gathered, yes newest 5k and 6k panels use unusually harsh coating, which I am certain I would not be able to use whatsoever.

The clarity seems fine to me, and I bought it specifically because I want crisp text. Maybe those people had defective units, I'm not sure.

I've been using an LG Ultrafine 27MD5KL-B for years, and it works pretty flawlessly once I set up BetterDisplay with it. This is my primary work setup, and I think I paid around a grand for at MicroCenter some time back. It has worked great.

I keep hoping someone will release a nice monitor that’s monitor shaped (16:10) instead of TV shaped (16:9). That’s part of why early 2000s Cinema Displays are so great. Not to mention the last great Mac laptop before it all went south — the 2015 MBP

Is buying a used 32" XDR worth it if we want a 32" apple display? or is the tech not as good now?

- at worst you're getting a 7 year old monitor

- the new 5k XDR has 4x as many dimming zones so it would objectively look better than the old XDR

- not sure what the market for used 6k XDRs is like, but there's a good chance you'd be paying new 5k XDR monitor price for an old 6k XDR used monitor


The the smaller xdr has better brightness and thunderbolt 5, so it depends on what you are looking for.

Really, a $3300 Mini-LED display in 2026?

Show me an HDR display that is 2000 nits peak HDR calibrated 27" for under $3,300. Not a gaming monitor. Closest you can find is a lilliput UQ31 that has half the nits.

And which supports DICOM calibration, which normally costs you >$5k for a smaller (e.g. 21") display.

It's now vastly cheaper to buy a Mac and a 27" Studio Display XDR than it is to buy a single 21" DICOM display for your clinic. Heck, it's not much more expensive to buy two SD XDRs than to buy one standard DICOM display.


I feel like if they can profitably sell a Mini-LED in a $1400 14" Macbook Pro, they can find a way to sell a larger one in a 27" display for under $3300…

I have the last-gen Studio Display, pretty great during the day (the nano-texture is astonishing), but just looks like trash at night when the backlight overwhelms the blacks.

My guess was that the “Studio Display 2” would introduce Mini-LED, and then a “Pro Display 2” would have the high-refresh and maybe 32". Wake me up in five years, I guess.


This looks like a new iMac Pro minus the computer. Its a shame they don't have anything where you can just dock your iPhone Pro to one of these to run macOS.

Or at the very least pair a Bluetooth mouse or trackpad to an iPhone for remote desktop use.

Was hoping to upgrade from my 2017 iMac and this new studio display is quite a bummer. 9 years later and it's basically the same display spec. (500 nits spec bump to 600 nits)

I may (think that I) have a 29 year old mind, but my eyes are at least their true 49 years of age, so I don't feel like I could do anything less than a 32 inch monitor, especially if I'm paying a premium.

I really can't believe they discontinued the Pro Display XDR.. what is wrong with them? A company the size of Apple, surely must have the resources to update it every couple of years.

DisplayPort 2.1 cannot reliably drive 6K at 120hz

With DSC it can drive even 8k@120Hz.

Yes, but can you call a monitor that relies on lossy compression to display in its native resolution professional?

As sort of a tangent, am I the only one who has had bad experiences doing what the woman in the press release is doing? Ya know, touching the laptop while it's connected to external devices via Thunderbolt and/or USB-C.

Sure, most of the time the cable seems secure enough to maintain connection when I accidentally nudge the laptop. But every once in a while, when I slightly shift the laptop here or there, flicker and everything goes batshit. The monitor loses connection, so maybe (depending on config) the laptop screen changes resolution and then eventually reconnects and flickers and changes back. Or the network drops out (if I'm connected to Ethernet over Thunderbolt). Or a program freaks out because the drive it was using disappeared. Or the laptop really freaks out and kernel panics.

Like I said, it doesn't happen a ton, but it's happened a handful of times over the years, just enough that now I always use an external mouse and keyboard with a docked laptop to avoid such nonsense.


Happens to me with HDMI, not USB-C

A $1600 60hz display in 2026 just feels extortionate.

The Studio Display XDR seems nice, but I wish they would have kept a 32" option.


Especially since a very similar, if not exactly the same, panel from the XDR will be in monitors from other brands for a fraction of the price (like the LG 27GM950B).

I just tried to look up the power usage for XDR and they only list voltage no amps or watts.

Did I miss something


Oddly thankful my current Studio Display doesn’t feel left behind by this upgrade and thus I feel no need to upgrade

Since the base model is still 60Hz, I'm struggling to pick between the base model or a Kuycon G32P. Can anyone on here help?

It shouldn't be a struggle. If you need colour quality (e.g. content creation/consumption) get the Studio Display. If you need real estate (e.g. technical work or programming) get the Kuycon.

As someone who likes bright monitors, I'm excited to try the 2000 nit peak brightness! Are there any comparable monitors to the XDR brightness wise?

If you can put up with wide curved panels the 49" variant of the Samsung Odyssey Neo G9 from 2021 offered HDR2000 as a 5120x1440 display (basically 2x27" 2560x1440).

I was heartbroken all of the flat panel normal aspect monitors in that family since have had other severe tradeoffs and it's only the curved ultrawides that were given the better specs.


May be an overkill but is anyone using it for coding? How is the experience so far?

wow, the prices have come down. I inherited the old Pro XDR display when my father passed away a couple of years ago: I think he paid $6K for the display and another $1K for the stand.

Off topic, but Apple seems to be dropping hardware costs / capability - relying more in subscription, app store, and cloud now? On an impulse buy, I bought the entry level MacBook Air at Best Buy about two months ago because it was $200 off list price. Amazingly capable laptop for $800.


It's cheaper but also 27" 5K instead of 32" 6K.

I think it's kind of weird that they didn't just do two size options with similar specs.


For that base display, it is essentially the same as the previous monitor with the addition of Thunderbolt 5.

I wish it came in an ultrawide format.

Daisy chaining finally supported.

> Studio Display XDR replaces Pro Display XDR and starts at $3,299 (U.S.) and $3,199 (U.S.) for education.

My father-in-law is a monitor engineer. He is insanely gifted. We were in a Taiwanese factory together years ago and I asked him what it would cost to build the Pro Display XDR today. I will never forget his answer…

“We can’t, we don’t know how to do it.”


My father-in-law is a monitor engineer. He is insanely gifted. We were in a Korean factory together years ago and I asked him what it would cost to build the Pro Display XDR today. I will never forget his answer…

“A lot less than you paid for it.”


Bit disappointed with this release.

I have a iMac 5k that I want to replace with a Mac mini or Studio, but monitors for Mac's are difficult since you need 5k at 27" or even 6k at 32" to keep th pixel density and then they aren't many options.

It's getting better, but prices are very high and a big issue with the 5k 27" monitors is that they are stuck at 60hz, Apple announced an approved offering here, but it's very expensive for the XDR and ideally I'd like to upgrade to 6k at 32" and have 120hz but that just seems unobtainable at the moment.


anyone else still using their 30" cinema display from 2003?

I have a 2000 22" Cinema Display. The software brightness control even works. https://kalleboo.com/linked/cinema-display.jpg

I was until quite recently. Bought a cheap 4k panel to replace it. I was really sick of the number of adapters to keep it going, plus it was never particularly bright and had low contrast.

I was keeping mine alive on life support until about two years ago when I updated to the Samsung 5K display.

Loved the extra screen real estate of the 30" ACD and it's a beautifully designed product that I enjoyed having on my desk.

In its last year or two the backlight wear started to result in colors to become uneven. Blues were less vibrant and reds had tint issues.

Was also difficult to justify the power draw, it had a 150w power supply.


Not using, but I still have it. I get it running every couple of years and it's striking how dim it is compared to modern monitors. Yet I just can't bring myself to dispose of it or the 2007 Mac Pro it's attached to, despite them having absolutely zero utility.

Does this still not support multiple inputs / devices?

Nope. Still does not. I have 2 macs on my desk and no simple way to connect them to a single Apple display! It's a glaring hole that to me suggests they have no idea who their market is for these.

I might be missing how this differs from the previous model.

https://en.wikipedia.org/wiki/Apple_Studio_Display#Technical... has a good table. The short story is the 2nd generation Studio Display has some minor noted user facing changes but isn't that big of a difference. The Studio Display XDR is a bit of a merge of that and the old Pro level feature set.

Few enough differences so that if I could get an old Studio Display at a discount, I would. But right now it seems the old one is still full price where it's available.

32" when?

This is awesome! $3299 is a great price drop. I’m moving countries soon and wasn’t going to bring my old monitor, so this is perfect timing.

It’s a smaller monitor though, the discontinued one was 32 inch 6K resolution, this one is 27 inch 5K resolution.

But it’s the same pixel density.


Is there any details on from whom Apple is sourcing the panel from? LG and MSI have both shown off 5K monitors at 165hz and 2304 dimming zones recently.

I said about two years back I’d wait to upgrade my 1080p monitor until Apple shipped a high refresh rate one. I knew the monkey’s paw would curl but at nearly $5000 CAD that’s a hard no.


Apple just doesn't seem to be hitting on all cylinders anymore. The price for this thing is outrageous compared to the competition, and the competition isn't lagging very far behind. It's certainly pretty and I'm sure it's an incredible piece of tech, but $1,600 on the low end and $3,600 on the high end is just not going to sell in this environment. While the competition has always started with the minimum viable product for a low price and iterated on the product, Apple's approach has been the opposite - maximum possible product and then try to iterate the price point down. The problem is that the competition is now encroaching on their product quality territory, and the offer doesn't seem as tempting. For example, see the ASUS Pro Art, which has arguably better specs with the addition of HDR10 for $799. Or the BenQ MA270S, which you can buy two of for $1,800.

Curious to see if the XDR works at 120Hz on Windows; and if so, if there’s a KVM switch that would work with it.

Probably not worth the hassle, but I wish there was literally any other display manufacturer out there with premium build quality.


XDR = LCD

Of course, even 5 layer tandem OLED would struggle to hit specs like 1000 nits sustained full panel brightness.

So what? Any LCD also struggles with black levels. They have advantages and disadvantages. The point is more that Apple tries, like TV manufacturers, to hide the LCD designation by instead coming up with a creative but misleading acronym. "XDR" in this case. This never happens with OLED. Which shows that manufacturers believe that most people care more about black level contrast than maximum brightness.

So... it helps you understand that's the main selling point of the monitors is the range of brightness? XDR stands for "Extreme Dynamic Range" - not sure how that's misleading or why sticking "LCD" into the title helps anyone figure out what the features of the panel are as most LCDs have awful range.

Using OLED directly in names works because OLED panels inherently have common features like blacks, contrast, and lack of blooming over other panel types. They can have other aspects added in (e.g. tandem OLED or the like) but by saying it's OLED you include these base things without needing a unique term.

Saying other panels are LCD tells you next to nothing as there are so many types which can be paired in so many ways which can all have completely different characteristics. It's not a conspiracy to hide the truth that people must only want OLED displays, it's an attempt to say something rather than nothing in the name of the monitor so you know what it actually is beyond "not an OLED".

If it were about trying to hide something the subhead wouldn't say mini-LED.


> So... it helps you understand that's the main selling point of the monitors is the range of brightness?

Doesn't help much because LCDs getting significantly brighter than OLEDs is normal.

> XDR stands for "Extreme Dynamic Range" - not sure how that's misleading or why sticking "LCD" into the title helps anyone figure out what the features of the panel are as most LCDs have awful range

The dynamic range of this LCD is still not necessarily higher than of an OLED because dynamic range depends both on maximal and minimal brightness. So "extreme dynamic range" is highly uninformative, while "OLED" or "LCD with local dimming", or something like that, wouldn't be.

XDR is simply a marketing term. Apple is doing the same thing here as TV manufacturers who come up with ever new fantasy acronyms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: