NHacker Next
login
▲Blurry rendering of games on Maccolincornaby.me
345 points by bangonkeyboard 14 hours ago | 228 comments
Loading comments...
pezezin 12 hours ago [-]
Am I the only one who finds screens with rounded corners and notches really stupid? We had to struggle for decades with CRTs and their funky geometry, and when we finally get displays with perfect geometry, we botch them again to make them look... cooler?
crazygringo 12 hours ago [-]
Don't think of notches as something stupid that takes away from screen area.

Think of them as something that allows the overall screen area to increase, as bevels shrink.

And then when the corners of the screen are so close to the corner of the laptop, and the corner of the laptop is rounded, it looks weird if the corner of the screen isn't rounded. Like a square peg in a round hole.

Fortunately, it's all context-dependent. So when you watch a video that is inherently rectangular on a Mac, that takes precedence. It's only shown below the notch, and the rounded corners on the bottom disappear.

So it's kind of the best of all worlds. Bigger screen with round corners for actual work (the notch is not particularly objectionable in the menu bar), slightly smaller screen with rectangular corners for video (and games too I assume?).

eviks 8 hours ago [-]
> then when the corners of the screen are so close to the corner of the laptop, and the corner of the laptop is rounded, it looks weird if the corner of the screen isn't rounded.

The rounded laptop corners is a similar design decision. And also it doesn't look weird to me at all compared to rounded corners for windows, especially when they introduce visible desktop garbage in those now-not-perfectly-covering-rectangles corner areas

Etheryte 2 hours ago [-]
Rounded corners are not simply an aesthetic design choice, they make the device more durable as there's less of a pinch point when you drop, snag or mishandle the device.
mcdeltat 10 hours ago [-]
Is bevel size really so important when it's already measured in mm? Personally I like a bit of bevel because I don't want the screen going to the edge of the device. The edge of the device is for holding, not interaction.
crazygringo 9 hours ago [-]
No, screen size is important. For the same size laptop, less bevel means more screen.

And I don't about you but I don't hold the screen half of my laptop. Not unless the laptop is closed. So I don't see how that's a concern.

mcdeltat 6 hours ago [-]
My point was at some point maximising screen size is less important than other usability factors. For phone definitely I do not want screen near the edge because fingers are there. For laptop, true, it's less of an issue. Although while opening my macbook my finger is about 1mm from touching the screen.

And the other comment about wasting screen space is funny. Yeah we need 1mm of extra phone screen space when 60% of most webpages are covered with ads (separate problem but still amusing in combination).

mikestew 7 hours ago [-]
For the love of $DEITY, it’s bezel! :-)

https://www.howtogeek.com/762611/what-is-a-bezel/

mcv 33 minutes ago [-]
I've often wondered about that, because I often see them used interchangeably (and not just here). But you're absolutely right. Here's a random site I found that explains it: https://www.difference.wiki/bevel-vs-bezel/

Is the similarity of the word the only reason they often get mixed up? I think another factor is that flat surfaces surrounded by an angled bevel are fairly common. For example. I just noticed that the bezel of one of my monitors is also beveled.

pezezin 8 hours ago [-]
But a cell phone, maybe. For a laptop, we are talking about a couple of millimetres on each side max, it is not a big deal.
userbinator 7 hours ago [-]
...and then the same "more space!!!111" people run apps which fill their screen with useless whitespace.
lightedman 32 minutes ago [-]
Hi, laptop repair tech here.

You want more bezel. Screens with less bezel have less overall edge protection and will break easier due to opening or closing forces or impacts.

Bezel-less screens are fragile pieces of junk.

10 hours ago [-]
incrudible 4 hours ago [-]
First of all, you can not argue anyone out of thinking it looks stupid or ugly. That is a visceral subjective experience and the extra space or whatever does nothing to make up for it.

Now for the bezels, mind that an equidistant circle (or squircle) radius converges to zero. So to avoid having a large inner radius, avoid a large outer radius. The Macbook is not a tablet, you do not hold its corners in your hand.

However, at some point Apple must have decided that the squircle is its entire visual identity and that hard corners on the XY plane are bad. That creates design problems it would not otherwise have.

The fact that developers are led into the trap of not pixel matching to the display however just shows a lack of attention to detail.

troupo 7 hours ago [-]
The notch reduces the space available in the top menu bar. And since Apple is still incapable of creating a built-in Bartender-like functionality you end up with less space on screen.

It's an actual objective fact of life.

You don't get a larger 15"/16"/17" inch screen. You get a screen that size minus the notch because of a psychotic obsession with thinness. And then they struggle to compensate for that with barely working workarounds in software that don't cover even half of cases.

vvillena 3 hours ago [-]
The screen size advertised by Apple measures the "full screen" area, the undisturbed 16:10 rectangle of pixels. I just took measures on both a 14 and a 16 inch Macbook Pro. The screen we get is indeed slighly larger.

If you want to avoid the extra space, it's as easy as using a 16:10 resolution size. The menubar will drop down to the 16:10 space.

phire 4 hours ago [-]
If the notch was replaced with a 1cm bezel, then the entire top menu bar would move down by roughly 1cm, and I'd have less screen space for actual content. In some ways, a bezel could be considered to be a "notch" that takes the entire width of the screen.

Personally, I've never run out of space in my menu bar, So the notch gives me 1cm of extra screen space.

It's nothing to do with thinness. It's about packing the largest possible display into the laptop's width/height. Sure, you could argue to just make the laptop 1cm higher for that bezel, but then why not add a notch and get 2cm of extra screen height?

troupo 4 hours ago [-]
> If the notch was replaced with a 1cm bezel, then the entire top menu bar would move down by roughly 1cm, and I'd have less screen space for actual content.

That perceived 1cm is largely meaningless for content. And you get less space in the top menu bar.

> Personally, I've never run out of space in my menu bar

I have 27 icons in my menu bar. Not because I collect them, but because quite a few apps add their icons there and I use a few of them.

On the laptop screen it manages to show 10.

IntelliJ idea has 12 top-level menus (I swear they had more). On a laptop the top menu bar manages to show 10 items on the left of the notch, and has to move two more to the right. This both splits the menu for no reason, and reduces the space for icons even further.

The notch has been around for 4 years now, and Apple still hasn't provided a solution for the problem they introduced.

And, of course, when you want to truly take advantage of "more content" you can't because the "safe screen space" without the notch is still squarely below the notch, and apps have to to be very careful to actually use that, or the notch will get in the way.

> It's nothing to do with thinness.

Yes, it does. In this case with thinness of bezels.

> Sure, you could argue to just make the laptop 1cm higher for that bezel

Yes, you could do that if you didn't have an institutional psychosis about thinness everywhere.

gommm 1 hours ago [-]
> The notch has been around for 4 years now, and Apple still hasn't provided a solution for the problem they introduced.

As a lot of people told you, you can just disable it. I've been doing that for 4 years, just set your resolution to a 16:10 ratio and you're good to go. The resolution is exactly the same as it was before they introduced the notch

Personally I like the fact that Apple gives us the choice. I dislike the notch and prefer my menu bar below because I use apps like intellij. My wife likes the notch and keeps it. So, both of us can have what we want.

Maybe Apple could have made it slightly easier to disable it by having an option instead of choosing a 16:10 resolution but, to be honest, most of the people who dislike it tend to be power users who can figure it out.

phire 2 hours ago [-]
If the notch bothers you so much, then why not disable it?

It's actually optional, the functionality is built into Mac OS (just check "show all resolutions" in display setting and pick the notchless resolution). You get that bezel you want, along with the full menu bar.

> Yes, it does. In this case with thinness of bezels.

Device thickness is not the same thing as bezel thinness. If you make a phone/laptop thinner, all you get is less battery life. If you make bezels thiner, you get more display area.

danieldk 3 hours ago [-]
To be honest, I wouldn't be surprised if it is requested and/or greenlit by Apple managers who only use their Macs for Safari or Mail. There can't be another way, because the menu splitting and icon disappearance is pretty infuriating.

Luckily I have mostly used Macs on external screens the last few years. But it ticks me off every time I actually use a MacBook as a laptop.

monsieurbanana 5 hours ago [-]
I don't understand your logic unless you're implicitly saying the camera needs to go away, but then you should say so (so that I can properly disagree with you).

The notch makes for a smaller menu bar but without the notch there would be no menu bar there, it would take the space underneath instead.

5 hours ago [-]
troupo 5 hours ago [-]
> I don't understand your logic unless you're implicitly saying the camera needs to go away

Yup. We never had camera before the notch.

> The notch makes for a smaller menu bar but without the notch there would be no menu bar there

Yup. Before the notch we neither had a menu bar that could comfortably fit most menu items even in professional apps, nor did we have a camera.

BTW, you literally are saying "The notch makes for a smaller menu bar". Imagine if I wrote that as the first sentence in my comment, then there would be no misunderstanding

ylk 4 hours ago [-]
The screen is a 16:10 screen with some extra pixels added next to the notch. By default, the system uses a resolution of 1512x982 (14"), which you can change to 1512x945 (16:10) to move the menu bar below the notch and end up with black pixels next to the notch.
troupo 4 hours ago [-]
"If you go make weird contortions and workarounds you might just find a semi-working non-solution to a problem that didn't exist until Apple introduced it".

Also my response here: https://news.ycombinator.com/item?id=44909996

Yokolos 3 hours ago [-]
What MacBook doesn't have a camera?
danieldk 3 hours ago [-]
They are being sarcastic because the parent post put forward a false dichotomy between a notch and a camera or no notch but no camera.
anonymoushn 5 hours ago [-]
my menu icons are hiding behind the notch lol
dlivingston 10 hours ago [-]
I've got a MacBook with a notch and I literally never notice it. That's a common sentiment among MacBook users, btw -- I very rarely seeing people complain about the notch. I'm sure there are some, but seems to be a minority.
bbrks 2 hours ago [-]
I frequently notice when my menubar items overflow and get truncated behind the notch with no built-in way to actually see them!

You have to hoop jump with janky tools[1] that actually let you see and access the icons silently hidden because they overshot the notch.

[1] https://macmenubar.com/menu-bar-managers/

bombcar 9 hours ago [-]
You didn’t lose pixels to the notch, you gained pixels beside it!
gwd 4 hours ago [-]
Look, given current technology there are exactly three options:

1. Don't have a front-facing camera on your laptop. Your actual laptop screen can be as close to the edge of the laptop case as technology will allow, with no bevel.

2. Have a front-facing camera on your laptop, with no bevel. Your actual laptop screen is now square, but has to be 5mm shorter now. There's a 5mm strip at the top of your laptop that can't be used for anything.

3. Have a front-facing camera on your laptop, with a bevel. Now the actual laptop screen has a "dead space" in middle at the top, but can again be as close to the edge of the laptop case as technology will allow. There's a narrow strip at the top that can't be used for anything, but the area to the left and right of the bevel can.

(A number 4 would be to somehow have a front-facing camera that operates without needing to displace screen area. Not clear how this would work without some complicated mechanism to extend the camera out the top of the screen, which would come with its own problems.)

Now, the vast majority of the time, you're going to be using your Mac in a windowed environment, with menus on the left, indicators on the right, and absolutely nothing in the middle.

In the case of #2, this menu bar has to take real estate from the top of your shorter screen, meaning that your windows are all 5mm shorter. #3 allows the menu bar and indicators to take up that space which on #2 is completely dead, freeing up extra space for your actual applications.

And the key thing is this: For modern full-screen games, in #3, you (apparently) can't use the areas to the side of the bezel; but this is the same situation you're in in #2.

IOW, as another commenter has said: The bevel design doesn't take away screen in the middle; it adds screen space on the side of the bevel.

That said, the API here seems obviously mad: What's the point of giving you a resolution if it's going to silently resize it behind your back? It should either give you the resolution which it won't resize, or throw an error when you try to make one higher.

2 hours ago [-]
dvfjsdhgfv 2 hours ago [-]
While I agree in general with your comment, a small nitpick on this:

> There's a 5mm strip at the top of your laptop that can't be used for anything.

Well, it can - on mine I have a physical switch that allows me to block the lens, and it saved me a few times. (I just wish I had a similar one for the microphone.)

bee_rider 12 hours ago [-]
Are the screens OLED? The phones are...

IMO the notch is pointless, but they need space for the front camera. With OLED they can just turn the pixels off when it suits the application and it becomes like a big bevel, which was the alternative anyway.

buildbot 5 hours ago [-]
My M1 MacBook Pro turns off the display & backlight that would show the notch as needed, for example right now in full screen Safari you could not tell there is a notch/menu bar area at all. It's actually just as good as it can be already. Free extra space!

Bezel not bevel FYI.

MBCook 11 hours ago [-]
Expected in the next year or two on the pros, but not yet.
userbinator 9 hours ago [-]
I've always preferred the look of non-antialiased fonts on an LCD for this reason - you can finally actually see the sharp square corners of a pixel, yet somehow many like the blurry smoothness of antialiasing to make the screen more like a CRT.
Findecanor 5 hours ago [-]
I prefer text to be hinted and antialiased even in low resolutions on a CRT. The first time I saw this on BeOS in the 90s I thought it was a big step up. You get corners snapped to pixels without jagged lines in-between. However, hinting makes text wider or narrower depending on font size and resolution.

Apple chose not to have hinting so as to make the text more dimensionally accurate. That did look blurry ... so Apple then doubled the screen resolution.

userbinator 7 hours ago [-]
Please explain why you disagree?
bowsamic 6 hours ago [-]
People obviously think your comment is sarcastic
floppyd 5 hours ago [-]
I don't have a notched laptop screen, but I do like rounded corners, makes screens feel just a little bit more natural. My MacBook is older and doesn't have rounded corners, but I got used to them so much that I ended up simulating them in software.
jeffhuys 5 hours ago [-]
Menu bar is on top on mac. I see it as the top bevel becoming a screen, not taking away from any screen real estate, actually FREEING up space for apps. I never see the notch anyway because my wallpaper is black.
trinix912 5 hours ago [-]
Until you want to do something full screen, then it might be a glaring black rectangle on top of the screen. The way to do it properly is to hide the camera in the thin bezels or somewhere else (inside the LCD, are we there yet?). This is just a lazy solution that designers probably even thought seems iconic.
jeffhuys 3 hours ago [-]
I get what you mean, but to me it just looks like before there was a notch, like there's no screen there. I can't really tell where the bezel ends and the screen starts if it's all black. Just looks like a large bezel.

Guess it's just a matter of taste. I'm never bothered by it.

msh 5 hours ago [-]
But if you hide it in a small bezel or behind the LCD you will get significantly worse camera quality.
jeffhuys 3 hours ago [-]
The quality is already insane if you think about how thin it has to be, with "glass" in front of it and the alu behind it. Pinch it with your fingers, imagine a CAMERA being thinner than that...

I feel like the only one that's still amazed by tech sometimes. I still look at planes and feel amazed we puny humans did that. It's never good enough.

AceJohnny2 7 hours ago [-]
It goes deep.

obligatory: https://www.folklore.org/Round_Rects_Are_Everywhere.html

julik 3 hours ago [-]
No, you are not the only one
simondotau 2 hours ago [-]
For what it's worth, Macintosh had rounded corners since its inception in 1984. They were just bitmaps in the screen corners and still addressable by software (and even by the mouse cursor in any context) but this demonstrates an aesthetic heritage.

These software rounded corners disappeared in software when CRTs gave way to flat panels, but they are indisputably part of the Apple design aesthetic.

bigstrat2003 10 hours ago [-]
Nope. I think they're incredibly stupid as well.
jama211 6 hours ago [-]
You realise you can use the largest rectangular section of the screen as a rectangular screen right? Play a video and you’ll see. It’s the best of both worlds
behnamoh 13 hours ago [-]
This just shows how little Apple cares about gaming on Mac. It's so sad that I spent thousands on multiple Mac devices (MBP, M Studio, etc.) only to be bottlenecked not by hardware, but by Apple's shitty approach to gaming software.

I know why they do it though. Apple can't take their undeserved 30% cut of Mac games the same way they take their iOS cuts.

We had a teacher years ago who said something that remains true until today: "everything is about money, especially the ones that appear to have non-monetary reasons."

wlesieutre 13 hours ago [-]
I think their bigger problem is there's shit for documentation. You get a big list of function signatures and if you actually want to know how you're supposed to use anything you find the WWDC session from 4 years ago and hope it's still accurate.
whstl 58 minutes ago [-]
I can attest to that. I used to freelance doing porting about 10 years ago, and the main way of having a stable event loop for a game on macOS is not really anywhere clear in the documentation, at least not anywhere I could find.

Libraries and SDL, GLFW and Sokol handle this for you, and I had to poke inside them to actually figure out how to get the same performance of other games.

In a nutshell, the trick is simple, you call [NSApplication finishLaunching] and do a while-loop yourself instead of calling [NSApplication run].

internetter 10 hours ago [-]
A WWDC session, I might add, with a mildly relevant title and rarely a hyperlink from the relevant documentation, let alone a timestamp.
grishka 10 hours ago [-]
But if you find anything with the dark-blue "documentation archive" header, then you know it's some good stuff. Though sometimes outdated.
trinix912 5 hours ago [-]
We've come from the monstrous Inside Macintosh tomes, to that early OS X documentation (that's now in the documentation archive), to this seemingly programmatically generated from doc comments thing.
grishka 1 hours ago [-]
Next they'll start generating documentation from code using LLMs...
trinix912 5 hours ago [-]
What's even worse is they used to have great documentation, then for some nonsense reason archived it and replaced it with this wannabe JavaDocs crap with vague function explanations that leave more questions open than they resolve.

They now tell people to watch WWDC videos as if those relatively short videos contained the same amount of information a proper API documentation does.

spike021 5 hours ago [-]
honestly i don't see this as Apple-specific but more of a sign of the times. i've seen a few libraries and vendor platforms we use at work with the same style of documentation (really lack thereof).
jiggawatts 4 hours ago [-]
Microsoft's documentation quality very noticeably fell of a cliff about a decade ago. Now most of their public APIs and SDKs have hundreds of thousands of "documentation" pages that are just the name of the function with spaces added between the words. There might also be a listing of the parameters, which helpfully tells you that "string param1" is called "param1" and takes a string as the data type.
xcf_seetan 12 hours ago [-]
Somebody told me that the quality of a company is directly proportional to the quality of its documentation...
makeitdouble 11 hours ago [-]
Boeing's documentation is probably excellent.

In general it is assumed that documentation is the boring part and paying attention to it is a sign of quality. But where do you put people who prefer writing and/or teaching to thinking long and deep about product issues ?

unscaled 10 hours ago [-]
Perhaps the biggest issue with the 737 MAX was lack of documentation. Boeing consciously chose to completely omit several new features (most notoriously MCAS) from the pilot manuals in order to make it seems like nothing important was changed that would require retraining 737 NG pilots on a simulator.

In a sense, this is far worse than what most tech companies do since it is a life-and-death issue and Boeing has made a very conscious effort to hide details here, rather than just being lazy.

cpgxiii 9 hours ago [-]
One could say that "the point" of MCAS was to be an implementation detail, something that you would often deliberately hide from software documentation so that users don't design around internal details.

There's something of a history of aerospace vendors omitting "implementation details" that end up contributing to serious accidents (e.g. if you get an Airbus far enough out of the normal envelope protections, you lose stall warning), and an equally sordid history of flight and maintenance crews improvising procedures to the observed (rather than designed/specified) behavior of aircraft systems.

Arguably, the single biggest systematic risk in the current pilot training system is that crews overlearn to the implementation details of their training, rather than the actual principles and flight manuals (e.g. training inadvertently training for quick engine shutdowns, when the consequences of shutting down the wrong engine in reality are much more serious).

sofixa 6 hours ago [-]
> There's something of a history of aerospace vendors omitting "implementation details" that end up contributing to serious accidents (e.g. if you get an Airbus far enough out of the normal envelope protections, you lose stall warning),

And Airbus control laws and protections are well defined and studied by pilots training for them.

reactordev 11 hours ago [-]
You encourage them to blog about it and you aggregate the posts into documentation - somehow. (I'm sure there's people for that, or an AI, or something). They had to explain it when they implemented it - explain it on a post. Confluence. Somewhere.

You're right though.

serf 9 hours ago [-]
>Boeing's documentation is probably excellent.

it isn't, it's just ISO/NADCAP conforming.

eru 10 hours ago [-]
Tech writer. Google famously employs quite a few of them for internal documentation.
jiggawatts 4 hours ago [-]
I've read one of the manuals for the 787 management software stack, and it was... beautiful. It read like a novel. I read it twice on the train for the simple pleasure of it.
thfuran 12 hours ago [-]
I guess mathworks is one of the best companies ever.
dlivingston 10 hours ago [-]
Microsoft has excellent documentation. So that's one counterexample. :)
cosmotic 10 hours ago [-]
It was excellent. It's a mess last time I looked.
userbinator 9 hours ago [-]
Their documentation for the very newest stuff contains very obviously AI-generated inaccurate and misleading crap.
userbinator 7 hours ago [-]
Don't believe me? Go look at it yourself. They've even "edited" the old docs with AI slop.

https://learn.microsoft.com/en-us/windows-hardware/drivers/u...

"The characteristics of the endpoint determine the size of each packet is fixed and determined by the characteristics of the endpoint."

noname120 4 hours ago [-]
That sentence was already there _at least_ since Dec 2022, stop seeing AI everywhere.

https://web.archive.org/web/20221214171028/https://learn.mic...

WithinReason 6 hours ago [-]
If they used AI the sentence would be coherent
sofixa 6 hours ago [-]
No they don't. Just last week I was looking for something, found a Google result from Microsoft's docs, but opening it resulted in a page saying they're moving docs around, this one isn't being moved, fuck off. No content, no link to the content.
saagarjha 9 hours ago [-]
Most of this stuff uses APIs greater than four years old, so you're probably not getting those videos unless you look at third party rehosts.
makeitdouble 11 hours ago [-]
They sure don't care about games on Mac, but I think this specific issue is more due to trying to do "magic" for better or worse.

Introducing the notch creates this "here but not really usable" area that is specific to some displays, and Apple touts it as something that will just work thanks to their software layer.

Yet every app has to deal with it one way or another and most apps that care about screen estate or resolution will need hacks.

It will be the same for every feature that pushes the boundaries and offers an abstraction layer as an answer to how the issues will be solved.

mort96 5 hours ago [-]
No, every app does not need to deal with it, at least in principle. Apple could literally have provided a resolution list which only covers the 16:10 resolutions which aren't covered by the notch, and then games literally wouldn't have known that they were running on a machine with a notch.

The problem here is 100% that Apple, for some reason, provides both 16:10 "safe" resolutions and the actual whole screen resolutions. All those non-16:10 resolutions are a trap, because they cause the compositor to downscale the window to fit below the notch anyway -- they serve literally no point. There is no situation where a user would want a window that's 16:10.39 or whatever and then downscaled to fit within the bounds of the 16:10 safe area, so there is no reason to provide those 16:10.39 resolutions in the list of display modes.

Apple botched the implementation, but the hardware would've been perfectly reasonable and caused no issues or complexities for either users or developers if Apple hadn't botched the software.

TechSquidTV 9 hours ago [-]
Im on an M3 Ultra Mac Studio. A $4K computer. And I'm not sure why but if I try to stream anything on StreamLabs (even with no game open), the recording is laggy and choppy. I just don't understand how that's possible.
tbolt 2 hours ago [-]
OBS works fine. StreamLabs could just be bad software.
Shorel 2 hours ago [-]
Macs were bad for gaming long before the software Apple Store existed.

Steve Jobs was dismissive about gaming, even when PC gaming was on the rise, that's the reason.

latexr 13 hours ago [-]
> I know why they do it though. Apple can't take their undeserved 30% cut of Mac games the same way they take their iOS cuts.

Why would Apple be deliberately sabotaging the experience? They would gain nothing from it. That argument makes even less sense when you consider most of the games mentioned in the article are on the Mac App Store, Apple can take their cut.

https://apps.apple.com/us/app/control-ultimate-edition/id650...

https://apps.apple.com/us/app/shadow-of-the-tomb-raider/id14...

https://apps.apple.com/us/app/riven/id1437437535

https://apps.apple.com/us/app/cyberpunk-2077-ultimate/id6633...

https://apps.apple.com/us/app/stray/id6451498949

This is an obvious case of Hanlon’s Razor. Anyone who develops for Apple platforms and has had to file Feedbacks is aware of Apple’s incompetence and lack of care.

goosedragons 12 hours ago [-]
Except for the handful of Mac ports exclusive to the Mac App Store, who with a lick of sense would choose to buy from there? Steam, GoG and Epic are all more feature rich, have more often sales/3rd party resellers and throw in the PC version too.

On iOS there is no choice.

lowkj 11 hours ago [-]
When No Man’s Sky first announced a Mac port, they promised to release on Steam and App Store. I waited a year after the Steam release for the App Store release which never came.

I am a very casual gamer and sometimes weeks go by without playing. My first experience with Steam was with Civ VI. Long flight, no internet, great I’ll play some Civ! But instead of opening up Civ, I was forced to open Steam which would then not allow me to play my own game because I hadn’t authenticated recently enough for them. Or I would try to play and Steam would say, oh first you need to download some huge update before you’re allowed to play your single player entirely offline game.

I know theoretically GoG is supposed to solve this issue but no Mac game I wanted was available there. Finally Cyperpunk 2077 launched on multiple stores and I bought on GoG. Even then, the default option became to use the GoG launcher. If I wanted DRM free download, there was some immediately complicated off putting set of instructions, downloading something like 20+ files, etc.

App Store experience, I click download, it downloads. I open it, it opens the app and not some launcher. Everything just works.

acomjean 22 minutes ago [-]
I’ve been a long time iOS user. A fair amount of my purchased software won’t run anymore as it wasn’t updated from 32->63 bit change. I had some Mac software in the same boat. App Stores have some ease of use advantages, but it “just works” till at some point it doesn’t. I’ll angree it’s annoying to have be internet connected to use the software. FWIW the steam deck seems to work with games offline, so maybe they fixed some of those issues?
talldan 8 hours ago [-]
I guess that's true, though I don't think you have to launch games through the steam app, but they try to make it convenient to do so.

You can also right-click the game and 'Browse local files' and the game's regular executable is usually right there.

I'm currently playing the Oblivion remake, and launch that through a mod manager rather than Steam (though on Windows), even though the game was installed via Steam.

bigyabai 11 hours ago [-]
Surely this isn't your policy for all software. I've known many people that defend the App Store on iOS this way, but nobody that uses a Mac exclusively for App Store software.
latexr 4 hours ago [-]
That doesn’t matter. We’re not discussing “all software”, we’re discussing games. You can simultaneously dislike the Mac App Store and still prefer it over Steam. Those ideas aren’t contradicting.
latexr 4 hours ago [-]
That is not only completely irrelevant to the point, it’s also wrong and inconsiderate of the preferences of others. I agree with the other commenter that the Mac App Store has advantages in terms of experience, especially if you’re not constantly gaming.

* Steam is constantly updating, every time you open it, and until recently videos would almost always fail to play on my Mac.

* The Epic Games launcher is so atrocious that calling it “feature-rich” feels like a bad joke. I find it so bad that I don’t even open it to get the free games, opting instead for the website, and even then I am super selective about any game I get (fewer than 10%) because I always think I’ll have to deal with that app. In its current state, there is zero chance I’ll ever by a game on there, all because of the app. My “favourite feature” is how if you queue a bunch of games to install and then set a few others to uninstall, those are added to the same queue and you have to wait for the installs to finish before the uninstalls get a chance. So if you are low on disk space, now you have to, one by one, cancel each of the installs and tell them to start again, so they are added to the bottom of the queue.

* GOG Galaxy was the biggest disappointment. I was expecting to like it but it only lasted an hour on my machine before I trashed it. It felt old and incomplete.

goosedragons 2 hours ago [-]
It's not irrelevant. How much is Apple really making off Mac game sales from the MAS?

Compared to the MAS, Epic is a good launcher for games. Take a look at CP2077. On the MAS you can't just download the language you need, you have to get all of them. This increases the download by 60GB. No other platform has this issue. So it ends up being 160GB which is nuts and more than half the storage on a base model M4 Mac. It's insanely barebones and half assed for gaming.

whateveracct 11 hours ago [-]
It's a shame because a Mac Mini is a solid gaming computer.
lostlogin 11 hours ago [-]
I just replaced a nuc 9 ghost canyon with a Mac mini.

The difference in power consumption is insane. Nuc was 67 watts. The mini is 4-10. I don’t have enough data for a long term average on the mini yet, but it’s ludicrously efficient.

6SixTy 10 hours ago [-]
Is it me or is a 6 year old product on 14nm against whatever isn't exactly the most level of comparisons?
galaxy_gas 9 hours ago [-]
Its been across the board for me. Have ryzen 6xxx-8xxx mini PCs that are 45-60W and my 5 year old M1 Mini is comparable at a fraction of power draw
protimewaster 9 hours ago [-]
I would think a low TDP Ryzen chip shouldn't be that much different. Were those the 15W TDP Ryzens?
galaxy_gas 9 hours ago [-]
30W and 54Ws. iGPU performance is also better on the Mac compared to the non-65W Ryzens.

The good thing is that the Ryzen boards can be upgraded on RAM and Storage but that seems to be changing with soldered DDR5 on many mini PCs now.

conception 9 hours ago [-]
Mac Mini is i believe the most efficient consumer computer made ever. It uses the same energy most computers Ethernet ports use.
TheDong 7 hours ago [-]
> Mac Mini is i believe the most efficient consumer computer made ever

My raspberry pi is sitting at 3 watts, and runs linux software just fine.

A Mac Mini uses multiple times that, and can barely run any software I care about (i.e. linux).

They're both consumer computers, and yes there's a massive difference in their capabilities, but "most efficient ever" is a really strong claim.

ZaoLahma 6 hours ago [-]
The raspberry pi is great. Absolutely love them. Been running them as dev boards (robotics) and servers (at home) for 10+ years. Last time I truly, genuinely, felt excited about Christmas was when my SO gave one to me back in 2014 or so.

But I wouldn't compare the experience to a Mac Mini, and I wouldn't call a raspberry pi a "consumer computer". It solidly falls into the amateur dev board category. Heck, it doesn't even have a power button.

nativeit 11 hours ago [-]
I don’t try to game on my laptop, and I have always thought “gaming laptops” are somewhat akin to “racing-spec BarcaLoungers”. That’s why I’ve never understood why so many people bitch about not being able to game on Macs. If you want to play video games, you should probably buy a console or a desktop PC.

White-hot take: you’re allowed to own a PC and a Mac. They aren’t like matter and antimatter, you won’t collapse the galaxy or anything.

freehorse 11 hours ago [-]
People are complaining because the hardware is capable but there are other reasons may make gaming on a mac an annoyance. Not because macbooks are not powerful enough to run games. And to be fair, macbooks can perfectly run games, and indeed a lot of titles perform great, even if most have to use some sort of compatibility layer (roseta 2, vulkan to metal etc). Why should I get a PC just for gaming if my machine can handle that? Especially something like a big form factor gaming PC that takes quite a bit of space in a room.

And this is not a matter of laptop vs desktop because most of these issues (not the notch) will be present for a mac studio.

In fact apple itself has started advertising gaming in macs in WWDCs last years. So it is only fair that people will complain about it.

But in general gaming in macs is perfectly feasible (maybe not the latest most demanding graphics game at max settings, but most of the stuff). You may miss certain graphics features that may not be available for macs, but otherwise performance is not bad. Even playing windows versions of games through whisky or CrossOver is perfectly feasible.

Fade_Dance 11 hours ago [-]
>That’s why I’ve never understood why so many people bitch about not being able to game on Macs.

Well that's where you're wrong then. It's perfectly possible to game on a laptop. Over the last decade with the development of the lightning ecosystem and docks it's become a very low friction endeavor as well.

There are a myriad of quality "gaming" laptops out there, but obviously that's not for everybody, and that's one of many reasons why workstation laptops exist (basically the same thing - ability to run at high loads for extended periods of time - without the gamer aesthetic).

There are many casual gamers out there who don't want to shell out for a dedicated gaming device, nor should they win a laptop is a perfectly adequate device to game on. It's not that hard to comprehend.

eru 10 hours ago [-]
I mostly agree.

> There are a myriad of quality "gaming" laptops out there, but obviously that's not for everybody, and that's one of many reasons why workstation laptops exist (basically the same thing - ability to run at high loads for extended periods of time - without the gamer aesthetic).

For the last few years, I thought 'gaming' was more GPU heavy and 'workstations' were heavier on the CPU and RAM?

Though I wonder how much that has changed recently or will change soon: after all with the rise of (local) AI, perhaps more work will move to the GPU?

delta_p_delta_x 5 hours ago [-]
> For the last few years, I thought 'gaming' was more GPU heavy and 'workstations' were heavier on the CPU and RAM?

On workstations you can get both. The Dell Precision 7XXX, HP Zbook Fury, and Lenovo P5x/P7x series are the primary high-end notebook workstations, and they are all nearly infinitely configurable with a myriad of CPU, GPU, memory, storage, display, and connectivity options.

I myself have a Precision 7560 from 2021 that has a Xeon W 11955M, and an RTX 3080 Laptop GPU that's roughly between a desktop 3060 and 3070 in performance due to the power limit of 90 W.

Before that I had a Precision 7530 from 2018 that had a Xeon E 2176M.

Both Precisions have 4 DDR4 slots for up to 128 GB, 3 M.2 slots (and an additional one for the WiFi module), and full repair/service manuals online.

Fade_Dance 10 hours ago [-]
I think you are correct, although some work stations are possible to be specced with the "big" GPUs found in the gaming laptops. On the "gaming" side, the hulking neon plastic gamer aesthetic somewhat gave way to a new popular class of gaming-laptop that is more sleek, and those machines often have the mid-range GPUs that you'd find in a workstation. It's all fairly blurred now.
nottorp 3 hours ago [-]
> There are a myriad of quality "gaming" laptops out there, but obviously that's not for everybody

If you're talking x86, only if you're okay with hearing loss :)

Or really have no choice. Student who has space/funds for one device and it has to be portable, for example.

eru 10 hours ago [-]
Different people have different trade-offs. Some people like the laptop form factor and want to game on it.

Just like people like the hand held form factor of the Nintendo Switch, but it's still entirely fine for them to complain about the system's low specs. Especially when the Steam Deck and the Rog Ally show that you can do better.

latexr 4 hours ago [-]
Not everyone has the extra money or space to spend on another device just to play games from time to time. I don’t even own a TV, I’m not going to get one, plus a console, plus rearrange my living room and get new furniture to house them, to play games occasionally. Nor will I buy an extra expensive, bulky, and noisy desktop machine and monitor, plus (again) rearrange my furniture.

Additionally, laptops are more than capable of playing demanding games these days.

makeitdouble 11 hours ago [-]
> If you want to play video games, you should probably buy a console or a desktop PC.

Push this logic one or two notches further and people should write and build code on desktop only with an e-ink portrait monitor.

Specialization has its place but asking a generic computing device to gracefully handle most applications isn't a big ask IMHO.

They're not running the latest top notch games or anything exotic, it should be fine with no tinkering.

nottorp 3 hours ago [-]
> If you want to play video games, you should probably buy a console or a desktop PC.

> White-hot take: you’re allowed to own a PC and a Mac.

I do own a console, a pc and two macs. But for what I've paid for the macs Apple should make it easy for me to play games on their hardware if i so choose.

There is some hope: Lies of P is actually very playable on the minimum requirements M2/16G ram. And you get the Mac version on Steam with the Windows version so you're not limited to a single platform.

wyre 11 hours ago [-]
Valid and correct take, but I don’t want to own a Mac and a PC. I’d like to be able to run casual games on my MacBook. I don’t game often or heavily enough to warrant purchasing a rig for it.
5 hours ago [-]
chickenzzzzu 11 hours ago [-]
White-hot take: you are allowed to own an outhouse and a urine only toilet. They aren't like matter and antimatter, and so on.
esseph 7 hours ago [-]
I so glad someone else was saying what I was thinking.
monkeyelite 8 hours ago [-]
> "everything is about money, especially the ones that appear to have non-monetary reasons."

There are countless other incentives as tangible as money including meaning, status, security, fame, etc.

If you spend time around people with money you will find they will happily trade it to achieve these.

What this belief signals most strongly to me is your class.

behnamoh 8 hours ago [-]
> What this belief signals most strongly to me is your class.

What this "classism" mentality signals is your sense of superiority due to the amount of $ you have in a bank.

monkeyelite 7 hours ago [-]
Class is not money - and you're making non-personal comment a personal one.
behnamoh 7 hours ago [-]
> non-personal comment

You literally said "What this belief signals most strongly to me is your class.". So that wasn't personal?

monkeyelite 6 hours ago [-]
No - this is a general statement. People who are saying that everything is about money are revealing they don’t appreciate other motivations.
capitol_ 6 hours ago [-]
But how do that relate to their class? The working class desire money just as much as the owning class.
rustystump 5 hours ago [-]
Depending on culture it very much is. Money is a direct signal of status in a way most other things are not as it is correlates to resources which nature selects for.

In the us of a, money is the ultimate status and often used as the measuring stick to the value of ones ideas, personhood, and everything in between.

I cannot stand it but it is what it is.

bowsamic 6 hours ago [-]
Which class does it signal to you?
mvdtnz 12 hours ago [-]
> I spent thousands on multiple Mac devices

I mean... yeah why would they change?

eru 10 hours ago [-]
Generally, if you lower prices you can sell more volume.

The profit maximising trade-off for Apple might however be where they are right now. Not sure.

jama211 6 hours ago [-]
This is exactly the old man shaking fist comment that I unfortunately expected to see at the top of a hackernews thread. Not everything is a big conspiracy.
bowsamic 6 hours ago [-]
That companies do things to many money is not a “big conspiracy”
eru 10 hours ago [-]
> We had a teacher years ago who said something that remains true until today: "everything is about money, especially the ones that appear to have non-monetary reasons."

That's more of an aspiration than a statement of fact.

Eg I'd happily have organised a crowdfunding to give Vladimir Putin a few dozen billion dollars to bribe him away from starting a war. And if you look at Russia's finances that's effectively what happened: he (and even more his oligarchs) predictably lost untold billions in revenue and profit for this war.

Also, migration from poor to rich countries increases workers' pay so much, that you could charge them a hefty sum for the privilege and they would still benefit from coming. However voters by and large don't like it, even if you were to distribute the proceeds amongst them (as a "citizen's dividend" or whatever).

They have non-monetary reasons.

See https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d... for an interesting collection of historical cases drawn from such diverse sources as Apartheid and Post-Apartheid South Africa, Malaysia, Nazi Germany.

wetpaws 12 hours ago [-]
[dead]
Razengan 11 hours ago [-]
What? This seems like just games not defaulting to the correct resolution for the display they're on. The first thing I do in every game I play (on any platform) is go through the settings, and make sure the resolution matches the screen's physical resolution. If the game is dumb about it, I choose a smaller multiple or run in windowed mode or on an external display.
iwontberude 13 hours ago [-]
to be fair there is only so much you can do inside the power envelope of 100W or so
jsheard 12 hours ago [-]
The Xbox Series S only uses about 80w under full load, and that's built on TSMCs old 7nm process. Apple's bleeding edge 3nm silicon should be able to do even more with a similar power envelope.
semiquaver 12 hours ago [-]
100W TDP ought to be enough for anyone.
13 hours ago [-]
gchamonlive 13 hours ago [-]
Not really. If you compare it to gaming rigs, they bear radically different architectures, so you can't really compare them by TDP or power requirements. They don't emit the same hear or require the same amount of power per TFLOP. And I wouldn't be surprised if tflops also wouldn't translate to actual compute room for shaders.

Even if they did, 100w should be room enough to play relatively recent titles, specially indie ones. Nothing really excuses Apple from this contempt it has for the gaming market.

theandrewbailey 13 hours ago [-]
> Even if they did, 100w should be room enough to play relatively recent titles

Steamdeck runs on 45W, and that's plenty enough power to have fun.

rafram 11 hours ago [-]
I briefly owned a Steam Deck (before returning it), and it seems like most users tone down their expectations a lot compared to PC gaming. 30fps at the Deck’s low resolution seems to be the norm for recent games. Enough power to have fun, sure, but I think people would rightfully pan it if it weren’t Valve.
terribleperson 2 hours ago [-]
Expectations are lowered appropriately, I think. There are a number of other handhelds on the market. Of the high end PC handhelds that came out reasonably close to the Steam Deck, they have to trade away a lot of battery life to perform better than the Steam Deck. Valve was clearly optimizing for battery life and I don't think it was a bad choice. The Switch 2 isn't even held back by x86 and it's battery life is still quite painful.

Now, handhelds with newer hardware are definitely going to trounce the steam deck without having to trade away battery life, but I think they did the best they could at the price point and time.

serf 9 hours ago [-]
absolutely agreed.

you swallow the taste of the terrible hardware of a steam deck for the support of valve/proton/devs and the ecosystem.

Rohansi 12 hours ago [-]
The included charger is 45W but the chip consumes less.
DaiPlusPlus 13 hours ago [-]
> Nothing really excuses Apple from this contempt it has for the gaming market.

Considering how the typical self-identifying "gamer" conducts themselves online, I think Apple might be on to something...

neutronicus 11 hours ago [-]
What, you mean you aren't champing at the bit to get death threats over a bug in a piece of software for which 70 dollars was paid three years ago?
Almondsetat 13 hours ago [-]
While the differences between ARM and x86 CPUs are quite substantial, Apple's iGPUs follow the same architecture as all the brands
ziml77 13 hours ago [-]
Not mentioned about WoW in here is that they considered the notch enough to also have an option to have the UI avoid the notch. It calls a function in C_UI to get the safe region, and then resizes UIParent to fit within that region while still rendering the game up to the true top of the display.
pdpi 13 hours ago [-]
WoW has always been exceptionally good at treating macOS like a first-class citizen. It's a shame Blizzard has stopped supporting macOS for their newer games.
cosmic_cheese 13 hours ago [-]
Back in the day they even implemented near-zero-performance-impact video recording that leveraged AVKit/Quicktime as a Mac-exclusive feature, which was pretty neat. It let me get silky recordings where Windows user guildmates’ videos skipped and stuttered from having to run both WoW and much heavier third party recording software.
rarepostinlurkr 7 hours ago [-]
If anything it's an example that games can be amazing on macOS, if developers take the time to learn and use the system. WoW on macOS is a far superior experience to WoW on Windows.
burmanm 29 minutes ago [-]
By what standard is it superior to WoW on Windows? Starting from the available hardware, the Windows world is far superior for WoW resulting in better visuals and more consistent fps which is a good start of having enjoyable experience.

Also I really can't think of the way Mac's mouse settings in WoW work better than the Windows ones. Scrolling, mouse movement are all having more settings under Windows to customize them compared to the experience on Mac (even with all the tools such as Mos to help with the bad external mouse handling in Mac).

So what exactly is so superior? I play WoW on both platforms and when I have a choice (that is, when playing at my home desktop - on my summer cabin I only have the Mac option) it's always the Windows one I select. It's nice that it works in Mac so I can play everywhere as I only have Mac laptop, but the sometimes weird graphics bugs, worse performance and odd issues with rendering displacing my WAs and other stuff just doesn't inspire confidence in Mac gaming compared to using Windows. And the mouse movement.. ugh.

nottorp 3 hours ago [-]
Heh, I play mmos in a window if i play them.

WoW on Linux with wine used to be a superior experience to WoW on Windows. Because the system didn't lock up for half a minute when I alt-tabbed.

[Haven't played WoW in a long time though, maybe in Windows 10 and up you have an easier time alt-tabbing between heavy 3d apps.]

dontlaugh 13 hours ago [-]
Perhaps one or two people on that team have always had personal macs.
odo1242 5 hours ago [-]
This has been my experience with software dev teams and Macs lol (the software’s quality on MacOS is directly proportional to the number of team members who have Macs)
debugnik 13 hours ago [-]
> But World of Warcraft is an older game using the legacy CoreGraphics display services full screen API. That API actually allows World of Warcraft to draw into the notch.

Not knowing much about Macs, I would have thought games were supposed to render full screen around the notch for immersion but respect the safe area for UI and gameplay. Are they supposed to leave a menu bar on macOS?

> Control gets around the issue by just making up its own resolutions.

That's hilarious, I wonder if they had trouble enumerating resolutions and gave up or if they simply couldn't bother.

freehorse 11 hours ago [-]
> Are they supposed to leave a menu bar on macOS?

Depends also how the specific game is implemented, but often that area is just black and inaccessible, as in you cannot even move the cursor there. It is as if the screen does not include the area above the notch anymore, ie how the screen would be if there was no notch, like the m1 air.

debugnik 11 hours ago [-]
Thanks for explaining! I see this is common then. If I'm understanding correctly the docs of NSPrefersDisplaySafeAreaCompatibilityMode[1], this is supposed to be a compatibility mode that apps can opt-out of to behave like I expected, and can be force-disabled from Finder.

So I guess rendering around the notch is the intended experience, but devs don't seem to know or care enough to opt-out, and the bug here is that enumerating display resolutions doesn't take this compatibility mode into account.

[1]: https://developer.apple.com/documentation/bundleresources/in...

freehorse 10 hours ago [-]
Hah I was not aware that a user can actually select that in finder. Really useful to know. Thanks for sharing!
reactordev 14 hours ago [-]
Oh it’s not just Apple…

This was an issue I also discovered on Xbox 360 in 2008. TV’s have overscan and depending on that setting, your resolutions will be off.

However, at the time, we couldn’t create render targets that matched the overscan safe area. XNA added a Screen SafeArea rect to help guide people but it was still an issue that you had to consciously develop for.

Now, we can create any back buffer size we want. It’s best to create one 1:1 or use DLSS with a target of 1:1 to the safe area for best results. I’m glad the author went and reported it but ultimately it’s up to developers to know Screen Resolution != Render Resolution.

Anyone using wgpu/vulkan/AppKit/SDL/glfw/etc need to know this.

DaiPlusPlus 13 hours ago [-]
If I understood you correctly... you wanted to be able to render to a slightly smaller surface to avoid wasting graphics compute time, but that's still going to be upscaled to 1080 for the HDMI scanout, and then mangled again by TVs' overscan - which to me feels like introducing more problems more severe than whatever problem you were trying to solve in the first place.

(Besides, TV overscan is a solved problem: instead of specifically rendering a smaller frame games should let users set a custom FoV and custom HUD/GUI size - thus solving 3 problems at once without having to compromise anything).

reactordev 12 hours ago [-]
No, Your TV says it’s 1080 but it’s not, it’s 1074… This is a solved issue now but it wasn’t when HDMI was first introduced. The Xbox 360 suffered from red rings of death. Microsoft hated Linux. And C# was cool.

Basically, if you rendered an avatar image in the top left of the screen, perfectly placed on your monitor, on the TV its head would be cut off. So you change to safe area resolution and it’s perfect again (but on your monitor safe area and screen resolution are the same, except Apple apparently). Make sense?

You can see how if your screen says it’s 4k, but really it’s short 40 pixels, you render at 4k - the screen will shrink it by 40 pixels and introduce nasty pixel artifacts. TV overscan goes the other way. Interesting find by the author about the notch.

rustystump 13 hours ago [-]
Read up on g buffer and deferred rendering. Usually one doesnt do everything at full resolution until the final output and even then it is often better these days to have fancy upscaling.

Many games do let users set the things you mention but it is not always so simple. For example, handling rounded edges and notches is a huge pain.

blackguardx 12 hours ago [-]
I'm surprised this article focused solely on blurry rendering when mouse pointer location in "fullscreen" Mac games is also commonly affected by this bug or whatever we are calling it. You have to dive into an OS menu for each game to tell it to render fullscreen below the notch to fix it. It should be a global accessibility setting but isn't for some reason.
losthobbies 2 hours ago [-]
Part of my decided to get the M3 MacBook Pro because I wouldn't be able to game on it so much - I needed something reliable for a Cybersecurity Masters and I knew if I got a powerful windows laptop I would be tempted to play games .

My is a lovely machine for the most part but overheats a lot when gaming, so much so that it worries me. So I don't bother.

AndriyKunitsyn 14 hours ago [-]
Which says more about the volume of the market of gaming on Mac. It's small and unfortunate.
diath 14 hours ago [-]
It's actually really small, according to Steam Hardware Survey, Macs are only 1.88% of Steam users, which is less than that of Linux, which is probably why most developers don't care.
zamadatix 13 hours ago [-]
Between the deprecation and stagnation of OpenGL on the platform, the removal of 32 bit support completely, the refusal to natively support Vulkan in favor of Metal, and the switch to ARM based systems... I can't believe it's still that "high".
benoau 12 hours ago [-]
Don't worry, they plan to gut Rosetta 2 so it will only support Mac games from the Intel era, that should help shrink that number!

> Rosetta was designed to make the transition to Apple silicon easier, and we plan to make it available for the next two major macOS releases – through macOS 27 – as a general-purpose tool for Intel apps to help developers complete the migration of their apps. Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.

https://developer.apple.com/documentation/apple-silicon/abou...

cosmic_cheese 11 hours ago [-]
Their compatibility layers being intended as exclusively transitional can be frustrating, but on the other hand it’s also why near every Mac app now has a proper aarch64 build. If it were known that Rosetta were going to remain part of the OS forever, many apps would’ve never been ported and would’ve been at an effectively permanent 30-60% performance penalty and disproportionate drags on battery life.

On the other side of the fence in Windows land, adoption of features much more trivial than an architecture change are absolutely glacial because devs know they can lean on back compatibility in perpetuity.

Apple’s approach is perhaps too aggressive but Microsoft’s is far too lax. I don’t think it’s realistic to dump a binary and expect it to work forever with no maintenance.

zamadatix 9 hours ago [-]
Even if Windows wanted to push hard to get off x86 they still lack ARM hardware worth moving to in the first place. I wouldn't be surprised if the 13" M1 MacBook alone shipped more units than all Windows 8/8.1/10/11 ARM devices combined. The 32 bit -> 64 bit app migration for x86 was also much slower on Windows but, again, there wasn't any real performance gain to be had until your app needed more than 4 GB of RAM in a single process, so no real pressure there either.

The moment Windows actually has non-x86 hardware which people actually want to buy I bet native app support comes pretty quick.

sofixa 5 hours ago [-]
> The moment Windows actually has non-x86 hardware which people actually want to buy I bet native app support comes pretty quick.

Snapdragon X laptops are here already, and reviews are generally positive (of course, use and workload dependent, nobody should get one to game, but for web browsing and long battery life it's perfect).

rarepostinlurkr 7 hours ago [-]
I don't follow how Rosetta 2 supporting Mac games from the x86 era is going to "shrink" that number.

macOS x86 games + macOS arm64 games == same number of games. Whats the loss angle here?

troupo 7 hours ago [-]
> the refusal to natively support Vulkan in favor of Metal

1. Metal precedes Vulkan

2. Hardly anyone supports Vulkan https://www.carette.xyz/posts/state_of_vulkan_2024/

account42 4 hours ago [-]
This is of course horseshit. Vulkan was being developed before Metal and Apple was even part of that development - they only dropped out once they decided to do their own thing instead.

Vulkan is the primary rendering API on Android and Linux these days and also well supported on Windows by GPU vendors. Applications that don't use it yet generally don't because they don't need it and still OpenGL (ES) (or for Windows applications, use D3D instead).

troupo 3 hours ago [-]
> This is of course horseshit. Vulkan was being developed before Metal and Apple was even part of that development

It's fascinating that history is changing so fast now that even events of 10 years ago can be claimed to be something they weren't.

- 2013: AMD starts developing Mantle, and it is only available as part of AMD Catalyst, and only on Windows.

- 2013 (at least): Apple starts working on their own graphics API.

- June 2014: Metal 1.0 on iPhones is announced by Apple (it means it had already been in development much earlier, that's why I wrote "2013 at least" above)

- July 2014: Khronos starts work on Vulkan

- August 2014: The announcement of the Vulkan project and the call for participation.

- June 2015: Apple announced Metal on Macs.

- Sometime in 2015: AMD discontinues Mantle and donates it to Khronos.

- December 2015: Vulkan 1.0 specification announced by Khronos group

- February 2016: The full spec and SDK for Vulkan 1.0 released

So, reality: Metal had been released two years before Vulkan even had a specification.

> also well supported on Windows by GPU vendors.

Keyword: GPU vendors. Not by Microsoft.

> Vulkan is the primary rendering API on Android and Linux these days

But not on Macs, iPhones, Windows, XBox, and Playstation.

And yet, "omg why doesn't Apple support this late-to-the scene quite shitty API that it must support because we say it must".

egypturnash 13 hours ago [-]
I do my work on a Mac. I game on a game machine. Which right now is a Steam Deck. In the past it's been PS4, 360, Gamecube, PS2, etc.

I think the only game I've put serious time into on my Mac was Hades 1, which I pretty much finished before the console ports happened.

komali2 8 hours ago [-]
I went the other direction: moved away from mac, windows, and game consoles into a monster PC running Manjaro. Now I can invest all my money into one machine and see benefits in all aspects of my life: work, personal projects, gaming, video editing, photo editing. I just stream from it to other devices if I'm mobile.
vlovich123 12 hours ago [-]
Mac users might prefer to get their games through the default Mac App Store than Steam for distribution which plausibly could unrealistically distort the numbers.
add-sub-mul-div 13 hours ago [-]
It would be smaller than that overall because Steam stats are incomplete, they don't count all the Game Pass users. I haven't opened Steam in the six or so years I've been taking advantage of Microsoft providing a few hundred games for $10/month.
bluedino 2 hours ago [-]
When I was into pixel art games and 8 bit emulation, this caused me to stick with the MacBook Air instead of the Pro, because it had the traditional 1440x900 screen without funky scaling.
neuroelectron 2 hours ago [-]
I find it hilarious that everyone is forced to carry around Tim Apple's big gay Macs. You're going to love my dongles and notches. Put it in your backpack so the screen snaps like a cracker! Actually a keyboard that hardly has any travel is superior. Imagine the efficiency as your fingers delicately prance around the buttons. Now carry around a tiny purse for your wireless earphones, dear. Your wireless mouse has a penis and lays on its back! So cute.

Now strap this heavy glass and steel to your face and gesture in the air like a fruit; an Apple, if you will. Don't you feel creative? :D

Rob_Polding 4 hours ago [-]
It’s a good thing that most people don’t buy Macs for gaming, as they suck compared to any other platform. Anyone who expects a good experience is fooling themselves as they won’t get it! If you want to experience games as the developer intends, don’t play ports that use compatibility layers, just buy a console and play natively!
esperent 4 hours ago [-]
> If you want to experience games as the developer intends, don’t play ports that use compatibility layers

A few months ago I would have assumed you're correct.

But recently I've played some games on my laptop with Lutris on Ubuntu which uses Wine or Proton under the hood.

The performance and stability is excellent. Although I haven't done any testing, subjectively it feels superior or at least equal to Windows. I've played several intense titles which are pushing my laptop's GPU (Nvidia 3070m) to it's limit, most recently the new Indiana Jones game (which is excellent FYI).

extraisland 3 hours ago [-]
It depends on the game, the distro, the version of the game, the kernel and the windowing system as to how well it performs. I am using Debian and while a lot of games work well, some games are an absolute PITA to setup properly.

It is easier for me to reboot into Windows.

liminal-dev 4 hours ago [-]
The Steam Deck runs games on a compatibility layer (Wine) and it’s an excellent console.
Toutouxc 4 hours ago [-]
I’m pretty sure that if a developer releases their game on macOS, it plays, by definition, as the developer intended.

I have both a powerful gaming PC with Windows and all that, and a PS5, but some games I just like on my MacBook.

extraisland 3 hours ago [-]
A lot of games are ported to Mac / Linux by a third party. The quality of that port is dependant on the company and the budget for porting it. If I was a game developers I would never do a Mac port.
carstenhag 5 hours ago [-]
Meta: the name on HN was "Blurry game rendering on Mac" or something. Now we have the same title as in the article - clickbait-sounding. Why?
diebeforei485 12 hours ago [-]
How can this be fixed without breaking existing software? Re-ordering the list?
shermantanktop 12 hours ago [-]
They could do what Windows has done, and build OS code that checks if the running application is a known-legacy game, and lie to the game about various capabilities so that the game runs well and looks good.

Not sure how much of that is still around but it was rampant for many years and likely a key to Windows success in gaming.

cesarb 11 hours ago [-]
> They could do what Windows has done, and build OS code that checks if the running application is a known-legacy game, and lie to the game about various capabilities so that the game runs well and looks good.

Or, even simpler (and AFAIK modern Windows does that too): if the running application doesn't say in its application manifest "I'm a modern application which understands the new things from operating system versions A, B, C and features X, Y, Z", you know it's a legacy application and you can activate the relevant compatibility shims.

account42 4 hours ago [-]
They already do that, linked from another comment here: https://developer.apple.com/documentation/bundleresources/in...

They just need to adapt the API to also filter the list of resolutions when the compatibility mode is on.

sira04 9 hours ago [-]
League of Legends in borderless mode renders at the resolution that's set in BetterDisplay instead of the real one which makes it very blurry. In fullscreen it does it correctly so I'm forced to use fullscreen.
int0x29 10 hours ago [-]
The font this website uses is too thin for parts of a number of letters.
rendaw 2 hours ago [-]
Yeah, IMO the tops on all the letters are too thin. Reading it I felt like I was reading something printed when the ink was running out...

(Upholding HN tradition here)

musicale 6 hours ago [-]
> The problem with Apple laptops is they have a notch at the top of the display

Well, yes.

eviks 9 hours ago [-]
> through NSScreen’s safeAreaInsets

How is vague "safety" is better than a simple descriptive rect_below_notch?

zerocrates 8 hours ago [-]
"Safe area" is terminology that has a history of being used with displays: defining the portion of a TV screen where viewers could be reliably expected to be able to see content and on-screen graphics without them getting clipped off by the overscan.

I assume they also wanted to choose something that would still be named appropriately if they had other kinds of notches or things in the future (and that even now it already accounts for, say, the curved corners where things also can't display).

eviks 8 hours ago [-]
Since "no-one" knows the history, the design go back to prioritizing the basics of comprehension
jama211 6 hours ago [-]
Loads of people know it as “safe area”, just because you hadn’t heard it before doesn’t mean it’s not an incredibly common term… I was shocked you hadn’t heard of it. It’s consistent across the industry.
eviks 6 hours ago [-]
I've heard of that term before, just because you don't have an actual argument there is no need to make up ignorance in others and feign your shock. Also you're also mistaking two different types of knowledge: what the term means and its "TV screen" history. I was explicitly referencing the latter
interpol_p 8 hours ago [-]
Safe area can account for things that are not just a notch. It's used across Apple platforms to indicate anything that might need to occupy a dedicated region on the screen: notch on iPhones, home indicator, iPadOS traffic light buttons, menu bar, curved edges of displays, and so on

Your container views can extend the safe areas for their children as well. In our apps, which allow users to run their own custom projects, we increase the safe area for our UI so that users can avoid it in their own rendering

Safe area is a fairly neat and functional API. The unfortunate thing is the older `CGDisplayCopyAllDisplayModes` API is just lumping all resolutions together

eviks 8 hours ago [-]
Even then, you can still have something descriptive like unobstructed_area for more general cases. "Safety" is too generic

But also you don't need to degrade Mac dev experience for games to tackle ipads the games will not be developed on, aliases exist

dkiebd 4 hours ago [-]
How is unobstructed area, a term you just made up, better than safe area, a term that has been standard for decades?
eviks 4 hours ago [-]
That's obvious: the descriptive term is obvious in what it represents on your screen, so has a low cognitive overload for all the new generations of people learning these APIs. Encoding bad practices for decades, on the other hand, just reflects the pains of the old generations.
zamadatix 14 hours ago [-]
Also consider setting NSPrefersDisplaySafeAreaCompatibilityMode and just leave self letterboxing control to a toggle in the settings (with whatever default you prefer).
nottorp 3 hours ago [-]
Hmm I just realized something. The article author seems to be unaware that Apple makes desktops too, and laptops can be connected to external monitors, and thus not every game will be played on something with a notch.
MBCook 13 hours ago [-]
Interesting article, but I think the demonstration image isn’t doing its job. Neither side really looks good to me. They both look roughly the same.
bschwindHN 11 hours ago [-]
Right? If it's vertically squashed, then at least draw the dividing line vertically so we can see a better difference! But yeah, both images seem like a janky rendering of an antialiased circle.
munificent 11 hours ago [-]
The bottom left side is showing the circle with the artist-created rough texture that it's supposed to have. The top right shows that that edgy-but-sharp texture has been softened by anti-aliased.
Tepix 4 hours ago [-]
Yeah but the difference isn't as stark as one would expect really.
thadk 13 hours ago [-]
How's factorio?
chainwax 12 hours ago [-]
I played all the way through Space Age on a M1 Macbook Air no problem.
TylerE 9 hours ago [-]
Factorial had an excellent native arm port. It’s a model citizen
mushufasa 14 hours ago [-]
interesting -- I ran into this recently playing baldur gate 3 and was curious the technical details why. my fix was that I had an external monitor and I just reset the resolution to the external monitor. (by default, though, the monitor was showing up blurry though; with the wrong aspect ratio.)
freehorse 11 hours ago [-]
Baldur's gate 3 groups the resolutions by aspect ratio. I assume it probably queries the resolutions, computes the aspect ratios and then displays them grouped them by it. This results in some weirdness with some weird ratios with huge coprime numerators/denominators but, as long as the right resolution is detected somewhere, you only basically need to do is select the right aspect ration for your screen.
TheJoeMan 13 hours ago [-]
I’m disappointed none of the proposed fixes are for CGDisplayCopyAllDisplayModes to have the “first” option on the list be the BEST option, taking into account the notch. The author hinted that many games pick the first option, so rather than demanding all those publishers add new code, Apple could make the easy path the happy path.
14 hours ago [-]
andrewmcwatters 14 hours ago [-]
Yes!

I remember first implementing this in Planimeter Game Engine 2D, we got a massive resolution list from SDL (through LÖVE, which is what we're built on).

If I remember correctly, we filtered the list ourselves by allowing users to explicitly select supported display ratios first, then showing the narrowed list from there. Not great. Technically there's a 683:384 ratio in there.[1]

But it did enough of the job that users who knew what resolution they wanted to pick in the first place didn't have to scroll a gargantuan list!

[1]: https://github.com/Planimeter/game-engine-2d/blob/v9.0.1/eng...

chris_wot 12 hours ago [-]
In typical Apple fashion, they have ignored the feedback for several years. Nice work, Apple!
14 hours ago [-]
everyone 3 hours ago [-]
Notches are dumb as shit. Eg. when making a cross platform mobile app, you just have to leave a portion of the screen at the top completely empty. Cus various different phones could have a notch or a front facing camera in any position up there. On mobile where screen real estate is tight we just have to give up a really important fraction of the screen because of this madness of non rectalinear screens.
apothekestadt 13 hours ago [-]
[dead]
nixpulvis 12 hours ago [-]
The notch is such a wildly stupid idea I can't even begin with it. I actually kinda liked the direction they were going with the keyboard Touch Bar... but they killed that.
emmelaich 12 hours ago [-]
What's the alternative? A camera on top of the screen? Fragile, unless you could fold it away. Which would be another point of failure.
nixpulvis 2 hours ago [-]
The alternative is what literally every other laptop does and just has a rectangular screen. Put the camera outside please and thank you.

iPhones have the same problem. Ever played a fullscreen game and had part of the UI cut off into the notch or island. Yep, it's a problem.

account42 4 hours ago [-]
Might be unthinkable to some, but an alternative could also be to have NO camera. People already have a phone with one if they really need it.
bigstrat2003 10 hours ago [-]
A bezel would be far preferable to a notch.
msh 5 hours ago [-]
Why? Unless I am running a full screen app the notch gives me more screen space. Remember that the mac have the menu bar on top of the screen at all times so the space is not available for normal windowed apps.
nixpulvis 2 hours ago [-]
I hide my menubar by default. I like using my space for stuff I'm actively looking at.

I was watching someone code in vim on a new mac the other day and the notch covered part of the first like of their editor! Like how does this not completely break you as a human. Maybe I'm a little OCD idk.

Isn't there just like an option to change the resolution and fix this? Apple should really have a "disable notch" mode.

wahnfrieden 8 hours ago [-]
No it wouldn’t be
wahnfrieden 12 hours ago [-]
No it's not
frumplestlatz 12 hours ago [-]
The notch grants more usable vertical physical pixels than you'd have otherwise, with the only "downside" being a small unusable area in the center where the camera/mic/other sensors are placed.

I put "downside" in quotes because the alternative is just not having those vertical pixels at all, and having the screen end below the sensors.

nixpulvis 2 hours ago [-]
The problem is, by granting more vertical space in such a way that requires additional application logic to avoid overlaps, you've taken a sacred abstraction (the rectangular screen) and shattered it.

Most applications do not handle this well, and I'm willing to bet that will continue to be the case for the foreseeable future.

I can see the allure. If you're a default macOS user with the menu bar not set to slide up when not in use, that space nicely fits. Problem still remains for fullscreen apps though.

And it's not even really a hardware problem. Apple could fix this in software. Just make using the unsafe region an opt-in for developers instead of the default. You want to paint in the unsafe region, call NSAllowUnsafeRegionResolution(). Boom done. By default, when the menu bar isn't shown, expose a rectangle to the viewport.

internetter 10 hours ago [-]
> with the only "downside" being a small unusable area in the center where the camera/mic/other sensors are placed.

The notch is wider than need be. As far as I can tell the main reason is aesthetics. But I agree, it is better than nothing

Notch: Camera, ambient light sensor, and a camera status LED

Dynamic island: camera, infrared camera and flood illuminator, proximity sensor, and an ambient light sensor... also half the width and height

JelteF 4 hours ago [-]
It's kinda funny that an article complaining about blurry rendering, is written using one of the most illegible fonts I've ever seen.