Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Same here, it’s like a freaky experience. I run mostly from the browser but have a couple of things going at the same time. One thing I noticed is the blazing fast typing on the Mac. There is an ever so small lag on windows in perceptible when using it. But, when you type on the Mac you notice it, and the brain just feels good using it. I have no idea if this is a real thing, but it’s similar with scrolling, you touch the track pad and things move. Independent of the load I’m running. Now, that many applications are moving to M1 compatible I’m thinking of replace my windows box with a mini+two external hard drives. I can’t imagine what will happen when they release a new processor. This one does more than enough.


One of the reasons iOS feels smoother than Android is because the render loop of the OS is decoupled from the app. The apps aren’t allowed to introduce jank, so if you’re scrolling a webpage and stuff is loading simultaneously, iOS will be way smoother. I think this is also why they can have such low latency on inputs, for example with the Apple Pencil which is much lower latency than the surface pen or the android stylus. I had a 120hz android phone for over a year, and while the frame rate when scrolling is slightly worse on iOS, overall the OS feels more fluid to me. On a 120hz iPad it’s no comparison.

I am speculating here as I don’t know for sure, but I remember iOS started as a derivation of OSX so this may be the case for macOS as well. So I think it’s not your imagination, it’s a different input and render architecture than windows or android.


Android has had a separate "render thread" for ages, though it's per app and runs in the app process. Some animations run on it, but many do not. You can't access it directly from within the app.

The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.


> The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.

This feature has been in the Linux kernel for ages[1]. Android and ChromeOS are based on the Linux kernel, and have had this feature for quite some time. This is nothing new.

[1] https://community.arm.com/developer/ip-products/processors/b...


So why does linux ux feel slower?


Because in many cases, it is. Unfortunately, there isn't as much communication between the GUI people and the kernel people in the Linux community as there is between those same groups at Apple Inc. Not to mention, there are multiple competing groups of GUI people in the Linux community making coordination across these levels difficult. Also, there are many competing interests working on the kernel who might oppose kernel-level optimizations which favor desktop usage at the cost of, for example, server usage. As a result of these and many other factors, Linux's desktop UX remains far less optimized when compared to macOS's desktop UX.

As with most of Linux's rough edges, however, this is trivially fixable if you're technical enough. Of course, that's exactly macOS's advantage. If you want the best experience on macOS, you don't need to be technical.

Personally, I run Linux with a tiling window manager and lots of custom keybindings and little scripts. Having used the default macOS experience a few times at work (on machines far more powerful than my dinky laptop), I can assure you that my highly customized setup feels far more responsive. On the flip side, it required a lot of up-front investment to get it to this point.


I never felt the UI responsive difference between Linux and MacOS but Windows (including freshly installed on powerful many-core machines when) is a different story. The number one reason I ever switched from Windows to Linux is the latter always feeling way more swift - UI responsiveness always remaining perfect and some background tasks also working much faster. And I never actually used lightweight WMs - only KDE, GNOME and XFCE. The first time I've noticed some slowness in Lunux was on RaspberryPI (4, with the default LXDE).


This.

I think the main advantage with macOS is that it's above all else a system designed to be used interactively, as opposed to a server, so they don't have to put out a configuration "good enough for most things".

I also run Linux with a tiling window manager on an old machine (3rd gen i7), and it flies. One thing that made a huge difference in perceived latency for me was switching the generic kernel with one having Con Kolivas' patch set [0].

I'm using Arch and sign my own EFI binaries, so I don't care about out of tree patches, but for Ubuntu users and similar who don't want to mess with this, there's an official `linux-lowlatency` package which helps [1].

---

[0] https://wiki.archlinux.org/title/Linux-ck

[1] https://packages.ubuntu.com/search?keywords=linux-lowlatency...


X-Windows: …A mistake carried out to perfection. X-Windows: …Dissatisfaction guaranteed. X-Windows: …Don’t get frustrated without it. X-Windows: …Even your dog won’t like it. X-Windows: …Flaky and built to stay that way. X-Windows: …Complex non-solutions to simple non-problems. X-Windows: …Flawed beyond belief. X-Windows: …Form follows malfunction. X-Windows: …Garbage at your fingertips. X-Windows: …Ignorance is our most important resource. X-Windows: …It could be worse, but it’ll take time. X-Windows: …It could happen to you. X-Windows: …Japan’s secret weapon. X-Windows: …Let it get in your way. X-Windows: …Live the nightmare. X-Windows: …More than enough rope. X-Windows: …Never had it, never will. X-Windows: …No hardware is safe. X-Windows: …Power tools for power fools. X-Windows: …Putting new limits on productivity. X-Windows: …Simplicity made complex. X-Windows: …The cutting edge of obsolescence. X-Windows: …The art of incompetence. X-Windows: …The defacto substandard. X-Windows: …The first fully modular software disaster. X-Windows: …The joke that kills. X-Windows: …The problem for your problem. X-Windows: …There’s got to be a better way. X-Windows: …Warn your friends about it. X-Windows: …You’d better sit down. X-Windows: …You’ll envy the dead.

https://donhopkins.medium.com/the-x-windows-disaster-128d398...


> like Sun’s Open Look clock tool, which gobbles up 1.4 megabytes of real memory!

It's funny to read this in an era when smartphones come with 6 GB of RAM to compensate for developers' laziness and unprofessionalism.


Almost all major distributions use Wayland as of today.


move on its 2021....


I use Plasma Desktop and it's been more responsive than macOS was, so I don't know.


On the exact same hardware linux always felt much faster to me, I remember doing stuff like resizeing finder windows vs KDE's dolphin, finder would be all janky and laggy and KDE wouldn't miss a frame



Yeah, my go-to for speeding up Macs is to throw Linux on them.


>the render loop of the OS is decoupled from the app

Can you elaborate on this? If you do too much work on the main thread in iOS, it's going to hang the UI. Isn't the main thread the "render thread"? Do scroll views have some kind of special escape hatch to get off the main thread for continuing to scroll if the main thread is blocked with loading the content?


I believe the point is that the "main thread" for your iOS application, is not the main thread of the OS. They're totally decoupled.


Err, same with Android? And every OS ever. That's just standard process isolation. Or am I misunderstanding something?


There’s some deep dive articles on the way the input loop works. But OP is correct, and the reason iOS feels smoother. Android has a lot more UI lag.


I don't think Apple has any special tricks for input loop.

Some Android phones really have input lag, but it is not caused by CPU load. For example, on my phone, there is approximately 100-150 ms lag between tapping the screen and registering the touch. The lag is not caused by CPU load, but by slow touch sensor.

I don't think Apple has any smart code optimization tricks. Either they have a faster touch sensor or just some people believe that if something is made by Apple then it is of a better quality.

Here is a comparison of input lag in games in iOS and Android [1] and it shows similar results: 80 ms between a tap and reaction on screen.

[1] https://blog.gamebench.net/touch-latency-benchmarks-iphone-x...


They do! See https://devstreaming-cdn.apple.com/videos/wwdc/2015/233l9q8h... and the corresponding WWDC session on them minimizing the input-to-display time.


This is off-topic, but I love that the title of the slides is "233_Advanced Touch Input on_iOS_04_Final_Final_VERY_Final_D_DF.key".


I wonder why somebody working at apple wouldn't just use git for that?


<laughs in Windows 3.1>


UIView animations do not run on the main thread, and will not be blocked if the main thread is blocked. This does help a bit with keeping the OS feeling smooth, but it is far from the only reason.


This is not entirely accurate. iOS indeed uses a client-server UI model, kind of similar to X11. Along with submitting “widget” hierarchy updates, it also supports submitting “animations”. The downside is that the animation states are not truly accessible by the actual app after it submits them.

The scrolling animation is 99.9% of the time implemented as a client-side animation timer submitting non-animated hierarchy updates to the server. It’s common to have janky scrolling.


> Along with submitting “widget” hierarchy updates, it also supports submitting “animations”.

Is that how all these third party iOS apps all have completely consistent “pop” animations on long press?


No, that would be OS-provided GUI components (or sometimes manual reimplementation), similar to how most win32 apps had the same right-click menu behavior.


Off-topic: I see the lack of standardized OS-provided GUI components in Unix/Linux distros as the main root-cause for low Linux adoption. I'm assuming there isn't such a thing since I haven't been able to notice any consistency ever in the GUI of any Linux distro and/or any GUI app that runs on Linux :P But I may be totally wrong.

On-topic: they should build an M1 app that simulates loud coil whine at low-CPU usage, so you can feel like you're using a Dell XPS.


Well, there are common components, it's just that there are multiple standards (Qt, GTK, wx, etc)...

I'm using a tiling window manager with mostly GTK apps, so pretty much all menus and such look the same. The worst offenders are Firefox and IntelliJ, although they have improved a bit lately.

However, I'm not sure that this is the reason for lack of adoption. Windows has always been a patchwork of interface design, every other app having their own special window decorations and behavior, including MS' own apps (thinking of Office in particular). Also, seemingly similar elements, such as text inputs, behave differently in different places. For example ctrl-backspace sometimes deletes the previous word, sometimes it introduces random characters.


The unique thing about this is that it takes a widget from the app, and expands it out while blurring the rest of the screen. So it’s not just an OS gui component.


Blurring happens inside the UI server process. Here is a related technique in macOS: https://avaidyam.github.io/2018/02/17/CAPluginLayer_CABackdr...

Basically, it's like an iframe — you declare what contents this portion of screen should have, and the render server composes it onto the screen. The render server is shared between all apps (just like in X11), so it can compose different apps with dependent effects (like blur). Apps don't have direct access to the render server's context, just like a webpage doesn't have any way to extract data from a foreign iframe.


I wonder what made apple give the iPad Pro such an awful screen though, considering all the software optimisations that they do. I got the M1 iPad a month ago, and the screen has absolutely horrendous ghosting issues, like, what is this? An LCD from 2001? Just open the settings app, and quickly scroll the options up and down - white text on the black background leaves such a visible smudge, it bothers me massively when scrolling through websites or apps in dark mode, it honestly doesn't feel like an apple product to me. Honestly haven't seen this issue on any screen in the past 10 years, and here this brand new(and very expensive) iPad with 120Hz screen has something that looks like 50-100ms gray to gray refresh time.


This is a very interesting anecdote considering the m1 IPad Pro is supposed to have one of the best screens available on any device that form factor, the same XDR display technology as their $5000+ monsters. Every reviewer has mentioned the display as the selling point. I have looked at them in person and have debating buying one, but Next time I’m at an Apple Store I’ll want to see if I can replicate what you’re seeing.

You might be experiencing the mini-LED effect where the back lighting is regionalized, which isn’t ghosting but can be noticeable.


It's on the 11" model, so definitely not a mini-LED issue.


Edit: the parent commenter does not have a miniLED iPad.

If you have the 12.9” M1 iPad, the ghosting is likely due to the new miniLED backlight which was introduced with that model.

This backlight is not static, but tracks to content in order to increase effective contrast (similar to FALD backlights on higher end LCD TVs). If the backlight does not track the LCD content fast enough, there can be ghosting.

In addition, since the LEDs are large compared to pixels, you can sometimes see haloing, particularly in small bits of white on a black background.

Overall, while the display is great for video, especially HDR video, it has some problems with UI and (white on black) text.


It's on the 11" model, so regular old LCD model, not the new fancy microled.


No issues on my iPad you might want to take it to Apple.


I’m guessing it must be an issue with those new miniLED screens, or some other significant generational problem. I have two older 12.9” iPad Pros, one 1st generation and one 2nd generation (the first with the 120hz display). They are both excellent displays with no such issues with motion or ghosting.


I’m worried the upcoming upgraded MBP will only have this option. Although I read they released an update that should minimize this issue. Have you tried that?


To this note, I've noticed huge discrepancies in display quality in iPads. They were passed out in my last year of high school, and each one even appeared to have slightly different color grading. The whole thing was pretty funny to me, especially since I have no idea how artists would be able to trust a display like that.


Display on microled Ipad pro is Apple's first more or less own display. The panel itself is LG's I believe.


Recently I visited the Apple store and compared the latest 11 inch iPad Air (A14 processor) and iPad Pro (M1 processor) models side-by-side. I loaded up Apple’s website in Safari and scrolled. The Pro is noticeably butter-smooth, while the Air stutters. My iPhone also stutters in the same way, but it’s never bothered me before. It’s only after looking at the performance of the M1-driven iPad Pro that I knew to look for it. And I previously had a similar experience noticing how much smoother my iPhone was than my old Android phones!

I don’t know for sure the processor is the difference, this is just a report of my observations.


That's not because of the processor. It's because the newest iPad Pro has a 120hz display and the other devices that you were comparing to have a 60hz display.


While 60hz screen refreshes are less smooth than 120hz, the small (but noticeable) of a difference due to refresh rates wouldn't be correctly describable as "stutter".

The M1 processor makes a real difference. I have both the M1 and the most recent non-M1 iPad Pro's.


Yeah you really don’t notice 60Hz scrolling until you try 120Hz scrolling. It’s a bit like you didn’t notice Retina displays until you tried one then looked at a pre-retina display. It’s crazy how you adapt to perceive things once you see something better/different.


Maybe I should hold off on the iPad Pro until I can get 120Hz on all my devices.


This is not really true, at least in any sense that really differs from other OSes. And, if you watch closely, you will notice that poorly-behaving apps running in the background can and will introduce jank in the foreground process. Since iOS lacks good performance monitoring, this (along with battery consumption) has historically been the easiest way to figure out if an app is spinning when it shouldn't be.


I'm sorry, I have an Samsung Galaxy S20 Ultra with 120mhz. Using my girlfriends iPhone 12 makes me dizzy.

120hz is something you don't notice much when you enable it, but you definitely notice when it's gone.


Typing lag is such a sad result of all our modern computing abstractions.

https://www.extremetech.com/computing/261148-modern-computer...


Hm. I've always thought it was more of a result of our current display technology? Digital displays buffer an entire frame before they display it. Sometimes several frames. And the refresh rate is usually 60 Hz so each buffered frame adds a delay of 16 ms. CRTs on the other hand have basically zero latency because the signal coming in directly controls the intensity of the beam as it draws the picture.

Anyway, is it any better on displays that have a higher refresh rate? I feel like it should make a substantial difference.


CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

All CRT displays attached to computers in the last 40 years were driven from memory buffers just like LCDs, and those buffers were typically only allowed to change while the electron beam was "off", i.e. moving from the bottom of the screen back to the top. Letting the buffer change while the beam is writing results in "tearing" the image, which was usually considered a bad thing.


> CRTs are potentially worse.

Video game aficionados would like to have a word with you:

https://www.wired.com/story/crt-tube-tv-hot-gaming-tech-retr...

To be fair, much of this is the color and shape rendering, where pixel art had been tailored for CRTs.

Twitchy gamers do swear by “zero input lag” but are perhaps just nostalgic, difference is likely to be 8ms vs. 10ms:

“Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag…”

https://www.resetera.com/threads/crts-have-8-3ms-of-input-la...


As you said that article seemed to be more about the appearance of objects on a CRT than lag, and I kind of agree with the nostalgia crowd in that respect. But [raster] CRT lag is always going to be 16ms (worst case) and will never be better, while LCDs can in theory run much faster as technology improves.

If we shift the discussion to vector CRTs (which have no pixels) such as the one the old Tempest [0] game used, the CRT has a major advantage over an LCD and the lag can in principle be whatever the application programmer wants it to be. I miss vector games and there's really no way to duplicate their "feel" with LCDs.

[0] https://en.m.wikipedia.org/wiki/Tempest_(video_game)


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen.

Back when I had CRTs, 60Hz displays were the older, less-common, cheapo option. I'm having a hard time remembering a CRT display that wasn't at least 75Hz (I believe this was the VESA standard for the minimum to be flicker-free), but most of the monitors I used had refresh rates in the 80-90Hz range. I remember a beautiful higher-end CRT that had a refresh rate around 110Hz.

85Hz gives you a frame time of 11ms, which doesn't sound much better, but is a 30% improvement over 16ms.


Before multi-sync CRTs and SVGA, 60Hz was not the "cheapo" option.


I don't think you can get a display slower than a TV, and they do in fact update at ~60Hz (or 50Hz, depending on region). Of course you're probably only getting VGA, 240p, or less in terms of pixels.


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

This is exactly the same as LCDs though, no? LCDs are also drawing an entire frame at a time, they're not "random access" for lack of a better term. There's just typically no image processing going on with a CRT* though, so there's no inherent latency beyond the speed of the electron gun and the speed of light.

*I'm aware there were some later "digital" CRT models that did analog->digital conversion, followed by some digital signal processing on the image, then digital->analog conversion to drive the gun.


I don't think that LCD buffer anything. I've experienced screen tearing which should not happen with buffering. Most applications implement some kind of vsync which introduces buffering and related delays indeed.

Best option is to use adaptive sync and get rid of vsync. But support for this technology is surprisingly not mature, it works mostly in games.


Screen tearing happens when the software swaps buffers on the GPU while the GPU is in the middle of reading out a buffer to send to the monitor. That tearing has nothing at all to do with whatever buffering is or isn't happening in the display itself, because the discontinuity is actually present in the data stream sent to the display.


See also this talk by John Carmack about buffering problems in input and output stacks.

https://www.youtube.com/watch?v=lHLpKzUxjGk


Sounds broken? Return it.


You have what I called Latency Sensitive or Jank Sensitivity. The most important thing in these environment is actually not how fast they are ( or how low the latency are ), but the latency being "consistent". With Windows and Android there are micro Jank everywhere. I know because while most people dont notice, I felt a small pin push like pain on the right back of my head every time it happens.


Depending on what you're coming from, it could just be you haven't been using modern hardware. I agree some is faster than others. I notice a difference going from my Ryzen 5900x to my i9 11900K. Typing and everything is definitely smoother on my latter machine. Part of it is just software advantages. Apple designed the compiler and built the system for the M1. On Windows and Linux, Intel has a boatload of engineers working on their behalf.

The single core performance of my 11900K is really hard to beat at 5.3GHz, and with 7 of 8 cores hitting that frequency simultaneously (which it hits all the time in desktop use) it's just buttery smooth. My guess is that you just came from an older system. The advances in CPUs (along with good compiler programmers working on that architecture's behalf) in my experience are larger than reviews and colloquial recommendations would suggest.


Just FYI from a person who's used Macs since December 1984 and Windows since DOS, the instant UI responsiveness of Macs (when using any input device), regardless of load, was ALWAYS a feature, even going back to the very first Mac. I always noticed the lag on Windows and couldn't believe it didn't bother anyone... which of course it didn't, because it's all most people knew!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: