You're basically saying "let's consider reinventing the internet", which I think is not reasonable or a good idea.
I think one problem with the comments about this post is that they doesn't emphasize enough that the hit is to system call heavy workloads. This is NOT a global slowdown of all compute. The hit is to system call performance, and it's big, no question, but I don't think we need to throw out the internet's structure in order to address that.
One option is to make fewer system calls. For a program that is basically scanning a file system that might be harder to do today, but for tons of other programs it isn't, and with iouring we have a very reasonable escape hatch for optimizing syscall heavy workloads.
This might sound silly, but this sort of thing is constantly happening - programs are optimized for thinking something is fast and something else is slow. For example, if you built a program in the 2000s you'd do everything you can to avoid the disk, aggressively caching. In 2020 disks are insanely fast, and the cost of caching will be worse than just optimizing your disk usage.
I don't know about the average website, but from what I can tell most sites that aren't total shit are pretty 'clean'. With an adblocker, especially so.
> running code you can't trust
We do that already. There's literally dozens of mitigations taken based on this.
> You're basically saying "let's consider reinventing the internet", which I think is not reasonable or a good idea.
It's absolutely reasonable to require the user to give a web page permission to start executing powerful, potentially dangerous code on their computer. Also JavaScript is not "the internet", nor is it even really "the web", which for most of its history was relatively benign HTML documents that you read, and things have frankly been on a downward trajectory in terms of safety and usability since we started changing it.
Powerful JS is a relatively recent phenomenon in the history of the web, and largely under the theory that we can do it safely with correct design. As someone that has to moderate code on un-trustable web sites as part of my work, I can just say it's not working out very well, even leaving out stuff like spectre/meltdown.
I don't think this would solve anything. This is like putting Word macros behind a "do you want to run this macro?". Yes, yes they do want to run it.
If people were willing to put every webpage behind a "do you want to run this webpage" we'd see them all using noscript already - virtually no one wants that experience.
> largely under the theory that we can do it safely with correct design
I blame operating systems and hardware vendors tbh. It's their job to make this safe, and they do a pretty bad job of it. Browser vendors have had to pick up a massive amount of slack to try to compensate, to the extent that browser teams have to make major patches to the Linux kernel.
I think one problem with the comments about this post is that they doesn't emphasize enough that the hit is to system call heavy workloads. This is NOT a global slowdown of all compute. The hit is to system call performance, and it's big, no question, but I don't think we need to throw out the internet's structure in order to address that.
One option is to make fewer system calls. For a program that is basically scanning a file system that might be harder to do today, but for tons of other programs it isn't, and with iouring we have a very reasonable escape hatch for optimizing syscall heavy workloads.
This might sound silly, but this sort of thing is constantly happening - programs are optimized for thinking something is fast and something else is slow. For example, if you built a program in the 2000s you'd do everything you can to avoid the disk, aggressively caching. In 2020 disks are insanely fast, and the cost of caching will be worse than just optimizing your disk usage.
I don't know about the average website, but from what I can tell most sites that aren't total shit are pretty 'clean'. With an adblocker, especially so.
> running code you can't trust
We do that already. There's literally dozens of mitigations taken based on this.