>> My nightmare scenario is that the Web grows into a rich application platform in an operating system-neutral way, and then a company like Siemens or Matsushita comes out with a $500 “WebMachine” that attaches to a TV. This WebMachine will let the customer do all the cool Internet stuff, plus manage home finances (all the storage is at the server side), and play games. When faced with the choice between a $500 box (RISC CPU, 4-8Mb RAM, no hard disk, …) and a $2KPentium/P6 Windows machine, the 2/3rds of homes that don’t have a PC may find the $500 machine pretty attractive!
> You have to remember that in democratic societies citizens talking with each other is very important. We've lost a lot of that with the mass media. Now we have an opportunity for citizens to create their own communications with each other. So when these big deals with the big companies and the big governments carve up this new territory, I feel it's very important that we keep a kind of "social green belt", that we keep the ability for citizens to talk amongst each other.
A lot of thin clients came out in the 1990s and they were garbage. Web browsers are ridiculously inefficient (10x or more cycles to draw similar UI), yet thin clients had something like 1/10th the performance and 1/4th the RAM so the net performance would be 20x-100x worse than a PC for 1/4th the price.
WebTV was kind of an exception to this; it was very fast due to the split browser architecture and putting very little content on the screen. IMO their amazing engineering was wasted on the low end of the market.
VNC/Sun Ray was a much better idea, but only within a LAN.
Web appliances are a far cry from tablets and smartphones. Those are both basically the exact opposite of a web appliance. They're network connected and can store data remotely but they have significant client side power. They also have significant local storage, at least compared to web appliances.
The old web appliance concept was an extremely low powered device with no local storage. It had just enough power and storage to run a browser. The concept is more like a Chromebook/Chrometop than a tablet or smartphone.
The web appliance concept was predicated on some early to mid 90s assumptions about technology trends:
1. Storage was expensive so just burn everything into a ROM. Saving data wouldn't be a thing as you'd bookmark sites or post data to websites.
2. The Pentium (and associated support chips) was expensive and would stay expensive.
3. The web would basically consist of static images and text. Input would be done with POSTs from form controls.
4. The web wouldn't adopt any vendor specific technologies and plug-ins won't exist.
With these predicates some low powered RISC SoC running a browser would have been a decent appliance. WebTV and the Audrey weren't terrible products, they weren't great but also not terrible.
Yes, but still not seeing the forest for all the trees.
Yes, a phone has substantial computing power - compared to old technology. But is has nothing in storage or processing power against the data centers of Netflix, Facebook and Google. It's very thin in relative terms.
Indeed, even native apps go to almost comical lengths to not use the CPU on the phone. Granted, they use them but for inconsequential things like eye candy. Voice recognition and "your" social graph etc is on the remote server. It's almost like there is a law that nothing of importance must be computed on the local device.
> Indeed, even native apps go to almost comical lengths to not use the CPU on the phone.
This is needed in order to save battery life. But Netflix/FB/Google do not have "substantial" processing power per user. They're providing a centralized service to a very large userbase, and that comes with a requirement to economize on both compute and storage.
So a smartphone is like a web appliance because Google and Netflix run data centers? That doesn't make any damn sense. Consumer devices, PCs included, have always been outstripped in power and storage by data centers. That does not make them de facto thin clients.
Smartphones conserving power doesn't make them thin clients either. It just turns out that modern smartphone SoCs are vastly overpowered for many computing tasks. They scale core clocks very quickly and even have multiple execution cores tasked with different execution needs (BIG.little co figurations). Smartphones also have limitations larger systems don't have like lacking memory swap space and tight thermal envelopes. A modern smartphone is a minor miracle of computing for the amount of shit packed into a pocket sized device that runs all day on an internal battery.
Most mobile applications tether to some cloud component for synchronization. It's rarely because they can't do the work locally. Your camera app is doing tons of processing on the device to deliver decent pictures from very constrained optics and sensors. Google and Apple both do lots of ML on device and have dedicated co-processors for that task, iOS more so than Android. Games obviously do all their work locally. The bloated bullshit that is modern JavaScript web apps is all run locally even if data comes from a network. Many smartphones can record 4K 60fps video! Have you used a smartphone in the past decade?
No aspect of smartphones makes them any more thin clients than a full PC. They certainly don't resemble the 90s thin client/web appliance.
These could do virtually no work locally and had no local persistent storage. They were slightly upscaled X terminals. The web appliance concept would run a web browser little more capable than early Netscape browsers. What JavaScript found would be using "document.write" statements to do trivial client side things like display the current time or do image rollovers. There was very little CSS, no AJAX, no tabs, few even allowed multiple windows, no plugins, and no video.
Most didn't even support frames so you couldn't do frame/iframe reloading tricks to get new data from the server. It was all full page reloads with either GET or POST requests.
The original WebTV box was pretty much the expression of web appliance talked about in the article. It had a 100MHz RISC CPU with 2MB of RAM and 2MB of ROM. It had a tiny amount of Flash memory (1MB IIRC) to store downloaded e-mail and TV schedules. That was a fraction of the power of the typical PC of the same vintage and the gap only widened as the 90s went on.
So just because Google runs data centers doesn't mean an iPhone is a thin client terminal. Just because a device can browse the web does not make it equivalent to an old WebTV box.
I'm not following how that's different from WebTV. The memo predicts a product like WebTV and that product materializes almost instantly. In the enterprise space, 'thin client' was already every second word out of Larry Ellison's mouth.
> When faced with the choice between a $500 box (RISC CPU, 4-8Mb RAM, no hard disk, …) and a $2KPentium/P6 Windows machine, the 2/3rds of homes that don’t have a PC may find the $500 machine pretty attractive!
WebTV wasn't the compelling product the author was concerned about. The open internet and more open hardware won then (to MS's benefit, for the author).
But the modern phone is that more compelling alternative.
The open internet winning was to Microsoft's detriment at the time, which was why they fought it so hard and why IE was so broken but difficult for corporate users to abandon. Microsoft wanted ActiveX to win.
There are a lot of things that turned out to be more compelling than WebTV but predicting WebTV doesn't make the author 'prescient' about them either. The claim was about the prescience not the compellingness.
Ok but the comment I was responding it did not say that.
As to the MS thing, I'm not sure that's all that accurate either. It's certainly striking that someone wrote this and presented it at a high level within Microsoft when they did. But it also isn't that far from the way the future of the web was framed outside of Microsoft. If anything, it's a reminder of the outsized role Microsoft used to play just in 'mindshare'. A great deal of discussion about the internet and the web revolved around how it might affect Microsoft, how it might be an opening for (or not) for Microsoft competitors and how Microsoft would respond. I don't think either Google or Apple today loom quite as large in people's perceptions.
What the original author was worried about was a world where it was no longer the case that 99% of consumers had to pay OEM license fees to Microsoft to do these things. And by and large that came to pass - the web is a free platform.
One comment on this being a good prediction of what we see in 2020. Yes, it was obvious in the 1990s that the web was going to be a platform for apps. That was why browsers were such a big piece of the antitrust case against Microsoft. The charge made against Microsoft was that it was better to write to a browser API than to the Windows API, and that's why Microsoft did what it could to kill browser competition with a Windows-only browser. That way, even if you wrote to the browser API, you were still writing your app for Windows.
Java transformed the way people thought about these things. Microsoft didn't like Java and they didn't like competing browsers because they were reasonable platforms.
Super interesting to read about all the technologies (... MediaView? MOS? RPC for... webpage-like things?) & the way they imagined the future to look like before the web (... which is so obvious to us now that we can't even imagine anything else). E.g. the way they imagined webapps as getting a view of the file system, which is actually how Android apps turned out to be; not web apps though (... except maybe some very recent APIs?). Also, "payments" (for which we still don't have a standardized solution; we sell user data instead).
Also, knowing CSS, their statement of "layout for the web is much easier than for Visual Basic" is kinda... endearing.
As a lifetime CRUD programmer/architect, I have to say that the current web standards have been nasty to productivity-oriented CRUD. Desktop GUI's are still more productive and easier to develop. We de-evolved.
The web really needs a GUI markup standard, something like YAML but with more interactive features. Emulating real GUI's via HTML/DOM/CSS/JS keeps failing in practice. They "break" too easy when new brands/versions of browsers come out. There's too many layers to get right and browser vendors don't always cooperate. A "GUI browser" that focuses on GUI's and only GUI's is needed so that distractions for other domains, such as social networks, don't get in the way.
Java applets and Flash failed at their attempts because they tried to be too many things at the same time, making them too big to patch quickly when security problems were found. Do one job and do it well.
>> If Microsoft is to influence the Web, we must have broad, standards-based Web support in our products — we have to be the product supplier of choice for all key existing Web technologies — clients, servers, and publishing tools, at a minimum.
I honestly feel lack of this is what killed IE in the eyes of developers.
IE was a dominant browser for many years, because it was widely acknowledged to be the best, by developers and all. Their market share was >95%. After MS "won" the browser wars, they let IE stagnate for years as Firefox continued to implement new features.
After >6 years of only critical security updates, the world had moved on.
My personal beef was alpha channel support for PNG.
What killed IE (I am not a IE hater) is the amount of extra work you had to invest to get a site to work. I've recently done some work that had to work with versions of IE9-11. I consider myself pretty good with writing front end code that will work with IE and there are many gotchas that I had forgotten about. The debugger is painful to use even on IE11.
That's not it at all. You're talking about the IE tax in a post-IE world. Look at US v Microsoft in 2001 and the early Browser Wars. IE had crushed Netscape (via shady business practices) and had market domination. For a long time, all projects were IE-first, then you would worry about niche upstarts like Firefox. It wasn't until the launch of Chrome and Safari and the resurgence of Apple and OSX that there was any plausible alternative. It was at that point that standards and compatibility really became paramount. IE wasn't "extra work", it was the primary objective.
As others have said here. This history has been retconned quite a bit. TBH people talk about the lawsuit back in 2001, my memory is hazy but outside of the internet it was barely spoken of.
IE was a much better browser than Netscape and until Firefox, Mozilla wasn't really used outside of Linux distros.
> For a long time, all projects were IE-first, then you would worry about niche upstarts like Firefox.
This isn't quite true. Many of the early adopters of Firefox was web devs and enthusiasts and typically I developed on there first (mainly because of firebug) and as I went along I checked that IE worked. However I started my career just after IE7 was released.
> It wasn't until the launch of Chrome and Safari and the resurgence of Apple and OSX that there was any plausible alternative.
No not really (at least not in the UK). I noticed things shifting after the prevalence of smart phones. Chrome came out much later than Firefox and things were already starting to shift in Firefox's favour (non-IE browsers were getting upto 30% of traffic).
> IE wasn't "extra work", it was the primary objective.
That really wasn't my experience at all. But as I said I started my career near the end of IE's dominance.
Hear Hear. I look back at supporting Netscape 4 in late 90's as equally painful as supporting IE6 in late 2000's. It's difficult to imagine, but Internet Explorer was superbly superior during that time. For example, the Netscape4 window couldn't even be resized without needing to reload and redraw the page from scratch.
This was actually one of the things I was going to reply with-- it also crashed significantly less on Windows than Netscape 4 and ran faster up until Active Desktop started to clog the whole thing up.
Certainly it was the primary objective initially. IE was ultimately killed by the strategic decision to stop putting much resource into developing it after they'd already killed Netscape Navigator (whether that was due to complacency or fear of more antitrust action if they looked like they were trying to build on their browser monopoly). But for a long time, that just meant everybody using an outdated browser.
The bigger problem was that MS weren't all that bothered about migrating old IE users to new IE. When IE7 came out five years after IE6 it didn't work on legacy Windows and wasn't pushed in an XP update or service pack, so unlike its predecessor it peaked at less than 50% market share[1]. At that stage Firefox was far from the dominant browser but the delay had created a reason for it to exist and its market share to be non-trivial and ultimately overtake the second most popular MS browser. Overall a combination of Microsoft 6/7 and 8 still had a majority market share for a very long time, but since they were each very different browsers with large shares designers had to go to the effort of supporting multiple platforms anyway rather than just lazily optimising for latest IE. Which meant they also bothered to support Firefox etc, which meant W3C standards compliant browsers were viable alternatives for late adopters when web designers started serving 'outdated browser' warnings to IE6 users and the EU slapped MS with the Browser Choice ruling. Especially since often it was easier for end users to get a non-IE browser to work. First time I installed Firefox at work was when tech support wouldn't let me upgrade to IE7...
[1]it also wasn't the best browser of its time, unlike its predecessor was in 2001. But that didn't matter to the many people who just carried on using IE6
Having been developing for the web back then IE also was at that time a simply better browser than Netscape. Once it started to stagnate that ended up not being the truth but many of the things we take for granted today (xmlhttprequest for example) were invented in IE 6.
The debugger in IE11 is so painful to use. It's an absolute nightmare. It takes forever to launch, it freezes, freezes the whole browser, I don't understand how it's acceptable for it to be so sluggish when running on a top of the line development workstation.
The app I work on has to support IE11 and it's easily the worst part of the job. Often times regressions are made because someone accidentally used some js keyword/feature that's not supported. The build takes forever because of IE, and so there is a flag IE_BUILD that's false most of the times until someone has to test in IE at which point they turn it on and wait for 10 mins for the thing to build. We had to implement theming, and for every other browser we used CSS variables but for IE we had to implement a hacky/complicated solution where we compiled customer specific CSS files.
With regards to IE specific code. Generally the approach I would take would:
1) Code normally and check compatibility of functions with the minimum version of IE. I then either polyfill the browser (if I understand the polyfill, some polyfills are quite large). Where I don't understand what the polyfill is doing or it is quite large (more than 40 or 50 lines) I then implement a specific work around and try to put it that logic in a separate function e.g. for dataset on Elements I have a function that detects whether data set exists and then just uses setAttribute with some regex munging as 90% of the time I only need to read and write dataset values and I rarely care about dom mutations.
2) Regarding styling. My approach is to that I style everything up and then use specific overrides for things that browsers can't do.
I also tend to avoid specific IE builds as you end up supporting almost two code bases. However I have years of intuition of what will and won't work in outdated browsers. However I appreciate that this isn't always possible or desirable depending on frameworks / libs used etc.
That's not the nature of the argument. We've strayed from thick clients to thin clients, where the user has no control over hardware or data. This is strictly worse from an ownership and control perspective than the early web.
>> My nightmare scenario is that the Web grows into a rich application platform in an operating system-neutral way, and then a company like Siemens or Matsushita comes out with a $500 “WebMachine” that attaches to a TV. This WebMachine will let the customer do all the cool Internet stuff, plus manage home finances (all the storage is at the server side), and play games. When faced with the choice between a $500 box (RISC CPU, 4-8Mb RAM, no hard disk, …) and a $2KPentium/P6 Windows machine, the 2/3rds of homes that don’t have a PC may find the $500 machine pretty attractive!