Thank you for that article! I have been feeling like that for many months, and whenever I point it out to co-workers or friends, they mention the Apple-universe as the ultimate achievement of interconnectivity. And in a one-vendor universe, the world may be okay right now. Apple did a fine job there.
However, interconnectivity between devices (and software) of different vendors seems to get worse and worse; standards seem to have become irrelevant. Time to market is the only thing that matters and long-term customer satisfaction and durability seem to be of no importance any more. They just don't care about integration with other vendors any more.
When I saw Minority Report (2002) a few years ago, I thought dragging windows and applications across devices with a gesture would be a possibility in the not-too-distant future. Now, in 2016, this not-too-hard-to-develop feature seems almost impossible to imagine. Sharing content between devices is utterly painful or even impossible: copying large files between computers in the same Wifi without going through the Internet; playing a video from your Android phone on a Samsung TV; moving application state from your laptop to your desktop PC when you leave work; playing music from your Android phone in a brand new Audi via Bluetooth ... All of these things are absolutely achievable if vendors worked together or standards were to be developed/followed. Right now, though, it just looks like technology fragmentation is getting worse every day.
I was going to bring this up too, and you got close, but you're just saying it's a lack of standards and I'd say it's something far more nefarious and intentional: brand lock-in.
Brands are what's missing from all the sci fi films where this stuff works, in Minority Report's D.C. Crimelab the displays aren't Samsung monitors hooked up to iPads, and THAT is what makes them work so seamlessly. No vendor wants to allow you to buy from other vendors so naturally the only time you get a seamless experience is when you sell your soul to the devil of your choice (in my case, Apple) and then your iPhone, your Macbook, your iPad and your Apple TV all work miracles right before your eyes.
If I ever get an Android phone, it will never sync up properly like the iPhone. And I just have the garden variety stuff, if you buy any of this IoT crap you damn well better hope that your given manufacturer will be around for long enough to make all the things you want and that they offer good support.
I have a lot of plans to make our home smart once we buy one, but when I do it will be built on open source modules that I can modify and control, and more importantly SERVICE myself when they inevitably break down after a time.
This is capitalism at its finest. Everyone just redoes the same things, occasionally with a slight spin on it followed by heavy marketing.
Noone cares about interoperability, noone cares whether all the data and content we're generating today will be accessible in a few decades or centuries.
Standards bodies are being taken over by company interests and we're still piling stuff on top of a technology stack designed with a ~50 year old technology landscape in mind.
Noone wants to think further than next quarter's profits.
> Noone cares about interoperability, noone cares whether all the data and content we're generating today will be accessible in a few decades or centuries.
It's not a consequence of capitalism alone. Capitalism wouldn't result in that
if its actors were rewarded for interoperability and future-proof access to
data. Sadly, customers don't care about these two aspects (or long-term
durability of a product, or servicability), so almost no company makes any
effort to provide them.
> Our industry has gone to shit.
It's hardly specific to IT/consumer electronics. You have the same with cars,
tools, clothing, everything really.
> "customers don't care about these two aspects (or long-term durability of a product, or servicability)"
Customers have no realistic way of assessing those things, especially in a market where all the products are obsoleted every year or so. It's Akerloff's "Market for Lemons" all over again.
A vast amount of the consumers in this case are still woefully ignorant of technology at all, but that is changing, the younger customers are more demanding of tech companies and have d loyalty than any group before, meaning these companies will have to start changing or be replaced by ones using more open frameworks.
It's not that kind of ignorance, it's that you literally can't assess the quality of a product before buying it and testing it. That information is not available. As others have pointed out, UL and CE impose certain categories of pre-sales testing to guarantee certain safety properties. But there's no guarantee that the product will actually work in particular ways and continue to do so.
You can't expect that everyone should, before buying a product, perform their own accelerated-life testing and code security audit.
(I think we've yet to have the statutory rights lawsuits where people's IoT devices stop working after a couple of years and they claim this is a "manufacturing defect", which under UK law must be warrantied for six years.)
I'm not even talking security, that's a whole other dog and pony show and you're right about that, the fact that there's no kind of oversight in that is insane. We make sure car manufacturers install air bags, and identity theft thanks to the garbage mechanisms in place to handle it is almost certainly as damaging if not moreso to you than having to go through a windshield, especially when you consider that the IoT startups can give away your life savings and then just ¯\_(ツ)_/¯ when you ask for their help.
But I agree, there is woefully little information even about what I was talking about which was interoperability, or even just functionality. Pretty much your only recourse is to look into reviews of existing products from a given manufacturer and hope that said review wasn't bought and paid for.
While I'd love this development to be true, in my experience it's the opposite:
Younger customers are at least as ignorant as the older generation, if not worse.
And long term loyalty is even less a concern: It's all about the new gadget/social platform/etc. of the day.
You all are not seeing the forest for the trees. Using a sci-fi film as the yardstick for what interoperability should look like is silly.
(This is in response to the general angst over interoperability, not the IOT space, which I agree is underwhelming at the moment. However, in what I assume to be opposition to most people in this thread, I am hugely optimistic (if not yet invested).)
Sure, every once in a while something that would appear to be easy turns out not to be, and that's really annoying. But most things works well most of the time. Transferring state between devices? That's the cloud. My email is in sync 100% of the time between 4-5 different computers and devices of different brands and OSs. Dropbox takes care of my personal files, box.com of those for work. I frequently collaborate on documents in both Google Docs and Quip. Not too long ago, transferring files between Mac and Windows was hard (I think macs had a proprietary compression program that wasn't available on windows?) - luckily that's entirely in the past. USB is ubiquitous, FireWire, PS/2, ADB, serial, all gone. A Mac keyboard works on a Windows box, and vice-versa. Bluetooth definitely has kinks, but it also has a lot of "just works" along very long stretches. With a few annoying exceptions, people using Linux, Macs and Windows can work together seamlessly. Websites (with a few annoying exceptions) generally works well in all major browsers on all major platforms, on desktop, tablets or mobile (both IOS, Android and Microsoft). I got a new wifi-printer a few days ago - after joining to the network, it was automatically available on our computers. Took maybe five minutes, no messing about with IP addresses and drivers.
I can cast to a ChromeCast on my TV from my Mac, my Android and random guests iPhones (I can also "cast" Youtube directly to the TVs Youtube app, but it's pretty flaky -- and this is some fancy standard, as opposed to the ChromeCast, it's just that Sony is shit at implementing the standard). I can play music from Spotify from my laptop or smartphone on a Denon speaker in the kitchen, a "GramoFon" wifi-sound-device attached to my "dumb" stereo and my dad's "smart" Marantz stereo.
Interoperability is doing fine, but yes, it's messier than it might have been if some magic omniscient body had come up with clean standards for all this. That doesn't mean that it's not there.
> Transferring state between devices? That's my butt.
No, it isn't. It is for transferring state between instances of the same application (or group of application by the same vendor) between devices.
> Dropbox takes care of my personal files, box.com of those for work. I frequently collaborate on documents in both Google Docs and Quip.
Did you try to make them work together? Oh, you can't really, because Google Docs decided to be cloud-first and you don't have files with actual data on your hard drive anymore. You can't open them in a third-party application anymore.
Here's the thing - seamless operation is getting worse than it was few years ago. A lot of that came from the push to cloud and mobile - even with propertiary formats on the desktop you could work with the files using third-party apps, because the data was actually on your hard drive. Now the cloud services locked your data in, and they're giving you access only through a way of their choosing (which is usually a lowest-common-denominator webapp).
What we see now is a lot of companies trying to commoditize each other. Third-party software developers try to commoditize the platform makers by routing the data through the Internet. So sure, Spotify works and syncs up nicely between your iPhone, Android tablet, Macbook and Windows PC. But why on Earth do I have to use the Spotify app to listen to music, instead of - I don't know - Foobar2000? That's right - because files are not there.
> Using a sci-fi film as the yardstick for what interoperability should look like is silly.
Actually, I think it's very good and sane. It's a perfect yardstick - because we get to ignore all the market forces and imagine how things could work if they were designed to be actually useful. And then we can ask ourselves, why things are not like this, and how to make them more like this.
Your comment quotes mseebach as saying "Transferring state between devices? That's my butt." I think you left Cloud-to-Butt turned on in your browser. That's not always a good move when reading HN.
You're speaking entirely in terms of actions that you can't perform, rather than outcomes. I use Google Docs and Quip because what I want to achieve is collaboration on a text document. None of that requires me to make two work together. That said, I just checked, Quip offers me to export to Word, PDF, Markdown, HTML and Latex - Google Docs to Word, ODF, RTF, PDF, TXT, HTML and ePub. I think I might just find a way.
I used Spotify to illustrate interoperability between brands, but all of the mentioned devices support (many) other sources as well, including DLNA.
> we get to ignore all the market forces and imagine how things could work if they were designed to be actually useful.
Movies show things that are pretty, not useful. It's a flat out cliche that practically anything that happens in any movie (not just in tech, but pretty much any field) looks ridiculous to people who actually knows a little about what's going on.
dude. in all of these examples, you're the interoperability stack. everything requires interventions from you. it's supposed to be the internet of things not the internet of my things that i make work somehow.
goto someone else's house and make all this work without significant amounts of effort. that is what i want. i should just be able to walk into someone's house and say "hey check this out, ok google - play the last youtube video i saw on the biggest screen in the house" and it should work damnnit. that's the future i want
But most things works well most of the time. Transferring state between devices? That's the cloud.
And all we had to give up was security, privacy, reliability, longevity, speed and more money. :-(
Unfortunately, as with so many adverse consequences when IT goes wrong, most non-technical people don't really understand the risks until something bad happens to them, and by then it's too late. In fact, these days with the trend for trying to outsource IT instead of maintaining in-house expertise, even a lot of technical staff don't seem to understand or properly control the risks. Just look at how many businesses grind to a halt every time one of the major cloud services has a significant outage.
The move to Internet-hosted services and subscription-based products is entirely understandable from the industry's point of view: it gives them lots of new ways to exploit their customers and make more money.
However, from the customer's point of view, I think we would be much better off if we invested more effort in decentralisation, standardisation and interoperability, and "private clouds" and VPNs. There are few advantages for customers to having important functionality reliant on a very small number of huge service providers, as opposed to having many smaller providers able to offer compatible variations and having options for self-hosting with decent remote access and backup provisions.
Unfortunately, we seem to have reached a kind of equilibrium now where the huge players are so utterly dominant in their industries that disruption is all but impossible. Their worst case is that they buy out any potential serious threats before they're big enough to become actual threats, but much of the time, the lock-in effects create sufficient barriers to entry to protect the incumbent anyway. There is no longer effective competition or disruption in many IT-related markets, just a lot of walled gardens where you pick your poison and then drink as much of it as they tell you.
I'm sorry to say I don't see any easy way to break the stranglehold the tech giants now have and get some competition and interoperability back into the industry. It's going to take someone (or possibly a lot of someones) offering products and services that are both competitive in their own right and built with a more open culture in mind to disrupt the status quo now, and it's hard to see either startup businesses or community-led efforts achieving escape velocity any time soon.
"The cloud" only works when there is unlimited high speed broadband available everywhere. That's not even remotely close to being reality, and so long as the ISPs in the US are allowed to continue down their current path, it will never be reality.
> when you sell your soul to the devil of your choice (in my case, Apple) and then your iPhone, your Macbook, your iPad and your Apple TV all work miracles right before your eyes.
Even knowing I'm going to hell, one button press to chuck my twitch stream from my phone onto mum's apple tv makes it feel worth it.
Oh geez, right? Or just tapping a show or movie and having the Apple TV show it straight away, or using the Apple TV as a sound output for a MacBook...
This might just be me but I just don't have that good of an experience with android, hence why I switched.
Presumably your appletv doesn't say "Apple TV (100)" like mine does when I connect to it. I've got some sort of virulent strain of discoveryD and it's awful.
This happens if your Apple TV is on the LAN and the WLAN at the same time and they are connected either because they are one network or you use avahi-bridge between them. It thinks the other interface is someone else using the same name so it adds these numbers to make a unique name.
Out of curiosity, have you tried a factory reset? If not I'd get in contact with apples tech support, they've always been remarkably helpful to me and they don't make you jump through hoops of fire to get help.
Exactly what I thought when he brought up playing videos from android phones on Samsung TV's. The whole thing works, but only when your TV and phone are from the same vendor
I dunno, even in the ideal Universe of Apple devices talking to each other I have headaches. AirDrop is my main bugbear - sending files (pictures) from my iPhone to my MacBook Air should be as easy as it gets but Airdrop has failed to discover the MBA for the last month or so, there's no direct Bluetooth file transfer as I can see, Mail kinda works as a bit of a kludge, but warns if the filesize is over a certain size. And iCloud is something I'd rather not touch (Apples cloud services can go take a hike, after Apple Music decided that a huge chunk of my music should be deleted forever).
I had phone->computer file transfer working via Bluetooth back on a sketchy old pre-Android Motorola, if such a simple workflow fails then what hope is there for anything more complicated. </rant>
edit: sorry this is slightly OT, it's irritated me for a while and this felt at the time like a good time to share/vent!
I think Apple's problem with this stuff is that they want it to be so seamless that they refuse to acknowledge the possibility of failure. If you're lucky they'll tell you that something went wrong, but they almost never say what. In many cases they just swallow the error silently. I run into this with AirPlay a lot. When it works, it's easy, but when it fails it's a total mystery as to why. Fixing it involves either retrying until it works, or rebooting whatever is handy in the hope that you'll stumble across whatever's responsible.
This pattern seems to be ingrained in Apple at a deep, genetic level.
Their product integration is the most obvious example, but it's far from the only one. Everyone knows that Apple computers "just work", and that when they don't you throw them in the bin because everything is too connected for easy repair.
Even bureaucratically, I once spent two months trying convince Apple that my MBP existed. They swore no machine with that serial number had been made, and flatly refused to service it until I gave them the 'real' number.
None of these things get resolved without forcing a real human to acknowledge the issue you're facing. The one thing I'd say in Apple's defense is that it's mere hubris, which I find far more pleasant than Google's support outlook of "yeah, it's broke, now go to hell."
That is deeply true, and has been for decades. In the old days (90s) the Mac world often smirked at the ubiquitous "Abort Retry Fail?" refrain that Windows users often encountered.
But even then, I noticed that the Apple equivalent tended to be, "...". Not much different today.
I hadn't thought about it until you said, but that is exactly the issue - I've run into problems on OS X that have been silent/invisible but which I've been fortunate enough to have root-caused by digging through logs. Sadly that's not an option on iOS :(
When I switched to Mac from Windows about 5 years ago (because I decided to be an iOS developer), this is the biggest thing I noticed. There was zero feedback, or a just a quick beep out of the speakers. The anthropomorphic equivalent would be malevolent staring.
That's weird to me as a lifelong Mac user because that's what I experience using Windows. My father needed help with something on his Windows machine, so I sat down and double clicked on an app. I got the hourglass for about 2 seconds and then ... nothing. The computer looked like it wasn't doing anything. No feedback. No icon in the dock-like window at the bottom, just nothing. I double clicked a few more times, and eventually the first instance showed a window. I think other instances may have started up in the meantime, slowing things down even more.
AirDrop was always a hit or miss for. I think it's even worse than file shares on Windows which worked me 80% on the time at best.
It seems that the main issue here is peer discovery, but I'm still somewhat astounded nobody has managed to produce an easy-to-use service discovery that works.
Unfortunately AirDrop is two things: a file sharing service over Bluetooth and a file sharing service over wifi. It only recently got somewhat unified which allowed you to use it to transfer files between a Mac and an iPhone but it doesn't seem to be quite ready yet.
Bluetooth is not fast and reliable enough for sharing modern file sizes but there is also no guarantee two phones that are near each other will share a wifi network.
That's the funny thing when specific issue started the discovery worked exactly _one_ way - the MBA would see the iPhone but not the iPhone would refuse to see the MBA. Puzzling at best!
I'm a relatively avid software user and I'm having a hard time agreeing that lacking such features is a big problem. I have almost never thought that I would need such a feature, and haven't heard anyone else lack it.
My point is that it's very understandable that such possibility would be useful for some people, and that they would very much like that workflow. It's just that if they are a minority - the industry will not prioritize such development, and it's a good thing. This means that they will develop more needed features.
In a way, everything you describe already exists in the form of Web Apps. Log into Gmail on your PC, do some things there, then switch to an iPad and open the same address there. - voilà you have same app on multiple devices. It will even sync flawlessly...
Regarding file sharing - I can only speak for the technologies that I've been using personally, but both Samba, Dropbox, FTP, SFTP bluetooth filesharing and bluetooth sound have all been working amazingly great for me... It seems like a perfect example of a developed and implemented standard which most devices agree on. What am I doing wrong?
> My point is that it's very understandable that such possibility would be useful for some people, and that they would very much like that workflow. It's just that if they are a minority - the industry will not prioritize such development, and it's a good thing. This means that they will develop more needed features.
I disagree this is a good thing. IT is an extreme example of the fact that people don't know what they want until you show it to them. For all you can tell, seamless transfer of files between multiple devices could be a feature people wouldn't imagine living without if they had it. But you can't explain mass complains about lack of such features because people have stuff to do, and they adapt their workflows to the capabilities of the tools they know - not the other way around.
You're right and wrong. You're right in that people don't know what they want. But you're wrong that industry doesn't give it to them - in this case, Dropbox did, and we (techies) said "why would you want that, you can just [X, Y, Z]?" while regular people flocked to it because it was just so much better.
Sure, it didn't include direct LAN sync to begin with (it does now), but that's the kind of "perfect is the enemy of good" implementation detail that the vast majority of people couldn't care less about.
Yeah, I meant mostly to convey that part you say I'm right about :). I.e. I believe that a lot of useful tools appear because some people want to solve a problem for themselves, and only then others discover the value in it.
Doesn't Dropbox achieve exactly that, and in almost the perfect way possible?
But yes I get your point, there are surely some areas for which no good solutions exist yet and users don't know that it would be very useful.
Dropbox is great (I've been a happy user for many, many years, and for the last year I've been also a paying happy user). But it solves a different problem - the problem of keeping your files accessible between many machines. Machines you own. The problem of direct file transfer, aka. "I want this file to get from this device to that device (any device - whether mine or my friend's) as fast as possible" remains unsolved.
That's not true, it used to exist in the form of FolderSync, a productized version of Microsoft's SyncToy, which later became part of the Windows Live brand (IIRC as Windows Live Sync.)
Then it went away and never came back. Why? Because Microsoft released OneDrive, a DropBox-like cloud storage system, and the 100% free FolderSync was a competitor to it. Microsoft can make money selling OneDrive, they can't make money selling FolderSync, so it's gone.
Basically, the product you're lamenting doesn't exist used to exist, but no longer does because nobody can make money from it.
I used to use FolderSync to exchange multi-gigabyte video files with my friends while we were doing video editing, and it was an amazing awesome product. It was dead-easy to set up, traversed NAT and firewalls without any troubles, maxed-out whatever internet connection it had access to, and used the LAN connection if possible. Now we'll never see anything like it again.
... anyway, TLDR: the problem doesn't "remain unsolved", it was solved but is now no longer solved.
> I thought dragging windows and applications across devices with a gesture would be a possibility in the not-too-distant future. Now, in 2016, this not-too-hard-to-develop feature seems almost impossible to imagine.
Moving a running process from one machine to another, seamlessly and instantly, is not at all easy on current architectures. Not that it's impossible (well, it might be impossible to do it instantly in all cases), but it would be a hell of a lot of work, even across a single platform, and have a lot of unpleasant snags to deal with. (What happens when important app/system settings are different on your desktop and laptop?)
You assume a specific implementation. In the movie, we only see the display moving from one screen to the other. That is a problem solved since the 80s with X-Window.
More recently, all enterprise application UI are actually web UI. So that's just moving a browser window from one screen to another, so just a refined Apple "Handover".
On a more hardcore fashion, we can move a whole VM almost seamlessly. At some point that could be the case with containers too and that may not be that far fetched to move single process after all.
X-Window from the 80s… Are you trying to give us PTSD flashbacks?
Let’s see how many of these words sound familiar: Xinerama, Zaphod, XRandR.
Not to mention network transparency. So slow, so fragile, so never ever going to support audio or USB devices or video acceleration. I see Microsoft RDP and I weep. I wonder how the Sun Ray protocol compares.
Moving processes is possible, though impractical when devices have different processors. See the enduring appeal of things like Continuum for smartphones. Moving just parts of application state, like Apple’s Continuity, tends to have vendor lock-in and third-party adoption issues. So, similar problems as what a practical IoT ecosystem faces.
> all enterprise application UI are actually web UI
Yes, and so are most consumer apps, and "moving state" to another device is simply a question of IM'ing a link. Sure, the UI in the movie looks way cooler, but it's a movie.
That works for the barely double-digit percentage of apps which store all necessary state in the URL. In most cases, it's more like share a link, reauthenticate, get an unhelpful error page, use the navigation to get back to where you were, learn that the work you did first wasn't saved at all or that their eventual consistency means "same day" (e.g. iCloud), and hopefully they don't have some halfhearted attempt at locking which will prevent you from continuing. I suspect that if this feature ever arrives it'll be streaming rendered video like CarPlay/etc. because that's the only thing the device vendor can count on.
The point isn't that this is uncharted waters technically but that too many companies decided a good user experience isn't compatible with their desired profit margins. In some cases like security and bug fixes that might change due to regulation but that's far from certain and it's really hard to imagine that extending to broad interoperability.
Yes, for an app to successfully transfer state, it needs to be able to transfer state, that's a tautology.
But for webapps that want to be able to transfer state, the mechanism is the URL, and for those app, this works perfectly and unceremoniously well today.
The point I was responding to was your assertion that this was already true of most enterprise and consumer webapps:
> > all enterprise application UI are actually web UI
> Yes, and so are most consumer apps, and "moving state" to another device is simply a question of IM'ing a link.
That's a great aspirational goal but it's simply not something which most people can assume will work – I still routinely find apps from major companies where you can't even use the back button within the same session!
Exactly, it's like all already there, working seamlessly, even most of the time free. For the users who really want/need this functionality, it seems to be very cheap to just research those options and configure them already (even if they don't come preconfigured by default on a new OS installation).
So maybe it's just not such a sought-after feature?
Xen demonstrated this (for hypervisor-managed running processes) prior to 2005. It was one of the selling points of their virtualisation process, though it turned out that live-system migration had a few additional hiccups in it. Mostly works now.
If you're looking at back-end-mediated stuff, this is pretty much what happens when you synchronise browser sessions across devices, modulo the rate of transfer. The key is managing state intelligently.
This is also done, mostly, on server-side infrastructure, where front-end systems have little to no state on them -- individual client requests come through, state is managed usually in the datastore itself.
You're right. Maybe it's not as easy as I made it seem there. It's definitely possible, if you think of VMware's vMotion (live migration). That implies a shared hypervisor _by VMware_, of course, ... but it shows that technology-wise it's doable. When I saw vMotion for the first time, it blew my mind. Now all I can think of is how cool it would be to have that among different devices (laptop->desktop->smartphone->..).
vMotion is cool, but it works on isolated VMs, which is an easier job than migrating an application running on a desktop and integrated with the rest of the system (unless you completely isolate all applications like on the mobile OSs, but then you lose other things).
I don't get it; how can one be sceptical of something which has been happening for centuries? How many scribes copying books by hand do you know? The interoperability issues - which are mostly a matter of politics (in a broad sense), not technology - will certainly affect how fast certain jobs can be automated, but that automation does and will happen is undeniable.
> will certainly affect how fast certain jobs can be automated, but that automation does and will happen is undeniable.
But that's a relevant point. The main question in the automation debate is not whether it happens, the question is whether job destruction due to automation happens fast enough to outpace the usual job creation mechanisms (appearance of new market segments etc.).
You're forgetting the main point (at least how I read it) of the article: Automation for buildings already exists, our office has a ton of the stuff, the thing is the automation hardware we're using easily goes for about $30-40k per room depending what you're doing. Our media and presentation room are about in that number. They work fantastically and in six years have never needed a huge amount of service aside from an occasional tire kick, the problem is Joe Schmo wants that same experience for his house, and it's just not going to happen at the price point that a lot of IoT products are hitting.
The sad thing is I would totally be up for IoT products that costed more, because then I would at least have an idea that whoever built it built it to last instead of with the accountant standing over their head.
I am well aware of how much automation exists. The amount of work we are doing is going up - 40 years ago it was mostly men in the workforce, now both men and women work.
Interconnectivity is a double-edged sword: it sometimes precludes innovation. To interconnect, you need an agreed-upon spec, which constrains what you can do. If you can think of a better way to do something, you may not be able to implement it.
As an example, IMAP lets you use any client app with any server. But IMAP lets a message belong to a single folder. Gmail, on the other hand, lets you apply multiple labels to an email. Doesn't map well to IMAP. Gmail also lets you star a particular mail in a thread, while applying a label to the entire thread — these don't map well to IMAP. Neither does priority inbox, for example. And so on. Which is why you get a second-rate Gmail experience if you use IMAP.
Standards and protocols sometimes preclude innovation.
I don't want to live in a world where everything is interoperable, because that's a world where everyone is forced to conform to a straitjacket. That doesn't mean, of course, that interoperability is completely useless. It's a matter of balance. I don't want too much interoperability or too little.
The point I'm making is that interoperability has a cost. It's not all good.
There is no economy without consumption and there is no consumption without waste. Incompatibility makes for a great waste.
Edit: The technology industry needs things not to work for it to be profitable. Imagine if things just worked. Dropbox, Box, Google Drive, and hundred other solutions would never be paid for and would not be needed. Waste creates jobs.
Android had beam that just worked so well to transfer practically anything. Recently it fails to transfer photos about 50% of the time on the latest and greatest Android devices.
Btw apple is probably just as bad as everyone else, the only reason things seem to work in the apple universe is because the life expectancy of an apple device is 2 years max, so they can just focus forward.
The best way to transfer files to an other computer if we are on the same wifi is still `python -m SimpleHTTPServer`.
Old things always seems to work better than new ones. If we deprecate the 3.5m jack, we are doomed.
However, interconnectivity between devices (and software) of different vendors seems to get worse and worse; standards seem to have become irrelevant. Time to market is the only thing that matters and long-term customer satisfaction and durability seem to be of no importance any more. They just don't care about integration with other vendors any more.
When I saw Minority Report (2002) a few years ago, I thought dragging windows and applications across devices with a gesture would be a possibility in the not-too-distant future. Now, in 2016, this not-too-hard-to-develop feature seems almost impossible to imagine. Sharing content between devices is utterly painful or even impossible: copying large files between computers in the same Wifi without going through the Internet; playing a video from your Android phone on a Samsung TV; moving application state from your laptop to your desktop PC when you leave work; playing music from your Android phone in a brand new Audi via Bluetooth ... All of these things are absolutely achievable if vendors worked together or standards were to be developed/followed. Right now, though, it just looks like technology fragmentation is getting worse every day.