> Apple's advertising is built on security and privacy — allowing other App Stores on their system would only open up users to invasive tracking & attacks.
But why? If sideloading requires explicit user action and acknowledgment of danger, why would this affect their brand of safety and privacy in any way? The users who want a safe controlled environment can easily choose to stay in that environment. I just do not understand this argument.
Because these users, regardless of what they previously clicked, will expect Apple to support it. And end to end support is kind of one of the big deals about the iPhone.
This references an even older blog post (that has been lost to time) which was quoted:
> Look at the scenario from the customer’s standpoint. You bought programs X, Y and Z. You then upgraded to Windows XP. Your computer now crashes randomly, and program Z doesn’t work at all. You’re going to tell your friends, “Don’t upgrade to Windows XP. It crashes randomly, and it’s not compatible with program Z.” Are you going to debug your system to determine that program X is causing the crashes, and that program Z doesn’t work because it is using undocumented window messages? Of course not. You’re going to return the Windows XP box for a refund. (You bought programs X, Y, and Z some months ago. The 30-day return policy no longer applies to them. The only thing you can return is Windows XP.)
This is the same sort of thing that Apple faces with iOS. If an application breaks when the system is upgraded, it is the system's fault - not the application (at least in the minds of many consumers) and it is also considered to be the responsibility of the operating system vendor to fix the problem.
(This is part of why the Catalina upgrade was such a big deal because it was known that they'd break a lot of things in doing that)
If someone gets an app from a hypothetical 3rd party vendor store, and it breaks on an update of the operating system (or allows you to download an app that doesn't run) - its seen as company that wrote the OS's fault even though there is nothing that the company that wrote the OS can do about it - they can't even refund the app.
I appreciate the dialog. I'm still struggling with it though and I'm wondering if we have different premises. This seems like it assumes that users are complete buffoons, and aren't capable of understanding a simple message like, "if you enable this feature, you open yourself up to possible security holes. Apple also makes no guarantees that software installed in this way will work, either now or in the future."
At a minimum it seems like the system is designed around the lowest common denominator of user at the expense of more power users.
I used to work tech support at a big tech company. I have little faith in the technical literacy of people outside of those who have specifically studied the issue and done an informed risk analysis on what they want to do.
I feel (especially in today's world) that people are too willing to accept risks that put themselves and others in danger without being informed of the implications or that they maintain a "yea, it will never happen to me" attitude.
That willingness to take risks is especially prevalent in younger demographics. With respect to fortnight when Epic was doing a "disable this check and load from another site" there were numerous copies of the software with malware installed because people were ignoring the risk and looking at what they have. https://www.theguardian.com/games/2018/aug/10/fortnite-on-an...
If you are a power user, and want those features, jailbreaking the phone and doing whatever you want to it is an option. Or maybe, not using an iDevice and going with something that is more open.
There are a lot more people out there that want the training wheels on their technology experience than there are power users.
Personally, after having a linux system that I built myself and compiled kernel patches for back in the day - I'm glad I have the experience and I'm quite happy to let Apple do that now and not have to spend time on that level of verification of software and administration of my own devices.
On the phone itself - I've got lots of personal information, credit cards tied into NFC, email, and IoT controls. And while I'm not going to take risky actions with my phone, I am confident that others will take those risks. As part of Apple's brand identity is privacy and security - allowing people to take those risks works against that brand identity.
One of the frequent comments on HN in the past is "HN may not be the targeted demographic."
But it's not just users that have this behavior. There was major outcry from some developers about Apple removing Carbon even after 12 years of deprecation and no updates. There was a lot of support from developer-centric community like HN. An example: [1]
Considering that even developers are not too understanding, it's no wonder people assume non-tech people will react the same way.
I really don't see a good solution for that. Even if Apple open sourced Carbon, I doubt Carbon users would be able pick up the slack, since they had 12 years to update but couldn't (or 20 if you consider Carbon was always marketed as a stopgap/compatibility solution).
But why? If sideloading requires explicit user action and acknowledgment of danger, why would this affect their brand of safety and privacy in any way? The users who want a safe controlled environment can easily choose to stay in that environment. I just do not understand this argument.