I worked at a place that tested software releases on a VM of every supported operating system, including OS X. We didn't have any Apple hardware, because no one wanted to deal with that, but someone had brought in the chassis of an old Apple computer and the host computer was inside it. We didn't run it by any lawyers or anything, but as far as we could tell, running OS X inside a computer that had all of its guts replaced was entirely within the license requirements.
Except int this case, we pulled a single plank off of ship of the ship, burned the rest of it, and nailed the plank to a brand new ship built in a competing ship yard.
The UK has been slowly codifying laws over time, transitioning from common law to written law. In the Americas, we kind of jump started that processes, and are far more focused on the law as written, to the point that the positioning of a comma can have million-dollar implications: https://www.nytimes.com/2006/10/25/business/worldbusiness/25...
If a literal interpretation of what you wrote allows for something, even though it's clear that you hadn't intended to do so, then it is going to be allowed.
It's been a while since I've looked at it, but I believe it says something along the lines of "Apple-branded systems". Putting an Apple logo sticker on a Hackintosh was a common thing to do for this reason.
1. I got tired of waiting 2h for my app to get notarized because
2. I can't sell it on the AppStore in the EU... because
3. the AppStore Connect page gets stuck at their DSA compliance form (it's been 10 days).
And, to add insult to injury, the whole thing could be a PWA, without any compromises in the UX whatsoever.
I misread the title, but I still posted this comment as an example of confirmation bias* in the orange book for posteriority. Time to step away from the computer!
I can't sign into Apple without going incognito in Chrome. I put in my email, and it throws and error before asking me for my password. It's not an extension, it's not a cookie, idk what it is.
And then when I do get past he password, it sends a OTP to a Mac Mini I never use and have to tap around to get it to generate a SMS code. No option for external TOTP, and no way to remove the Mac Mini I don't use from OTP without signing out of it.
Ive made an apple developer account, paid $100 and then it kicked me out and after logging in still said I didnt pay yet. I paid again until realizing it actually charged me.
It also took me an hour to try and figure out how to get it to send OTP to a phone instead of an old broken macbook.
Google also gives me a ton of issues with having multiple accounts. Go to calendar app with account 2, switch to desktop mode so I can actually click on the meeting invite, now Im logged back into account 1. Similar issues trying to use any other google service and have to use
I don't understand how these kind of things aren't priority #1
I've got this with some sites. How long have you had that Chrome profile? They seem to collect cruft somewhere and it stops some sites from working. On my main Chrome profile I can't use one of my banking sites, any OpenAI site uses 100% of CPU, other sites only hold the login cookies for a few minutes. I've tried disabling every extension.
None of these issues on my other profiles or in incognito.
I have the exact same problem. It’s saying something about not being able to confirm my identity? I took a look at the dev tools and it’s apparently making a request to a server which returns an error.
It only works in incognito because it’s using a different ip address there…
Beats me. Same computer, same browser. Don’t know whether it’s the exact same address though. Cleared cookies already. Still only working in incognito.
A PWA on iOS is just a cached web page. Safari remains pretty crippled with regards to the APIs (bluetooth, usb, filesystem, etc...) that make local apps attractive in the first place. Apple is fine with letting people cache web pages, they're not fine with stuff that might displace the app store.
And for that I'm quite thankful, if all the stuff that apps could do were possible on the web it would make the web a far far scarier place than it is.
I avoid apps as much as possible due to all the nefarious tricks they play, even with all the sandboxing and review they go through. Without those constraints, I can't imagine the hell that we'd be in.
Which is fine, for the 90% of people that spend their time on the 70% of common features and interact only with the screen and headphones and internet.
But sometimes people like to do stuff like configure their QMK keyboards or load new firmware for their EdgeTX drone radios or make bootable USB sticks, all tasks that work just fine in easily deployed PWAs on every client platform in existence, except iOS.
For small developers of small-yet-oddball clients apps, PWA's are an absolutely magnificent platform. Write once, deploy once, run... everywhere-but-an-iPhone. It really sucks that Apple's devices are crippled like this.
Edit to reply to this bit:
> Without those constraints, I can't imagine the hell that we'd be in.
Again, that hell is literally every other platform on the planet. It's only Safari that is "protected". In point of fact browser permissions management on this stuff tends strongly to be stricter and less permissive than app permissions, which are much less visible.
> While originally developing iPhone prior to its unveiling in 2007, Apple CEO Steve Jobs did not intend to let third-party developers build native apps for iOS, instead directing them to make web applications for the Safari web browser.[10] However, backlash from developers prompted the company to reconsider,[10] with Jobs announcing in October 2007 that Apple would have a software development kit available for developers by February 2008.[11][12]
And the irony of this is that a lot of the apps in the app store are hybrid apps that are basically web apps with a thin native wrapper around them because it's just so much less of a hassle to develop for both iOS and Android that way and because, if you're coming at it as an outsider, Swift is such a ball-ache to deal with compared to other languages and stacks.
So PWAs would have been more than fine but, unfortunately, that ship has long since sailed, and Apple make way too much money out of the app store for a course change.
It came out in the Epic vs Apple trial that 90% of App Store revenue comes from in app purchases of pay to win games. The only money Apple is making from these “could have been a web app” apps are for things like Uber where you can use Apple Pay (not in app purchases)- that has the same credit card fees regardless.
If it’s only mean old Apple, where are all of the great Android PWAs and why do developers decide to make native Android apps?
It looks from the cited sources that developers wanted to write apps, Apple chose to do this in a way that allowed it to keep control of what was installed.
They didn’t have much of a choice. In that time people had already developed jail breaks and Cydia, an app store in its own right, was thriving.
Before Apple’s App Store launched, my iPhone was running all sorts of other apps and alternative launchers.
Apple had to move fast to keep things from getting too out of control.
Over the years, as the vulnerabilities in the OS were closed and iOS added features, the need or desire to bother with jailbreaks and 3rd party pirate app stores dropped. I haven’t thought about it in many years.
Perens had accepted a position as senior Linux/Open Source Global Strategist for Hewlett-Packard, which he describes as leaving Apple “to work on Open Source. So I asked Steve: ‘You still don’t believe in this Linux stuff, do you?'” And Perens still remembers how Steve Jobs had responded.
“I’ve had a lot to do with building two of the world’s three great operating systems” — which Jobs considered to be NeXT OS, MacOS and Windows. “‘And it took a billion-dollar lab to make each one. So no, I don’t think you can do this.'”
Perens says he later "won that argument" when Jobs stood onstage in front of a slide that said ‘Open Source: We Think It’s Great!’ as he introduced the Safari browser."
That's interesting!
However I would argue Jobs sadly won that argument, as there really didn't come any open source os for neither phones or major push on PCs in the almost 30 years since that exchange.
While yes some software have come in that format, it took the big 3 to push the server Linux based clouds, Google to push it on phone, tablets and laptops and now Steam to make a push for the average gamer.
This is not to discredit the work being done outside those lab's which very much build on the work for free or by foundations, however the first versions just don't capture a majority of the available markets which the OSes Jobs mention very much did and the others by the billion dollar labs since.
Apple neutered the web as best they could to force you to use their rails.
I'm still angry they killed flash. There has never been a better platform for non-technical folks, kids especially, to make animation, games, and mini apps, and deploy them as single binary blobs.
A single swf file could be kept and run anywhere. For the younger generation: imagine right clicking to download a YouTube video or a video game you'd see on itch.io. And you could send those to friends.
You could even embed online multiplayer and chatrooms into the apps. It all just worked. What we have now is a soup of complexity that can't even match the feature set.
Flash was cool, but it was also a spectacular dumpster file. Honestly I'm sort of glad Google&Apple killed it. Yes it was an amazing medium, but it feels almost like Adobe kept thinking about it as an animation studio and didn't care to run it as an application platform with all the concerns it entails (i. e. security). And support of anything that's not Windows, while technically present, was abysmal. HTML5, with all it sins and warts, is a better platform, even if it has much higher entry barrier.
During the Flash era, creativity flourished. It was accessible, too. Seven year olds could use it.
Flash was getting better and better. It could have become an open standard had Jobs not murdered it to keep runtimes off iPhone. He was worried about competition. The battery and security issues were technical problems and fully solvable.
The companies that filled the web void - Google and Apple - both had their own selfish reasons not to propose a successor. And they haven't helped anyone else step up to the plate. It would be impossible now.
Imagine if apps for mobile could be deployed via swf. We'd have billions of apps, and you could just tap to download them from the web.
Smartphones might have pushed us forward, but the app layer held us back.
The 1990s and 2000s web saw what AOL and Microsoft were trying to lock us into and instead opted for open and flexible.
Platformization locked us into hyperscaler rails where they get action on everything we do. This has slowed us down tremendously, and a lot of the free energy and innovation capital of the system goes to taxation.
The thing is, HTML5 is far more technically capable than Flash ever was. It was competitive even at the time: Flash's main thing was 2D vector graphics, but iOS Safari has supported both Canvas and SVG since at least 2010, possibly from day one.
But the creation tools and the culture never really lined up the same way, and developers focused on creating apps instead.
For non-games, HTML has always been technically superior. iOS Safari may have a long history of rendering bugs, but it beats Flash/AIR, which always looked very out-of-place even on desktop.
I do wonder what would have happened in an alternate universe where either Flash or HTML5 took off on mobile instead of apps. We would have both the upsides of openness, and the downsides of worse performance and platform integration and the lack of an easy payment rail. Pretty much the same situation we still see on desktop today.
We wouldn't have had the same "gold rush" from the early App Store, which happened in large part because of the ease of making money. There would probably be more focus on free stuff with ads, like Android but more so.
I second everything except the fact that Adobe was behind Flash, which IMO is what killed it in the first place (with ten years of hindsight, I can say this confidently). I still do creative, non-standard work, but in a free way using pure vanilla JS (using Haxe).
Adobe's mistake was keeping the system proprietary instead of letting it be free. Since then, I've left that ecosystem and what a relief!
(I know I'm mixing different levels here, and my personal experience isn't really an argument).
ps: HTML scope is way more advanced than whatever Flash could have been.
> Imagine if apps for mobile could be deployed via swf. We'd have billions of apps, and you could just tap to download them from the web.
No they wouldn't. We've forgotten just how bad and sloppy flash apps were. The handful of companies that used Adobe Flex turned out awful POS that barely worked. It occupied the same space that Electron does today -- bloated, slow, and permitting cheap-ass devs to utilize cheap talent to develop 'apps' with all the finesse of a sledgehammer
As a kid I loved flash, I was making interactive apps in AS2/3 in high school. But I watched in horror as it became the de facto platform for crapware
> It occupied the same space that Electron does today
This. Except Electron crap at least runs on top of a well-designed and relatively reliable platform (HTML/Chromium) - and sometimes the crap even offer an actual PWA version with all the sandbox benefits a real browser has to offer. Flash didn't even had that.
And let's be realistic, there will always be demand for a crap-running platform for vendors that don't care (or just have their core values elsewhere).
> And let's be realistic, there will always be demand for a crap-running platform for vendors that don't care (or just have their core values elsewhere).
My kingdom for some way of gatekeeping platforms so that entities like this are forbidden from participating
- Lack of gatekeeping was THE advantage that made Web viable and competitive against traditional media.
- You can't gatekeep crapmakers without also gatekeeping that kid in his parent's basement with an awesome idea.
- Crapmakers with enough money will punch through any gatekeeping.
- Sometimes you have to accept that vendors don't care. Can't expect a transport company to give too much love to their timetables app. Yes, they are expected to hire someone competent to do it, but the "someone competent" also rarely care. Still better than having no access to the timetables.
No, there was gatekeeping, it was knowledge. You had to be knowledgeable enough to work the system. You had to have the time to dedicate to learning the system and how the internet and how computers worked. Those twin gates kept the internet as it was in its early days.
Unfortunately every peabrained enterpreneur saw that and began eroding the moat until it was gone. The knowledge required to build things has been on a steady decline, and now with AI that decline has completely destroyed it. Now, every fucking hack with an "idea" is not only able to act on them but now they act like they are as good as the people who paid a heavy price to get to the same level through years of study and hard work.
This myth that Apple “killed” Flash on mobile should die. When Flash finally came to Android in 2010-11, it required a phone that had a 1Ghz processor and 1GB RAM and barely ran on that.
The first iPhone came with 128MB RAM with a 400Mhz CPU, it couldn’t even run Safari smoothly. If you scrolled too fast, you would get a checkerboard while you waited on the page to render. An iPhone with those specs didn’t come out until 2011.
Adobe was always making promises it couldn’t keep. The Motorola Xoom was suppose to be the “iPad Killer” that could run Flash , Adobe was late leaving the Xoom in the unenviable position that you couldn’t go to the Xoom home page on the Xoom at launch because it required Flash.
Flash was cool, but the plugin was full of bugs and a constant source of pretty serious vulnerabilities. I too miss the flash games era of the web at times, but it wasn't some utopian thing.
Macromedia Flash was indeed a beautiful, innovative piece of software. HTML 5 still doesn't match its features vis the ease and usability that Flash offered in creating and deploying content online. But after its acquisition by Adobe, it just ever so slowly went downhill. It should have been open sourced.
> I'm still angry they killed flash. There has never been a better platform for non-technical folks
Capcut and Roblox would like words. No, that's kinda just wrong. Content generation for non-technical folks has never been easier or more effective. Flash is just something nerds here remember fondly because it was a gateway drug into hackerdom. Some of us are older and might feel the same way about Hypercard or TurboPascal or whatnot.
Absolutely. Apple had the balls to be the first major tech company to take the first material step to actually end the security nightmare that was Flash for good.
I'm sure it was the security that of Flash that worried them, and not the fact that a third party was encroaching on their walled garden that couldn't be extorted.
> And since our philosophy is to provide software for our machines free or at minimal cost, you won't be continually paying for access to this growing software library.
I don't how it was when Apple was a start-up, but I have never considered macOS or Apple Office suites as "free" or cheap - the way I rationalised purchasing an Apple device was by telling myself that Apple hardwares are overpriced because it includes the price of the accompanying software. Of course, now, as Apple slowly shifts to a hybrid subscription model, you will of course be continually paying for Apple software ...
They forgot to mention that the growing software library is also shrinking as they deprecate support for older OS versions and hardware. On the one hand they go to heroic lengths (fat binaries, Rosetta 2) to enable a migration to a new hardware platform but get bored in ~5 years and drop support.
I don’t think dropping legacy support is due to boredom. It what allows them to keep moving forward without being saddled by every decision from the past.
How long should they have kept PPC or Classic support?
Microsoft is in a funny position. Backward compatibility is seen as a competitive advantage, especially in the enterprise market. However, it’s that very compatibility that makes people avoid adopting new technologies, because why bother? We see Microsoft throw so many things against the wall, and almost nothing sticks. Meanwhile, Apple tells devs to jump and they ask how high. Devs know Apple is going to cut support, so its update your apps or be left behind.
To really make a change, a person needs to be all-in. Dual booting Windows and Linux/macOS, for example. This is a sign a person isn’t all-in and they don’t really make the change, or it takes significantly longer. When a person goes all-in and burns the boats, they are forced to find new solutions and make the changes needed to make the new thing actually work.
The ARM chips in later iPhones and all M series Macs physically don’t have the hardware to run 32 bit software.
Should they still be supporting PPC software? 68K software? Why not old Apple // software for good measure?
Right now the last time I counted in 2012, there were 12 ways to define a string in Windows and you had to convert back and forth between them depending on which API you are calling. There are so many one off hacks to keep Windows running (see Raymond Chen’s blog) it’s a house of cards
It’s been half a century of Apple. At this point if FireWire, Flash, and a half dozen other things didn’t convince you that Apple deprecates then removes old functionality pretty rapidly I don’t know what to say.
If only those trillions of dollars of market cap and hundreds of billions of dollars in revenue could support...a couple dozen small teams maintaining legacy support. For the old hardware, pretty decent open source emulators exist that can run older versions, like all the way back to MacOS 7. It can't be that hard to keep the pilot light on for those old things.
It's a product choice. If you want long-term backwards compatibility, Windows is probably the best option, perhaps the Win32 API on Linux via WINE. Overall, backwards compatibility is a drag on future development. In general, I prefer to drop it myself unless that's the crucial feature. In this case, I'd say I'd make the same choice. Those last few bps of users can simply remain on old software/hardware.
A lot of corporate "philosophies" are actually just business models. There have been times between then and now they charged for the OS. They do charge for other software. But largely it's been a good business model for them.
In the 2000s I remember the OS releases being $130, which (depending on exactly what year you start from) is equivalent to $200-250ish today.
Not a yearly cadence because back then they only released a new OS version when it was done and had features worth releasing, but even every two years that wasn't a cheap update.
This anecdote from history feels timely given the recent shift of Apple’s iWork suite (Pages, Numbers, Keynote) from being bundled with Macs to being a freemium subscription.
I appreciate that the software and updates are made "free" to me, and it may be their right to disallow "downgrades" and have time-limited windows for redemption. However, as a developer for their platform, it is quite frustrating that these restrictions are at odds with industry practice to guarantee support for older OS versions than current. I cannot purchase a new iPhone, put iOS 18 on it, install my app, and test updating the iPhone to 26. This can have very real negative consequences for the very same shared customers of mine and Apple's.
I know people are rightly amazed by Woz’s engineering prowess, but it’s fascinating to see Steve’s fingerprints all of Apple I. Look at the product commitments and they’ll ring a bell:
- It’s all in one
- Hassle free to set up
- Something that usually doesn’t work (cassette board) now just works
They rightly identified the hobbyist market (I want to tinker) was actually the smaller market within a larger one. Seems obvious in hindsight. It wasn’t obvious then.
Which is another myth that needs to die. Apple had a couple of billion in the bank from a loan that they secured and they lost much more than the measly $250 million that Microsoft invested. Not to mention that Apple
Not really. The Apple I was discontinued within a year of release, if you saved that money until 1978 then you could get an Apple II that would be supported for almost 20 years give-or-take.
Part of the reason the Apple I is so rare, is that Apple offered an Apple I trade in program. Apple would destroy the boards of Apple Is that were traded in for Apple IIs.
I use to muse if I put the money I spent on computer gear back in the day instead into woodworking tools, I'd not only have a bigger, better shop than Norm Abrahm, all of the tools would probably still work.
I worked at a place that tested software releases on a VM of every supported operating system, including OS X. We didn't have any Apple hardware, because no one wanted to deal with that, but someone had brought in the chassis of an old Apple computer and the host computer was inside it. We didn't run it by any lawyers or anything, but as far as we could tell, running OS X inside a computer that had all of its guts replaced was entirely within the license requirements.
The Mac of Theseus
Except int this case, we pulled a single plank off of ship of the ship, burned the rest of it, and nailed the plank to a brand new ship built in a competing ship yard.
These sort of letter-of-the-law arguments don't tend to do well in court in my very limited experience (UK). But I love the essence of it!
The UK has been slowly codifying laws over time, transitioning from common law to written law. In the Americas, we kind of jump started that processes, and are far more focused on the law as written, to the point that the positioning of a comma can have million-dollar implications: https://www.nytimes.com/2006/10/25/business/worldbusiness/25...
If a literal interpretation of what you wrote allows for something, even though it's clear that you hadn't intended to do so, then it is going to be allowed.
I would love to hear more about the exact license wording that allows this.
It's been a while since I've looked at it, but I believe it says something along the lines of "Apple-branded systems". Putting an Apple logo sticker on a Hackintosh was a common thing to do for this reason.
Haha, excellent timing:
I opened HN just now because:
1. I got tired of waiting 2h for my app to get notarized because
2. I can't sell it on the AppStore in the EU... because
3. the AppStore Connect page gets stuck at their DSA compliance form (it's been 10 days).
And, to add insult to injury, the whole thing could be a PWA, without any compromises in the UX whatsoever.
I misread the title, but I still posted this comment as an example of confirmation bias* in the orange book for posteriority. Time to step away from the computer!
* (sunk cost fallacy)
I can't sign into Apple without going incognito in Chrome. I put in my email, and it throws and error before asking me for my password. It's not an extension, it's not a cookie, idk what it is.
And then when I do get past he password, it sends a OTP to a Mac Mini I never use and have to tap around to get it to generate a SMS code. No option for external TOTP, and no way to remove the Mac Mini I don't use from OTP without signing out of it.
Ive made an apple developer account, paid $100 and then it kicked me out and after logging in still said I didnt pay yet. I paid again until realizing it actually charged me. It also took me an hour to try and figure out how to get it to send OTP to a phone instead of an old broken macbook.
Google also gives me a ton of issues with having multiple accounts. Go to calendar app with account 2, switch to desktop mode so I can actually click on the meeting invite, now Im logged back into account 1. Similar issues trying to use any other google service and have to use
I don't understand how these kind of things aren't priority #1
a few years ago a lot of legacy "AppleLink" email accounts where culled without remorse
I've got this with some sites. How long have you had that Chrome profile? They seem to collect cruft somewhere and it stops some sites from working. On my main Chrome profile I can't use one of my banking sites, any OpenAI site uses 100% of CPU, other sites only hold the login cookies for a few minutes. I've tried disabling every extension.
None of these issues on my other profiles or in incognito.
You cannot even change the password of an apple ID without logging into a macOS or iOS device.
I have the exact same problem. It’s saying something about not being able to confirm my identity? I took a look at the dev tools and it’s apparently making a request to a server which returns an error.
It only works in incognito because it’s using a different ip address there…
Funny, I had this problem yesterday when I tried to login to Apple Business Manager. Thought I had messed up but worked fine in incognito.
Exact same problem, exact same error. Glad to know I'm not alone!!!
Sorry, how is it that you make Chrome incognito window use a differnt IP address?
That sounds like a good magic trick.
Beats me. Same computer, same browser. Don’t know whether it’s the exact same address though. Cleared cookies already. Still only working in incognito.
I use PWAs on iOS and they're pretty great. That was the original plan for apps on iOS, before Apple was pressured into creating an app store.
A PWA on iOS is just a cached web page. Safari remains pretty crippled with regards to the APIs (bluetooth, usb, filesystem, etc...) that make local apps attractive in the first place. Apple is fine with letting people cache web pages, they're not fine with stuff that might displace the app store.
And for that I'm quite thankful, if all the stuff that apps could do were possible on the web it would make the web a far far scarier place than it is.
I avoid apps as much as possible due to all the nefarious tricks they play, even with all the sandboxing and review they go through. Without those constraints, I can't imagine the hell that we'd be in.
Which is fine, for the 90% of people that spend their time on the 70% of common features and interact only with the screen and headphones and internet.
But sometimes people like to do stuff like configure their QMK keyboards or load new firmware for their EdgeTX drone radios or make bootable USB sticks, all tasks that work just fine in easily deployed PWAs on every client platform in existence, except iOS.
For small developers of small-yet-oddball clients apps, PWA's are an absolutely magnificent platform. Write once, deploy once, run... everywhere-but-an-iPhone. It really sucks that Apple's devices are crippled like this.
Edit to reply to this bit:
> Without those constraints, I can't imagine the hell that we'd be in.
Again, that hell is literally every other platform on the planet. It's only Safari that is "protected". In point of fact browser permissions management on this stuff tends strongly to be stricter and less permissive than app permissions, which are much less visible.
Not really, as long as they need permission granted
Who pressured Apple and why?
I had nor even heard of app stores before then IIRC unless you count Linux repos.
> While originally developing iPhone prior to its unveiling in 2007, Apple CEO Steve Jobs did not intend to let third-party developers build native apps for iOS, instead directing them to make web applications for the Safari web browser.[10] However, backlash from developers prompted the company to reconsider,[10] with Jobs announcing in October 2007 that Apple would have a software development kit available for developers by February 2008.[11][12]
https://en.wikipedia.org/wiki/App_Store_(Apple)
And the irony of this is that a lot of the apps in the app store are hybrid apps that are basically web apps with a thin native wrapper around them because it's just so much less of a hassle to develop for both iOS and Android that way and because, if you're coming at it as an outsider, Swift is such a ball-ache to deal with compared to other languages and stacks.
So PWAs would have been more than fine but, unfortunately, that ship has long since sailed, and Apple make way too much money out of the app store for a course change.
It came out in the Epic vs Apple trial that 90% of App Store revenue comes from in app purchases of pay to win games. The only money Apple is making from these “could have been a web app” apps are for things like Uber where you can use Apple Pay (not in app purchases)- that has the same credit card fees regardless.
If it’s only mean old Apple, where are all of the great Android PWAs and why do developers decide to make native Android apps?
It looks from the cited sources that developers wanted to write apps, Apple chose to do this in a way that allowed it to keep control of what was installed.
Less than a year?
Doesnt really sound like Jobs was putting up much of a fight there.
They didn’t have much of a choice. In that time people had already developed jail breaks and Cydia, an app store in its own right, was thriving.
Before Apple’s App Store launched, my iPhone was running all sorts of other apps and alternative launchers.
Apple had to move fast to keep things from getting too out of control.
Over the years, as the vulnerabilities in the OS were closed and iOS added features, the need or desire to bother with jailbreaks and 3rd party pirate app stores dropped. I haven’t thought about it in many years.
This week Bruce Perens (who wrote the original Open Source definition) remembered talking to Steve Jobs about Open Source back in 2000.
https://thenewstack.io/50-years-ago-a-young-bill-gates-took-...
Perens had accepted a position as senior Linux/Open Source Global Strategist for Hewlett-Packard, which he describes as leaving Apple “to work on Open Source. So I asked Steve: ‘You still don’t believe in this Linux stuff, do you?'” And Perens still remembers how Steve Jobs had responded.
“I’ve had a lot to do with building two of the world’s three great operating systems” — which Jobs considered to be NeXT OS, MacOS and Windows. “‘And it took a billion-dollar lab to make each one. So no, I don’t think you can do this.'”
Perens says he later "won that argument" when Jobs stood onstage in front of a slide that said ‘Open Source: We Think It’s Great!’ as he introduced the Safari browser."
That's interesting! However I would argue Jobs sadly won that argument, as there really didn't come any open source os for neither phones or major push on PCs in the almost 30 years since that exchange.
While yes some software have come in that format, it took the big 3 to push the server Linux based clouds, Google to push it on phone, tablets and laptops and now Steam to make a push for the average gamer.
This is not to discredit the work being done outside those lab's which very much build on the work for free or by foundations, however the first versions just don't capture a majority of the available markets which the OSes Jobs mention very much did and the others by the billion dollar labs since.
Have you built a PWA solution for it? If not, why not?
> the orange book
?
> the whole thing could be a PWA
Apple neutered the web as best they could to force you to use their rails.
I'm still angry they killed flash. There has never been a better platform for non-technical folks, kids especially, to make animation, games, and mini apps, and deploy them as single binary blobs.
A single swf file could be kept and run anywhere. For the younger generation: imagine right clicking to download a YouTube video or a video game you'd see on itch.io. And you could send those to friends.
You could even embed online multiplayer and chatrooms into the apps. It all just worked. What we have now is a soup of complexity that can't even match the feature set.
Flash was cool, but it was also a spectacular dumpster file. Honestly I'm sort of glad Google&Apple killed it. Yes it was an amazing medium, but it feels almost like Adobe kept thinking about it as an animation studio and didn't care to run it as an application platform with all the concerns it entails (i. e. security). And support of anything that's not Windows, while technically present, was abysmal. HTML5, with all it sins and warts, is a better platform, even if it has much higher entry barrier.
The security issue could have been addressed by simply running it in a sandbox.
Creativity dropped off a chasm with HTML5.
During the Flash era, creativity flourished. It was accessible, too. Seven year olds could use it.
Flash was getting better and better. It could have become an open standard had Jobs not murdered it to keep runtimes off iPhone. He was worried about competition. The battery and security issues were technical problems and fully solvable.
The companies that filled the web void - Google and Apple - both had their own selfish reasons not to propose a successor. And they haven't helped anyone else step up to the plate. It would be impossible now.
Imagine if apps for mobile could be deployed via swf. We'd have billions of apps, and you could just tap to download them from the web.
Smartphones might have pushed us forward, but the app layer held us back.
The 1990s and 2000s web saw what AOL and Microsoft were trying to lock us into and instead opted for open and flexible.
Platformization locked us into hyperscaler rails where they get action on everything we do. This has slowed us down tremendously, and a lot of the free energy and innovation capital of the system goes to taxation.
The thing is, HTML5 is far more technically capable than Flash ever was. It was competitive even at the time: Flash's main thing was 2D vector graphics, but iOS Safari has supported both Canvas and SVG since at least 2010, possibly from day one.
But the creation tools and the culture never really lined up the same way, and developers focused on creating apps instead.
For non-games, HTML has always been technically superior. iOS Safari may have a long history of rendering bugs, but it beats Flash/AIR, which always looked very out-of-place even on desktop.
I do wonder what would have happened in an alternate universe where either Flash or HTML5 took off on mobile instead of apps. We would have both the upsides of openness, and the downsides of worse performance and platform integration and the lack of an easy payment rail. Pretty much the same situation we still see on desktop today.
We wouldn't have had the same "gold rush" from the early App Store, which happened in large part because of the ease of making money. There would probably be more focus on free stuff with ads, like Android but more so.
I second everything except the fact that Adobe was behind Flash, which IMO is what killed it in the first place (with ten years of hindsight, I can say this confidently). I still do creative, non-standard work, but in a free way using pure vanilla JS (using Haxe). Adobe's mistake was keeping the system proprietary instead of letting it be free. Since then, I've left that ecosystem and what a relief!
(I know I'm mixing different levels here, and my personal experience isn't really an argument).
ps: HTML scope is way more advanced than whatever Flash could have been.
> Imagine if apps for mobile could be deployed via swf. We'd have billions of apps, and you could just tap to download them from the web.
No they wouldn't. We've forgotten just how bad and sloppy flash apps were. The handful of companies that used Adobe Flex turned out awful POS that barely worked. It occupied the same space that Electron does today -- bloated, slow, and permitting cheap-ass devs to utilize cheap talent to develop 'apps' with all the finesse of a sledgehammer
As a kid I loved flash, I was making interactive apps in AS2/3 in high school. But I watched in horror as it became the de facto platform for crapware
> It occupied the same space that Electron does today
This. Except Electron crap at least runs on top of a well-designed and relatively reliable platform (HTML/Chromium) - and sometimes the crap even offer an actual PWA version with all the sandbox benefits a real browser has to offer. Flash didn't even had that.
And let's be realistic, there will always be demand for a crap-running platform for vendors that don't care (or just have their core values elsewhere).
> And let's be realistic, there will always be demand for a crap-running platform for vendors that don't care (or just have their core values elsewhere).
My kingdom for some way of gatekeeping platforms so that entities like this are forbidden from participating
pls dont
- Lack of gatekeeping was THE advantage that made Web viable and competitive against traditional media.
- You can't gatekeep crapmakers without also gatekeeping that kid in his parent's basement with an awesome idea.
- Crapmakers with enough money will punch through any gatekeeping.
- Sometimes you have to accept that vendors don't care. Can't expect a transport company to give too much love to their timetables app. Yes, they are expected to hire someone competent to do it, but the "someone competent" also rarely care. Still better than having no access to the timetables.
No, there was gatekeeping, it was knowledge. You had to be knowledgeable enough to work the system. You had to have the time to dedicate to learning the system and how the internet and how computers worked. Those twin gates kept the internet as it was in its early days.
Unfortunately every peabrained enterpreneur saw that and began eroding the moat until it was gone. The knowledge required to build things has been on a steady decline, and now with AI that decline has completely destroyed it. Now, every fucking hack with an "idea" is not only able to act on them but now they act like they are as good as the people who paid a heavy price to get to the same level through years of study and hard work.
As a side note, Apache Royale is still alive (or is it?).
<https://royale.apache.org>
> The battery and security issues were technical problems and fully solvable.
Seriously? Is that why I ran all my desktop browsers with flashblock even before the iPhone was out?
Dare to tell me Adobe was feverishly working in secret on reducing pointless CPU usage and saving my battery?
This myth that Apple “killed” Flash on mobile should die. When Flash finally came to Android in 2010-11, it required a phone that had a 1Ghz processor and 1GB RAM and barely ran on that.
The first iPhone came with 128MB RAM with a 400Mhz CPU, it couldn’t even run Safari smoothly. If you scrolled too fast, you would get a checkerboard while you waited on the page to render. An iPhone with those specs didn’t come out until 2011.
Adobe was always making promises it couldn’t keep. The Motorola Xoom was suppose to be the “iPad Killer” that could run Flash , Adobe was late leaving the Xoom in the unenviable position that you couldn’t go to the Xoom home page on the Xoom at launch because it required Flash.
Flash was cool, but the plugin was full of bugs and a constant source of pretty serious vulnerabilities. I too miss the flash games era of the web at times, but it wasn't some utopian thing.
Macromedia Flash was indeed a beautiful, innovative piece of software. HTML 5 still doesn't match its features vis the ease and usability that Flash offered in creating and deploying content online. But after its acquisition by Adobe, it just ever so slowly went downhill. It should have been open sourced.
> I'm still angry they killed flash. There has never been a better platform for non-technical folks
Capcut and Roblox would like words. No, that's kinda just wrong. Content generation for non-technical folks has never been easier or more effective. Flash is just something nerds here remember fondly because it was a gateway drug into hackerdom. Some of us are older and might feel the same way about Hypercard or TurboPascal or whatnot.
Just like Microsoft before them.
But flash specifically deserved to die.
Absolutely. Apple had the balls to be the first major tech company to take the first material step to actually end the security nightmare that was Flash for good.
I'm sure it was the security that of Flash that worried them, and not the fact that a third party was encroaching on their walled garden that couldn't be extorted.
On the other hand you're okay with Adobe having that level of control over the web?
Maybe one day we'll see a JS/WASM framework that is just as portable.
Ironically, Macromedia / Adobe didn't try to assert any control back then. They were even opening the standard, IIRC.
They learned this much later after learning the game from Meta, Google, and Apple.
The text was mangeled by some OCR-software. This ad can be found as image on Wikimedia: https://commons.wikimedia.org/wiki/File:Apple_1_Advertisemen...
The full sentence:
> And since our philosophy is to provide software for our machines free or at minimal cost, you won't be continually paying for access to this growing software library.
I don't how it was when Apple was a start-up, but I have never considered macOS or Apple Office suites as "free" or cheap - the way I rationalised purchasing an Apple device was by telling myself that Apple hardwares are overpriced because it includes the price of the accompanying software. Of course, now, as Apple slowly shifts to a hybrid subscription model, you will of course be continually paying for Apple software ...
They forgot to mention that the growing software library is also shrinking as they deprecate support for older OS versions and hardware. On the one hand they go to heroic lengths (fat binaries, Rosetta 2) to enable a migration to a new hardware platform but get bored in ~5 years and drop support.
"Growing software library" it ain't.
I don’t think dropping legacy support is due to boredom. It what allows them to keep moving forward without being saddled by every decision from the past.
How long should they have kept PPC or Classic support?
Microsoft is in a funny position. Backward compatibility is seen as a competitive advantage, especially in the enterprise market. However, it’s that very compatibility that makes people avoid adopting new technologies, because why bother? We see Microsoft throw so many things against the wall, and almost nothing sticks. Meanwhile, Apple tells devs to jump and they ask how high. Devs know Apple is going to cut support, so its update your apps or be left behind.
To really make a change, a person needs to be all-in. Dual booting Windows and Linux/macOS, for example. This is a sign a person isn’t all-in and they don’t really make the change, or it takes significantly longer. When a person goes all-in and burns the boats, they are forced to find new solutions and make the changes needed to make the new thing actually work.
The ARM chips in later iPhones and all M series Macs physically don’t have the hardware to run 32 bit software.
Should they still be supporting PPC software? 68K software? Why not old Apple // software for good measure?
Right now the last time I counted in 2012, there were 12 ways to define a string in Windows and you had to convert back and forth between them depending on which API you are calling. There are so many one off hacks to keep Windows running (see Raymond Chen’s blog) it’s a house of cards
It’s been half a century of Apple. At this point if FireWire, Flash, and a half dozen other things didn’t convince you that Apple deprecates then removes old functionality pretty rapidly I don’t know what to say.
If only those trillions of dollars of market cap and hundreds of billions of dollars in revenue could support...a couple dozen small teams maintaining legacy support. For the old hardware, pretty decent open source emulators exist that can run older versions, like all the way back to MacOS 7. It can't be that hard to keep the pilot light on for those old things.
You just can’t imagine my lost from not being able to use my 7 device deep SCSI chain including my Zip and Jaz drive
It's a product choice. If you want long-term backwards compatibility, Windows is probably the best option, perhaps the Win32 API on Linux via WINE. Overall, backwards compatibility is a drag on future development. In general, I prefer to drop it myself unless that's the crucial feature. In this case, I'd say I'd make the same choice. Those last few bps of users can simply remain on old software/hardware.
There was discourse in the 1970s about whether software should all be free or if paid software would be better. Apple and Micro-Soft had different perspectives: https://en.wikipedia.org/wiki/An_Open_Letter_to_Hobbyists
A lot of corporate "philosophies" are actually just business models. There have been times between then and now they charged for the OS. They do charge for other software. But largely it's been a good business model for them.
In the 2000s I remember the OS releases being $130, which (depending on exactly what year you start from) is equivalent to $200-250ish today.
Not a yearly cadence because back then they only released a new OS version when it was done and had features worth releasing, but even every two years that wasn't a cheap update.
Not related at all: oh my, chez.com still exists? That's my very first website I did in 2000: http://w2000.chez.com/
That was my first surprised as well...
https://archive.is/dJvc
This anecdote from history feels timely given the recent shift of Apple’s iWork suite (Pages, Numbers, Keynote) from being bundled with Macs to being a freemium subscription.
https://www.macrumors.com/2026/01/28/apple-updates-keynote-n...
Green PCB Prototype #0 Apple I just sold yesterday for $2.75m
https://news.ycombinator.com/item?id=46843037
What's up with all of the weird typos, such as:
"APPLE Computer Compagny"
"Palo Atlt"
Probably OCR'd with no editing.
it appears to be a website in the french tongue
I appreciate that the software and updates are made "free" to me, and it may be their right to disallow "downgrades" and have time-limited windows for redemption. However, as a developer for their platform, it is quite frustrating that these restrictions are at odds with industry practice to guarantee support for older OS versions than current. I cannot purchase a new iPhone, put iOS 18 on it, install my app, and test updating the iPhone to 26. This can have very real negative consequences for the very same shared customers of mine and Apple's.
What do any of these comments have to do with this advertisement for the Apple1?
"Compared to switches and LED's, a video terminal can dis- play vast amounts of information simultaneously."
The beginning of the end.
Really. You start with 40x24 chars and after a little span of time end up doom scrolling
> "you won't be continually paying for access to this growing software library."
Well... the apple used to be sweet and has turn pretty sour with the years...
Expandable to 65K. I don’t recall seeing SI units used in this context until by hard disk manufacturers years later.
I know people are rightly amazed by Woz’s engineering prowess, but it’s fascinating to see Steve’s fingerprints all of Apple I. Look at the product commitments and they’ll ring a bell:
- It’s all in one - Hassle free to set up - Something that usually doesn’t work (cassette board) now just works
They rightly identified the hobbyist market (I want to tinker) was actually the smaller market within a larger one. Seems obvious in hindsight. It wasn’t obvious then.
Makes me wonder who printed their motherboards early on
Interesting to think that:
>If Microsoft never bailed Apple out, this wouldn't be on the front page today
>If Apple didn't have the greatest marketing team of all time and nail the ipod commercial, this wouldn't be on the front page today
>If Apple charged competitive prices for the iphone, rather than make it a veblen good, this wouldn't be on the front page today.
If I could only consider how much luck is involved in life, it might make setbacks feel better.
Which is another myth that needs to die. Apple had a couple of billion in the bank from a loan that they secured and they lost much more than the measly $250 million that Microsoft invested. Not to mention that Apple
At $666.66 this must have been a diabolic deal!
It would be a cancellable offense today
~$3,800 in 2026 dollars.
Why, for $3800, you can now get a brand new Apple computer with a million times the RAM!
This was because Woz liked repeating digits.
https://youtu.be/pJif4i9NRdI @2:05
Including 8K of "RAM memory", brought to you by the DRD Department!
More devilish
Same thing.
Not really. The Apple I was discontinued within a year of release, if you saved that money until 1978 then you could get an Apple II that would be supported for almost 20 years give-or-take.
Part of the reason the Apple I is so rare, is that Apple offered an Apple I trade in program. Apple would destroy the boards of Apple Is that were traded in for Apple IIs.
* Not that there was really many to begin with.
What was the reasoning behind that?
Probably to reduce support costs.
I recall my junior high school had only Apple IIs in 1995.
But very really if you bought it and kept it until now.
Even better, what if I had invested that money in Apple stock instead? :)
I use to muse if I put the money I spent on computer gear back in the day instead into woodworking tools, I'd not only have a bigger, better shop than Norm Abrahm, all of the tools would probably still work.
Heard to believe that all this (product and ad) was by kids barely out of teenage.