John, MacStories’ Managing Editor, has been writing about Apple and apps since joining the team in 2015. He also co-hosts MacStories’ podcasts, including AppStories, which explores of the world of apps, MacStories Unwind, a weekly recap of everything MacStories and more, and MacStories Unplugged, a behind-the-scenes, anything-goes show exclusively for Club MacStories members.
Only Premier members can view this post on Club MacStories.
Apple published new developer docs explaining how apps can integrate with Apple Intelligence’s text summarization and priority classification of emails and messages thanks to new Core Spotlight APIs in iOS 18.4 and macOS 15.4. (Link)
Legendary Apple designer Susan Kare, known for creating the original Macintosh icons, has released a collection of 32 new retro-inspired designs as physical keycaps and pendants in collaboration with Asprey Studio. The only issue: with a starting price of $650, this may be too rich even for Stephen. (Link)
As we wait for Siri to gain the features promised last June by Apple, I thought it would be worthwhile to look further into the future. I’ve been thinking beyond Siri in part because I’m skeptical that Apple will be able to pull off what was shown off last year. The technical challenge alone is immense and will require a massive leap forward in Siri’s capabilities that is far beyond any past update in my memory. But the challenge runs deeper to something that may be even more difficult to solve, which is the fact that Siri just hasn’t been very good for years. People have formed habits. They know they can get the weather, set a timer, or turn on the lights with Siri, but I’d wager that most people don’t do much more than that. I sure don’t.
We should know later this year whether Apple can pull off its plans for Siri, but I think it’s worth considering what could be next for Apple Intelligence. On this week’s episode of AppStories, I mentioned that I expect AI to fade into the background of products the same way APIs do for regular users. AI, whether it’s been branded like Apple Intelligence or not, is really a feature that can play different roles in different contexts.
The latest app by indie developer Sindre Sorhus is an extremely simplified text editor for iPhone and iPad that embraces minimalism as the core feature of the app. As the name implies, the app is a barebones window where you can type in plain text. (It’s worth noting that Plain Text Editor uses the new document browser in iOS 18, which we also previously saw in Runestone.) The app supports common file formats like .txt, .md, and .csv, and it includes options for line length and word counts. What caught my attention is the app’s ‘Brain dump mode’, which lets you write without the ability to delete or edit text, helping you stay focused on just getting words down on the page. I don’t know if that’s the sort of thing that can work for me, but I continue to be fascinated by Sindre’s experiments with apps.
My Markdown database of every article ever published on MacStories.
The age of AI will reward digital hoarders in ways that search never has.
As a computer user, I grew up on search. As a law student at the dawn of the '90s, I cut my teeth on Westlaw and Lexis terminals. Back then, both services – enormous databases of court decisions and related materials – could only be accessed via thin client terminals. I’d type in a query and wait as the screen filled line by line with my search results. It was a slow process, so being good at search paid off.
Peek is a new iPhone app that lets developers track their App Store sales data and trends. The app integrates with App Store Connect and displays key metrics like active subscriptions, revenue trends, and top-performing apps through a widget-based UI (both inside the app and with actual widgets on the Home Screen). What caught my attention is how Peek handles data visualization: you can view sales information in your local currency, set custom time ranges, and receive notifications when new data comes in. It’s worth a look if you’re an Apple developer.
After one year of ownership, Castro’s developers share a behind-the-scenes look at the improvements made to the podcast app’s backend infrastructure, including faster feed updates and increased reliability. (Link)
In a fascinating deep dive into their Apple Health data export, Kieran Healy discovers nearly eight million entries of personal health metrics collected by the iPhone and Apple Watch since 2018. Stay until the end for the graphs. (Link)
The Apple Watch is a close cousin of the Apple TV in my gadget life. By that, I mean that I don’t tinker with it or think about it much. I rely on both for a handful of features that work well enough to meet my needs.
With my original Apple Watch Ultra, the features I rely on are notifications and health and fitness tracking. I keep up with important notifications from my Watch and appreciate knowing what’s happening without having to always look at my iPhone. I also use it to track my sleep and exercise.
I’m a research nerd. I love chasing down details about whatever interests me, diving into the minutiae of a topic. However, the trouble with research is that it’s also easy to get lost in unproductive rabbit holes and waste a lot of time. That’s especially true when the topic is something I know little or nothing about. For example, a recent research session that started with the goal of finding technical specifications for Sony PlayStation Portable (PSP) models turned into a meandering tour of subreddits and PSP modding websites.
That got me thinking about how I could optimize the way I handle this sort of early-stage research. Wandering the Internet is fun, but I don’t always have the time for it. So I started exploring ways I could avoid distractions, which led to a quest for greater flexibility in my research workflow. What I ended up with was a collection of web services and apps across the Mac, iPhone, and iPad, with Google Gemini at the center.