Already finding myself using this multiple times a day to help speed up my workflow.
Someone’s messaged me to confirm a dinner date?
keyshortcut + 'add this to my cal' = done
Already finding myself using this multiple times a day to help speed up my workflow.
Someone’s messaged me to confirm a dinner date?
keyshortcut + 'add this to my cal' = done
Follow me here if you want to see this as it develops and what I experiment with. macOS first, and iOS eventually too.
Sure to be a fun journey (especially trying to do something similar on iOS!).
Also wondering: how can this help users who have accessibility needs?
Natural language + real-time voice + action/interaction chaining = potential to control the whole OS conversationally.
Thinking about adding “memory” too — recall what you’ve seen across apps you approve.
stuff like:
“what was that recipe dina sent last week?”
Permissioned, scoped, and useful.
Lots left to do: faster processing, streaming responses, and a more optimised data format and upload flow.
Also adding more tools: alarms, searching notes, or interacting with the filesystem — maybe even hooking directly into the Intents/actions that Shortcuts.app uses.
“What does ‘IUO’ mean here?”
Context-aware answers based on what you’re actually looking at — documents, emails, code, etc.
“Save Dina’s birthday to her contact and send a quick happy birthday message.”
One ask. Done.
“Add this to my calendar”
No copy-pasting, no explaining, no tapping around — just ask.
Been building a macOS app that does a lot of what Apple Intelligence promised — but actually available now.
It can 'see' what you're looking at and take action across apps. It can hook into your contacts, calendar, sms, and email to varying degrees. Only with your permission of course.