TECH&SPACE
LIVE FEEDMC v1.0
HR
// STATUS
ISS420 kmCREW7 aboardNEOs0 tracked todayKp0FLAREB1.0LATESTBaltic Whale and Fehmarn Delays Push Scandlines Toward Faste...ISS420 kmCREW7 aboardNEOs0 tracked todayKp0FLAREB1.0LATESTBaltic Whale and Fehmarn Delays Push Scandlines Toward Faste...
// INITIALIZING GLOBE FEED...
AIdb#2346

Siri’s Multi-Request Trick: Finally Catching Up to 2018

(1w ago)
Cupertino, United States
macrumors.com
Siri’s Multi-Request Trick: Finally Catching Up to 2018

Siri’s Multi-Request Trick: Finally Catching Up to 2018📷 Published: Apr 12, 2026 at 02:05 UTC

  • Siri’s new multi-command parsing mirrors 2018 Google Assistant
  • Apple Intelligence repackages old AI promises with iOS 27
  • Real-world utility hinges on unproven NLP advancements

Apple’s Siri is getting a feature competitors rolled out half a decade ago: the ability to handle multiple requests in one query. According to Bloomberg (via MacRumors), iOS 27 will let users chain commands like “Get directions to the museum and text them to Alex”—something Google Assistant demonstrated in 2018. The move is part of Apple Intelligence, the company’s belated push to inject generative AI into its ecosystem after years of playing catch-up.

The hype machine will call this a leap forward. The reality? It’s a basic functionality upgrade masked as innovation. Current Siri can’t even parse “Set a timer for 10 minutes and play jazz”—a task Amazon’s Alexa has handled since 2017. Apple’s framing this as an efficiency win, but the real question is whether it’ll work seamlessly or just add another layer of “Sorry, I didn’t get that” frustration.

Early signals suggest the feature relies on under-the-hood NLP improvements, but Apple’s track record with Siri’s accuracy is spotty at best. The demo-to-deployment gap here is wider than Cupertino’s silence on release dates or device compatibility. If this is the marquee AI upgrade for iOS 27, the bar was already on the floor.

The feature everyone else had years ago, now with Apple’s signature delay

The feature everyone else had years ago, now with Apple’s signature delay📷 Published: Apr 12, 2026 at 02:05 UTC

The feature everyone else had years ago, now with Apple’s signature delay

The competitive angle is clearer than the tech. Apple isn’t just late—it’s entering a market where Google’s Gemini and Microsoft’s Copilot have spent years refining multi-step workflows. Even Samsung’s Bixby (yes, Bixby) supports follow-up commands without re-triggering the wake word. Apple’s advantage? A captive audience of iPhone users who’ve tolerated Siri’s limitations for over a decade. The risk? That tolerance may finally wear thin if this feels like another half-baked beta feature.

Developers aren’t holding their breath. GitHub threads and Apple’s own forums show skepticism about Siri’s backend improvements, with many noting that Shortcuts already offer clunky workarounds for multi-step tasks. The real test isn’t whether Siri can handle two commands—it’s whether it’ll understand them correctly on the first try, in a noisy café, while your AirPods are at 30% battery.

If this works as advertised, it’ll be a modest quality-of-life bump. If it doesn’t, it’ll be another data point in the ‘Apple’s AI is always next year’ timeline. The company’s WWDC 2024 promises for Apple Intelligence were heavy on demos, light on deployment details. History suggests caution.

SiriAppleVirtual AssistantNatural Language ProcessingVoice Recognition
// liked by readers

//Comments