iOS SiriKit

Demystifying Siri, Part 7: Intents UI

At the end of Part 6 we finally reached our goal – our iOS device is finally able to solve a Countdown numbers game using voice alone. To round things off, let’s add a nice bit of UI that displays our calculation in NumberRace branding.

In Part 5 we created an Intents extension. At the same time we also created an Intents UI extension. Let’s open the project group up and take a look at the files therein.

Intents UI folder structure
Files in our Intents UI

Our folder structure looks pretty much like any standard bare-bones app – we have an info.plist, a default storyboard, an entitlements file and a view controller.

Creating a view is pretty much the same as for any other app. We can join text labels and other UI components to IBOutlets in our view controller. We’ll also add in some image assets from the main app – that means ensuring that we check the UI extension in the Target Membership list for the relevant NumberRace asset catalogs.

Intent View Controller.
Intent View Controller

The final thing we need is the code.

We have two UILabels in our view – a target and a solution – so lets open up IntentViewController.swift and add them to our class:

class IntentViewController: UIViewController, INUIHostedViewControlling {
    @IBOutlet var lblTarget: UILabel!
    @IBOutlet var lblSolution: UILabel!

Setting the size

Next, we want our set our view to be a particular height, otherwise our UI extension will take up much of the screen. Open up IntentViewController.swift and add the following function to the class:

    var desiredSize: CGSize {
        return CGSize.init(width: self.extensionContext!.hostedViewMaximumAllowedSize.width, height: 320)

This will display our view with a height of 320 pixels – just enough for us to show the target number and our solution. Finally, lets add the code that does the work:

func configureView(for parameters: Set<INParameter>, of interaction: INInteraction, interactiveBehavior: INUIInteractiveBehavior, context: INUIHostedViewContext, completion: @escaping (Bool, Set<INParameter>, CGSize) -> Void) {
        guard let intent = interaction.intent as? SolveGameIntent else {
            completion(false, Set(), .zero)
       if interaction.intentHandlingStatus == .success {
            if let response = interaction.intentResponse as? SolveGameIntentResponse {
                lblTarget.text = String(Int(truncating: ?? 0))
                var spokenResult = response.spokenResult?.replacingOccurrences(of: ", ", with: "\n") ?? ""
                spokenResult = spokenResult.replacingOccurrences(of: "divided by", with: "÷")
                spokenResult = spokenResult.replacingOccurrences(of: "times", with: "×")
                spokenResult = spokenResult.replacingOccurrences(of: "plus", with: "+")
                spokenResult = spokenResult.replacingOccurrences(of: "minus", with: "−")
                lblSolution.text = spokenResult.replacingOccurrences(of: "equals", with: "=")
                completion(true, parameters, desiredSize)
        completion(false, parameters, .zero)

You’ll notice that I’ve written some code to convert the spoken result into mathematical notation. My original idea was to send a spoken result to Siri and a version with mathematical notation to the UI. I tried to add a mathematicalResult to my response in Intents.intentdefinition, but I can’t seem to use it unless it’s also spoken by Siri.

The Properties and Response Templates section of our SolveGame intent.
The “mathematicalResult” property can’t be accessed because it’s not mentioned in the response template

Apple’s documentation suggests adding a NSUserActivity with additional information but Apple’s documentation is a little scant here and I have a workaround that I’m reasonably comfortable with.

And we’re done! Not only do we have a spoken result, but a visual one too.

The UI for our Numbers game solver.
The numbers game solver UI.

Clicking anywhere in the UI will take us to the solver in the NumberRace app. You’ll recall that in Part 3 we implemented the ability to restore a user activity, so that our Solver opened automatically in our app in response to a shortcut. Clicking on the UI now triggers the same process and we now can see our solution in the app itself.

And that’s it! There’s plenty more to explore in SiriKit but I think I’ve reached the limit of what I wanted to achieve. There were some rough edges but we’ve worked around them. There’s been some exploration and some head-scratching – hope you found it all useful.

iOS SiriKit

Demystifying Siri, Part 6: Enums in Custom Intents

At the end of Part 5 we came tantalisingly close to our goal of implementing a Countdown numbers game solver in Siri. But we hit a snag. Siri recognises the user saying “three” but not “eight”. What is going on here? Let’s take a look at the screenshots:

There’s a clue there. When I say “three”, Siri identifies my spoken input as a number 3 and moves on. When I say “eight”, Siri identifies my spoken input as the word “Eight” and doesn’t match any of the enum values. Could this inconsistency between words and numbers cause the issue? Time to get stuck into Apple’s documentation for SiriKit enums.

A screenshot of the entirety of Apple's SiriKit Enumerations documentation - 'no overview available'
SiriKit enumerations – or maybe not

Hmm. So Plan B then is to try and fill in the gaps in the documentation.

How Siri resolves enumerations

When creating a new enum case, there are a few different fields to fill in. Which of them, if any, does Siri use to compare with your spoken phrase?

Let’s create a new enum case. The case value is .red but we’ll give it a display name of blue. We’ll also add a speakable match of green. I’ve added pronunciation hints for clarity but they shouldn’t be needed. What happens now if I say ‘red’, ‘green’ or ‘blue’?

A test enum case with identifier .red, display name blue and alternative speakable match green
A test enum

The answer is… nothing matches!

  • When I say ‘red’, Siri renders this as ‘Read’, which doesn’t match.
  • When I say ‘blue’, Siri can’t identify a matching case. To be fair, this is a display name only, though I have no idea why a display name needs a pronunciation hint.
  • When I say ‘green’, Siri can’t identify a matching case. I’d have expected this to match, given that this is described as an “Alternative Speakable Match”.

Ok, so perhaps the case value is a promising lead. Let’s tweak it slightly…

A test enum case with identifier .dishwasher, display name kettle and alternative speakable match spoon
Another test

What happens now? ‘Kettle’, ‘toaster’, ‘spoon’ and ‘knife’ don’t get matched, but ‘dishwasher’ does. Funnily enough, so does ‘dish’. And ‘washer’. But not ‘dishwasher tablets’.

Every day is a learning day

We can now therefore conclude the following:

  • Siri can identify an enum case if all or part of the enum value is spoken.
  • Setting the Display Name and Alternative Speakable Matches appear to have no effect on what is matched.
  • If Siri decides on a different interpretation of your spoken input (e.g. ‘red’ vs ‘read’ or ‘seven’ vs ‘7’) then there is no match.

(All of the above is current as of iOS 13.7.)

The lack of implementation of Alternative Speakable Matches is a surprise but can be confirmed by looking inside the custom generated class for InitialNumberResolutionResult – there’s no sign of our values. (Have a look for yourself by right-clicking on an instance of InitialNumberResolutionResult and selecting Jump to Definition.) A Google search for Alternative Speakable Matches turns up very little too, so maybe we’ve reached the seldom-visited frontier of SiriKit functionality.

So far, so frustrating. But we can use the partial matching of case values to our advantage. As a workaround, let’s change our case values to .one1, .two2, .three3 etc. Does this work I wonder?


Yes! It’s a little clumsy but finally we can match numbers consistently. And now we have a way of providing input parameters to Siri, and have Siri respond with a result. We’re there or thereabouts now – everything else is just window dressing.

Talking of which, recall when we created our Intents extension, back in Part 5. We kept that Include UI Extension box checked – to round everything off, in Part 7, we’ll find out why.

iOS SiriKit

Demystifying Siri, Part 5: Intents Extensions

After a somewhat circuitous route we are now finally ready to implement our interactive voice-based interface to NumberRace. At the end of this part, Siri will ask us for a target number and six initial numbers and read out a solution.

Mea culpa

First, though, a confession. Is this really the best use of a voice interface? According to Apple’s Human Interface Guidelines, Siri Shortcuts should be used to accelerate common actions. The guidelines suggest designing intents that require as few follow-up questions as possible. We have seven parameters to complete before Siri is able to provide a solution. Are we really saving a user time by providing a voice-based interface with seven questions to answer? Well, in my defence, we could theoretically run our shortcut on a HomePod or Apple Watch so we’re making our solver available on more devices. We’ll press on and see how cumbersome our shortcut is in action.

Intents Extensions

Our app interacts with Siri through an Intents extension. This extension does the job of retrieving the responses from Siri, identifying missing parameters, and supplying the results of our intent.

We’ll be adding a new Intents extension target to our project shortly, but first there are a few things we need to do before we get coding. Firstly, we’ll create an App Group so that the app and the Intents extension can share resources between them. Use the Apple Developer Portal to create an App Group and assign it to one or more App IDs. Add the App Group to your project target by navigating to the Signing & Capabilities section of your target’s settings in Xcode.

Secondly, you’ll need to ensure that the Siri capability is enabled. While you’re in the Signing & Capabilities section, click on the + Capability button and select Siri from the list.

The + Capability button in Xcode
The + Capability button in Xcode

In my NumberRace app, I have a Solver class that does all the calculations needed to generate a solution. It’d be great if this code could be shared by both my app and the Intents extension to save space. I’ve therefore moved this class into a framework to save having to include it in both targets.

I won’t trouble you with the steps involved in moving your code into a framework as there are plenty of tutorials online. I used this one from Ray Wenderlich.

A new target

Next, go to File > New > Target… and select Intents Extension. Then click Next.

The New target dialog box in Xcode with Intents Extension selected
Intents extension

Let’s call our extension NumberRaceIntents and we’ll check the Include UI Extension box while we’re at it.

Next, add our App Group to the NumberRaceIntents and NumberRaceIntentsUI targets.

You’ll recall that in Part 3 we spoke about our .intentdefinition file being translated into custom generated classes. We’ll need to ensure that our new extensions have access to these classes. Click on Intents.intentdefinition and, in the right hand pane under Target Membership, ensure that our new extensions are checked and that Public Intent Classes is selected for each.

The Target Membership pane of Intents.intentdefinition, with Public Intent Classes selected for our three targets

Finally, we need to specify the list of intents that our extension will handle. Go to the Target settings, and in the General section, add SolveGameIntent to the list of supported intents.

The Supported Intents section of our Intents target settings.
The Supported Intents

Writing some code

Returning to our Intents extension, a default IntentHandler.swift has already been created. In this default handler, we’ll check to see if our SolveGame intent has been invoked and use a Solve Game-specific handler instead to perform all the tasks we need.We’ll create that specific handler in a moment, but for now, let’s change our intent handler code to the following:

class IntentHandler: INExtension {
    override func handler(for intent: INIntent) -> Any {
        // This is the default implementation.  If you want different objects to handle different intents,
        // you can override this and return the handler you want for that particular intent.
        guard intent is SolveGameIntent else {
            fatalError("Unhandled intent type: \(intent)")
        return SolveGameIntentHandler()

Our handler will react to requests from Siri when it needs assistance in handling our intent. This means:

  • Resolving intent data – letting Siri know that data is missing or invalid so that Siri can ask the user for clarification
  • Handling the request once all data has been resolved, passing the results back to Siri for reading out

Let’s start with the first of those items – resolving data.

Resolving data

Let’s create a new file in our Intents target, SolveGameIntentHandler.swift. We’ll start off by resolving our initial numbers. Add the following code to our file:

import Foundation
import Solver

class SolveGameIntentHandler: NSObject, SolveGameIntentHandling {
    func resolveInitialNumber(_ initialNumber: InitialNumber) -> InitialNumberResolutionResult {
        if initialNumber.rawValue != 0 {
            return InitialNumberResolutionResult.success(with: InitialNumber(rawValue: initialNumber.rawValue)!)
        } else {
            return InitialNumberResolutionResult.needsValue()

    func resolveNumber1(for intent: SolveGameIntent, with completion: @escaping (InitialNumberResolutionResult) -> Void) {

Here we’ve written a function to resolve an initial number, resolveInitialNumber(), and we’re using it to resolve our first initial number, number1. If we have a valid initial number then the rawValue of our InitialNumber enum is greater than 0 and we can send a success result to Siri’s completion handler. All is well. If the rawValue is 0, then Siri wasn’t able to match our spoken input to one of the enum values, and so we send a needsValue result; this tells Siri to ask the user again for a number.

We can resolve numbers 2-6 in the same way, so I won’t trouble you with the duplication.

Resolving the target is a similar process:

    func resolveTarget(for intent: SolveGameIntent, with completion: @escaping (SolveGameTargetResolutionResult) -> Void) {
        if let target = {
            completion(SolveGameTargetResolutionResult.success(with: Int(truncating: target)))
        } else {

If the target exists then we can send the success result. Otherwise our intent needs a value and Siri will ask one more time.

Handling data

We have all our data, and it’s valid – what now? The next stage is to handle the data, perform the calculation, and pass the result, in the form of one or more parameters, to Siri. First, though, we need to define those result parameters. In our app’s target let’s reopen Intents.intentdefinition. Click on the Response item on the left. We’ll add two strings as our response properties –

  • howManyAway which contains a description of how close the solver came to solving the game, and
  • spokenResult which describes the solution.

If the solving process was successful we can add those parameters to our success response template, as per the screenshot below.

The solve game intent response, showing result parameters and the spoken phrase on success.
Our Solve Game intent response.

Let’s make use of those parameters now. Let’s open our Solve Game intent handler and add the following piece of code. It’s not an elegant piece of code, but hey.

  func handle(intent: SolveGameIntent, completion: @escaping (SolveGameIntentResponse) -> Void) {
        let (closest, result) = Solver.helper.getResult(numbers: [
        ], target: Int(truncating:!))

        let howManyAway = closest == 0 ? "I've found a solution" : "I could only find a solution \(closest) away" 
        completion(SolveGameIntentResponse.success(howManyAway: howManyAway, spokenResult: result))

The principle is the same as before – sending a success result to Siri’s completion handler when the request is handled. Here we have a Solver class that does all the work in generating a solution, and we pass the howManyAway and spokenResult parameters to Siri in our success result.

Lost in translation

Let’s try it out!

Siri asks for our first number – let’s say 3:

A Siri conversation where a number is selected and a new number is requested
Our first number is recognised…

So far so good. Siri is now asking for a second number – let’s say 8:

A Siri conversation where some input has been given but the same number is requested
…but our second number is not

Eh? Siri is asking us for the second number again. This is a problem – what is going on here and how do we fix it? In Part 6 we’ll journey into the dark heart of Apple’s documentation and attempt to bridge the gaps we find therein.

iOS SiriKit

Demystifying Siri, Part 4: Suggestions

In Part 1 of this series we were resigned to the fact that a user wasn’t immediately able to make use of Siri to solve numbers games using NumberRace. Our user has to go through the rigmarole of creating a shortcut and adding a spoken phrase before Siri is even aware of this functionality. You’ll recall that this is a problem with custom intents; Siri’s inbuilt system intents, such as booking a ride or playing music, does not suffer from this problem. Is there anything we can do to reduce the friction in creating a shortcut?

Let us suppose that our hypothetical user is a fan of TV game show Countdown and uses our app to solve numbers games while it is broadcast. Wouldn’t it be great if we could automatically provide a prompt at that time every weekday? Fortunately we can, using suggestions.

Creating intent suggestions

You’ll recall that in Part 1 we discussed that it is possible, as a developer, to notify Siri that a user has carried out a particular activity in your app. Siri is then able to make predictions about when an activity is likely to be used and then present it as a suggestion on the lock screen or the search screen of the user’s phone at the appropriate time.

We’ll create some suggestions by re-opening our Intents.intentdefinition file. Select the SolveGame custom intent and scroll down to Suggestions.

We’ll create three types of suggestion:

  • a suggestion with the target and six initial numbers populated
  • a suggestion with the target populated only
  • a suggestion with no parameters populated

Firstly, ensure that Intent is eligible for Siri Suggestions is checked. Then we’ll add our suggestions – each with our different parameter combinations. For each suggestion type, click on the + sign below the Supported Combinations box. Enter the parameter combinations and give a summary for each. When all is done we should end up with something resembling the screenshot below.

Donating intents

Next, we need to inform Siri whenever the solver is used in our app. In order to do this we need to ensure a donation occurs whenever the user taps on the ‘Solve’ button. Let’s get this done by creating a function:

    func donateSolveGameIntent() {
        if #available(iOS 12.0, *) {
            let intent = SolveGameIntent()
            let interaction = INInteraction(intent: intent, response: nil)
            interaction.donate(completion: nil)

If we have iOS 12 and above, we can create a SolveGameIntent, populate it with a target and initial numbers (although here we haven’t done that) and donate it to Siri.

When the user presses ‘Solve’ in the solver, we ensure that this function is run.

    @IBAction func solveScreen() {
        // ...code to run the solver

It’s important that, when donating an intent to Siri, the intent data in the donation matches a parameter combination in the intent definition. If a matching parameter combination is not found then a suggestion will not be made.

We’ve left our intent data blank because it’s very unlikely that a user going to run the solver with the same numbers and target each time. We could have populated our intent data with a target and initial numbers, but there’s no benefit in doing so. But if there was some regular pattern to our intent data Siri can recognise it and come up with a suggestion tailored to our needs, using the most relevant parameter combination we defined in our intent definition file.

For example, let’s suppose we use a banking app to manually make a regular payment to the milkman every week. The payee stays the same but the amount varies each time. The app donates that intent with the payee and amount data. Siri is able to recognise that the payee is the same, discard the amount, and display a generic “Pay the milkman” suggestion on our phone’s lock screen.

Let’s see if that works…

Donating intents in action

Fortunately we don’t have to wait an age while Siri collates all our new intent donations. We can make a change to our settings to see suggestions as they happen. Open the Settings app, then select Developer. Scroll down to Display Recent Shortcuts and ensure that this setting is enabled. Also ensure that Display Donations on Lock Screen is enabled.

The Developer settings screen
The Developer settings screen

Open NumberRace, then our solver, then click on Solve. The next time our lock screen appears then we have a suggestion! This suggestion also appears in our shortcuts app, where we can attach a custom phrase. Or we can navigate to the Siri & Search section of the Settings app to do the same thing.

A Siri suggestion - Solve a numbers game - on a lock screen
A Siri Suggestion

Having a conversation

We’ve travelled quite far on our journey, but we still haven’t reached our destination – for Siri to ask us for numbers and to read out a solution. We’ll do this in Part 5. Will we then finally reach our goal? (Spoiler alert: this is a six, maybe seven, part series, so no.)

iOS SiriKit

Demystifying Siri, Part 3: Restoring User Activity

In Part 2, we created a custom intent in SiriKit to allow us to open our NumberRace app when our Solve Game shortcut is invoked. Next, we’re going to update NumberRace so that our solver is opened and populated with the data we provided.


Firstly our application needs to respond to the invocation of an intent. In order to do this we need to implement the application (_ application: UIApplication, continueUserActivity: NSUserActivity, restorationHandler: @escaping ([UIUserActivityRestoring]?) -> Void) method in our application delegate. In this method we identify which intent was invoked and pass any data to one or more view controllers to process. We’re finally going to do some coding! Open up AppDelegate.swift and add the following:

    func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([UIUserActivityRestoring]?) -> Void) -> Bool {

        guard userActivity.activityType == "SolveGameIntent" else {
            return false
        guard let window = window,
            let rootViewController = window.rootViewController as? SwitchViewController else {
                return false
        return true

My root view controller is the main menu screen and I’ve decided that it should be the lucky recipient of the intent data. Why? Well it’s the only view controller that I’m sure will be instantiated in our app at all times. From there we’ll open my solver’s view controller, predictably named SolverViewController, and pass in our data at the same time.

A segue about segues

An aside: the NumberRace app has a number of view controller scenes – scenes that appear while playing the game, scenes for changing settings, and so on. The scenes are connected with a number of UIStoryboardSegues. I’ve written a helper class called ViewCoordinator that handles the task of opening our solver view controller (SolverViewController) no matter where you left the app. You may be in the middle of a game, or in a settings screen – ViewCoordinator attempts to perform and unwind the segues necessary to open the solver.

The ViewCoordinator code is outside the scope of this blog post series. (By which I mean that I am too embarrassed to share it as it’s not the most elegant thing I’ve written. If I knock it into shape I’ll update this blog post with the code.)

Restoring user activity state

When the line

restorationHandler([viewController1, viewController2, ...])

is executed in a continueUserActivity function, all the restoreUserActivityState(_ activity: NSUserActivity) functions in the view controller list are run.

Therefore, for us, we now need to add such a function to our root view controller.

    override func restoreUserActivityState(_ activity: NSUserActivity) {
        if #available(iOS 12.0, *) {
            guard let intent = activity.interaction?.intent as? SolveGameIntent else {

            ViewCoordinator.helper.solverData = (
                number1: intent.number1.rawValue,
                number2: intent.number2.rawValue,
                number3: intent.number3.rawValue,
                number4: intent.number4.rawValue,
                number5: intent.number5.rawValue,
                number6: intent.number6.rawValue,


The code here is pretty straightforward. Firstly, we’ll check that our custom intent is of type SolveGameIntent before proceeding. We’ll then add our intent data to ViewCoordinator. Finally, by calling my openSolver() function our ViewCoordinator will handle all the segues that need to be performed to bring our solver into view.

I’ve also created a SolverData data type. I could’ve passed around the SolveGameIntent intent as is, but custom intents are only supported in iOS 12 and above. It seemed more straightforward to convert the data into a tuple instead of adding iOS version checks everywhere in the code.

Incidentally, when you create an intent definition file, Xcode translates this into custom generated code behind the scenes. You can see this generated code by right-clicking on SolveGameIntent appearances in your code and selecting Jump to Definition. Thought it’s worth pointing out here in case you were wondering how Xcode is aware of intent classes.

Trying it out

And that’s all we need! Let’s try it out and create a shortcut…

A screenshot of the Shortcuts app, displaying an example Solve Game shortcut
A Solve Game shortcut

If we run the shortcut by pressing Play, the application continueUserActivity function is activated, our root view controller’s restoreUserActivityState() function is called and then openSolver() opens our solver and populates the values.

The NumberRace Solver screen with the correct fields populated.
The NumberRace Solver screen

So far so good! Our restoration handler is working as expected. But how can we publicise the fact that this functionality is available to a user? The answer lies in suggestions, which we’ll cover next, in Part 4.

iOS SiriKit

Demystifying Siri, Part 2: Creating a Custom Intent

In Part 1 of our series on exploring Siri’s capabilities we discussed the possibility of creating a custom intent to make some of our app’s functionality accessible via Siri. At the end of this part we should be able to access our shortcut through the Shortcuts app, and by speaking a custom phrase in Siri.

Disclaimer: At the time of writing Xcode 11.6 is current as is iOS 13. Things may change in future versions!

Creating a new custom intent

I have an Xcode project called NumberRace. In order to create a custom intent we need to add a SiriKit Intent Definition File to this project. From the Xcode menu, select File > New > File… and, from the dialog box that appears, select “iOS” from the top and, in the “Resource” area, select SiriKit Intent Definition File. (In the diagram below, it’s on the right hand side in the middle row.)

The File chooser in Xcode.
Choosing the SiriKit Intent Definition File

This file can hold one or more intents so you can save the file in your preferred location as Intents.intentdefinition. There are no intents defined at the moment so let’s remedy that by clicking on the + sign at the bottom left.

A blank intent definition file
A blank intent definition file

Choose New Intent from the context menu.

The New Intent menu
Choosing a New Intent

We can now begin defining our custom intent. We’ll call it SolveGame – changing the name in the left hand pane has nicely updated the title in the right hand pane.

We now need a category. Remember that Siri only supports certain verbs in specific categories, for example Order, or Show, and choosing a category here influences how Siri talks about your intent. We’ll use a generic verb for now. Run will do. We’ll add a description here too.

Finally we don’t need user confirmation. If we were making a payment through an intent, a confirmation step is needed to make sure that the amount and payee are correct and there are no nasty financial slip-ups. In our case, there’s no harm done if any of the information is incorrect, so we can leave this box unchecked.

The Custom Intent section
Custom Intent properties


We now need some parameters. To enable the solver to do its job, we’ll need:

  • A target number – a three digit integer between 100 and 999;
  • Six initial numbers – each number can be either a smaller one from 1 to 10, or a larger one – 25, 50, 75, 100

Our target number is easy to define. Click on the + sign and type target. Is type is an Integer and it is user-facing. We can supply default, minimum and maximum values, and a phrase to request the parameter when we eventually hook our intent up to Siri’s voice assistant.

The Parameters section of our custom intent

But what about our initial numbers? We could use six integers to represent the six initial numbers, but we need to be a bit more restrictive than that as the user should only be able to select the numbers 1-10, 25, 50, 75 or 100. I’m going to opt for an enum so that the user can choose only those valid values. This may be a bit cumbersome but we’ll try it and see how it works.

Creating an enum

Click on the + sign at the bottom left of the pane, and select New Enum. Give it a name of InitialNumber. Next, create all the cases – one per possible value. I may live to regret this, but anyway that’s how I’ve done it, as pictured in the screenshot below.

For each possible number we have a case .value<x> with an index matching our number. You’ll notice that I’ve also added a pronunciation hint. I’m not sure this is needed as I hope Siri can pronounce numbers!

Let’s move back to our Solve Game intent and add our initial number parameters. We’ll make each initial number parameter user-facing, so that the user can supply the value in Shortcuts.

We’ve updated the Siri Dialog too, ready for when we have our interactive voice interface.

Our parameters fully completed with initial numbers and targets
Our completed parameters

Configuring Shortcuts

The next section determines how our custom intent appears in the Shortcuts app. We’re going to make the intent user-configurable, which means that the user can choose to run the shortcut with some, all or none of our parameters pre-filled in.

For example, a user could create a shortcut to always solve a numbers game with a target of 342. The shortcut would prompt the user for the initial number values before handing control over to NumberRace.

Or the user could populate all the values in the shortcut. In that case, when the shortcut is run, the solver is opened with the same values each time. Not that there’s much point to that, but still – it’s possible.

We’ll add a summary – type in Solve a numbers game with a target of target using number1, number2, number3, number4, number5 and number6. Ensure that when you type the words in bold that you select the parameter from the box that appears. Bingo – we have an intent that can be used in the Shortcuts app!

The Shortcuts App section of our custom intent
Specifying our shortcut

Incidentally, Apple’s Human Interface Guidelines suggest avoiding punctuation as we have done here. So eventually we may have to come up with another suitably pithy sentence. That’s for another day though.

We should now have enough information to test our intent. Let’s build our project, open Shortcuts and see if it works.

Open the Shortcuts app and click on the Create Shortcut button.

The New Shortcut screen in the Shortcuts app
Creating a shortcut in Shortcuts

Click on Add action and select our app. Success! Our Solve Game intent is listed.

The NumberRace app in the Shortcuts app
Choosing an intent

Select that action and we can configure it how we wish.

The NumberRace solve game intent awaiting some parameters
Configuring our intent

Let’s choose a target number and ask the user every time for the target numbers…

Our configured Solve Game intent in the Shortcuts app
Our completed shortcut. Note the speech bubble next to some of the parameters – Shortcuts will ask for those values each time the shortcut is run

Clicking on the Play button will take us through our shortcut and, sure enough, we get asked for some initial numbers…

The Initial Numbers dialog box in the Shortcuts app
Choosing a number

The NumberRace app opens… and gets stuck on the main menu.

The NumberRace menu screen

Progress! Sort of!

We’ve created a shortcut and it gathers the correct values. But our app doesn’t know what to do next. It’s time to write some code. We’ll do this next, in Part 3.

iOS SiriKit

Demystifying Siri, Part 1: Introduction

The year 2020 is full of surprises – me rummaging through Siri documentation is one of the smaller ones. I had an idea for a new feature in one of my iOS apps and it quickly turned into an expedition into all things Siri. To save you embarking on a similar journey, I thought I’d attempt to share the knowledge I’ve learned and the missteps along the way, using a real world app as a case study.

About NumberRace

Back in 2010, I released NumberRace, a fast-paced version of the Numbers Game round from mid-afternoon UK game show Countdown. Over the years I’ve intermittently added new features as time allows, including support for Auto Layout and Apple Watch.

This year, during The Situation We Find Ourselves In, I revived the app and finally got round to implementing a solver. Given a target number and six initial smaller numbers, the solver would attempt to find a way of reaching the target number by adding, subtracting, multiplying and dividing the initial smaller numbers.

The solver in NumberRace, showing a completed solution
The NumberRace numbers game solver

With the solver in place, inspiration struck. Wouldn’t it be great if Siri could take you through the solving process interactively? Siri could ask for the target number and the initial numbers, and read out the solution. A few weeks ago I started to investigate how to achieve this goal.

Siri defined

I usually think of Siri as ‘that thing that replies when you talk to your phone’, but it may be more helpful to consider Siri as ‘a virtual assistant that helps make actions available when you need them’. There are a number of different features accessible through the set of developer APIs known as SiriKit, not all of them voice-related. Exploring some of these features may help us towards our goal. Here goes.


The Shortcuts app

A shortcut is a link to an action (or, in Apple terminology, an intent) that can be handled by your app. In the case of NumberRace, we could create a ‘Solve Game’ intent that can be used to present the solver immediately to the user. A user can add this intent to the Shortcuts app and use it to present the solver. Furthermore, a user can associate a shortcut with a spoken phrase in the Shortcuts app so that the solver could be presented by saying ‘Hey Siri, Solve Game’. Or even ‘Hey Siri, Do a Clever Number Thing’ if the user decided to choose a flashier spoken phrase to open the shortcut.

But all this assumes that the user is familiar enough with the Shortcuts app. How else can we make our solver more accessible to the user?


As a developer, you can configure your app to notify Siri that a user has carried out a particular activity. These notifications are called donations. In our NumberRace app, for example, every time a user clicks on our ‘Solve game’ button, the app is able to donate a ‘Solve Game’ intent to Siri. Siri keeps track of these donations and works out if there’s a regular pattern to them. Siri can then make predictions as to when the user is likely to make use of our intents and then present this as a suggestion on the lock screen or the search screen. Our hypothetical user may be a fan of Countdown and watches the show at the same time every weekday. In that case, Siri may make the ‘Solve Game’ suggestion available on the user’s lock screen at the time when Countdown is broadcast.

A Siri suggestion on the iOS lock screen
A Siri suggestion on the lock screen of an iPhone

Turning suggestions into shortcuts

Can we convert suggestions into shortcuts? That is, can we attach a custom spoken phrase to a suggestion so that we can access it through Siri’s voice interface? The answer is yes, but the way of doing this is somewhat cumbersome.

A developer can add an ‘Add to Siri’ button in the app’s UI. Clicking on this button will prompt the user for a spoken phrase and, once provided, a shortcut is created. For my app, I could add an ‘Add to Siri’ button on the solver screen and this will take the user through the process of adding the Solve Game intent as a shortcut. I’m not sure that adding the button, at a cost of some real estate and added complexity to the UI, is worth the trouble.

Alternatively, the user could perform the same task manually, by navigating to ‘Siri & Search’ in the Settings app and recording a custom phrase there. Again, this doesn’t feel particularly frictionless, relying on the user to visit the Settings app to perform this task.


We’ve been using the term intents a lot already, but what are they exactly? Intents enable developers to use Siri to provide an interactive voice interface to their apps. They come in two flavours: system intents, and custom intents.

System intents

Apple introduced system intents in iOS 10. System intents are limited to handling common tasks such as sending messages, paying bills, or playing media content. (There’s a full list of tasks over at Apple’s Human Interface Guidelines site). When a user says, for example, ‘Hey Siri, use WhatsApp to send a message’, Siri is able to interpret this as a Send Message system intent. Siri guides the flow of conversation, asking for more information interactively, reaching out to the target app to supply the data.

Unsurprisingly, there isn’t a system intent for solving a numbers game. That’s where custom intents come in, and these were introduced in iOS 12.

Custom intents

Custom intents can be written to handle actions not covered by system intents. This gives us a little more freedom to do what we want. However, a custom intent needs to belong to a particular category, so that Siri is able to respond using the correct verbs. For example, setting the “Order” category will ensure that Siri says order-related things during the interaction. (Again, the full list of categories is available at the Human Interface Guidelines site.)

Again, there’s no category specific enough for our solver, but there is a handy generic category. We’ll use the verb “Run” in that category for now.

Intents and shortcuts

There is a drawback, however, when it comes to custom intents. While Siri can easily interpret sentences related to system intents such as ‘use WhatsApp to send a message’, the same isn’t true of custom intents. Despite specifying the dialogue necessary to handle our custom intent, it is not currently possible for Siri to do so without extra work on the user’s part. The user needs to create a shortcut first, and attach a specific phrase to that shortcut. Our user then speaks that phrase to invoke the custom intent.


So where does that leave us? My desired end goal is already unachievable. It is not possible for a user, after installing the app, to say, ‘Hey Siri, use NumberRace to solve a numbers game’ and have Siri interpret the phrase ready for NumberRace to do its thing.

I’m therefore going to have to revise my expectations downwards – the user needs to create a shortcut to access the NumberRace solver through Siri. It’s not ideal, but we can also use suggestions to present the solver to the user – assuming of course that the suggestion is surfaced in the first place. We’ll be looking at suggestions in more detail in Part 4.

First steps

We now have the first steps of a vague plan. Firstly we’ll create a custom intent to implement a shortcut. We’ll have a crack at this Part 2, and see how far it takes us.