No one likes to admit when they’re wrong. That’s true for you and me, and it’s especially true for big companies like Apple. The thing is, when you’re willing to admit when you made a mistake, it goes a long way towards building trust. And trust is, by far, your brand’s most valuable asset.
Today, Apple apologized for how it had handled recorded snippets of users’ voice interactions with Siri, the company’s digital assistant. In a statement, the company said that “we realize we haven’t been fully living up to our high ideals, and for that we apologize.”
You might remember that Apple, like pretty much every other tech company, recently admitted that it used contractors to listen to, and transcribe these recordings in an effort to improve the artificial intellience-powered service. Making matters worse is that fact that the company hadn’t disclosed this practice, and contractors often heard false-activations that revealed personal information and other private conversations.
Earlier this month, Apple paused its review program and ended its relationship with the contractors involved. Now, it appears to be taking the next step, which started with an apology.
That’s actually pretty remarkable. It’s not often that companies say, “I’m sorry. We messed up.” Sure, they sometimes say a lot of words that vaguely sound like “I’m sorry,” but rarely are they this direct. Apple basically called itself out, saying that it wasn’t living up to its own standards, and that it owed customers an apology for a problem it caused.
Along with the apology, maybe the even bigger news here is that Apple announced a series of steps it plans to take moving forward, including:
- The company will no longer retain recorded Siri interactions, but will use computer-generated transcripts instead.
- Apple will allow users to opt in to having their audio samples included in the company’s efforts to improve the product. Users will also be able to opt out at any time after that.
- Apple will only allow its employees (not contractors) to listen to audio samples, and will delete any “inadvertent trigger,” of Siri.
This is a big deal for a lot of reasons, but mostly because Apple will now allow users to ‘opt in.’ This is exactly how it should work.
There are perfectly legitimate reasons why Apple would want to listen to recorded snippets of Siri interactions. That’s one of the only ways it can really know how accurate the AI is at understanding user requests and providing the right information for a human to review and correction. I don’t know of anyone who doesn’t agree that that’s reasonable.
But Apple is changing the default assumption of an unspoken ‘opt in’ to one where people are given the choice to participate, instead of simply offering some opaque way of opting out. Companies offer opt out because they know most people won’t go through the trouble of changing whatever the default setting is, meaning people stay in whether they really want to or not.
Every tech company handling sensitive data should do exactly this. Don’t just let people opt out, or delete their history, or make a request to no longer be recorded. Make the default position the thing that’s best for the user, even if it makes your job a little harder.
Then, make your case for why your practice is worth it to the customer, and let them decide to participate or not.