Resist Google’s Attempts to Make You a Robot
We’re training machines to think this is how humans actually converse
Users of Gmail—and there’s a good chance that’s you—have noticed an “upgrade” in the service recently, in which you have been given the opportunity to respond to a message with a few short phrases. You may have been on the receiving end of such an autoreply already, and perhaps you have used one too: They’re easy, convenient, often intuitive, and can go a long way toward reducing inbox clutter. A few articles have even appeared about the love and fanfare for the new feature.
There are certainly times when a short programmatic response to an email does the trick. Email threads that are near the end of their useful life are best killed off efficiently—when all you need to do is accept or reject a simple proposal, confirm a time or place of meeting, or indicate that you have completed a task. But these situations by no means represent a majority of emails, and the temptation to treat other more complicated or nuanced emails like that don’t often have a happy ending.
Here’s an email I received from a friend a few weeks ago:
Gmail suggested I respond to this message in one of the following ways:
I can think of circumstances in which I might want to use one of these responses: For example, if my intention was to permanently alienate my good friend or send him an indication that my descent into dementia had begun. But, on that day, neither seemed quite on message, so I responded in the way that a friend should—with a considerate and sympathetic expression of my thoughts and feelings about this development in his life.
This is an extreme case that illustrates the cluelessness of Google’s algorithmic approach to the complexities of human communication. The choices it presents will probably improve over time, the reason being that you will be helping them. Even if you choose not to use one of Gmail’s autoreplies, you still supply a useful datapoint for Google’s gargantuan heap of big data that indicates “well, that didn’t work.”