When someone experiences sexual harassment in the workplace, one of the hardest things to do afterwards is report it to HR. Feelings of shame and fear of retribution are often the culprit.
Julia Shaw thinks the answer is to talk to a machine.
Earlier this year she and two engineers from Switzerland and Germany created Spot, an online bot that asks a harassment victim about their experiences, using the so-called cognitive interview technique.
It asks about when and where an experience of harassment happened, who did it, and whether there were any witnesses. The answers can be anonymous, or include names. Shaw’s team keeps the information on their system for up to 30 days before deleting it, in case a victim wants to come back for it.
When a user has spent about 10 minutes or more talking to the bot, it’ll collate their answers into a PDF, with a formal-looking cover sheet, that can be emailed to human resources or anyone they want.
Some three weeks since Shaw’s team launched their bot, around 220 people have chatted to it about their experiences, while more than 3,000 have visited the TalkToSpot site.
“We are very proud of these figures,” says Shaw, who developed the tool with co-founders Daniel Nicolae and Dylan Marriott. “They represent a great starting point. We are now proactively working to increase the reach of our tool.”
There are other online services for reporting harassment, including the STOPit app that employees can use to report harassment incidents, or Callisto, an online platform for college students to file an encrypted report about harassment. Callisto uses an escrow-style approach of being a neutral third party for victims to communicate through, which Spot will eventually do too, Shaw says.
As a chat bot, Spot might be better than humans at asking the right questions, she adds. It uses the same technique police use to help victims of crime remember what happened and is “the first cognitive interview bot.”
Spot is pretty young. Shaw co-founded it in July 2017 at All Turtles, a startup “studio” founded by former Evernote CEO Phil Libin.
She first met Libin at a tech conference in Britain last summer. Intrigued by her work, including a book she wrote about false memories called “The Memory Illusion,” he invited her to San Francisco to work with engineers who were interested in using artificial intelligence to help people build their own memory palaces.
Shaw says that it is easy today for those in authority to inadvertently mislead harassment victims when asking questions, particularly when it comes to emotional events. Part of her work involves training judges and lawyers “not to ask leading questions, but how to ask the right questions.”
That, and simply writing an incident down after it happens can be an important early step, says Sarah Chilton, a partner at London employment law firm CM Murray. “A lot come to us and say ‘X, Y and Z happened to me,’ and you say, ‘Did you make a note of what happened?’ And they say, ‘No.’”
Often people don’t want to go through the process of writing things down because the incident was so traumatic, Chilton adds.
Answering questions from a neutral and inanimate chatbot might make that easier.
Chatbots were all the rage in Silicon Valley from early 2016 when Facebook announced its bot platform for Messenger, but the trend has since gone through the same “hypecycle” of other new forms of new tech, and fizzled out a little.
Startups and developers back then designed bots for games, transactions, weather reports and news. (At Forbes, we even launched our own, short-lived, news bot on Telegram.)
Developers have since realised that bots are more useful when they facilitate some kind of conversation. Eugenia Kuyda, the co-founder of bot startup Luka, for instance, originally designed her company’s bot to give people restaurant recommendations.
“Then we realised the best way to find a restaurant is a graphical interface,” she says. “The less steps in conversation, the better.”
For employers and legal firms alike, anything that makes it easier to report harassment “is a good thing,” says CM Murray’s Chilton, who has seen myriad harassment cases but admits that what does get reported is the tip of a large iceberg.
Since launching Spot, Shaw has been fielding calls from large employers interested in making a similar service available to their own staff, for processing verifiable reports of harassment. She sees a potential business model for Spot in the form of a tool for companies that is “coming this summer.”
“It’ll use analytics and pattern recognition, to figure out the health of your corporate culture,” she explains. “Which parts of your company are you getting the most reports from? You’re getting a lot of reports from Tuesday at 2pm. The analytics and tracking of what’s going on in companies is hugely popular because AI and tech can pick up patterns that humans can’t.”
“If an employer could identify pattern where one employees name keeps coming up, thats useful information and allows them to do something about it,” says Chilton.
But in many cases, harassment victims will want to stay anonymous, which means HR can’t ask the alleged perpetrator to respond to any specific allegations — and that makes it difficult to properly investigate their claim, she adds. “if they don’t have the victim’s name there’s limit to what they can do… For us to seek redress for someone we’d need to tell a company who they are.”
That may be a problem even technology can’t solve.
Sarah Chilton, published in Forbes
By Parmy Olsen