In an era when laptops take pictures, phones track your movements, and digital assistants listen for instructions at home, people are increasingly worried about the sensors they are letting into their lives.
“If you see how many users are doing things like putting tape over the cameras on their laptops, that suggests there's something we can do to help them feel more comfortable,” says Mary Baker, a senior researcher in HP’s Immersive Experiences Lab.
In response, Baker has been leading an effort at HP Labs to understand what exactly people are concerned about when it comes to interacting with today’s digital devices and to imagine ways in which those concerns might be addressed.
She began with a survey of HP consumers from a wide range of backgrounds, asking them to describe their attitudes to the smart digital assistants that are gaining in popularity with families across the world.
“To me that was a good place to focus because it's a new technology, so a lot of people are thinking about why they might or might not want to adopt it,” Baker notes. “What I found was that while the top reason for not buying an assistant was because people weren’t sure they really needed one, the second biggest was all about security and privacy – the word “creepy” came up in lots of the comments.”
Indeed, it became clear that many people worry that these devices are enabling something or someone to listen in on them or see them without their knowledge.
That spurred a follow-on study where Baker interviewed a smaller group of users in depth about their attitudes to sensing technologies and challenged them to create simple prototypes of devices that would assuage their concerns.
“We wanted to know what it might take for people to just look at a device and know intuitively how private they are with respect to it,” says Baker. “Is it obvious to them how they would control it? Can they trust those indicators and controls?”
Significantly, interviewees felt that an LED “recording” indicator was not something they were able to trust. Instead, they preferred solutions that physically blocked or separated a sensor from a device to indicate that it was not currently in use.
“So while tech companies spend a lot of time trying to hide sensors, users might prefer us to make their behavior more obvious,” Baker suggests.
These insights clearly have implications for any company interested in creating devices that users feel will protect their privacy, and Baker and her HP Labs colleagues have been sharing their findings with HP’s various product groups. Most recently, Baker, along with Jim Mann from the Office of the Chief Engineer, and Cath Sheldon from Customer & Market Insights, led a workshop for teams from across each of the company’s major business units. The workshop, sponsored by Chief Engineer Chandrakant Patel, offered the opportunity to discuss and share information about design features that are most likely to reassure users and has prompted new inventions around sensor privacy.
She and her colleagues Eric Faggin and Hiro Horii have also shared a variety of conceptual sensor solutions developed by Immersive Experiences Lab engineers in response to her survey research. These include microphone units that must be physically manipulated before they work and clasps that cover cameras when not in use.
While considerations like complexity and manufacturing cost are always major determinants of final designs, teams across HP now have a better understanding of how consumers are likely to respond to sensors in future HP devices.
“We want the users’ experience with HP products to be associated strongly with protection of their privacy,” Baker says. “That’s what this research is all about.”