We launched our first research paper today: Lying and Hiding in the Name of Privacy (PDF here) by Mary Hodder and Elizabeth Churchill.
Our data supporting the paper is here: Addendum Q&A and shortly we’ll upload a .xls of the data for those who want to do a deep dive into the results.
We all know that many people hide or submit incorrect data, click away from sites or refuse to install an app on a phone. We’ve all mostly done it. But how many? How much is this happening?
We’re at IIW today and of course, the age old dilemma is happening in sessions where one guy in the room says: “People will click through anything; they don’t care about privacy.” And the next guy will say, “People are angry and frustrated and they don’t like what’s happening.” But what’s real? What’s right?
We conducted this survey to get a baseline about what people do now as they engage in strategies to create privacy for themselves, to try to control their personal data.
The amazing thing is.. 92 % hide, lie, refuse to install or click, some of the time. We surveyed 1704 people, and had an astonishing 95% completion rate for this survey. We also had 35% of these people writing comments in the “comment more” boxes at the bottom of the multiple choice answers. Also astonishingly high.
People expressed anger, cynicism, frustration. And they said overwhelmingly that the sites and services that ask for data DON’T NEED it. Unless they have to get something shipped from a seller. But people don’t believe the sites. There is distrust. The services have failed to enroll the people they want using their services that something necessary is happening, and the people who use the services are mad.
We know the numbers are high, and that it’s likely due to many not having a way to give feedback on this topic. So when we offered the survey, people did vent.
But we think it also indicates the need for qualitative and quantitative research on what is true now for people online. We want more nuanced information about what people believe, and how we might fix this problem. Many sites only look at user logs to figure out what is happening on a site or with an app, and therefore, they miss this problem and the user feelings behind them. We want to see this studied much more seriously so that people no longer make the conflicting statements at conferences, so that developers say the user’s don’t care, so that business models are developed that think different than we do now, where sites and services just take personal data. We want to get beyond the dispute over whether people care, to real solutions that involve customers and individuals in ways that respect them and their desires when they interact with companies.
5 comments
People as global citizens need to be given access to a robust platform for participating truthfully in ongoing global research of important matters, with no commerce introduced bias and with strong legal endorsement of privacy laws for individual safety: Smart, mindful, transparent use of such Big Data would be a global change agent for the better. It has to be established despite the resistance that would be posed by those who rely on hidden information for personal income, power and status.
I agree that services ask for too much personal data and often give false information. But this URL below describes what can happen when a site offers the fully anonymous privacy protecting ability to say what you want. Food for thought.
http://www.dailydot.com/news/anthony-stubbs-ask-fm-teen-suicides/
This is great stuff! In fact, it is very much in line with our recent findings. We looked as to why people refrain from disclosing or start faking their details. Unlike the Customer Commons study, we relied on an experiment rather than a survey. It turns out that fairness has a consistent and significant effect on the disclosure and truthfulness of data items such as weekly spending or occupation. Partial support was found for the effect of effort and sensitivity.
Malheiros, Preibusch, Sasse. “Fairly truthful”: The impact of perceived effort, fairness, relevance, and sensitivity on personal data disclosure. to appear: 6th International Conference on Trust & Trustworthy Computing (TRUST 2013) http://preibusch.de/publications/Malheiros-Preibusch-Sasse__effort-fairness- relevance-sensitivity_disclosure_draft.pdf
Thanks UK Researcher ! Very interested in reading the paper and checking out your focus on experimenting with getting good or bad data and engendering trust.
This is great stuff! In fact, it is very much in line with our recent findings. We looked as to why people refrain from disclosing or start faking their details. Unlike the Customer Commons study, we relied on an experiment rather than a survey. It turns out that fairness has a consistent and significant effect on the disclosure and truthfulness of data items such as weekly spending or occupation. Partial support was found for the effect of effort and sensitivity.
Malheiros, Preibusch, Sasse. “Fairly truthful”: The impact of perceived effort, fairness, relevance, and sensitivity on personal data disclosure. to appear: 6th International Conference on Trust & Trustworthy Computing (TRUST 2013) http://preibusch.de/publications/Malheiros-Preibusch-Sasse__effort-fairness-relevance-sensitivity_disclosure_draft.pdf (fixed link)