User:Sidstamm/Notes July 2014 SOUPS: Difference between revisions
Line 169: | Line 169: | ||
= Day 2: SOUPS main track = | = Day 2: SOUPS main track = | ||
= Keynote: Chris Soghoian -- Sharing the blame for the NSA's dragnet surveillance program = | |||
This talk was about government spying on people/suspects. People don't buy things or generally expect to be "attacked" (I don't buy a laptop based on thinking I'll be raided by the FBI in the future). | This talk was about government spying on people/suspects. People don't buy things or generally expect to be "attacked" (I don't buy a laptop based on thinking I'll be raided by the FBI in the future). | ||
Line 205: | Line 205: | ||
* government and law enforcement pressure | * government and law enforcement pressure | ||
* Lack of market power in the orgs that want to make change | * Lack of market power in the orgs that want to make change | ||
= Warnings And Decisions = | |||
== Stefan Korff: Too much choice== | |||
== Rick Wash: How automatic software updates introduce security problems == | |||
== Saranga Komanduri: Revisiting popup fatigue == | |||
= Mobile Security and Privacy = | |||
== Hui Xu: Towards continuous and passive authentication via touch biometrics == | |||
== Jialu Liu: Modeling users' mobile app privacy preferences == | |||
== Emanuel von Zezschwitz: Smartphone unlocking behavior and risk perception == | |||
= Authentication = | |||
== Taiabul Haque: Applying psychometrics to measure user comfort when constructing a strong password == | |||
== Elizabeth Stobert: The password life cycle, user behavior in managing passwords == | |||
= Social nets and access control = | |||
== Mainack Mondal: Social ACLs == | |||
== Hootan Rashtian: To befriend or not? friend request acceptance model on facebook == |
Revision as of 20:33, 17 July 2014
These are not the greatest notes, but the main takeaways are covered.
I was unable to attend all sessions, so there are some papers presented at SOUPS I did not summarize below.
Main conference site: http://cups.cs.cmu.edu/soups/2014/
Day 1: Workshop (Privacy Personas and Segmentation)
tl;dr: People can't agree on best ways to segment large populations by privacy posture or needs.
Urban & Hoofnagle: The Privacy Pragmatic as Vunerable.
This work critiqued Alan Westin's segmentation (Fundamentalists/Pragmatists/Unconcerned) and suggested peoples' concerns are related to how informed they are. "Perhaps underinformed individuals are vulnerable since many privacy prgamatics are underinformed."
The authors claimed to find some logical flaws in Westin's segmentation. Pragmatists are simply not Fundamentalists or Unconcerned, they're really the catch-all segment. And there's a gap between what consumers understand about data flows and what they want (their preferences).
The authors:
- Tested how informed each of Westin's segmentation was and found Fundamentalists were significantly more informed about privacy risks
- All groups reject information-intensive business models
- Reduced the segmentation into two segments: resilient and vulnerable where Fundamentalists are resilient and everyone else is "vulnerable".
See their short paper for 10 suggestions on how to improve the segmentation. But better segmentation may be hard. Is it too hard? When is it useful?
They reccomend Jennifer King's paper on this subject as she did many statistical tests on the authors' data (also in this workshop.
Soren Preibusch : Managing Diversity in Privacy Preferences: How to construct a Privacy Topology.
The author here wants to reduce the complexity of peoples' privacy preferences -- make it easier for them to choose by kick-starting with typing.
Too often there are privacy/functionality trade-offs. People will often prefer functionality over privacy.
A good typology has (1) reliability and (2) predictive power. It means that people are classified into a type correctly and the type strongly indicates the individual's preferences.
The author clustered some survey respondents into a strangely arbitrary number of clusters. He also tried Factor Analysis to partition based on activities.
Nothing was reliable and predictive. No solution presented. Suggested a typology would be useful but only if done right with strong, reproducible science.
Author asserts that privacy preferences are strongly related to personality traits, which leads him to typing since personality typing is reliable and predictive.
Pam Wisnewski : Profiling Facebook Users' Privacy Behaviors
Pam defines privacy as how we manage social interactions. There are lots of things you can do in Facebook to manage boundaries and interactions. Nobody has yet analyzed disclosure decisions (what you share and when) in relation to peoples' privacy settings.
This author studied people's use and frequency of use for each privacy behavior (such as managing news feeds), then classified users based on their results. Did Confirmatory Factor Analysis of the results.
Next, she ran a few MFAs, chose six classes based on stats. The largest group in this clustering was "Privacy Balancers" (much like the pragmatists in Westin's segments). http://usabart.nl/chart/
Takeaways:
- Privacy strategies extend beyond disclosure decisions
- Studying feature awareness vs privacy behavior
- Decisions depend on more than awareness level
Kovila Copamootoo : An approach to modeling privacy concerns and behavior via mental models
Attitude is evaluation of something that changes your behavior. (lost the train of thought during slide flipping)
Mental Models are maps of cognition made up of cognative associations. Time + Context dependent. Can help with prediction and facilitate interaction with computer systems.
But mental models are not directly accessible. So the authors asked people questions (indirect ones) to extract peoples' attitudes towards social nets, things like IP addresses and bank accounts.
Mental models could help validate or identify segments -- new way to segment the population.
Lynn Covantry : Perceptions and Actions
This was a talk by a psychologist. Much of it went over my head.
- theory: Behavior is based on rational choice
- theory: behavior is planned
- theory: perceived threats and behavior as a result is "coping"
- theory: behavior is learned
- theory: change is a process.
Lynn wants to know what are the environmental, social and personal influencers on privacy decisions. Can we influence behavior based on product design?
Paper had a survey to determine risk groups based on behaviors.
Takeaway: While people say they intend to have cautious behavior, did not observe a difference in behavior.
Lydia Kraus : Privacy and Security Knowledge for influencing mobile protection behavior.
Premise: People lack understanding of mobile device security.
Two questions:
- Is knowledge related to concerns?
- Does knowledge lead to behavior change?
The authors studied smartphone (android) users
- 11 questions based on recommendations from various web sites
- coded answers for correctness and calculated score by summing results
- measured Global Info Privacy Concern by asking questions about privacy
- measured behavior based on questions about behavior
Takeaways:
- Knowledge and Concern are not correlated
- Behavior is influenced by both knowledge and concern.
(Author described limits to their methods including biased sample and questions)
Bart Knijneburg : Information Disclosure Profiles for Segmentation and Recommendation.
Transparency and control are intended to empower BUT:
- Simple notices are useless and detailed ones are too complex
- Informing users make them more wrong
- People want control but eschew the hassle
- Decision bias.
Many people lack the resources to navigate the privacy space. Privacy nudges are promising, but what is the right direction? Need to move beyond one-size-fits-all.
Idea: use recommendation system (like Neflix) to find out what determines user choices.
Disclosure behaviors are multidimensional; profile users based on behavior, not attitude.
The authors clustered users based on types of disclosures (actions). Used Mixed Factor Analysis.
Idea: Privacy adaption procedure: (1) predict behaviors (2) provide tailored support when prediction is uncertain.
Maija Pockela : Locate! When users expose location.
What influences location disclosure?
- Who is requesting
- What is the reason
- Who I am
Previous studies have been hypothetical, these authors wanted data. The authors developed an app called Locate!. Participants would receive messages requesting location and the user could allow, allow "blurred" location, deny or "cheat" with a fake location. User could also set context like work or home to describe where they were.
Participants chose 6 names from their address books. Study spoofed requests from these people with reasons ("Where are you, need to see you ASAP at work", etc). Randomized the defaults (location, fuzz, etc) to identify deliberate actions in responses. Presented short questionnaire after each disclosure to identify why they disclosed what they did.
Authors are working on a new questionnaire.
Results: participants did not share more accurately with people they felt closer to. The opposite was true. Also asked if people used protection from threats for mobile. Those who said "yes" did not share precise location as often.
Higher education of subjects did not affect willingness to disclose. "Who" has no affect, nor does reason for disclosure. Subject or context does have effect.
Takeaway: This study indicates context of location disclosures has no effect, but they need more data.
Sebastian Schnorf : A Comparison of Six Sample Providers Regarding Online Privacy Benchmarks.
UX research at google.
Fielded set of questions to different survey platforms. Including mail and phone surveys.
Takeaway: Hard to get secretive people in a privacy-focused survey. Random samplers are better quality because of this.
Marc Busch: Is This Information Too Personal? Relationship between privacy concerns and personality.
Personality matters because it may influence design of a system. As in other talks, "one-size-fits-all" privacy fails.
Recent studies are specific and narrow.
Takeaway: Only 3.8% of privacy concerns are affected by various personality traits.
Janine Spears : I have nothing to hide, thus nothing to fear.
What about the person who has no privacy concern?
D. Solove makes a case for why privacy matters even if you have nothing to hide. This is a myopic view of privacy equals secrecy. They trust data collectors (blindly) and are unaware of the extents of tracking.
Instead, shift discussions to implications of over-disclosure (when data is not suppressed). What are the long-term implications of inadvertent shares? How do you educate the user? How do nudges work?
- How does this persona type affect others around him with different types?
- How quickly do this person's views change, like when there is something to hide?
To illustrate why privacy matters, start with a zip code and shopping list and then ask:
- What inferences can be made from this info?
- What are implications of these inferences?
Also, "Do you wear clothes?"
Day 2: SOUPS main track
Keynote: Chris Soghoian -- Sharing the blame for the NSA's dragnet surveillance program
This talk was about government spying on people/suspects. People don't buy things or generally expect to be "attacked" (I don't buy a laptop based on thinking I'll be raided by the FBI in the future).
Phones.
Supreme court ruled recently that we have reasonable expectation to privacy on phones and other digital portable devices.
At the US border, authorities can inspect and image any of your devices (but not make you enter your password).
Mobile developers don't advertise security as a selling point. It's hard to weigh security benefits of various apps when the devs don't say things about how they secure things.
Apple did decscribe how their security works on iOS; it says that with a pin, your device is encrypted -- strongly.
Apple and google also have mechanisms to bypass any encryption with a warrant.
Desktop.
Windows limits which consumers (via home/pro/ultimate) versions get disk encryption. Windows 8.1 has it for all versions, but has not in the past packaged the option with home. Apple offers it to all Mac OS X users. Defaults and incentives are not there to benefit the majority of people. This is default security for the rich.
We know how to fix this, but security isn't reaching poorer users.
Tech can protect us when the law can't. So we should have protection tech.
Mail.
GPG is not usable. Glen Greenwald couldn't use it when he needed to protect a source.
Nothing has changed since "Why Johnny Can't Encrypt."
What about email subjects and attachment names? PGP doesn't help obfuscate these.
Existing tools do not suit the needs of non-technical users. The market forces are against default/easy-to-use crypto.
- Data loss concerns
- Business model (data mining companies)
- government and law enforcement pressure
- Lack of market power in the orgs that want to make change