Schneier taxonomy of social networking data

See below for 'A Revised Taxonomy of Social Networking Data'...

I'm not sure the 'disclosed' vs 'entrusted' distinction (in terms of
'your pages' vs other's pages) is very sustainable given the
complexity of rules and APIs on different sites, but the intent is
clear, and the general approach seems both useful and timely...

Dan


---------- Forwarded message ----------
From: Bruce Schneier <schneier@schneier.com>
Date: Sun, Aug 15, 2010 at 7:46 AM
Subject: CRYPTO-GRAM, August 15, 2010
To: CRYPTO-GRAM-LIST@listserv.modwest.com


                CRYPTO-GRAM

              August 15, 2010

             by Bruce Schneier
     Chief Security Technology Officer, BT
            schneier@schneier.com
           http://www.schneier.com


A free monthly newsletter providing summaries, analyses, insights, and
commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit
<http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at
<http://www.schneier.com/crypto-gram-1008.html>.  These same essays
and news items appear in the "Schneier on Security" blog at
<http://www.schneier.com/blog>, along with a lively comment section.
An RSS feed is available.


** *** ***** ******* *********** *************

In this issue:
    A Revised Taxonomy of Social Networking Data
    News
    WikiLeaks Insurance File
    NSA and the National Cryptologic Museum
    Schneier News
    Book Review: How Risky Is It, Really?


** *** ***** ******* *********** *************

    A Revised Taxonomy of Social Networking Data



Lately I've been reading about user security and privacy -- control,
really -- on social networking sites. The issues are hard and the
solutions harder, but I'm seeing a lot of confusion in even forming
the questions. Social networking sites deal with several different
types of user data, and it's essential to separate them.

Below is my taxonomy of social networking data, which I first
presented at the Internet Governance Forum meeting last November, and
again -- revised -- at an OECD workshop on the role of Internet
intermediaries in June.

1.  Service data is the data you give to a social networking site in
order to use it. Such data might include your legal name, your age,
and your credit-card number.

2.  Disclosed data is what you post on your own pages: blog entries,
photographs, messages, comments, and so on.

3.  Entrusted data is what you post on other people's pages. It's
basically the same stuff as disclosed data, but the difference is that
you don't have control over the data once you post it -- another user
does.

4.  Incidental data is what other people post about you: a paragraph
about you that someone else writes, a picture of you that someone else
takes and posts. Again, it's basically the same stuff as disclosed
data, but the difference is that you don't have control over it, and
you didn't create it in the first place.

5.  Behavioral data is data the site collects about your habits by
recording what you do and who you do it with. It might include games
you play, topics you write about, news articles you access (and what
that says about your political leanings), and so on.

6.  Derived data is data about you that is derived from all the other
data. For example, if 80 percent of your friends self-identify as gay,
you're likely gay yourself.

There are other ways to look at user data. Some of it you give to the
social networking site in confidence, expecting the site to safeguard
the data. Some of it you publish openly and others use it to find you.
And some of it you share only within an enumerated circle of other
users. At the receiving end, social networking sites can monetize all
of it: generally by selling targeted advertising.

Different social networking sites give users different rights for each
data type. Some are always private, some can be made private, and some
are always public. Some can be edited or deleted -- I know one site
that allows entrusted data to be edited or deleted within a 24-hour
period -- and some cannot. Some can be viewed and some cannot.

It's also clear that users should have different rights with respect
to each data type. We should be allowed to export, change, and delete
disclosed data, even if the social networking sites don't want us to.
It's less clear what rights we have for entrusted data -- and far less
clear for incidental data. If you post pictures from a party with me
in them, can I demand you remove those pictures -- or at least blur
out my face? (Go look up the conviction of three Google executives in
Italian court over a YouTube video.) And what about behavioral data?
It's frequently a critical part of a social networking site's business
model. We often don't mind if a site uses it to target advertisements,
but are less sanguine when it sells data to third parties.

As we continue our conversations about what sorts of fundamental
rights people have with respect to their data, and more countries
contemplate regulation on social networking sites and user data, it
will be important to keep this taxonomy in mind. The sorts of things
that would be suitable for one type of data might be completely
unworkable and inappropriate for another.

This essay previously appeared in IEEE Security & Privacy.
http://www.schneier.com/essay-322.html

First version:
http://www.schneier.com/blog/archives/2009/11/a_taxonomy_of_s.html


** *** ***** ******* *********** *************

    News



The NSA's Perfect Citizen:  In what creepy back room do they come up
with these names?
http://www.schneier.com/blog/archives/2010/07/the_nsas_perfec.html

Someone claims to have reverse-engineered Skype's proprietary
encryption protocols, and has published pieces of it.  If the crypto
is good, this is less of a big deal than you might think.  Good
cryptography is designed to be made public; it's only for business
reasons that it remains secret.
http://techcrunch.com/2010/07/08/skypes-innermost-security-layers-claimed-to-be-reverse-engineered/
or http://tinyurl.com/2744pf3
http://www.enrupt.com/index.php/2010/07/07/skype-biggest-secret-revealed
or http://tinyurl.com/2ewy55y

There's an embedded code in the U.S. Cyber Command logo, and it's been
cracked already.
http://www.wired.com/dangerroom/2010/07/solve-the-mystery-code-in-cyber-commands-logo/
or http://tinyurl.com/2fxy46w
http://www.computerworld.com/s/article/9179004/Researcher_cracks_secret_code_in_U.S._Cyber_Command_logo
or http://tinyurl.com/24m9kmd

Violating terms of service may be a crime.
http://www.schneier.com/blog/archives/2010/07/violating_terms.html

>From the U.S. Government Accountability Office: "Cybersecurity: Key
Challenges Need to Be Addressed to Improve Research and Development."
Thirty-six pages; I haven't read it.
http://www.gao.gov/new.items/d10466.pdf

Two interesting research papers on website password policies.
http://www.schneier.com/blog/archives/2010/07/website_passwor_1.html

Interesting journal article evaluating the EU's counterterrorism efforts.
http://www3.interscience.wiley.com/cgi-bin/fulltext/123574424/PDFSTART

A book on GCHQ, and three reviews.
http://www.amazon.com/exec/obidos/ASIN/0007278470/counterpane/
http://www.theregister.co.uk/2010/06/15/gchq_review/
http://www.birminghampost.net/life-leisure-birmingham-guide/postfeatures/2010/07/05/privacy-terrorism-and-the-surveillance-secrets-of-gchq-65233-26790952/
or http://tinyurl.com/3xa5gnp
http://www.economist.com/node/16537028?story_id=16537028

More research on the effectiveness of terrorist profiling:
http://www.pnas.org/content/106/6/1716.full

Stuxnet is a new Internet worm that specifically targets Siemens WinCC
SCADA systems: used to control production at industrial plants such as
oil rigs, refineries, electronics production, and so on.  The worm
seems to upload plant info (schematics and production information) to
an external website.  Moreover, owners of these SCADA systems cannot
change the default password because it would cause the software to
break down.
http://news.cnet.com/8301-27080_3-20011159-245.html
http://www.pcworld.com/businesscenter/article/201468/eset_discovers_second_variation_of_stuxnet_worm.html
or http://tinyurl.com/2b3s7dz
http://www.scmagazineus.com/stuxnet-malware-threat-continues-targets-control-systems/article/175092/
or http://tinyurl.com/2ebphdx
http://blogs.computerworld.com/16578/first_true_scada_malware_detected
http://www.infoworld.com/d/security-central/siemens-warns-users-dont-change-passwords-after-worm-attack-915?page=0,0&source=rss_security_central
or http://tinyurl.com/2bzqwts
http://www.wired.com/threatlevel/2010/07/siemens-scada/

The Washington Post has published a phenomenal piece of investigative
journalism: a long, detailed, and very interesting expose on the U.S.
intelligence industry.  It's a truly excellent piece of investigative
journalism.  Pity people don't care much about investigative
journalism -- or facts in politics, really -- anymore.
http://projects.washingtonpost.com/top-secret-america/articles/
My blog entry, with lots of links and reactions.
http://www.schneier.com/blog/archives/2010/07/the_washington.html

An article from The Economist makes a point that I have been thinking
about for a while: modern technology makes life harder for spies, not
easier.  It used to be technology favored spycraft -- think James Bond
gadgets -- but more and more, technology favors spycatchers.  The
ubiquitous collection of personal data makes it harder to maintain a
false identity, ubiquitous eavesdropping makes it harder to
communicate securely, the prevalence of cameras makes it harder to not
be seen, and so on.  I think this is an example of the general
tendency of modern information and communications technology to
increase power in proportion to existing power.  So while technology
makes the lone spy more effective, it makes an institutional
counterspy organization much more powerful.
http://www.economist.com/node/16590867/

Here's a book from 1921 on how to profile people.
http://www.schneier.com/blog/archives/2010/07/1921_book_on_pr.html

WPA cracking in the cloud.
http://blogs.techrepublic.com.com/security/?p=4097
http://www.wpacracker.com/index.html
http://www.wpacracker.com/faq.html

In related news, there might be a man-in-the-middle attack possible
against the WPA2 protocol.  Man-in-the-middle attacks are potentially
serious, but it depends on the details -- and they're not available
yet.
http://www.networkworld.com/newsletters/wireless/2010/072610wireless1.html
or http://tinyurl.com/27tcv6r
http://webcache.googleusercontent.com/search?q=cache:VArK7JzNMyUJ:www.hackforums.net/archive/index.php/thread-321253.html+hack+wpa+via+fake+ssid&cd=2&hl=en&ct=clnk&gl=us&client=safari
or http://tinyurl.com/29kkw72

Okay, this is just weird: a pork-filled counter-Islamic bomb device.
http://www.schneier.com/blog/archives/2010/07/pork-filled_cou.html

DNSSEC root key split among seven people:
http://www.schneier.com/blog/archives/2010/07/dnssec_root_key.html

Security vulnerabilities of smart electricity meters.
http://www.schneier.com/blog/archives/2010/07/security_vulner.html

Hacking ATMs to spit out money, demonstrated at the Black Hat conference:
http://www.wired.com/threatlevel/2010/07/atms-jackpotted/
http://www.technologyreview.com/computing/25888/
http://www.computerworld.com/s/article/9179796/Update_ATM_hack_gives_cash_on_demand
or http://tinyurl.com/39b3yvo

The business of selling fear in the form of doomsday shelters.
http://www.schneier.com/blog/archives/2010/07/doomsday_shelte.html

Seems there are a lot of smartphone apps that eavesdrop on their
users.  They do it for marketing purposes.  Really, they seem to do it
because the code base they use does it automatically or just because
they can. (Initial reports that an Android wallpaper app was malicious
seems to have been an overstatement; they're just incompetent:
inadvertently collecting more data than necessary.)
http://www.schneier.com/blog/archives/2010/08/eavesdropping_s.html

Meanwhile, there's now an Android rootkit available.
http://www.examiner.com/x-39728-Tech-Buzz-Examiner~y2010m7d31-Researchers-release-rootkit-tool-for-Android-phones-at-Defcon-conference
or http://tinyurl.com/2ceuxgx

Location-based encryption -- a system by which only a recipient in a
specific location can decrypt the message -- fails because location
can be spoofed.  Now a group of researchers has solved the problem in
a quantum cryptography setting.  Don't expect this in a product
anytime soon.  Quantum cryptography is mostly theoretical and almost
entirely laboratory-only.  But as research, it's great stuff.
http://www.sciencedaily.com/releases/2010/07/100726162123.htm
http://arxiv.org/PS_cache/arxiv/pdf/1005/1005.1750v1.pdf

More brain scanning to detect future terrorists:
http://www.schneier.com/blog/archives/2010/08/more_brain_scan.html

Coffee cup disguised as a camera lens; yet another way to smuggle
liquids onto aircraft.
http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=280544719941

Ant warfare.
http://www.wired.com/dangerroom/2010/08/gallery-ant-warfare/all/1

There's a new paper circulating that claims to prove that P != NP.
The paper has not been refereed, and I haven't seen any independent
verifications or refutations.  Despite the fact that the paper is by a
respected researcher -- HP Lab's Vinay Deolalikar -- and not a crank,
my bet is that the proof is flawed.
http://www.hpl.hp.com/personal/Vinay_Deolalikar/
http://science.slashdot.org/story/10/08/08/226227/Claimed-Proof-That-P--NP
or http://tinyurl.com/2d2nw4e
http://www.allvoices.com/contributed-news/6476401-vinay-deolalikar-explains-the-proof-that-p-np
or http://tinyurl.com/34bvmjx

Good information from Mikko Hypponen on the Apple JailbreakMe vulnerability.
http://www.f-secure.com/weblog/archives/00002004.html
http://blog.iphone-dev.org/
Apple has released a patch.  It doesn't help people with old model
iPhones and iPod Touches, or work for people who've jailbroken their
phones.
http://www.f-secure.com/weblog/archives/00002007.html
http://support.apple.com/kb/HT4291

Facebook Privacy Settings: Who Cares?" by danah boyd and Eszter Hargittai.
http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/3086/2589
or http://tinyurl.com/37y3u7v

UAE is threatening to ban BlackBerrys.  It's a complicated story, and
I have much to say in my blog post:
http://www.schneier.com/blog/archives/2010/08/uae_to_ban_blac.html

Security analysis of smudges on smart phone touch screens.
http://www.usenix.org/events/woot10/tech/full_papers/Aviv.pdf

Cloning retail gift cards.
http://www.oregonlive.com/beaverton/index.ssf/2010/08/beaverton_man_steals_thousands_from_stores_by_cloning_gift_cards.html


** *** ***** ******* *********** *************

    WikiLeaks Insurance File



WikiLeaks has posted an encrypted 1.4 GB file called "insurance."
It's either 1.4 GB of embarrassing secret documents, or 1.4 Gig of
random data bluffing.  There's no way to know.

If WikiLeaks wanted to prove that their "insurance" was the real
thing, they should have done this:

    * Encrypt each document with a separate AES key.

    * Ask someone to publicly tell them to choose a random document.

    * Publish the decryption key for that document only.

That would be convincing.

In any case, some of the details might be wrong. The file might not be
encrypted with AES256.  It might be Blowfish.  It might be OpenSSL.
It might be something else.

http://wikileaks.org/wiki/Afghan_War_Diary,_2004-2010
http://www.wired.com/threatlevel/2010/07/wikileaks-insurance-file/
http://www.theregister.co.uk/2010/08/02/wikileaks_insurance/
http://cryptome.org/0002/wl-diary-mirror.htm

Weird Iranian paranoia:
http://english.farsnews.com/newstext.php?nn=8905131636


** *** ***** ******* *********** *************

    NSA and the National Cryptologic Museum



Most people might not be aware of it, but there's a National
Cryptologic Museum at Ft. Meade, at NSA Headquarters.  It's hard to
know its exact relationship with the NSA. Is it part of the NSA, or is
it a separate organization?  Can the NSA reclassify things in its
archives?  David Kahn has given his papers to the museum; is that a
good idea?

A "Memorandum of Understanding (MOU) between The National Security
Agency (NSA) and the National Cryptologic Museum Foundation" was
recently released.  It's pretty boring, really, but it sheds some
light on the relationshp between the museum and the agency.

http://www.governmentattic.org/3docs/MOU-NSA-NCMF_2010.pdf


** *** ***** ******* *********** *************

    Schneier News



None this month.  Summers are always slow.


** *** ***** ******* *********** *************

    Book Review: How Risky Is It, Really?



David Ropeik is a writer and consultant who specializes in risk
perception and communication.  His book, How Risky Is It, Really?: Why
Our Fears Don't Always Match the Facts, is a solid introduction to the
biology, psychology, and sociology of risk.  If you're well-read on
the topic already, you won't find much you didn't already know.  But
if this is a new topic for you, or if you want a well-organized guide
to the current research on risk perception all in one place, this is
pretty close to the perfect book.

Ropeik builds his model of human risk perception from the inside out.
Chapter 1 is about fear, our largely subconscious reaction to risk.
Chapter 2 discusses bounded rationality, the cognitive shortcuts that
allow us to efficiently make risk trade-offs. Chapter 3 discusses some
of the common cognitive biases we have that cause us to either
overestimate or underestimate risk: trust, control, choice, natural
vs. man-made, fairness, etc. -- 13 in all.  Finally, Chapter 4
discusses the sociological aspects of risk perception: how our
estimation of risk depends on that of the people around us.

The book is primarily about how we humans get risk wrong: how our
perception of risk differs from the reality of risk.  But Ropeik is
careful not to use the word "wrong," and repeatedly warns us not to do
it.  Risk perception is not right or wrong, he says; it simply is.  I
don't agree with this.  There is both a feeling and reality of risk
and security, and when they differ, we make bad security trade-offs.
If you think your risk of dying in a terrorist attack, or of your
children being kidnapped, is higher than it really is, you're going to
make bad security trade-offs.  Yes, security theater has its place,
but we should try to make that place as small as we can.

In Chapter 5, Ropeik tries his hand at solutions to this problem:
"closing the perception gap" is how he puts it; reducing the
difference between the feeling of security and the reality is how I
like to explain it.  This is his weakest chapter, but it's also a very
hard problem.  My writings along this line are similarly weak.  Still,
his ideas are worth reading and thinking about.

I don't have any other complaints with the book.  Ropeik nicely
balances readability with scientific rigor, his examples are
interesting and illustrative, and he is comprehensive without being
boring.  Extensive footnotes allow the reader to explore the actual
research behind the generalities.  Even though I didn't learn much
from reading it, I enjoyed the ride.

How Risky Is It, Really? is available in hardcover and for the Kindle.
Presumably a paperback will come out in a year or so.  Ropeik has a
blog, although he doesn't update it much.

http://www.amazon.com/exec/obidos/ASIN/0071629696/counterpane/

David Ropeik:
http://dropeik.com/

My essay on the feeling and reality of security:
http://www.schneier.com/blog/archives/2008/04/the_feeling_and_1.html

My essay on the value of security theater:
http://www.schneier.com/blog/archives/2007/01/in_praise_of_se.html


** *** ***** ******* *********** *************

Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing
summaries, analyses, insights, and commentaries on security: computer
and otherwise.  You can subscribe, unsubscribe, or change your address
on the Web at <http://www.schneier.com/crypto-gram.html>.  Back issues
are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to
colleagues and friends who will find it valuable.  Permission is also
granted to reprint CRYPTO-GRAM, as long as it is reprinted in its
entirety.

CRYPTO-GRAM is written by Bruce Schneier.  Schneier is the author of
the best sellers "Schneier on Security," "Beyond Fear," "Secrets and
Lies," and "Applied Cryptography," and an inventor of the Blowfish,
Twofish, Threefish, Helix, Phelix, and Skein algorithms.  He is the
Chief Security Technology Officer of BT BCSG, and is on the Board of
Directors of the Electronic Privacy Information Center (EPIC).  He is
a frequent writer and lecturer on security topics.  See
<http://www.schneier.com>.

Crypto-Gram is a personal newsletter.  Opinions expressed are not
necessarily those of BT.

Copyright (c) 2010 by Bruce Schneier.

Received on Sunday, 15 August 2010 10:58:54 UTC