Quantcast
Channel: Big Data Big Analytics » pii2012
Viewing all articles
Browse latest Browse all 2

Privacy Identity Innovation (pii2012) Wrap Up—It’s All About Trust and Transparency

$
0
0

By Mary Ludloff

I have purposely given myself a week to reflect on pii2012 before blogging about it because there was a lot of information to absorb. As I look back on the conference, two themes come to mind: trust and transparency. Where to begin? Well, as Jim Adler (@Jim_Adler) tweeted:

“#pii2012 is the best collection of geeks, wonks, and suits.”

This was a conference chock full of interesting ideas, opposing views, and, at times, heated debate. One of the most interesting panels featured four people from very different backgrounds and points in their lives discussing the effect social media and the digital world has on how they conduct themselves online. There is a great post by Marina Ziegler (@Marina_Z) that covers it in depth, but like Marina, I felt that this was the key takeaway:

“The panel, while made up of experts, provided some very direct and honest understanding of how the average person is unaware, and how there is still work to be done in helping average folks understand how to navigate their social online lives, even down to the applications they use online.”

When you are a part of those “geeks, wonks, and suits” you sometimes forget that the “average person” is grappling with how to navigate some very choppy privacy waters. This is something that Terence and I have had many discussions about before, during, and after publishing our book on privacy and big data: How can we help the mainstream public understand how their data is collected and used? And let’s be clear here: the mainstream public includes some very techie folks! I have chatted with lots of people in the high tech industry (in finance, marketing, product development, and services) and many of them have expressed great surprise at how much data is collected about them and how public that data is.

Which leads me to Alison Cerra’s talk on “Identity Shift: How Technologies Change Who We Are, What We Do and Whom We Trust.” Alison shared some of the research from her book (co-authored with Christina James) “Identity Shift: Where Identity Meets Technology in the Networked-Community Age.” It’s a fascinating read (I’m about halfway through) and when you have the time I highly recommend it.  One of the major themes of the book and talk was how we approach technology as we mature through life stages (such as teen, emerging adult, parent, etc.) and the premium we all put on trust in its explicit (via privacy policies and processes) and implicit (perceived trust violations that may not be “real” in terms of adherence to stated policies) forms.  Put simply: we are willing to pay more for products and services that we trust.  Of course, building trust is not just about “talking the talk.” It’s far more about “walking the walk.”

This is one of two themes that we focused on during my panel, “Brand Reputation: The Role of Privacy in Communications.” Moderated by Barry Hurd (@barryhurd), and joined by co-panelists Mike Whitmore (@mikewhitmore) and Leigh Nakanishi (@LeighNakanishi), we had an energetic conversation about:

  • How privacy can be a competitive differentiator and what we, as marketers, should be doing to build trust with our customers.
  • How trust needs to be continually earned throughout our relationships with our customers.

In other words, trust is a dynamic that we must constantly address and transparency, in terms of  how we collect and use personal information, is the key to establishing and maintaining it. Sounds simple right? But in practice not so much, as SceneTap (name of the company and a bar application) discovered during its recent launch:

“On May 14th, SF Weekly published a list of all the San Fran bars that were installing these data-mining cameras and the Internet went wild. Thousands shared the post and hundreds commented to say they would boycott any bar in the city that uses the technology. “Thanks for telling me which bars I should NEVER go to,” is currently the top-ranked comment. The citizen uproar seems to have worked; a few San Francisco bars that had originally partnered with SceneTap have now pulled out due to negative publicity surrounding privacy concerns.”

Now Kashmir Hill, in Forbes, scolded San Franciscans and the media for taking an alarmist view of what is already happening:

“If you’ve been at a bar lately (or a restaurant, or a bank, or a fast food place, or out on a street in most fairly urban environments), and looked around, you’ve probably noticed a few cameras. Surveillance cameras are everywhere. They are a part of life these days, and in that way, we are always “spied on.” SceneTap is empowering in that we actually get to benefit from the data being produced by some of those cameras.”

Kashmir has a valid point—we are being surveilled all the time (a point made during the protecting civil liberties panel that I moderated)—but this is not about surveillance. It is about trust and transparency and what happens when we, as organizations and companies, fail to understand our customers’ concerns. When questioned by the media about their privacy policy, this is what SceneTap said:

“Individual privacy is a huge concern, to be sure. We actually met with the FTC on this very issue a few months back to help start the conversation. Bottom line – this technology is new, and it needs to be used responsibly.”

Translation: “You should just trust us to do the right thing.” Consumer reaction: “So you’re not going to tell me anything about how you might store those images, etc., but you expect me to trust you. I’m going to another bar.” Now, to SceneTap’s credit, once the brouhaha ensued, they were more specific about what they did with the images and data associated with it. But the damage was done and the opportunity to be perceived as a “good privacy citizen” was lost. And honestly, as a marketer I am stunned that they did not realize how their application would raise privacy concerns.

SceneTap may have blown it but Spokeo certainly understands what’s at stake, in terms of privacy, and behaves responsibly. During our book interview tour, Terence and I liked to tell people to spokeo themselves (yes, I am using it as a verb) in order to see what public information is easily obtained about them. Why Spokeo? Well, as I told Emanuel Pleitez (@EmanuelPleitez), Spokeo’s Chief Strategy Officer, at one of the conference’s receptions:

“Your company has an easily understood privacy policy that explains in plain-English were data is pulled from and a simple opt-out to remove one’s name from the listing that also cautions the consumer that opting out of your service does not remove the underlying data.”

In contrast to SceneTap, Spokeo understands the concerns that consumers may have about what they do and have created a framework where they explain how data is collected and used and offer an easy opt-out for those who do not want to be listed (unlike many other personal data aggregators where opting out is made to be as difficult as possible—believe me, I’ve tested them all).  This is a great illustration of what Barry, Mike, Leigh, and I were talking about during our panel: in order to build trust you must be transparent about the data you collect and how you use it. Of course, as we mentioned during the panel several times, you have to have processes in place to ensure that “you do everything you say you’re doing in your privacy policy” as well as a cross-functional team that is constantly asking (and answering) what privacy issues may arise from new products, services, or features.

By the way, Emanuel was one of many who participated in “fireside chats,” a one-on-one interview forum where he talked about the people search market generally and Spokeo specifically. There were many chats scattered throughout the conference which gave all the attendees the opportunity to better understand specific business models as well as issues, such as the work Twitter does to combat fraud and abuse in its service.

The “With Big Data Comes Big Responsibility” panel featuring Terence, Elizabeth Charnock, Ken Dreifach (@ZwillGen), Kevin Marks (@kevinmarks), and moderated by Declan McCullagh (@declanm) also talked about the importance of trust and transparency. Key to this discussion was what big data enables or as Terence put it:

“What’s exciting is our ability to take disparate data sets, analyze them, and find correlations.”

This set the stage for an animated discussion of how big data can be used for “good” or can be abused and what role vendors should play in the privacy debate. As regular readers of our blog know, Terence and I believe that our industry (big data) and our company (PatternBuilders) must always weigh the benefits and risks of what we do against privacy concerns. If we want our customers and the public at large to trust that we are “good” privacy citizens, we must talk the talk and walk the walk.

This reminds me of another discussion I had with someone at one of the conference receptions. He asked me this: “How do you work with your customers when it comes to big data and privacy issues?” We get asked this question a lot because big data can be uncharted territory that leads to lots of interesting unintended consequences, particularly when it comes to pii. For the most part, companies are very protective of their internal data sets and understand that those sources must remain private and secure. However, they may not understand the implications when they want to aggregate those data sets with others. One of our responsibilities is to ensure that our customers do not become a SceneTap. In other words, tackle the issues of pii and privacy throughout our working relationship so that they understand the risks and work with them to ensure that they are building explicit and implicit forms of trust with their customers.

Suffice to say, this was a great conference and lucky for all our readers, much of it was videoed and will be available to the public (I will tweet when they’re posted). While trust and transparency were the key themes, I will leave you with this thought: We are all in this together. We may work for different companies, pursue diverse business models, or provide much needed guidance in a host of areas but we are also consumers. What we create, how we behave, whether we are good privacy stewards or just plain creepy, we will all reap what we sow in this ever-changing digital world.


Tagged: big data, pii2012, Privacy, Privacy and Big Data

Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images