getting ahead digital data guidelines access encryption privacy security

Getting Ahead of Digital Data: the cart’s before the horse and it’s rolling away!

Human history is punctuated with examples of new science and technologies gaining powerful momentum before society considered the repercussions of their applications and established guidelines for their uses.

American drivers were bumping along in Model T Fords for several years before they were required to obtain licenses and moviegoers had been buying tickets for decades when motion picture and television rating systems were introduced.

Our uniquely human drive to discover, invent, and improve is a wondrous thing, but we can get ahead of ourselves by adopting advances before considering the potential for undesirable consequences or taking measures to avoid them (Nobel laureate Alexander Fleming, who discovered penicillin, predicted antibiotic resistance as a result of misuse but his warnings went unheeded for generations).

How did personal computing become personalized ads?

Progress empowers; it enables and enriches. It also introduces new challenges as we see now with the rise of big data and the “personal information economy.” Our ability to capture and crunch data has leapfrogged ahead of a framework to guide its responsible use.

Early triumphs of the digital age arose from computing power—the ability to grind through calculations at an unprecedented rate. The advent of personal computing saw word processing replace the typewriter and the introduction of desktop publishing.

Then came the Internet and email, web search and browsing, e-commerce, and eventually social networks. Each of these developments contributed to the next one and each incrementally encroached upon our online privacy.

Today data, much of it personally identifiable information (PII), drives a significant portion of the global economy and contributes inestimably to our daily activities and interactions.

Retail transactions, traffic apps, fitness trackers, private communications, even media consumption involve surrendering various fragments of data that can easily be combined to create rich profiles and to identify and locate users with great specificity.

We cede this personal data in exchange for convenience, or so the argument goes. Yet in the absence of a universal, or at least widely adopted, ethical framework to guide the responsible use of data, we expose ourselves to questionable manipulation and outright abuse. It’s very difficult to know where to draw the line.

To move forward, we must first step back

Perhaps it will become clearer if we step back and take the long view. It’s fair to say that we are collectively realizing the need to reexamine the very concept of privacy, to redefine it in light of changes wrought by information technology, just as we had to redefine labor in the industrial age to address child welfare, public health, and urbanization.

The modern factory became emblematic of the industrial age, embodying both its promise and peril. The Internet represents the multifarious face of today’s technology: globalization, speed, connectivity, convenience, and scale, but also unintended exposure, inconsistent regulations, and every imaginable scam.

Automation, as a driver of the industrial age, transformed both manufacturing and labor. Initially, in a rush to reap its benefits, we failed to account for the human factor and treated workers as machines.

Appalling conditions, occupational hazards, inhumane hours, and child labor gave rise to a spate of new legislation that, in effect, stepped backward to identify the best way forward.

Debating ethical use, responsibility, and regulations

We find ourselves at a similar junction today as we debate how to use data responsibly. Having rushed headlong into our current state, we must now retreat and reconsider the physical and tacit boundaries that once demarcated private spaces.

We must unravel each strand of a complicated topic to evaluate issues of access, encryption and surveillance, data privacy, the protection of student data, data ownership, and security.

Fierce competition among data brokers and the potential for anti-trust actions against those with the most valuable troves indicate just how high the stakes have become.

So what next?

We all hold the reins

Having acknowledged that left unbridled our digital world is becoming increasingly vulnerable (data breaches and the hacking of Wi-Fi enabled toys starkly illustrate its darker side), we can direct our forward momentum toward an acknowledgement that regulation is needed.

It’s not an all-or-nothing issue. We’ve traveled too far down the path of progress to roll back the conveniences we’ve come to enjoy.

So let’s balance economic benefits with individual rights by agreeing to basic ground rules: broadly speaking in the form of data ethics, and more narrowly in the form of specific implementations, e.g., architectures, privacy policies, and business models.

It seems neither practical nor desirable to eliminate completely the capture and use of personal data. Our expectations and habits have changed. But it is entirely within our power to demand and create a code of ethics, to pull back on the reins a bit and return things to a workable order.

 

Privacy matters!

5 privacy trends for 2016 big data ad blocking personal data

5 Privacy Trends for 2016: a battle for big data, bandwidth, and ad blocking

As we look ahead to the coming year, our eyes are inevitably drawn to the digital landscape and the billions of personal data points that map its contours.

Nearly every what-to-watch-in-2016 list refers to data privacy. And nearly every one points to a significant shift in the balance of control over personal data: tipping away from AdTech and toward consumers.

To relinquish or control, that is the question

People are bristling at the unbridled collection and use of data about their behavior online, their every move through physical space, and literally thousands of facets of their “persona” (up to 4,000 data points on a single user—one journalist asks whether he could come up with that many data points on his spouse!).

And we consumers are footing the bill: the frenetic pop-ups and “vexing videos” that plague our mobile screens have voracious appetites for bandwidth, sometimes consuming more than the content itself.

Cross-device tracking using digital fingerprinting represents a particularly egregious invasion of personal space.

Yet consumer opinion remains divided, largely along generational and cultural lines, about the risks and benefits of permitting data collection.

Some, especially “digital natives,” are accustomed to letting their private lives spill out in full view of the online public. (Though cybersecurity specialists predict that Millennials will take a closer look at privacy.) Many others shrink from the spotlight, wondering what really lies behind the glow.

Of course, there is no immutable law of information technology declaring that we must relinquish our personal data and privacy in order to participate as digital citizens. We can demand control.

Blocking, faking, refusing

Has data-driven personalization reached its limit? It certainly has met its match in ad-blocking technology and consumers’ evasive strategies.

  • Symantec’s State of Privacy 2015 finds that 33% of consumers in the UK provide fake data and 53% avoid posting personal data online.
  • A Pew Research Center study shows that 24% of American Internet users provide inaccurate information about themselves and 57% have refused to provide information irrelevant to the transaction at hand.
  • The dizzying rise of ad-blocking software (198 million active ad blockers globally including 34% of 16-24 year olds using the Internet) illustrates our collective frustration with increasingly intrusive advertising strategies.

Big data is bittersweet

Big data has many worthwhile and legitimate uses but the anonymization of personal data is notoriously difficult and the data collected often far exceeds what is needed for a given service or transaction, for example:

  • identifying individuals based on retail transactions (as few as 4 data points provided 90% accuracy!)
  • seeking excessive permissions (up to 235 permissions, the average Android app seeks 5)

Forks in the road: what can we do?

We can choose more palatable paths through the digital world. Consider these 5 alternatives:

  • Matching content to channel: Differentiate content types and select communication channels that are aligned with their attributes: e.g., broadcast public content; choose user-to-user or authenticated access for private content; delete temporary content when it becomes irrelevant; archive permanent content for posterity.
  • Managing our own personal-data: Ask users to define the privacy parameters of their online presence based on the context of what is being served (e.g., search results, e-retail, social content, branded content, academic research, professional content, etc.). Researchers in EdTech have already taken steps down this path granting students greater control over what personal data is displayed on a given page. They call it “sovereign source identity.”
  • Re-defining regulatory frameworks: Support national and international laws that promote more transparent terms of service, explicit opt-in, the right-to-be-forgotten, and what law professor Lawrence Lessig calls systems that draw on personal data for “single-use purposes.”
  • Favoring private-by-design: Appeal to consumers by offering inherently private, secure devices like ReVault’s wearable data storage, Purism’s laptop, or the Blackphone 2.
  • Data minimization: Do not collect sensitive information if it isn’t needed for a given service and delete it once it is no longer relevant. Store personal data locally rather than in the cloud. (Data minimization will be critical for the Internet of Things.)
  • Permission-based advertising: Encourage permission marketing rather than interruption marketing. The former is not a new idea but it may enjoy a renaissance. Rather than pushing intrusive ads to consumers, marketers and advertisers may offer them something in exchange for their attention or action.

All of these options implicitly treat personal data as a monetizeable asset. Given that we are the source of this in-demand resource, shouldn’t we exercise our right to determine its value and the conditions of its exchange?

Shouldn’t we demand more than the simple convenience that data controllers point to as the current trade-off? (An Annenberg School for Communication survey reveals that most Americans don’t buy this “tradeoff fallacy” anyway.)

More bandits, more breaches

As we explore these options, cybercriminals will continue to test our systems’ vulnerabilities relentlessly and will penetrate inadequate defenses. The incidence of data breaches continues to increase (780 in the U.S. in 2015), as does the sophistication of the attacks. Those seeking unauthorized access to personal data are devising increasingly subtle ploys. Social-engineering fraud preys on our gullibility and turns our socially-shared information against us.

Leaky connections

The profusion of connected devices spawned by the Internet of Things (IoT) will expose still more of our data to additional “controllers” and attacks. The Gartner Group estimates that the number of connected things will reach 25 billion by 2020.

And the range of entities seeking to use information about our behavior and demographic data keeps expanding: note the granularity of voter profiling in the current U.S. presidential race. Psycho-graphic, behavioral microtargeting is providing candidates’ campaigns with detailed information gleaned from voters’ “Like” patterns on social media.

As with any data stored in the cloud, these records can be leaked. A researcher was able to access 191 million voting records from one database a few weeks ago and an additional 56 million records from another.

New rules of the road

We can all expect to be rated on our data-ethics performance and our reliability vis-à-vis privacy and security. Driven by both consumer pressure and the risks of cybercrime, businesses will continue to adapt by creating new roles (chief privacy officer), adopting new regulations, developing new privacy-enhancing technologies (so-called PETs), and implementing new policies and training, all addressing data security and data ethics. The economic and reputational risks of failing to do so could be crippling.

So who says privacy is dead? 93% of Americans feel that it is important to control who can get information about them and 90% feel it’s important to control what information is collected. Those numbers unequivocally refute any claim to privacy’s demise. To ignore them is akin to junk food marketers asserting that healthy eating is dead.

Privacy will be dead when we digital citizens give it up. Nothing indicates that moment is near.


Read additional perspectives on what to expect in the privacy space in 2016:

  • Mary Meehan writing about consumer culture in Forbes
  • Christos K. Dimitriadis writing about cyber-risk trends in TechInsider
  • Victor Pineiro writing about social-media marketing trends in AdAge
  • Global design firm Fjord predicts: “big data will get some manners.” Let’s hope they are correct.

Photo credit: Russell Johnson

Try bitpuf!