Human history is punctuated with examples of new science and technologies gaining powerful momentum before society considered the repercussions of their applications and established guidelines for their uses.
American drivers were bumping along in Model T Fords for several years before they were required to obtain licenses and moviegoers had been buying tickets for decades when motion picture and television rating systems were introduced.
Our uniquely human drive to discover, invent, and improve is a wondrous thing, but we can get ahead of ourselves by adopting advances before considering the potential for undesirable consequences or taking measures to avoid them (Nobel laureate Alexander Fleming, who discovered penicillin, predicted antibiotic resistance as a result of misuse but his warnings went unheeded for generations).
How did personal computing become personalized ads?
Progress empowers; it enables and enriches. It also introduces new challenges as we see now with the rise of big data and the “personal information economy.” Our ability to capture and crunch data has leapfrogged ahead of a framework to guide its responsible use.
Early triumphs of the digital age arose from computing power—the ability to grind through calculations at an unprecedented rate. The advent of personal computing saw word processing replace the typewriter and the introduction of desktop publishing.
Then came the Internet and email, web search and browsing, e-commerce, and eventually social networks. Each of these developments contributed to the next one and each incrementally encroached upon our online privacy.
Today data, much of it personally identifiable information (PII), drives a significant portion of the global economy and contributes inestimably to our daily activities and interactions.
Retail transactions, traffic apps, fitness trackers, private communications, even media consumption involve surrendering various fragments of data that can easily be combined to create rich profiles and to identify and locate users with great specificity.
We cede this personal data in exchange for convenience, or so the argument goes. Yet in the absence of a universal, or at least widely adopted, ethical framework to guide the responsible use of data, we expose ourselves to questionable manipulation and outright abuse. It’s very difficult to know where to draw the line.
To move forward, we must first step back
Perhaps it will become clearer if we step back and take the long view. It’s fair to say that we are collectively realizing the need to reexamine the very concept of privacy, to redefine it in light of changes wrought by information technology, just as we had to redefine labor in the industrial age to address child welfare, public health, and urbanization.
The modern factory became emblematic of the industrial age, embodying both its promise and peril. The Internet represents the multifarious face of today’s technology: globalization, speed, connectivity, convenience, and scale, but also unintended exposure, inconsistent regulations, and every imaginable scam.
Automation, as a driver of the industrial age, transformed both manufacturing and labor. Initially, in a rush to reap its benefits, we failed to account for the human factor and treated workers as machines.
Debating ethical use, responsibility, and regulations
We find ourselves at a similar junction today as we debate how to use data responsibly. Having rushed headlong into our current state, we must now retreat and reconsider the physical and tacit boundaries that once demarcated private spaces.
Fierce competition among data brokers and the potential for anti-trust actions against those with the most valuable troves indicate just how high the stakes have become.
So what next?
We all hold the reins
Having acknowledged that left unbridled our digital world is becoming increasingly vulnerable (data breaches and the hacking of Wi-Fi enabled toys starkly illustrate its darker side), we can direct our forward momentum toward an acknowledgement that regulation is needed.
It’s not an all-or-nothing issue. We’ve traveled too far down the path of progress to roll back the conveniences we’ve come to enjoy.
So let’s balance economic benefits with individual rights by agreeing to basic ground rules: broadly speaking in the form of data ethics, and more narrowly in the form of specific implementations, e.g., architectures, privacy policies, and business models.
It seems neither practical nor desirable to eliminate completely the capture and use of personal data. Our expectations and habits have changed. But it is entirely within our power to demand and create a code of ethics, to pull back on the reins a bit and return things to a workable order.