In last week’s House Committee on Energy and Commerce hearing on “Protecting Consumer Privacy in the Era of Big Data,” the Center for Democracy and Technology’s Nuala O’Connor called attention to a five-year grey area in U.S. privacy law, more commonly referred to as “the teenage years.” The Children’s Online Privacy Protection Act applies to companies that offer services to children under the age of 13. Meanwhile, the age of adulthood in the U.S. is 18. So, teens aged 13 through 17 are treated — in U.S. privacy law — the same as adults. They have the same ability to consent, no restrictions on how an organization collects or uses their data, the same access to purchase and download things as their parents. At 13.
When my kids were 13-year-olds, they made lots of decisions without considering the consequences, they do at 14 and 15 too, and I bet they will continue to for a few more years (but hopefully not too many!). They are also, by nature of their age, exceptionally susceptible to things that escalate their social status. My 15-year-old is now an “ambassador” for two companies that direct messaged him on Instagram. He’s also “sponsored” by another for skiing. As far as I can tell, these companies have found a way to get my son’s personal information and now he markets their products for them. In exchange, he gets discounts on things he buys from them (except he’s never bought anything) and he can tell his friends he’s sponsored, or that he’s an ambassador for a company. I wonder, how many teenagers have freely given their information in exchange for this social capital without any parental input?
It’s not illegal, but it certainly seems to be taking advantage of a demographic that is protected in myriad other ways in our society. If he was 12 they couldn’t do that. First of all, he wasn’t on Instagram at 12 (though many are) and secondly, COPPA would have made it illegal for these companies to collect personal information from him without my consent as his parent. Which brings me to my point: Just because something is legal doesn’t mean that it’s necessarily ethical.
There are laws all over the world about what personal information organizations can collect, how they can collect it and what they can do with it. But sometimes, like in the case above, a path paved with good intentions (protecting kids’ privacy) doesn’t go the full distance. Maybe in 1998, when it was written, COPPA did the job. Today, however, kids have an online presence (often literally) the minute they emerge from the womb, and what’s possible online now was just a twinkle in some developer’s eye back then. The laws just can’t keep up.
And yet, all too often, legal compliance is the sole guiding light for organizations’ privacy programs. This mindset is what has turned our notice-and-choice framework into a like-it-or-leave-it framework; what’s made the current self-regulatory programs ineffective; what’s allowed for discrimination within our platforms and algorithms; and what created the sense of urgency that brought us the CCPA.
Alternatively, a privacy program that respects the rights of consumers will look beyond what’s legal, consider the company’s code of conduct, code of ethics, risk tolerance and consumer expectations to build a culture of privacy throughout the organization in line with its values. It’s also a program that won’t have to constantly reinvent itself for the next new law coming down the pike. Of course, there will be tweaks here and there for specific requirements within specific laws, but an ethics-based privacy program will result in a high level of legal compliance, and much, much more.