A Near Miss for Minnesota’s Children
What we can learn from the North Star State’s botched attempt at social media regulation.
The good intentions of legislators do not guarantee good legislation. This maxim has proved true in Minnesota, where good-hearted lawmakers just tried to mint a new law which would have blocked young Minnesotans’ access to important online resources and threatened their online privacy. Intended to shield children from the very real harms of social media usage, the bill proposed to ban tech companies from suggesting new content to minors. Regrettably, its overbroad language and one-size-fits-all logic betrayed its authors’ ignorance of the workings of the digital world. One shouldn’t attempt open heart surgery with a backhoe, and internet regulation also requires an educated, careful, and precise touch.
Luckily for the young people of Minnesota, however, the legislative session just ended, and this shoddy policy is on hold for the moment. Nonetheless, state officials around the country have an appetite for internet regulation at the moment — especially regulation of children’s use of social media — and it is worth taking the time to understand where they are going wrong. The Minnesota legislation was filled with oversights of the most rudimentary kind, and these errors should be fully understood by Minnesotans — and everyone else. Botched attempts at internet regulation are likely coming to a state near you!
The ignorance of legislators was plain even from the original bill’s official summary (i.e. the summary of the legislation before it was folded into the aforementioned omnibus package): “The bill would require anyone operating a social media platform with more than one million users to require that algorithm functions be turned off for accounts owned by anyone under the age of 18” (emphasis added). But this makes no sense. A social media algorithm is just a bit of computer code by which information is prioritized and distributed to users. Every website that displays content uses an algorithm — even if that algorithm simply prioritizes content in a reverse-chronological manner (i.e. newest content first) without regard for user consumption habits. Minnesotans may regulate how algorithms behave in their state, but the idea of “turning off algorithm functions” is frankly laughable.
Let’s get to the text.
First off, the bill’s language was simply too broad. Its definition of “social media platform” included not only social media platforms, but any internet service which “allows users to create, share, and view user-generated content.” Therefore, the bill would have applied to any service which allows users to publish their own content — so social media platforms, video platforms (like YouTube), online book sellers (like Goodreads), and any website with a comments section. (Search engines, internet service providers, and email providers are excepted.)
Such dubiously labeled “social media platforms” were to be barred from recommending user-generated content to users under the age of 18. While it is true that algorithms regularly recommend harmful content to minors, young people — like everyone else — often benefit greatly from suggested content, as well. High school students research their course material on YouTube. Aspiring musicians dive into Spotify rabbit holes to learn from the greats. Young people fight off isolation by discovering communities and support systems with others who share their interests. Examples abound.
Simply put, Minnesota was about to prevent all children, regardless of individual circumstances, from accessing any of the benefits of suggested content. Under such a ban, parents would be unable to make case-by-case decisions that can be tailored to the uniqueness of their child, decisions that can curtail the negative aspects of internet usage while preserving access to the useful parts. Even children of the same age often have different personality traits and maturity levels — the degree of internet access appropriate for two sixteen year-olds may be wildly different — and it certainly makes no sense to regulate kindergartners and high school seniors in the same manner. This bill threw not only the baby, but the whole tub, out with the bathwater.
Now on to the privacy concerns. To comply with the bill, tech companies would have to establish the age and location of all users. And companies would have to be sure that information provided by users is accurate, for they would be statutorily liable even if they merely have “reason to know” that a Minnesotan minor is using their service. Therefore, tech companies would likely require that users authenticate their age and location with still more sensitive personal data.
Indiscriminately spreading personal data around the internet is dangerous. Private actors and foreign spies regularly and effectively hack their way to your personal data. What’s more, private data companies get rich selling off the data that you and I voluntarily — though often unknowingly — provide to the websites we visit and apps we use. The buyers of this private market data, which include the feds and the Chinese government, may do with it what they will. And supposedly “anonymous” data isn’t really that “anonymous.”
If lawmakers insist on further regulating how children interact with social media, they must first educate themselves on how the internet works and do the hard work of writing considered, targeted legislation. All efforts must treat the constituents’ privacy with care. Blind spelunking into tech regulation only makes the digital world worse.
Better solutions, however, may come from private actors. Parents must step up and more closely monitor their children’s online activity, and tech companies must create more detailed settings options with which parents can limit the content children can view. This allows for tailored solutions based on individual needs and therefore avoids the inevitable externalities of government involvement. Taking the time to persuade parents and tech companies to be responsible may not be as sexy as bugling for a cavalry charge of state intervention, but its results will be far better in the long run.
A National Privacy Crisis
Minnesota politicians aren’t alone in pushing for measures which will endanger online privacy and security. Congress is developing a series of bills, including the American Innovation and Choice Online Act and Open App Markets Act (which mandate that devices allow the the dangerous process of sideloading and compel data portability), the SHOP SAFE Act and the INFORM Act (which require third-party online sellers to disclose incredibly sensitive data), and the EARN IT Act (which threatens the existence of end-to-end encryption). Many of these bills will also be disastrous for the online marketplaces, tech innovation, and American consumers in general.
Despite the complex, often counterintuitive, and ever-changing nature of the modern tech space, the oldsters who run our country are tragically self-assured that their various regulatory shenanigans will have nary an unintended consequences. To disabuse the generally benign — but painfully ignorant — gerontocracy of this notion, citizens need to call their elected officials and vote on tech issues at the ballot box.