Facebook is the largest community in the world. It is also one of the least democratic institutions on earth. That’s why Facebook needs a mayor.
In non-virtual communities – meaning “IRL” (In Real Life) physical cities and states – where people interact face-to-face daily, societies have developed self-governing structures and policing institutions to serve and protect them. Private companies like Facebook, however, were not organized around democratic ideas or social justice principles. Despite the often lofty mission statements of social media companies, they are businesses put together for one reason: To make money. Oodles of it.
Thanks to the “network-effect,” unplanned, but highly profitable, communities have grown on these internet platforms to number in the billions. Greater in size than any nation-state. More politically powerful than any party or person. They cross borders and span the globe.
As they have grown, so have the scale of their problems and the onus of their responsibilities. The network effect cuts both ways. What has not grown apace has been the capacity to deal with the downside of a digital community’s size and scale. Systems are inadequate to manage communities’ dark side. The bigger these diverse social networks get, the less responsive they are to their complexity and breadth. Worse, they are undemocratic.
Many people reside, interact, and organize within these social networks more than they do in their physical, local community. How do they participate in Facebook’s governance? Who reviews and settles their grievances? And when did they give up their rights to safety and representation? By clicking “I agree” to a Terms of Service Agreement they never read?
By sad-face emoji-ing, individuals might feel at times as if they are lodging a meaningful political protest, but an anonymous algorithm can willy-nilly silence a voice or mute dissent.
Whom do you call when you want to fill the propaganda potholes or arrest violence-inciting cyberbullies? Who monitors and manages the daily self-donations of personal, private data? Which social media platforms get to decide 21st century campaign finance laws?
To function healthily, these platforms need representational roles for their global communities. Mayors, sheriffs, judges, educational boards, regulatory bodies. These platforms must democratically open up to typical community representatives and roles to manage cross-border political speech, for example, or develop bodies to hear and settle accusations of libel. It’s time to recognize that Facebook has become a real global commons that needs a real public governing structure.
Today, however, Facebook denizens are not Facebook citizens.
Sure, people can drop out of the community and live a Walden-like life in the analog outback with other digitally disconnected and disaffected humans. But everyone knows that even if they now choose to leave digitized society, it is impossible to erase their past and purge the digital breadcrumbs of their previous searches and interactions.
Surviving in a 21st century society means a dependency on digital life, whether banking, finding a job, or staying connected with friends and family. We are neither willing nor able to make the trade-off between being a digital subject or an analog citizen.
Regardless, the choice should not be binary. Why should we be subservient to a big data behemoth and “voluntarily” relinquish the rights we have accrued since 1776? Since when is “I Agree” enough to strip us of a participatory role and make us subject to psychographically advanced targeting of our preferences and person? Who’s looking out for us?
Tech leaders like Facebook’s CEO Mark Zuckerberg have argued they should be entrusted with this role. Zuckerberg has made a personal commitment to defend democracy. Despite his good intentions, anyone who understands democracy’s evolutionary history, should feel slightly uncomfortable when someone powerful says “trust me” or that he alone can fix a problem.
Further, while digital “platforms” are entrusted to secure the data and dreams of billions, they need to recognize they are not immune to the vagaries of a market that values them on projected revenue and future growth. The last few weeks have reminded us that tech stocks can be volatile and markets fickle. A marekt’s normal functioning can lead to failure.
What happens if a social media company collapses under the weight of debt, the failure of leadership, or the loss of potential growth. Who owns the data of the dispossessed? How does that community re-form itself and find new expression and connection? This is not a theoretical question.
The virtual world is made up of tentative topographies and ephemeral communities. In 1994, Apple Computer (disclosure: I used to consult Apple) once hosted a social site called “eWorld,” where I established an identity and built a community. I lost both when eWorld went bell-“e”-up after two years. If only I could have called eWorld’s mayor to complain or offer help.
In a future social media democracy, I would own my data, ask my representatives to shut out political advertising paid for by Russian rubles, and vote for my teenage boys to learn less about condom-snorting and more about civics. I bet I’m not alone.
Markos Kounalakis, Ph.D. is a senior fellow at Central European University and visiting fellow at the Hoover Institution. Contact him at firstname.lastname@example.org or on Twitter @KounalakisM.