In recent years, social media companies have taken over the globe. Social media platforms, like Facebook, increasingly take up space in our daily lives, but as their influence grows, the treatment they receive from users and governments has remained relatively stagnant and is currently in drastic need of change. A variety of issues have occurred in regards to the relationship between Facebook and its users, prompting questions about the operation of Facebook and other social media platforms.
One of the primary topics that social media companies encounter is the data privacy of their users. Facebook is no stranger when it comes to issues of data privacy. For example, after settling a lawsuit from 2015 related to facial recognition and the infringement of customer privacy, Facebook was forced to pay out $650 million (Isak and Hanna). Facebook just resolved this issue from 2015 about data privacy, only for a brand new data privacy issue to emerge. Facebook recently faced backlash because it provided unauthorized access to personally identifiable information (PII) to Cambridge Analytica (Isak and Hanna). Facebook shared the data of more than 87 million of its users with Cambridge Analytica without their knowledge. The sheer lack of transparency is of the greatest concern for Facebook users who believe their data is secure. Social media and tech companies are impacting society in a new way as Facebook puts its customers at risk, threatening the privacy and well-being of many American citizens.
So, where do we go from here? How do we resolve this issue and protect the data privacy of users? The answer lies in regulation, but the difficulty is finding the proper means by which to regulate. At present, social media companies primarily operate through a system of self-regulation. The companies have the power to make the policy, interpret their own rules, and enforce this how they wish. The companies manage their platform through terms and conditions, which users agree to when making an account. However, these terms are rarely transparent or easy to understand, and with each company designing its own terms and conditions, there is no uniform policy put in place. Social media platforms are left to act on their own accord with their best interests in mind while ultimately putting users at risk. An alternative option for social media regulation would be to put the government in charge. The three branches of the U.S. government already act to create legislation, interpret the law, and enforce these rules upon American citizens and businesses. With the government leading regulation, a uniform set of policies could be established, and each social media platform would be forced to conform to these rules, but the primary concern with government regulation is that both companies and users are apprehensive about government control. Platforms would lose their power with a shift from self-regulation to government regulation, and users may be discontent with increased government oversight in their lives.
Self-regulation and government regulation are practical options for regulation, but each presents its own set of issues. There is a need for compromise between the two types of regulation through a solution where the power to create regulation policies, interpret these rules, and enforce them is not contained in just one body. Thus, a third option that finds a balance between the previous two would be to regulate social media by utilizing a third-party governing body, such as the United Nations, to create and pass legislation for companies to adhere to. This governing body would have the power to draft legislation on how to regulate social media companies. Social media companies would interpret these policies and apply them to their platform’s terms and conditions. Then, the individual governments would be left to enforce this policy to ensure companies follow the rules. This solution finds a middle ground between self-regulation and government regulation of social media platforms. Companies would know what guidelines to follow, and the government would be in a position of oversight. There would be a transparent system between the social media companies, users, and the government making this form of regulation the most viable solution.
Facebook’s disregard for customer privacy proves that new forms of regulation are needed, regardless of what the solution may be. Corporations, like Facebook, cannot be trusted to operate with users’ best interests in mind. Social media platforms are failing to protect the data and privacy of their consumers, illustrating the need for new forms of regulation. Additionally, issues of data privacy are only one sector of Facebook’s irresponsible actions. There are still issues of fake news circulating on the platform, targeted ads being presented to users, and the monopolistic nature of Facebook, which has prompted anti-trust lawsuits against the company. The problems with Facebook only continue to expand and challenge the current legal infrastructure in place. It is time for new policies, new regulations, and new solutions.
Sources Cited: Isak and Hanna, User Data Privacy: Facebook, Cambridge Analytica and Privacy Protection, Computer 51:8 (2018)