The Lowy Institute has raised concerns about the concentration of power in the hands of a few tech giants, leading to what they call ‘democratic erosion’ in the online realm. They propose a unique solution – the involvement of ordinary citizens in the moderation of online content.
In an era where a handful of multinational corporations control most digital platforms, there is a growing lack of accountability to the general population. Researcher Lydia Khalil suggests that by allowing average digital users and tech experts to form platform councils, a more legitimate consensus on content moderation can be achieved.
This approach would distribute responsibility for content moderation and user access among technology companies, government bodies, and the wider population. It aims to address the challenges posed by disinformation, polarization, and extremism online while ensuring democratic rights like freedom of expression are upheld.
Moreover, this model could extend to informing government regulations on emerging technologies like AI. Recent events, such as the Australian eSafety Commissioner imposing a fine on a tech company for failing to tackle online abuse materials, highlight the need for effective oversight mechanisms.
While governments contemplate legislative measures to combat misinformation and disinformation, public consultations have shown a range of views on proposed laws. The delay in enacting such laws reflects the complexity of balancing online safety with democratic rights, such as freedom of expression.
In the ongoing debate, both government officials and opposition leaders are considering amendments to proposed legislation to address concerns around definitions, exemptions, and religious freedom protections. The aim is to combat harmful online content without stifling legitimate expression or impeding public discourse.
As the digital landscape continues to evolve, the role of ordinary citizens in shaping the governance of online platforms is gaining traction. By involving a diverse range of stakeholders in decision-making processes, it is hoped that a more balanced and inclusive approach to online content moderation can be achieved.