Supernet Limited Co-signs Agreement With MENA Telecom Operator

An Intergovernmental Panel on Information Technology, like IPCC would have more leverage when it comes to persuading technology companies to share their data.

Supernet Limited Co-signs Agreement With MENA Telecom Operator

Search engines, online banking, social media platforms, and large-language models are examples of AI systems can be beneficial, but they can also be used to spread hate speech, false information, and disinformation. A panel is needed to understand and address the impact of emerging information technologies on global social, economic, political, and natural systems, similar to how organisations like the IPCC conduct assessments of global environmental change.

An Intergovernmental Panel on Information Technology, like IPCC would have more leverage when it comes to persuading technology companies to share their data. It would also have credibility in non-Western countries, which is increasingly important as the impacts of digital communication technologies play out in different cultural contexts.

A non-profit organization called PeaceTech Lab in Washington DC is assembling a panel with a similar charge to the body we are proposing, but we question whether such a group can operate with independence.

It is challenging to foresee the long-term effects of digital information technologies, such as machine-learning algorithms intended to direct police towards “high-crime areas” and landlords on rental pricing.

While generative AI systems can undermine people’s conceptions of proof, evidence, and veracity, machine learning algorithms have the potential to create cartel-like dynamics and biases in the criminal justice system.

A level playing field in international science could be achieved by lowering or eliminating language barriers, but the use of generative AI systems for text, such as ChatGPT, could undermine the public’s understanding of science by encouraging the industrial-scale production of texts containing untruths and irrelevant information.

Threats of legal action or actual lawsuits have prevented various groups from learning more about how digital information technologies are affecting society. Technology companies are increasingly using strategies to obstruct outside scientific research and sway public opinion.

Researchers from New York University developed a browser extension to gather information on targeted advertising on the platform, and in 2021, Meta, the company that owns Facebook, sent them a cease-and-desist notice. Others have been discouraged from performing this kind of work as a result of this.

What technology corporations publish from their own research teams is carefully curated. Meta has published research on the advantages of Facebook for those who are grieving, but not much on its own analysis of potential negative effects.

Despite internal data showing that posts that sparked the angry emoji were more likely to contain potentially harmful and false content, emoji reactions were valued five times more highly than “likes” by Facebook’s ranking algorithm for three years, according to documents provided to the US Securities and Exchange Commission in 2021.

Companies have steered research agendas as awareness of the issues surrounding potentially harmful and misleading content has grown, placing more of the blame for issues on specific users.

While academic researchers have collaborated with Twitter, Facebook, and Jigsaw (a Google think tank) to develop warning labels for deceptive content, it is unlikely that these labels will have a significant impact on the spread of false information in general.

Before sharing data, companies frequently request a detailed project description, which gives them complete editorial control.

Due to the lack of information about platform design available to independent investigators, it is challenging to quantify and reduce harms. Data are necessary for independent researchers to quantify and reduce harms.

To better understand greenhouse gas emissions and their effects, the IPCC and IPBES have been collaborating since 1988. They have the responsibility for combining already collected data, gathering more information, synthesising that knowledge, and disseminating it to decision-makers. Reports on their global assessments raise awareness and promote evidence-based policy.

The use of digital information technologies is a worldwide issue with illusive effects that cut across generations and continents. The implementation of guard rails has been the focus of efforts to better manage online information ecosystems.

The US AI Bill of Rights promises to give people choices regarding their privacy and freedom from AI-related harm, but it is ambiguous as to how harm could be accurately assessed and avoided.

An infrastructure must be developed to compile and summarise the current state of knowledge regarding the potential societal effects of digital communications technologies in order to ensure proper stewardship.

The best chance of achieving this is through an Intergovernmental Panel on Information Technology, a panel like IPCC which would include professionals in policy, law, the physical and social sciences, engineering, the humanities, government, and ethics.

Similar to the IPCC and the IPBES, the objective would be to provide a knowledge base to support the decisions of actors including governments, humanitarian organisations, and even businesses. The goal would not be to establish an international consensus on how to manage the digital world or to issue regulatory recommendations.

The reports from the IPCC and IPBES can be based on common presumptions about sustainability, natural disasters, and food security.

Researchers looking into the effects of digital technologies have limited access to data and must contend with a rapidly shifting target because businesses routinely conduct A-B tests on their users to gauge the impact of interface changes.

An intergovernmental panel, like IPCC representing the interests of UN member states could determine areas where current levels of transparency are not yielding sufficient insight and encourage regulators to implement legislation that fosters greater accountability, transparency, and auditing.

Despite the fact that nations may not agree on how platforms and services should be restricted or deployed, negotiation between nations requires a clearer understanding of the situation and the available policy options.

Shared objectives, such as those enshrined in international human-rights treaties and norms, as well as emerging formulations of people’s rights in the face of a rapidly evolving digital environment, would serve as the foundation for an Intergovernmental Panel on Information Technology.

It might compile data on fraud, social media, improper election interference or manipulation, or it might analyse the knock-on effects of unrestrained financial dynamics. Negotiation will be necessary for progress, but some of the biggest corporations and most powerful governments may have short-term interests that conflict with progress.

The ability of the online information ecosystem to organise thousands of researchers to address societal issues has been made clear by the COVID-19 pandemic. To ensure that digital communication technologies foster positive dynamics, management and adaptation are essential.