The Impact of YouTube’s Supreme Court Case on ChatGPT and AI

Date:

As the US Supreme Court is soon to make a decision concerning the weakening of a powerful privacy shield protecting internet companies, this could also have an influence on technologies which are rapidly developing, such as AI chatbot ChatGPT by OpenAI. Microsoft is a major investor in this company. This decision could determine the legal implications for such organizations which develop generative AI tools for tasks like defamation or privacy issues.

The case being reviewed in the court does not directly relate to generative AI, but the way algorithms handle operations like YouTube suggesting videos to its users and ChatGPT, both can be compared. Specifically, the Supreme Court is trying to decide whether the US Communications Decency Act of 1996 granting digital partners legal security for hosting user content, should also apply when companies use their algorithms to target customers with recommendations.

The thoughts on this situation vary. Carl Szabo, Board Member of NetChoice, which belongs to tech-industry, claims that AI is not originally creating anything, and it is just collecting and reorganizing data. On the other side, Democratic Senator Ron Wyden, former member of the House of Representatives, believes that if companies are using AI to create content, then the law protecting them should not be applied.

If the court decides to weaken the law, AI developers risking to be exposed to a great number of complaints, which could result in restriction of the development of such technologies. While the regulations could vary in different contexts, the whole debate ultimately is focusing on the organizations’ liability when it comes to generated content from AI tools like ChatGPT or Bard from Google. According to experts, the court should try to find the middle ground solution, determining if the app is operating with data or creating completely new material.

See also  Comparison of ChatGPT and CIOs on Top Three Health IT Issues

Cameron Kerry, a fellow from the Brookings Institution, said that such issues are relevant for the AI-based chatbot as well. If the court decides that the responsibility should be placed on the developer, this could result in companies producing safer products, according to Hany Farid, a professor from the University of California at Berkeley.

OpenAI and Google did not provide a statement on the issue. The Supreme Court is expected to announce its decision by the end of June.

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Obama’s Techno-Optimism Shifts as Democrats Navigate Changing Tech Landscape

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tech Evolution: From Obama’s Optimism to Harris’s Vision

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tonix Pharmaceuticals TNXP Shares Fall 14.61% After Q2 Earnings Report

Tonix Pharmaceuticals TNXP shares decline 14.61% post-Q2 earnings report. Evaluate investment strategy based on company updates and market dynamics.

The Future of Good Jobs: Why College Degrees are Essential through 2031

Discover the future of good jobs through 2031 and why college degrees are essential. Learn more about job projections and AI's influence.