Newstik logo
NEWSTIK
Explore the world of short videos
Newstik logo
NEWSTIK
Explore the world of short videos
Newstik logo
NEWSTIK
Explore the world of short videos
Newstik logo
NEWSTIK
Explore the world of short videos
    Login
    Create account
    hustlebitch
    hustlebitch
    0
    62
    3
    1k
    10

    Google's AI Gives Dangerous Advice: Glue in Pizza Sauce?

    Google's AI Hallucinations Raise Concerns: A recent video highlights a disturbing trend of Google's AI providing inaccurate and potentially dangerous advice. The AI has been reported to suggest adding glue to pizza sauce, prompting concerns about the reliability of AI-generated information. Independent researchers have found Google's Gemini model hallucinates in 1.8% of cases, which is a significant figure. This is even more concerning when compared to OpenAI's 03 model, which hallucinated in 33% of internal tests. "This isn't just a funny glitch," says one expert, "it's a serious issue that needs to be addressed." The video underscores the need for caution and further development in AI safety protocols. The increasing reliance on AI for information necessitates rigorous testing and safeguards to prevent the spread of misinformation and dangerous advice.

    2 months ago
    US
    United States
    news summary
    ai
    Google's AI Fails: Inaccurate Overviews Spark Online Outrage
    Google's AI Fails: Inaccurate Overviews Spark Online Outrage
    0.1percentcoach
    2mo ago
    US
    839
    AI Assistants' Unexpected Gibberlink Conversation Sparks Security Concerns
    AI Assistants' Unexpected Gibberlink Conversation Sparks Secu…
    letribunaldunet
    2mo ago
    FR
    4k
    Google's AI Fails:  A Simple Query Reveals a Critical Flaw
    Google's AI Fails: A Simple Query Reveals a Critical Flaw
    0.1percentcoach
    2mo ago
    US
    1k
    ChatGPT's Accuracy: Is the AI Getting Better or Worse?
    ChatGPT's Accuracy: Is the AI Getting Better or Worse?
    skynews
    2mo ago
    GB
    99k
    newstik.info@gmail.com