Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    30 days ago

    It’s not, actually. Hallucinations are things that effectively “come out of nowhere”, information that was not in the training material or the provided context. In this case Google Overview is presenting information that is indeed in the provided context. These aren’t hallucinations, the AI is doing what it’s being told to do. The problem is that Google isn’t doing a good job of providing it with the right information to summarize.

    My suspicion is that since Google is using this AI for all search results it’s had to cut back the resources it’s providing to each individual call, which means it’s only being given a small amount of context to work from. Bing Chat does a much better job, but it’s drawing from many more search results and is given the opportunity to say a lot more about them.