• VHS [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 months ago

    cnet’s record is terrible. first they do crapware bundling, then have factually incorrect articles written by machine learning algorithms, now this

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    13
    ·
    11 months ago

    This is the best summary I could come up with:


    In the memo, CNET says that so-called content pruning “sends a signal to Google that says CNET is fresh, relevant and worthy of being placed higher than our competitors in search results.” Stories slated to be “deprecated” are archived using the Internet Archive’s Wayback Machine, and authors are alerted at least 10 days in advance, according to the memo.

    These metrics include page views, backlink profiles and the amount of time that has passed since the last update,” the memo reads.

    A comparison between Wayback Machine archives from 2021 and CNET’s own on-site article counter shows that hundreds — and in some cases, thousands — of stories have disappeared from each year stretching back to the mid-1990s.

    Red Ventures, a private equity-backed marketing firm that owns CNET, didn’t immediately respond to questions about the exact number of stories that have been removed.

    Red Ventures has applied a ruthless SEO strategy to its slate of outlets, which also includes The Points Guy, Healthline, and Bankrate.

    In the wake of that revelation and resulting errors on AI-generated stories, Red Ventures temporarily paused the content and overhauled its AI policy.


    I’m a bot and I’m open source!

  • usrtrv@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    Wouldn’t using robots.txt do the same thing without deleting content?