After its website was crippled for nearly a month by a cyberattack, the Internet Archive announced on Monday that it had restored one of its most valuable services—the Save Page Now feature that allows users to add copies of webpages to the organization’s digital library.
In a social media post, the Internet Archive said web pages that users had attempted to save since October 9 are beginning to be archived now, although it did not provide an estimate for when the process would be completed. So, if you were worried that all of that election coverage was in danger of disappearing, the Archive says it’s handling the backlog. And if you stopped archiving because it was down, get back to work.
The organization had been operating its collection in read-only mode since October 21 as it steadily worked to restore services.
Founded in 1996, the Internet Archive is a nonprofit based in San Francisco that provides access to historic web pages, digitized books, and a variety of other media that it has uploaded through its partnerships with hundreds of physical libraries and other partners.
Its unparalleled collection currently contains 835 billion web pages, 44 million books and texts, 15 million audio recordings, 10.6 million videos, 4.8 million images, and 1 million software programs.
I don’t know what kind of architecture web.archive.org has, but when it was offline, I thought that we should really have something distributed that would allow people to store and host a copy of all websites that are important for them.
Doesn’t i2p do something similar to this? I don’t know much about it but I remember reading it and thinking that it’s like bittorrent but no one person has the entire file, or something like that.
100PB on i2p is a funny idea, but it’s not necessarily a bad one.
didn’t you mean IPFS? I2P is a mixnet like the Tor network.
TIL it’s I2P and not L2P
You can save the wacz files?
IPFS seems similar to what you’re looking for.
(See: A copy of Wikipedia on IPFS being censorship-resistant, and globally distributed)