Wikipedia is under attack — and how it can survive
-
The fact you can download the entirety of the site for 111gb sounds pretty damn impressive to me.
It doesn’t actually include all the media, and – I think – edit history. It does give you a decent offline copy of the articles with at least the thumbnails of images though.
Edit: If you want all the media from Wikimedia Commons (which may also include files that are not in Wikipedia articles directly) the stats for that are:
Total file size for all 126,598,734 files: 745,450,666,761,889 bytes (677.98 TB).
according to their media statistics page.
-
Wikipedia needs to leave the US at the least. Billionaires and AI pose a threat to all humans, and Wikipedia is no exception.
And the Internet Archive too
-
This post did not contain any content.
Paywall; DR
-
It doesn’t actually include all the media, and – I think – edit history. It does give you a decent offline copy of the articles with at least the thumbnails of images though.
Edit: If you want all the media from Wikimedia Commons (which may also include files that are not in Wikipedia articles directly) the stats for that are:
Total file size for all 126,598,734 files: 745,450,666,761,889 bytes (677.98 TB).
according to their media statistics page.
Nice stats. I always wondered. I get the feeling that ~678 TB is little bit more than ~111 GB.
-
Nice stats. I always wondered. I get the feeling that ~678 TB is little bit more than ~111 GB.
Like, at least 7GB bigger.
-
It doesn’t actually include all the media, and – I think – edit history. It does give you a decent offline copy of the articles with at least the thumbnails of images though.
Edit: If you want all the media from Wikimedia Commons (which may also include files that are not in Wikipedia articles directly) the stats for that are:
Total file size for all 126,598,734 files: 745,450,666,761,889 bytes (677.98 TB).
according to their media statistics page.
Dear god, are we still using base 2 for file sizes? At least use TiB like a reasonable person.
-
And the Internet Archive too
However I wonder how this would work. As far as I know Internet Archive have a “Library” status and rights in the US (and only in the US), which grants them rights to archive stuff and have it as download that would be otherwise not legal. That does not mean everything provided there is legal. So leaving the US could actually hurt Internet Archive or the users in the US maybe.
I would be glad if anyone with more insight into this topic could tell me one or two things about it.
-
Dear god, are we still using base 2 for file sizes? At least use TiB like a reasonable person.
It doesn’t matter in this case, as long as it is documented (and it is by the unit).
-
However I wonder how this would work. As far as I know Internet Archive have a “Library” status and rights in the US (and only in the US), which grants them rights to archive stuff and have it as download that would be otherwise not legal. That does not mean everything provided there is legal. So leaving the US could actually hurt Internet Archive or the users in the US maybe.
I would be glad if anyone with more insight into this topic could tell me one or two things about it.
It definitely cannot go to the EU. I don’t believe any EU country permits private online libraries.
Plus the entire Wayback Machine would be considered systematic copyright infringement since the Internet Archive doesn’t obtain permission prior to archival. And if you don’t have permission then it is automatic copyright infeingement.
-
It definitely cannot go to the EU. I don’t believe any EU country permits private online libraries.
Plus the entire Wayback Machine would be considered systematic copyright infringement since the Internet Archive doesn’t obtain permission prior to archival. And if you don’t have permission then it is automatic copyright infeingement.
There are still Nordic countries outside the EU. Switzerland appears to be heading in the wrong direction as well, so I might suggest the Seychelles.
-
It doesn’t matter in this case, as long as it is documented (and it is by the unit).
To be clear, I’m fine with RAM being base 2 – it’s rather difficult for it not to be given the structure – but for fixed storage, this is an old-school measurement that only gets worse with each order of magnitude.
-
To be clear, I’m fine with RAM being base 2 – it’s rather difficult for it not to be given the structure – but for fixed storage, this is an old-school measurement that only gets worse with each order of magnitude.
-
The fact you can download the entirety of the site for 111gb sounds pretty damn impressive to me.
Text is light. Images are a bit heavier, but there’s not too too many.
-
Do theverge have this big font or is something broken on my end?
You can download the entirety of Wikipedia for offline usage, BTW. I do this with an application called Kiwix https://kiwix.org/en/ .
- Click “All Files” on the left menu of the program.
- In the bottom search bar (there is one top and one bottom bar) type “wikipedia” to show only those entries matching the search.
- Then click on the “Size” header to sort all entries by size. Usually the biggest one is the most complete.
- Now “Download” it (i already have it, so it says “Open” for me).
Note that the big one with 111 GB contains images and contains all English language Wikipedia articles. The one with 43 GB should be the same I think, but without images. There are many other variants too, varying in content and theme and even build date. In example the one with “1m Top” contains the top 1 million articles only.
The problem with this solution is that it leaves out the most important part of Wikipedia of all; the editors. Wikipedia is a living document, constantly being updated and improved. Sure, you can preserve a fossil version of it. But if the site itself goes down then that fossil will lose value rapidly, and it’s not even going to be useful for creating a new live site because it doesn’t include the full history of articles (legally required under Wikipedia’s license) and won’t be the latest database dump from the moment that Wikipedia shut down.
-
Do theverge have this big font or is something broken on my end?
You can download the entirety of Wikipedia for offline usage, BTW. I do this with an application called Kiwix https://kiwix.org/en/ .
- Click “All Files” on the left menu of the program.
- In the bottom search bar (there is one top and one bottom bar) type “wikipedia” to show only those entries matching the search.
- Then click on the “Size” header to sort all entries by size. Usually the biggest one is the most complete.
- Now “Download” it (i already have it, so it says “Open” for me).
Note that the big one with 111 GB contains images and contains all English language Wikipedia articles. The one with 43 GB should be the same I think, but without images. There are many other variants too, varying in content and theme and even build date. In example the one with “1m Top” contains the top 1 million articles only.
Thanks for sharing this. Started hosting a local copy of several wiki sources last weekend once this news broke.
Another commenter said downloading is missing out on the best part of Wikipedia, the ongoing editing. Which, while true, is also going to be a weak point.
How many of those amazing editors are going to stick around when their full time job becomes combatting obvious right wing bullshit, when they have to submit gov ID to have an account on the site, and when common sense and fairness becomes a crime?
Wikipedia was a high point for humanity. Whatever comes next I’d like to preserve a little piece of it.
-
Do theverge have this big font or is something broken on my end?
You can download the entirety of Wikipedia for offline usage, BTW. I do this with an application called Kiwix https://kiwix.org/en/ .
- Click “All Files” on the left menu of the program.
- In the bottom search bar (there is one top and one bottom bar) type “wikipedia” to show only those entries matching the search.
- Then click on the “Size” header to sort all entries by size. Usually the biggest one is the most complete.
- Now “Download” it (i already have it, so it says “Open” for me).
Note that the big one with 111 GB contains images and contains all English language Wikipedia articles. The one with 43 GB should be the same I think, but without images. There are many other variants too, varying in content and theme and even build date. In example the one with “1m Top” contains the top 1 million articles only.
Best thing is that it works flawlessly on the mobile apps as well, and Wikipedia also has a 1 million most relevant articles or so, which is just a few gigabytes.
-
Some solution is better than no solution. I don’t mind having a ‘fossil’ version for a pinch. We got along okay with hardcovered encyclopedias pre-internet and this is not that different except it still being reliant on electricity. (I have different, more valuable books on hand if we ever wind up THAT fucked.)
-
Some solution is better than no solution. I don’t mind having a ‘fossil’ version for a pinch. We got along okay with hardcovered encyclopedias pre-internet and this is not that different except it still being reliant on electricity. (I have different, more valuable books on hand if we ever wind up THAT fucked.)
My point is that the alternative isn’t “no solution”, it’s “the much better database dump from Internet Archive or Wikimedia Foundation or wherever, the one that a new Wikipedia instance actually would be spun up from, not the one that you downloaded months ago and stashed in your closet.”
The fact that random people on the Internet have old copies of an incomplete, static copy of Wikipedia doesn’t really help anything. The real work that would go into bringing back Wikipedia would be creating the new hosting infrastructure capable of handling it, not trying to scrounge up a database to put on it.
-
Wikipedia is not at risk of being shutdown, the danger is malevolent editors bringing the culture war inside of it and destroying “truth”. While it would be great to keep wikipedia as it is, “they” are coming for it, wikipedia doesn’t get to be excluded from the war. For now the best we can hope for is that it will survive but the best we can do is save local wikipedia copies in case the worse happens. Which isn’t shutdown, but corruption.