ৱিকিপিডিয়া:সততে উদিত প্ৰশ্নসমূহ/ডাউনলোড আৰু সংযুক্তি
How do I download Wikipedia content?[সম্পাদনা কৰক]
Regular dumps of the database can be found at this address: http://download.wikimedia.org/
How do I download Wikipedia software?[সম্পাদনা কৰক]
Wikipedia runs on MediaWiki. To install MediaWiki on your own server, simply head to the MediaWiki website and scroll down to the "Download" area. You will find the latest stable release at the top of this section. Download the software and unzip it using any program with that functionality.
What kind of computer system do I need to run Wikipedia software?[সম্পাদনা কৰক]
Minimum system requirements, and installation instructions can be found on the MediaWiki website, "Manual:Installation" area.
Can I fork individual Wikipedia articles?[সম্পাদনা কৰক]
Wikipedia considers each Wikipedia article to be an individual document. Moreover, for the purposes of creating derivative works of individual Wikipedia articles, Wikipedia considers a direct link-back to a particular Wikipedia article as being in full compliance with the Creative Commons Attribution-Sharealike 3.0 Unported License (CC-BY-SA), provided your derivative work is also licensed under CC-BY-SA.
Most (but not all) of our articles can also be reused under the GNU Free Documentation License (GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts). See Wikipedia:Copyrights#Reusers' rights and obligations for how to identify text which is not available under the GFDL.
How much storage is required for a copy of the English Wikipedia?[সম্পাদনা কৰক]
The data dump of all pages and their histories is a several-gigabyte file, which according to the warnings about decompression size at the database download page, could theoretically decompress to at least a terabyte. Paul Swanson's How to Mirror Wikipedia, however, suggests a much more modest size. The images on Commons amount to several terabytes; see Commons:Commons:MIME type statistics; but not all of these will be needed, and it is possible to use a script to only get the files that are referenced in Wikipedia articles. Alternatively you can now use InstantCommons. Wikipedia itself had about 420 GB of images as of 2008.