MediaWiki backup: Difference between revisions

From LemonWiki共筆
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
 
(10 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{Template:MediaWiki Installation}}
{{Template:MediaWiki Installation}}


{{Wiki version| wiki_engine=MediaWiki | version=1.6.7, 1.11.0 }}
{{Wiki version| wiki_engine=MediaWiki | version=1.31.0 }}
Easy backup method: Input the all page name into [[Special:Export]]
'''Easy backup method'''


* Generate the list of all page name
Input the all page name into [[Special:Export]], and export the all content as a XML file. Steps:
: I modified the [http://www.thinklemon.com/wiki/MediaWiki:Google_Sitemaps Google Sitemaps] to [http://wiki.planetoid.info/lists.phps list.php]. Partial original code as follows:
# Generate the list of all page name:
<pre>
#* For newer MediaWiki: Use the [https://www.mediawiki.org/wiki/Extension:OneColumnAllPages Extension:OneColumnAllPages - MediaWiki]
# -----------------------------------------------------
#* For MediaWiki 1.21.2: I modified the [http://www.mediawiki.org/wiki/Extension:DynamicWikiSitemap Extension:DynamicWikiSitemap] to [http://wiki.planetoid.info/lists.phps list.php]
# Start output
# Copy the list of all page name to [[Special:Export]]
# -----------------------------------------------------
# Export the ''latest'' revision of all pages (If you choose to include the all old revisions, it will costs time to export the content.)


?>
<url>
<loc><?php echo fnXmlEncode( "http://" . $wgServerName . eregi_replace('\$1',$sPageName,$wgArticlePath) ) ?></loc>
<lastmod><?php echo fnTimestampToIso($row_rsPages['page_touched']); ?></lastmod>
<changefreq>weekly</changefreq>
<priority><?php echo $nPriority ?></priority>
</url>
<?php } while ($row_rsPages = mysql_fetch_assoc($rsPages)); ?>
</urlset>
</pre>


I changed the output part as ...
'''Other approach: Database backup'''
<pre>
# -----------------------------------------------------
# Start output
# -----------------------------------------------------
 
?>
<?php echo fnXmlEncode( $sPageName ) ?>
 
<?php } while ($row_rsPages = mysql_fetch_assoc($rsPages)); ?>
</urlset>
</pre>
 
 
* Copy the list of all page name to [[Special:Export]]
** Export the ''latest'' revision of all pages (not include the old revisions)
 
 
'''Other approach'''
* [http://www.mediawiki.org/wiki/Manual:Moving_a_wiki Manual:Moving a wiki - MediaWiki] / [http://meta.wikimedia.org/wiki/Restore_Database Restore Database]
* [http://www.mediawiki.org/wiki/Manual:Moving_a_wiki Manual:Moving a wiki - MediaWiki] / [http://meta.wikimedia.org/wiki/Restore_Database Restore Database]
* [[MySQL_commands#Exporting_data_of_database.2Ftable_into_MySql_sql_file | Exporting data of database/table into MySql sql file]]




similar approach
similar approach
* [http://www.chieftain.idv.tw/archives/2008/01/25/1337.html Reflection » Blog Archive » Export all the page names on your Mediawiki-powered Wiki]
* [http://www.chieftain.idv.tw/archives/2008/01/25/1337.html Reflection » Blog Archive » Export all the page names on your Mediawiki-powered Wiki]
 
* [https://errerrors.blogspot.com/2020/03/alternative-approach-to-upgrade-mediawiki.html MediaWiki 替代的升級方式]


[[Category:MediaWiki]]
[[Category:MediaWiki]]

Latest revision as of 12:04, 1 March 2020

Ψ 架設Wiki之前 --> MediaWiki安裝 --> 本地化設置 --> 介面修改 --> 功能擴充 --> 備份與管理 | 工具箱 Ψ

MediaWiki v.1.31.0+


Easy backup method

Input the all page name into Special:Export, and export the all content as a XML file. Steps:

  1. Generate the list of all page name:
  2. Copy the list of all page name to Special:Export
  3. Export the latest revision of all pages (If you choose to include the all old revisions, it will costs time to export the content.)


Other approach: Database backup


similar approach