MediaWiki backup
Jump to navigation
Jump to search
| MediaWiki | v.1.6.7, 1.11.0, 1.15.0+ |
Easy backup method: Input the all page name into Special:Export
- Generate the list of all page name
- I modified the Google Sitemaps to list.php. Partial original code as follows:
# -----------------------------------------------------
# Start output
# -----------------------------------------------------
?>
<url>
<loc><?php echo fnXmlEncode( "http://" . $wgServerName . eregi_replace('\$1',$sPageName,$wgArticlePath) ) ?></loc>
<lastmod><?php echo fnTimestampToIso($row_rsPages['page_touched']); ?></lastmod>
<changefreq>weekly</changefreq>
<priority><?php echo $nPriority ?></priority>
</url>
<?php } while ($row_rsPages = mysql_fetch_assoc($rsPages)); ?>
</urlset>
I changed the output part as ...
# ----------------------------------------------------- # Start output # ----------------------------------------------------- ?> <?php echo fnXmlEncode( $sPageName ) ?> <?php } while ($row_rsPages = mysql_fetch_assoc($rsPages)); ?> </urlset>
- Copy the list of all page name to Special:Export
- Export the latest revision of all pages (If you choose to include the all old revisions, it will takes time to import the content.)
Other approach
similar approach