Here are some tips to increase XWiki's performance.


If you need high availability or if the load on your XWiki instance is too high you can configure XWiki in a cluster to spread the load.

Standalone Solr

By default XWiki use an embedded instance of Solr for easy of use but if you find yourself struggling with very slow search you should probably give external Solr instance a try. You can use debug=true in the URL of the search to check how long is taken in Solr itself to know if improving Solr speeed would really change anything (the issue might come from the UI on XWiki side too).

See Performance Guide in Solr module documentation.

Slow random number generation on UNIX

The library used for random number generation in Oracle's JVM relies on /dev/random by default for UNIX platforms.

Although /dev/random is more secure, it's possible to use /dev/urandom if the default JVM configuration instead.

To determine if your operating system exhibits this behavior, try displaying a portion of the file from a shell prompt:

head -n 1 /dev/random

If the command returns immediately, you can use /dev/random as the default generator for Oracle's JVM. If the command does not return immediately, use on of the following solutions to use /dev/urandom:

JVM setup

  1. Open the $JAVA_HOME/jre/lib/security/ file in a text editor.
  2. Change the line:
        to read:
  3. Save your change and exit the text editor.

Command line parameter

The same effect can be obtained using in the Java command line (usually in the application server configuration).

Gzip compression and caching of static pages

We're working on making these features part of the XWiki core (see XWIKI-2022). While waiting for this to be natively implemented, the recommended solution is to set up an Apache Web Server in front of your servlet container and install/configure the following modules:

Modify your Apache configuration file to load the different modules:

LoadModule expires_module /usr/lib/apache2/modules/
LoadModule deflate_module /usr/lib/apache2/modules/
LoadModule proxy_module /usr/lib/apache2/modules/
# Depends: proxy
LoadModule proxy_ajp_module /usr/lib/apache2/modules/

Alternatively you can run the following commands as root (sudo)

a2enmod deflate
a2enmod proxy_ajp
a2enmod expires

and configure your different modules as described below:

Mod Deflate Configuration

vwwwpro-1:~# cat /etc/apache2/conf.d/deflate
<Location />
   # Insert filter
   SetOutputFilter DEFLATE

   # Netscape 4.x has some problems...
   BrowserMatch ^Mozilla/4 gzip-only-text/html

   # Netscape 4.06-4.08 have some more problems
   BrowserMatch ^Mozilla/4.0[678] no-gzip

   # MSIE masquerades as Netscape, but it is fine
   # BrowserMatch bMSIE !no-gzip !gzip-only-text/html

   # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
   # the above regex won't work. You can use the following
   # workaround to get the desired effect:
   BrowserMatch bMSI[E] !no-gzip !gzip-only-text/html

   # Don't compress images
   SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary

   # Make sure proxies don't deliver the wrong content
   #Header append Vary User-Agent env=!dont-vary

On debian apache2 the config file for deflate is located under /etc/apache2/mods-enabled/deflate.conf

Mod Expire Configuration

vwwwpro-1:~# cat /etc/apache2/conf.d/expires
<Location /xwiki/skins/>
       ExpiresActive on
       ExpiresDefault "access plus 1 day"

<Location /xwiki/bin/skin/>
       ExpiresActive on
       ExpiresDefault "access plus 1 day"

Mod Proxy AJP Configuration

ProxyRequests Off
   <Proxy *>
       Order deny,allow
       Allow from all
   ProxyPreserveHost On
   ProxyPass /xwiki ajp://

where ajp:// is the internal address of your Servlet container where XWiki is running.

If you use Tomcat(7) you need to enable the ajp connector in the /etc/tomcat7/server.xml. Comment out the following line with adding <!-- -->. Maybe give a comment why you did it.

    <!-- disable to use ajp connector instead
    <Connector port="8080" protocol="HTTP/1.1"
               redirectPort="8443" />

Uncomment the following line by removing the <!-- --> and add  URIEncoding="UTF-8" to it.Maybe give a comment too.

<!-- Activate ajp connector for apache proxy_ajp -->
<Connector port="8009" protocol="AJP/1.3" redirectPort="8443" URIEncoding="UTF-8"/>


You need to configure your Servlet container so that XWiki has enough memory. You'll need to tune the value to your need. You should check the logs and see if there are any "out of memory" errors. Here are some good default values:

  • For Java 8 (i.e. XWiki >= 8.1). Notice that there's no permgen anymore in Java 8:
    • Small and medium installs: A minimum of 1024MB (-Xmx1024m)
    • Large installs: 2048MB or beyond (-Xmx2048m).
  • For Java 7 (i.e. XWiki < 8.1)
    • Small installs: A minimum of 512MB of heap memory and 196MB of permGen (-Xmx512m -XX:MaxPermSize=196m)
    • Medium installs: 1024MB for the heap and 196MB of permGen (-Xmx1024m -XX:MaxPermSize=196m)
    • Large installs: 2048MB (or beyond) for the heap and 196MB of permGen (-Xmx2048m -XX:MaxPermSize=196m).
You should not increase the memory beyond what you need because increasing it means that there's more Objects in memory at any time and the automatic JVM Garbage Collector has to work harder to clean itself, which can results in performance degradation in XWiki (since a full GC will pause the application for a longer time).

Note that storing attachments with the default (in database) storage mechanism is very memory intensive. See the administrators guide to attachments for more information about memory cost and the alternative filesystem based attachment store.

Also note that uploading a lot of pages can trigger out of memory (OOM) errors due to scheduled watchlist jobs. For example uploading 1 million pages will trigger OOM errors even when the JVM is configured to run with 2GB of heap space. For such kind of load we recommend to disable (unschedule) the Watchlist jobs (in /xwiki/bin/view/Scheduler/) before uploading the pages. To track the progress on this issue, please see XWIKI-10594.

If you use HSQLDB as the wiki database, be aware that the full content of the database is stored in memory and thus the memory requirements are higher. See HSQLDB installation page for more details.

For your information here are the values used for the site:

CATALINA_OPTS="-server -Xms800m -Xmx1480m -XX:MaxPermSize=222m -Dfile.encoding=utf-8 -Djava.awt.headless=true -XX:+UseParallelGC -XX:MaxGCPauseMillis=100"

Database Indexes

Make sure you've set Database indexes. This is especially important when you start having lots of documents.

Large number of users

When you have large number of users it's recommended to turn on implicit All Group, i.e. to consider that all users are members of XWiki.XWikiAllGroup by default in the configuration. This is achieved by editing the xwiki.cfg file and setting:

Then you should remove all the XObjects from the XWikiAllGroup page but you should keep the page since otherwise you won't be able to set permissions for this group. This will prevent XWiki from having to load all that page's XObjects representing the users (thousands of them if you have thousands of users).

Also make sure that the XWikiAllGroup is listed in the xwiki.users.initialGroups property (it's there by default if you haven't touched that property):

#-# List of groups that a new user should be added to by default after registering. Comma-separated list of group
#-# document names.
# xwiki.users.initialGroups=XWiki.XWikiAllGroup


Some panels take more resources than others. For example the Navigation panel should NOT be used for wikis with a lot of documents since it displays all documents in the wiki. In the future that panel should be improved for performance but that's not the case right now. Originally this panel was only meant as a starting point. A better approach is to use a "Quick Links Panel" as we've now set up in the default XWiki Enterprise wiki version 1.1 (we've removed the default usage of the Navigation Panel in that version).


If your wiki is open on the Internet, it'll be crawled by search robots (like GoogleBot, etc). They will call all the URLs and especially the ones that are resource hungry like exports (PDF/RTF). You need to protect against this. To do so configure a robots.txt file and put it in your webserver configuration.

Some example:

User-agent: *
# Prevent bots from executing all actions except "view" since:
# 1) we don't want bots to execute stuff in the wiki!
# 2) we don't want bots to consume CPU and memory
# (for example to perform exports)
# Note: You may want to allow /download/ if you wish to have
# attachments indexed.
Disallow: /xwiki/bin/viewattachrev/
Disallow: /xwiki/bin/viewrev/
Disallow: /xwiki/bin/pdf/
Disallow: /xwiki/bin/tex/
Disallow: /xwiki/bin/edit/
Disallow: /xwiki/bin/create/
Disallow: /xwiki/bin/inline/
Disallow: /xwiki/bin/preview/
Disallow: /xwiki/bin/save/
Disallow: /xwiki/bin/saveandcontinue/
Disallow: /xwiki/bin/rollback/
Disallow: /xwiki/bin/deleteversions/
Disallow: /xwiki/bin/cancel/
Disallow: /xwiki/bin/delete/
Disallow: /xwiki/bin/deletespace/
Disallow: /xwiki/bin/undelete/
Disallow: /xwiki/bin/reset/
Disallow: /xwiki/bin/register/
Disallow: /xwiki/bin/propupdate/
Disallow: /xwiki/bin/propadd/
Disallow: /xwiki/bin/propdisable/
Disallow: /xwiki/bin/propenable/
Disallow: /xwiki/bin/propdelete/
Disallow: /xwiki/bin/objectadd/
Disallow: /xwiki/bin/commentadd/
Disallow: /xwiki/bin/commentsave/
Disallow: /xwiki/bin/objectsync/
Disallow: /xwiki/bin/objectremove/
Disallow: /xwiki/bin/attach/
Disallow: /xwiki/bin/upload/
Disallow: /xwiki/bin/download/
Disallow: /xwiki/bin/temp/
Disallow: /xwiki/bin/downloadrev/
Disallow: /xwiki/bin/dot/
Disallow: /xwiki/bin/svg/
Disallow: /xwiki/bin/delattachment/
Disallow: /xwiki/bin/skin/
Disallow: /xwiki/bin/jsx/
Disallow: /xwiki/bin/ssx/
Disallow: /xwiki/bin/login/
Disallow: /xwiki/bin/loginsubmit/
Disallow: /xwiki/bin/loginerror/
Disallow: /xwiki/bin/logout/
Disallow: /xwiki/bin/charting/
Disallow: /xwiki/bin/lock/
Disallow: /xwiki/bin/redirect/
Disallow: /xwiki/bin/admin/
Disallow: /xwiki/bin/export/
Disallow: /xwiki/bin/import/
Disallow: /xwiki/bin/get/
Disallow: /xwiki/bin/distribution/
Disallow: /xwiki/bin/imagecaptcha/
Disallow: /xwiki/bin/unknown/
Disallow: /xwiki/bin/webjars/
# Don't index sandbox content since it's sample content
Disallow: /xwiki/bin/view/Sandbox/
# Don't index Admin space since it contains Admin stuff.
# Note that the Admin space is protected by permissions
# anyway but this acts as a safety net to not have private
# info leaked on the internet ;)
Disallow: /xwiki/bin/view/Admin/
# Don't index Stats data (just because it's not useful and
# those pages are a bit CPU intensive)
Disallow: /xwiki/bin/view/Stats/
# Don't index Panels data (because we don't want it
# indexed on the internet)
Disallow: /xwiki/bin/view/Panels/

Other example:

# It could be also useful to block certain spaces from crawling,
# especially if this spaces doesn't provide new content
Disallow: /xwiki/bin/view/Main/
Disallow: /xwiki/bin/view/XWiki/
# On the other hand you would like to have your recent (public) changes included
Allow: /xwiki/bin/view/Main/Dashboard


For Tomcat6 the placement of the robots.txt file should be within the $TOMCAT/webapps/ROOT folder and should have permission 644 applied.

-rw-r--r--  1 root  www  1478 Jan  8 15:52 robots.txt

In order to test if the robots.txt file is either accessable or working as desired use this checker.


This is no longer true starting with XE 1.4M2 since statistics are now put on a queue and written in a different thread in the database in one go, thus reducing the overhead to a maximum.

The statistics module is off by default since it's quite database intensive. If you don't need it you should turn it off.

Document Cache

You can tune the Document cache in the xwiki.cfg configuration file. The value depends on how much memory you have. The higher the better. A good reasonable value is 1000.

Cache Macro

It's possible to perform selective content caching by using the Cache Macro.

LESS CSS Performances

LESS is a preprocessor used to generate CSS files for skins and skin extensions. See the Performances section of the LESS module documentation to learn more about how to optimize its cache for performances, and to set the appropriate number of simultaneous compilations your server can handle.

Rendering cache

Some pages are complex to render (they may aggregate outside data for example or do complex and slow queries). For theses pages you can use rendering cache.

Configuration based

Pages can be cached (i.e. their rendered content cached) to speed up displaying. The configuration is done in with the following configuration options:

#-# [Since 2.4M1]
#-# Indicate if the rendering cache is enabled.
#-# Default value is false.
# core.renderingcache.enabled=true

#-# [Since 2.4M1]
#-# A list of Java regex patterns matching full documents reference.
# core.renderingcache.documents=wiki:Space\.Page
# core.renderingcache.documents=wiki:Space\..*

#-# [Since 2.4M1]
#-# The time (in seconds) after which data should be removed from the cache when not used.
#-# Default value is 300 (5 min).
# core.renderingcache.duration=300

#-# [Since 2.4M1]
#-# The size of the rendering cache. Not that it's not the number of cached documents but the number of cached results.
#-# (For a single document several cache entries are created, because each action, language and request query string
#-# produces a unique rendering result)
#-# Default value is 100.
# core.renderingcache.size=100

You can force a page to refresh using refresh=1 in the URL.

Since 6.2 it's also possible to programmatically refresh any document cache using com.xpn.xwiki.internal.cache.rendering.RenderingCache component:

private RenderingCache renderingCache;


renderingCache.flushCache(new DocumentReference("xwiki", "MySpace", "MyCachedDocument"));

Merge the CSS files

In order to reduce the number of requests and files that are downloaded from the browser or client, it could help to merge all XWiki CSS files into a single one. See the Merge CSS Script.

Set up NginX

If you experience heavy loads on your wiki, you could try using NginX.

NginX is used to fetch static content: images, javascript, styles, etc, but it can also be used as a reverse-proxy to pass requests down to the web container (e.g. Tomcat on port 8080).

Unlike Apache, which instantiates a new process per every static file, NginX uses the same process to fetch all the static data, and thus gives you extra perfomance "for free". 

For more info on setting up NginX check this guide.


While a pretty neat feature, keeping track of the backlinks has a medium impact on the document saving time and a minor impact on the document loading time. If you feel that your wiki does not need backlinks, you can safely disable them with the following line in xwiki.cfg:



One of the key features of any wiki system, versioning greatly affects the database size and the document update time. If you are sure your wiki does not need to keep track of all the changes and you will never need to revert documents to a previous version, then you can add the following line in xwiki.cfg:

Custom Mapping

In some cases you may not want to rely on XWiki's generic database schema for storing XClass data and instead you'd like to provide your own optimized table. For these use cases you can use Custom Mapping.


Disable LDAP sub groups search

By default when loading a LDAP group, each member is searched and loaded to figure out if it's a group or not (and then load the sub group members, etc). Since 7.2 If you know there is not sub group in your LDAP groups you can disable it and speed up quite a lot big groups handling using xwiki.authentication.ldap.group_sync_resolve_subgroups property in xwiki.cfg configuration file.

Performance tree

Since 7.1 it's possible to directly get a tree of time spent in each step of the request by using debug mode.


Monitor plugin

More a developer-oriented feature, XWiki can monitor its own code, reporting the time spent for each sub-component activated during a request. While the monitoring code isn't time consuming, it increases the memory consumption a bit, and the create/start/stop/log/destroy calls are spread all around the code, so you will save a lot of method calls by disabling this. You can do that by setting the following line in xwiki.cfg:


1.0 rendering cache using velocity in document content itself

You can add the following to their content to cache them after they are rendered. Note that the document is refreshed whenever the content of the document changes, and the cache takes into account the URL, so it is pretty safe to add a long cache duration for all documents that don't contain scripts gathering data from the wiki. For example to cache the rendered content for 60 seconds you would add:


Since 1.5M2, you can set the default rendering cache duration for all pages in xwiki.cfg:

## cache all rendered documents for one hour

Setting the default cache duration to a large value, and manually disabling the cache in dynamic pages would really speed up the wiki, since the rendering is one of the most time consuming processes.

Wiki syntax features for XWiki Syntax 1.0

If you're using XWiki Syntax 1.0 and if you don't plan to use all of the markup features, like the strikethrough filter, the automatic http links filter, the SVG, Laszlo or style macros, you can disable them in xwiki-core-*.jar/META-INF/services/com.xpn.xwiki.render.*. The wiki rendering is the most costly operation in the rendering process, so any disabled feature counts.

Now this will have no effect if you're using another syntax, like XWiki Syntax 2.x.

Created by VincentMassol on 2007/09/20 12:08

Get Connected