Below are a few tips and tricks that you may find useful when you use Mayavi2.
Often you write Mayavi scripts to render a whole batch of images to make an animation or so and find that each time you save an image, Mayavi “raises” the window to make it the active window thus disrupting your work. This is needed since VTK internally grabs the window to make a picture. Occluding the window will also produce either blank or incorrect images.
If you already have a Python script, say script.py that sets up your visualization that you run likes so:
$ mayavi2 -x script.py
Then it is very easy to have this script run offscreen. Simply run it like so:
$ mayavi2 -x script.py -o
This will run the script in an offscreen, standalone window. On Linux, this works best with VTK-5.2 and above. For more details on the command line arguments supported by the mayavi2 application, see the Command line arguments section.
When using mlab you will want to do this:
mlab.options.offscreen = True
before you create a figure and it will use an offscreen window for the rendering.
Another option for offscreen rendering would be to click on the scene and set the “Off screen rendering” option on. Or from a script:
mayavi.engine.current_scene.scene.off_screen_rendering = True
This will stop raising the window. However, this may not be enough. Please see below on the situation on different platforms.
Windows: If you are using win32 then off screen rendering should work well out of the box. All you will need to do is what is given above.
Linux and the Mac: there are several options to get this working correctly and some major issues to consider:
If you have VTK-5.2 the offscreen rendering option should let you generate the pictures without worrying about occluding the window. However, you will need VTK-5.2 to get this working properly. There are also situations when this does not always work – try it and if you get blank windows, you have a problem. For example:
from mayavi import mlab mlab.options.offscreen = True mlab.test_contour3d() mlab.savefig('example.png')
If this produces a clean image (even if you switch desktops or cover any windows produced), you should be golden. If not you should consider either using a virtual framebuffer or building VTK with Mesa + OSMesa to give you a pure software rendering approach.
VTK uses openGL for all its rendering. Under any conventional Unix (including Linux), you need an Xserver running to open a GL context (especially if you want hardware acceleration). This might be a problem when rendering on a headless server. As mentioned in the above paragraph, on a desktop, using the default server may also be a problem as it interferes with your ongoing work.
A good workaround is to use the virtual framebuffer X server for X11 like so:
Make sure you have the Xvfb package installed. For example under Debian and derivatives this is called the xvfb package.
Create the virtual framebuffer X server like so:Xvfb :1 -screen 0 1280x1024x24 -auth localhost
This creates the display ”:1” and creates a screen of size 1280x1024 with 24 bpp (the 24bpp is important). For more options check your Xvfb man page.
Export display to :1 like so (on bash):$ export DISPLAY=:1
Now run your Mayavi script. It should run uninterrupted on this X server and produce your saved images.
This probably will have to be fine tuned to suit your taste.
Many Linux systems (including Ubuntu and Debian) ship with a helper script xvfb-run for running headless. The following command can run a Python script with Mayavi2 visualizations headless:
xvfb-run --server-args="-screen 0 1024x768x24" python my_script.py
Beware that you shouldn’t call mlab.show or start the mainloop in the script, elsewhere the script will run endlessly, waiting for interaction in a hidden window.
If you want to use Mayavi without the envisage UI or even a traits UI (i.e. with a pure TVTK window) and do off screen rendering with Python scripts you may be interested in the examples_offscreen. This simple example shows how you can use Mayavi without using Envisage or the Mayavi envisage application and still do off screen rendering.
If you are using mlab, outside of the Mayavi2 application, simply set:
mlab.options.offscreen = True
Sometimes you might want to run Mayavi/VTK completely headless on a machine with no X server at all and are interested in pure offscreen rendering (for example for usage on the Sage notebook interface). In these cases one could use Mesa’s OSMesa library to render offscreen. The downside is that you will not get any hardware acceleration in this case. Here are brief instructions on how to build VTK to do this.
Build a recent version of mesa. 7.0.4 (as of this time) should work as would 7.2. We assume you download MesaLib-7.0.4.tar.bz2.
Untar, and change directory to the new directory created. We call this directory $MESA henceforth.
Run make configs/linux-x86, change file as per your configuration. Run make to see list of options. Note: 7.2 has a ./configure script that you can run.
Get VTK-5.2 or later (CVS will also work)..
Run ccmake path/to/VTK.
Now select advanced options ‘t’.
Set VTK_OPENGL_HAS_OSMESA ON
Configure: press ‘c’
Set the OSMESA_INCLUDE_DIR to the $MESA/include dir
Set OSMESA_LIBRARY to $MESA/lib/libOSMesa.so
Similarly set the OPENGL_INCLUDE_DIR, OPENGL_gl_LIBRARY=$MESA/lib/libGL.so, OPENGL_glu_LIBRARY, and OPENGL_xmesa_INCLUDE_DIR.
Set VTK_USE_OFFSCREEN to ON if you want offscreen all the time, this will never produce an actual mapped VTK window since the default value of the render window’s offscreen rendering ivar will be set to True in this case.
Any other settings like VTK_USE_GL2PS, USE_RPATH etc.
Configure again (press ‘c’) and then generate ‘g’.
Note that if you do not want to use ccmake and would like to do this from the command line you may also do (for example):cmake \ -DVTK_OPENGL_HAS_OSMESA=ON \ -DVTK_USE_OFFSCREEN=ON \ -DCMAKE_INSTALL_PREFIX=/path/to/vtk-offscreen \ -DVTK_WRAP_PYTHON=ON \ -DPYTHON_EXECUTABLE=/usr/bin/python2.5 \ -DPYTHON_LIBRARY=/usr/lib/libpython2.5.so \ -DBUILD_SHARED_LIBS=ON \ -DVTK_USE_GL2PS=ON \ -DOSMESA_INCLUDE_DIR=/path/to/Mesa-7.2/include/ \ -DOSMESA_LIBRARY=/home/path/to/Mesa-7.2/lib64/libOSMesa.so \ -DOPENGL_INCLUDE_DIR=/path/to/Mesa-7.2/include \ -DOPENGL_gl_LIBRARY=/path/to/Mesa-7.2/lib64/libGL.so \ -DOPENGL_glu_LIBRARY=/path/to/Mesa-7.2/lib64/libGLU.so \ path/to/VTK/
Run make and wait till VTK has built. Let us say the build is in $VTK_BUILD.
Now install VTK or set the PYTHONPATH and LD_LIBRARY_PATH suitably. Also ensure that LD_LIBRARY_PATH points to $MESA/lib (if the mesa libs are not installed on the system) this ensures that VTK links to the right GL libs. For example:$ export PYTHONPATH=$VTK_BUILD/bin:$VTK_BUILD/Wrapping/Python`` $ export LD_LIBRARY_PATH=$VTK_BUILD/bin:$MESA/lib
Now, you should be all set.
Once this is done you should be able to run mlab examples offscreen. This will work without an X display even.
With such a VTK built and running, one could simply build and install mayavi2. To use it in a Sage notebook for example you’d want to set ETS_TOOLKIT='null' and set mlab.options.offscreen = True. Thats it. Everything should now work offscreen.
Note that if you set VTK_USE_OFFSCREEN to ON then you’ll by default only get offscreen contexts. If you do want a UI you will want to explicitly set the render window’s off_screen_rendering ivar to False to force a mapped window. For this reason if you might need to popup a full UI, it might be better to not set VTK_USE_OFFSCREEN=ON.
A developer may wish to customize Mayavi by adding new sources, filters or modules. These can be done by writing the respective filters and exposing them via a user_mayavi.py or a site_mayavi.py as described in Customizing Mayavi2. A more flexible and reusable mechanism for doing this is to create a full fledged Mayavi contrib package in the following manner.
Create a Python package, let’s call it mv_iitb (for IIT Bombay specific extensions/customizations). The directory structure of this package can be something like so:mv_iitb/ __init__.py user_mayavi.py sources/ ... filters/ ... modules/ ... docs/ ...
The two key points to note in the above are the fact that mv_iitb is a proper Python package (notice the __init__.py) and the user_mayavi.py is the file that adds whatever new sources/filters/modules etc. to Mayavi. The other part of the structure is really up to the developer. At the moment these packages can add new sources, filters, modules and contribute any Envisage plugins that the mayavi2 application will load.
This package should then be installed somewhere on sys.path. Once this is done, users can find these packages and enable them from the Tools->Preferences (the UI will automatically detect the package). The user_mayavi.py of each selected package will then be imported next time Mayavi is started, note that this will be usable even from mlab.
Any number of such packages may be created and distributed. If they are installed, users can choose to enable them. Internally, the list of selected packages is stored as the mayavi.contrib_packages preference option. The following code shows how this may be accessed from a Python script:
>>> from mayavi.preferences.api import preference_manager >>> print preference_manager.root.contrib_packages  >>> preference_manager.configure_traits() # Pop up a UI.
For more details on how best to write user_mayavi.py files and what you can do in them, please refer to the examples/mayavi/user_mayavi.py example. Please pay particular attention to the warnings in that file. It is a very good idea to ensure that the user_mayavi.py does not implement any sources/modules/filters and only registers the metadata. This will avoid issues with circular imports.
There are three ways a user can customize Mayavi:
- Via Mayavi contributions installed on the system. This may be done by enabling any found contributions from the Tools->Preferences menu on the Mayavi component, look for the “contribution settings”. Any selected contributions will be imported the next time Mayavi starts. For more details see the Extending Mayavi with customizations section.
- At a global, system wide level via a site_mayavi.py. This file is to be placed anywhere on sys.path.
- At a local, user level. This is achieved by placing a user_mayavi.py in the users ~/.mayavi2/ directory. If a ~/.mayavi2/user_mayavi.py is found, the directory is placed in sys.path.
The files are similar in their content. Two things may be done in this file:
- Registering new sources, modules or filters in the Mayavi registry (mayavi.core.registry.registry). This is done by registering metadata for the new class in the registry. See examples/mayavi/user_mayavi.py to see an example.
- Adding additional envisage plugins to the mayavi2 application. This is done by defining a function called get_plugins() that returns a list of plugins that you wish to add to the mayavi2 application.
The examples/mayavi/user_mayavi.py example documents and shows how this can be done. To see it, copy the file to the ~/.mayavi2 directory. If you are unsure where ~ is on your platform, just run the example and it should print out the directory.
In the user_mayavi.py or site_mayavi.py, avoid Mayavi imports like from mayavi.modules.outline import Outline etc. This is because user_mayavi is imported at a time when many of the imports are not complete and this will cause hard-to-debug circular import problems. The registry is given only metadata mostly in the form of strings and this will cause no problem. Therefore to define new modules, we strongly recommend that the modules be defined in another module or be defined in a factory function as done in the example user_mayavi.py provided.
The Standalone example demonstrates how one can use the core Mayavi API without using Envisage. This is useful when you want to minimize dependencies. Offscreen example demonstrates how to use Mayavi without the envisage UI or even a traits UI (i.e. with a pure TVTK window) and do off screen rendering.
Compute in thread example demonstrates how to visualize a 2D numpy array and visualize it as image data using a few modules. It also shows how one can do a computation in another thread and update the Mayavi pipeline once the computation is done. This allows a user to interact with the user interface when the computation is performed in another thread.
Sometimes you have a separate computational process that generates data suitable for visualization. You’d like Mayavi to visualize the data but automatically update the data when the data file is updated by the computation. This is easily achieved by polling the data file and checking if it has been modified. The Poll file example demonstrates this. To see it in action will require that you edit the scalar data in the examples/data/heart.vtk data file.
Say you have a little visualization script and you’d like to run some kind of server where you can script the running Mayavi UI from a TCP/UDP connection. It turns out there is a simple way to do this if you have Twisted installed. Here is a trivial example:
from mayavi import mlab from mayavi.tools import server mlab.test_plot3d() server.serve_tcp()
There is no need to call mlab.show() in the above. The TCP server will listen on port 8007 by default in the above (this can be changed with suitable arguments to serve_tcp()). Any data sent to the server is simply exec’d, meaning you can do pretty much anything you want. The names engine, scene, camera and mlab are all available and can be scripted with Python code. For example after running the above you can do this:
$ telnet localhost 8007 Trying 127.0.0.1... Connected to localhost. Escape character is '^]'. scene.camera.azimuth(45) mlab.clf() mlab.test_contour3d() scene.camera.zoom(1.5)
The nice thing about this is that you do not loose any interactivity of the application and can continue to use its UI as before, any network commands will be simply run on top of this. To serve on a UDP port use the serve_udp() function. For more details on the server module please look at the source code – it is thoroughly documented.
While this is very powerful it is also a huge security hole since the remote user can do pretty much anything they want once connected to the server.
Often users like to animate visualization without affecting the interactive capabilities of the view. For example you may want to rotate the camera continuously, take a snapshot while continuing to interact with the Mayavi UI. To do this one can use the very convenient animate() decorator provided with Mayavi. Here is a simple example:
from mayavi import mlab @mlab.animate def anim(): f = mlab.gcf() while 1: f.scene.camera.azimuth(10) f.scene.render() yield a = anim() # Starts the animation.
Notice the use of yield in the above, this is very crucial to this working. This example will continuously rotate the camera without affecting the UI’s interactivity. It also pops up a little UI that lets you start and stop the animation and change the time interval between calls to your function. For more specialized use you can pass arguments to the decorator:
from mayavi import mlab @mlab.animate(delay=500, ui=False) def anim(): # ... a = anim() # Starts the animation without a UI.
If you don’t want to import all of mlab, the animate decorator is available from:
from mayavi.tools.animator import animate
Note that to start the event loop, i.e. to get the animation running, you will need to call show() if you do not already have a GUI environment running.
For more details check the documentation of the animate() decorator available in the MLab reference. For an example using it, alongside with the visual handy for object-movement animation, see Mlab visual example.
If you want to change the data of an object in an animation, see Animating the data
Let’s say you have a stack of PNG or JPEG files that are numbered serially that you want to animate on a Mayavi scene. Here is a simple script (called img_movie.py):
# img_movie.py from pyface.timer.api import Timer def animate(src, N=10): for j in range(N): for i in range(len(src.file_list)): src.timestep = i yield if __name__ == '__main__': src = mayavi.engine.scenes.children animator = animate(src) t = Timer(250, animator.next)
The Timer class lets you call a function without blocking the running user interface. The first argument is the time after which the function is to be called again in milliseconds. The animate function is a generator and changes the timestep of the source. This script will animate the stack of images 10 times. The script animates the first data source by default. This may be changed easily.
To use this script do this:
$ mayavi2 -d your_image000.png -m ImageActor -x img_movie.py
This isn’t really related to Mayavi but is a useful trick nonetheless. Let’s say you generate a stack of images using Mayavi say of the form anim%03d.png (i.e. anim000.png, anim001.png and so on), you can make this into a movie. If you have mencoder installed try this:
$ mencoder "mf://anim%03d.png" -mf fps=10 -o anim.avi \ -ovc lavc -lavcopts vcodec=msmpeg4v2:vbitrate=500
If you have ffmpeg installed you may try this:
$ ffmpeg -f image2 -r 10 -i anim%03d.png -sameq anim.mov -pass 2
The Mayavi application allows for very powerful Command line arguments that lets you build a complex visualization from your shell. What follow is a bunch of simple examples illustrating these.
The following example creates a ParametricSurface source and then visualizes glyphs on its surface colored red:
$ mayavi2 -d ParametricSurface -m Glyph \ -s"glyph.glyph.scale_factor=0.1" \ -s"glyph.color_mode='no_coloring'" \ -s"actor.property.color = (1,0,0)"
Note that -s"string" applies the string on the last object (also available as last_obj), which is the glyph.
This example turns off coloring of the glyph and changes the glyph to display:
$ mayavi2 -d ParametricSurface -m Glyph\ -s"glyph.glyph.scale_factor=0.1" \ -s"glyph.color_mode='no_coloring'" \ -s"glyph.glyph_source.glyph_source = last_obj.glyph.glyph_source.glyph_list[-1]"
Note the use of last_obj in the above.
Here is a simple example showing how to texture map an iso-surface with the data that ships with the Mayavi sources (the data files are in the examples directory):
$ mayavi2 -d examples/tvtk/images/masonry.jpg \ -d examples/mayavi/data/heart.vti \ -m IsoSurface \ -s"actor.mapper.scalar_visibility=False" \ -s"actor.enable_texture=True"\ -s"actor.tcoord_generator_mode='cylinder'"\ -s"actor.texture_source_object=script.engine.current_scene.children"
It should be relatively straightforward to change this example to use a ParametricSurface instead and any other image of your choice. Notice how the texture image (masonry.jpg) is set in the last line of the above. The image reader is the first child of the current scene and we set it as the texture_source_object of the isosurface actor.
Sometimes you need to shift/transform your input data in space and visualize that in addition to the original data. This is useful when you’d like to do different things to the same data and see them on the same plot. This can be done with Mayavi using the TransformData filter for StructuredGrid, PolyData and UnstructuredGrid datasets. Here is an example using the ParametricSurface data source:
$ mayavi2 -d ParametricSurface \ -m Outline -m Surface \ -f TransformData -s "transform.translate(1,1,1)" \ -s "widget.set_transform(last_obj.transform)" \ -m Outline -m Surface
If you have an ImageData dataset then you can change the origin, spacing and extents alone by using the ImageChangeInformation filter. Here is a simple example with the standard Mayavi image data:
$ mayavi2 -d examples/mayavi/data/heart.vti -m Outline \ -m ImagePlaneWidget \ -f ImageChangeInformation \ -s "filter.origin_translation=(20,20,20)" \ -m Outline -m ImagePlaneWidget
The UserDefined filter in Mayavi lets you wrap around existing VTK filters easily. Here are a few examples:
$ mayavi2 -d ParametricSurface -s "function='dini'" \ -f UserDefined:GeometryFilter \ -s "filter.extent_clipping=True" \ -s "filter.extent = [-1,1,-1,1,0,5]" \ -f UserDefined:CleanPolyData \ -m Surface \ -s "actor.property.representation = 'p'" \ -s "actor.property.point_size=2"
This one uses a tvtk.GeometryFilter to perform extent based clipping of the parametric surface generated. Note the specification of the -f UserDefined:GeometryFilter. This data is then cleaned using the tvtk.CleanPolyData filter.
Under mlab, the Userdefined can be used to wrap eg a GeometryFilter VTK filter with:
filtered_obj = mlab.pipeline.user_defined(obj, filter='GeometryFilter')
With mlab, the user_defined function can either take as a filter argument the name of the VTK filter to be used, or an already-instanciated instance of the filter.
With the UserDefined filter, as with most Mayavi filter, the raw TVTK object can be accessed as the filter attribute of the Mayavi filter object.
The Image cursor filter example gives a full example of using the UserDefined filter. The Tvtk segmentation example is a full example of building a complex VTK pipeline with a heavy use of the UserDefined filter.
The default 3D interaction with the scene (left click on the background rotates the scene, right click scales, middle click pans) is not suited for every visualization. For instance, in can be interesting to restrict the movement to 2D, e.g. when viewing an object in the ‘x’ direction. This is done by changing the interactor_style of a scene. Here is an example to use Mayavi as a 2D image viewer:
from mayavi import mlab mlab.test_imshow() mlab.view(0, 0) fig = mlab.gcf() from tvtk.api import tvtk fig.scene.interactor.interactor_style = tvtk.InteractorStyleImage() mlab.show()
Another useful interactor is the ‘terrain’ interactor, handy to have natural movement in scenes where you want the ‘up’ vector to be always pointing in the ‘z’ direction:
from mayavi import mlab mlab.test_surf() fig = mlab.gcf() from tvtk.api import tvtk fig.scene.interactor.interactor_style = tvtk.InteractorStyleTerrain() mlab.show()
VTK has many different interactors. An easy way to list them is to display the VTK class browser (via the help menu, in the mayavi2 application) and to search for “Interactor”. Another option is to tab complete on Ipython, on tvtk.InteractorStyle.
You’ve just created a nice Mayavi/mlab script and now want to generate an animation or a series of images. You realize that it is way too slow rendering the images and takes ages to finish. There are two simple ways to speed up the rendering. Let’s assume that obj is any Mayavi pipeline object that has a scene attribute:
obj.scene.disable_render = True # Do all your scripting that takes ages. # ... # Once done, do the following: obj.scene.disable_render = False
This will speed things up for complex visualizations sometimes by an order of magnitude.
While saving the visualization to an image you can speed up the image generation at the cost of loosing out on anti-aliasing by doing the following:
obj.scene.anti_aliasing_frames = 0
The default value is typically 8 and the rendered image will be nicely anti-aliased. Setting it to zero will not produce too much difference in the rendered image but any smooth lines will now appear slightly jagged. However, the rendering will be much faster. So if this is acceptable (try it) this is a mechanism to speed up the generation of images.