ADL: Dejavu GPU
sargis at scripps.edu
Tue Jan 22 13:51:36 PST 2008
> As my knowledge, ADT is using Dejavu to do the animation works. I am wondering is
> there any one is doing research on using GPU to improve the performance of animation
> rendering in Dejavu. I would love to dig into this field. So I am asking for help. I
> will appreciate that if anyone could give me some suggestion.
If you have appropriate OpenGL drivers installed then DejaVu performs at its
The question is can we use GPU to add other affects such as ambient occlusion?
DejaVu already uses programmable shaders for volume rendering using UTpackages
that are python extensions of C++ libraries developed in Chandrajit Bajaj's
group at UT Austin. The problem is accessing GPU functionality from Python. The
following are some of the packages that provide direct access to GPU from Python.
The latest version of PyOpengl can access GPU using ctypes
I would start with PyOpengl examples and add shaders there. Once you have this
going, it would be easy to add this code to the Draw function of appropriate
DejaVu object and use these shaders.
> Moreover, has anyone ran Autodock4 on a GPU?
This is another interesting question. I've seen NVIDIA demoing how VMD is using
their GPU for MD last year at SIGGRAPH. There are new articles coming on
gpgpu.org every week.
More information about the autodock