(Thu, Aug 24, 2006 at 12:56:49PM -0400) Geoffrey Hutchison

On Aug 24, 2006, at 3:47 AM, Donald Ephraim Curtis wrote:

I think the next major step is building the actual “tools”. Once
we get
the environment setup correctly; then building more and more tools
easier and easier.

Wow. You’ve moved through a lot of code, which is great. As you say,
the “hard” part is getting the basic environment designed and
developed right.

Right now, I’m getting errors looking for GLEngine.h.
SBSEngine.h:33:22: error: GLEngine.h: No such file or directory

Yeah, i had some errors. I had a lot of dubugging to do. Here are a
few things i came up with.

I have made it so that each engine implements a ‘render’ function for
whatever parameter it wants but it’s not through dynamic casting…
Everytime i tried to do render(Primative *p) then dynamic_cast p to Atom
or Bond, it wouldn’t load the plugin at all. Something about the
QPluginLoader and dynamic_casts? I don’t know.

So if an engine wants to implement a render for Atom and not Bond, it
simply implements render(Atom *atom)

but this avoids naming of renderAtom and renderBond. Either way is ok
with me though. It doesn’t change anything it just means that at
runtime, which function is called is what matters.

And here is the layout so far…

Each window has a Molecule. When it adds the GLWidget it also creates a
MoleculeView. A View is composed of an object and multiple other
subViews. The reason for this is expandability. Eventually we may want
an atom to also render say, electron orbits or something that is a
subcomponent of the Atom. Anyways, the MoleculeView has two slots, one
for when a bond is created and one for when an atom is created.
(Molecule overwrites the virtual createAtom and createBond functions
from OBMol)>

Anyways, the MainWindow initializes this MoleculeView and binds it to
GLWidget (the MoleculeView also knows who it’s QObject parent is, namely
hte GLWidget it’s bound to). Then whenever GLWidget wants to render it
calls the MoleculeView->render() function. This function first renders
all it’s subViews (namely bonds and atoms) and then renders itself
(later we could possibly add a transparent wrap or shell around the atom
or whatever, i don’t know all what you want to render).

I’ve been busy with other things this week, but will hopefully have
some time this weekend to get more of the “base” finished, including
moving the current mouse manipulation into a tool framework. One
slight catch is that I want to build in undo/redo functionality, and
there isn’t much example code.

I’m also going to bring in some of the Kalzium code including
Benoit’s spheres and cylinders.

I actually asked Benoit to do that kinda stuff and said i’d forward him
the API for the plugins. But you’re welcome to. I added the
StickEngine code but it needs some work. The Cylinder length isn’t
quite accurate but it works for the most part.

I’ve been thinking about GL caching a lot too but at this point i think
we are best off just doing all our rendering raw. Forgetting display
lists for now. We can decide later where we want to put those. I mean,
we don’t have to ignore them but i’m not going to worry about it for

Does this make any sense? It took me a long time to figure out the
class dissection we needed to get the dynamics we want out of it.
Eventually we need to make it so that you can edit parameters of the
engines too. This will be easy i think. That’s a ways off but it can
be done now.