XML Persistance for Torque Objects
by J. Donavan Stanley · 05/19/2003 (11:03 am) · 35 comments
Download Code File
Installation
1) Unzip the archive into your engine directory.
2) Add the files in engine/persistance to your project.
3) Recompile.
Info
This introduces two new classes to the Torque Engine:
Pickler - An abstract base class for serialization support.
XMLPickler - A concrete class that can serialize objects to/from XML.
This approach was taken so that others could implement custom picklers using XMLPickler as a guide. For example someone may want to implement a BinaryPickler that stores objects in a non-human-readable format.
If you unpickle an object that already exists within the simulation it's fields will be overwritten with the contents of the XML stream.
Only fields that are exposed to the scripting engine (via addField ) are imported / exported.
You must specify a storage medium as part of the desitination. For example "file://test.xml" refers to the file "test.xml" in the root directory of the Torque executable. Currently only "file" storage mediums are supported.
You can see a sample output file by clicking here.
This uses a modified version of the TinyXML library.
Usage
In a nutshell:
1) Create a XMLPickler object.
2) Call either pickle (passing an object and a desitination), or unpickle(passing in a source).
3) You're done.
Script Example:
Installation
1) Unzip the archive into your engine directory.
2) Add the files in engine/persistance to your project.
3) Recompile.
Info
This introduces two new classes to the Torque Engine:
Pickler - An abstract base class for serialization support.
XMLPickler - A concrete class that can serialize objects to/from XML.
This approach was taken so that others could implement custom picklers using XMLPickler as a guide. For example someone may want to implement a BinaryPickler that stores objects in a non-human-readable format.
If you unpickle an object that already exists within the simulation it's fields will be overwritten with the contents of the XML stream.
Only fields that are exposed to the scripting engine (via addField ) are imported / exported.
You must specify a storage medium as part of the desitination. For example "file://test.xml" refers to the file "test.xml" in the root directory of the Torque executable. Currently only "file" storage mediums are supported.
You can see a sample output file by clicking here.
This uses a modified version of the TinyXML library.
Usage
In a nutshell:
1) Create a XMLPickler object.
2) Call either pickle (passing an object and a desitination), or unpickle(passing in a source).
3) You're done.
Script Example:
new XMLPickler( "pickler" ); pickler.pickle( LightMaleHumanArmor, "file://test.xml" ); $obj = pickler.unpickle( "file://test.xml" );
About the author
#2
05/19/2003 (11:59 am)
Oww. Very cool. :)
#3
05/19/2003 (12:39 pm)
very Clever
#4
05/19/2003 (4:28 pm)
Very useful :)
#5
05/19/2003 (4:32 pm)
@Fredik - I realized yesterday that the copy I sent you by email was horrbily broken. You'll want to use this version instead.
#6
Thx a lot!!
05/19/2003 (10:18 pm)
Sweeeeeeeet! Works like a charm and is exaclty what I wanted to do anyways for our SP game!! :DThx a lot!!
#7
05/20/2003 (2:16 am)
wohoo!!!
#8
If you look in pickle.cc near line 199 ( in exportSimSet ) you see the following:
This should be changed to:
This corrects a bug when exporting nested SimSets. Due to the way SimSet iterators work the orginal code would list any objects contained in a nested SimSet as being part of the containing SimSet. This would lead to duplicate objects being exported and then attempts to recrate the same object again on import.
Also you might want to make the following changes, (all line number assume you've made the preceeding change:
This will prevent attempts to readd an existing object to the simulation.
05/20/2003 (1:49 pm)
Here's a small update...If you look in pickle.cc near line 199 ( in exportSimSet ) you see the following:
for (SimSetIterator itr(set); *itr; ++itr)
{
obj = static_cast<SimObject*>(*itr);This should be changed to:
for(U32 i = 0; i < set->size(); i++)
{
obj = (*set)[i];This corrects a bug when exporting nested SimSets. Due to the way SimSet iterators work the orginal code would list any objects contained in a nested SimSet as being part of the containing SimSet. This would lead to duplicate objects being exported and then attempts to recrate the same object again on import.
Also you might want to make the following changes, (all line number assume you've made the preceeding change:
// Near line 127
bool bExistingObject = true;
// Line 136 change the final else statment to read
{
bExistingObject = false;
newObject = (SimObject*)ConsoleObject::create( element->Attribute(CLASS_FIELD) );
}
// Line 180 change the if statement to read:
if( !bExistingObject && newObject->isProperlyAdded() == false && !newObject->registerObject())This will prevent attempts to readd an existing object to the simulation.
#9
05/21/2003 (3:03 am)
Good update. I had some problems, but hadnt the time to look it up more closer.
#10
The following snippet are my current mods:
In TiXmlDocument::LoadStream()
First notice buf[] has increased by +1 and has been null-terminated for the maximum size. This ensures any buffer read of 0x800 bytes is null-terminated. (I have a large file being processed. BTW: the write of pickler.pickle(obj,file); works great for large Simsets. )
Second, and here is where I am trying to find an elegant solution,
a test for EOS needs to allow
Edit: Correct example for nulling buf.
05/21/2003 (9:28 am)
@J.DS - I found several bugs also. I have fixed one and wish some blessing on the needed second one.The following snippet are my current mods:
In TiXmlDocument::LoadStream()
const int BUF_SIZE = 2048;
char buf[BUF_SIZE+1];
buf[BUF_SIZE] = 0 ; // insure buffer is null-terminated
while( stream.read(BUF_SIZE, buf ) )
{
data += buf;
}
if( stream.getStatus() == Stream::EOS ) // insure we get the last bit of data
data += buf;First notice buf[] has increased by +1 and has been null-terminated for the maximum size. This ensures any buffer read of 0x800 bytes is null-terminated. (I have a large file being processed. BTW: the write of pickler.pickle(obj,file); works great for large Simsets. )
Second, and here is where I am trying to find an elegant solution,
a test for EOS needs to allow
data += buf;to work. The only gotcha for me at this point is if a buffer is partially filled (after being used before and having completely been filled with valid ascii chars from a previous read) how does one insure that the partially filled buffer contains a null terminator at the end? I thought including a for loop to clear the buf[] as per
for( int i=0;i<sizeof(buf);buf[i++] = 0) ;would work but was looking for a more elegant way. It seems to me the the FileStream processor ought to null terminate a read upon reaching EOS or EOF. Any ideas gang?
Edit: Correct example for nulling buf.
#11
05/21/2003 (12:55 pm)
If you want to set the buffer to all NULL just use memset. However that shouldn't be needed. The read call should always read up to the last byte of the file. The non-NULL termination thing was a bonehead mistake on my part.
#12
When I converted this to use streams I forgot that they don't have a text mode. So what we need to do is change the loop slightly.
That way the previous data gets wiped from the buffer each loop.
05/21/2003 (2:25 pm)
ok here we go..When I converted this to use streams I forgot that they don't have a text mode. So what we need to do is change the loop slightly.
while( stream.read( BUF_SIZE, buf ) )
{
data += buf;
dMemset( buf, 0, BUF_SIZE );
}That way the previous data gets wiped from the buffer each loop.
#13
05/21/2003 (3:06 pm)
@J.DS - tried using dMemset and no-go. My for loop works though. Not exactly sure why though.
#14
05/21/2003 (6:30 pm)
@Wendell- What exactly is the problem you're seeing? Is it missing the last part of your file or what?
#15
05/21/2003 (6:34 pm)
Yep the last part of the file doesn't get loaded because there is nothing setting the null-terminator. So I insure the whole buffer contains NULLS which works. stream.read() returns a bool and not the byte count so one can't depend on it. Here is my working code. My file btw is 881k huge.bool TiXmlDocument::LoadStream( FileStream& stream )
{
// Delete the existing data:
Clear();
U32 length = stream.getStreamSize();
TIXML_STRING data;
data.reserve( length );
const int BUF_SIZE = 2048;
char buf[BUF_SIZE+1];
buf[BUF_SIZE] = 0 ; // insure buffer is null-terminated
while( stream.read( BUF_SIZE, buf ) )
{
data += buf;
for( int i=0;i<sizeof(buf);buf[i++] = 0) ;
}
if( stream.getStatus() == Stream::EOS ) // insure we get the last bit of data
data += buf;
Parse( data.c_str() );
if ( Error() )
return false;
else
return true;
}Nice work J. Can't wait to try your other edits to this.
#16
05/21/2003 (7:04 pm)
Something I noticed. My Simset contains 519 objects that gets loaded. Since all this appears server side related, each client has to be updated for all the loaded objects. 519 of them takes some time and one has to insure the clients have received them all before calling gui functions attempting to display objects. For a small number of objects this may work well as is but I'll have to think about what to do for my large object count. (Maybe don't process keystrokes until all clients have gotten the update.)
#17
So the question then is what's special about your file?
Also note I've changed the mod for the loop up above. If your code matches it and it still doesn't work send me your XML file by email I'll try debuging the pickler with it.
Edited to add: The code you have will only reappend the last read data from the file which isn't what you want...
05/22/2003 (1:58 pm)
I've looked over the read function. It's stupid that it doesn't return the number of bytes read like standard fileio but oh well. Even so, it should never fail to read the last few bytes of the file unless there's a bug in it someplace. And if there is I'm curious how loading large graphics and models works.So the question then is what's special about your file?
Also note I've changed the mod for the loop up above. If your code matches it and it still doesn't work send me your XML file by email I'll try debuging the pickler with it.
Edited to add: The code you have will only reappend the last read data from the file which isn't what you want...
#18
I know my edits look strange but it works. :) I just feel like I am missing a more elegant solution.
05/22/2003 (4:01 pm)
ok. What happens is the buffer does indeed get filled but since EOS is reached, the read function fails in FileStream. This causes the while to terminate before the data += buf override gets executed. So all I am doing is testing for EOS and performing the last partial buffer transfer after insuring the buffer is full of nulls to allow the partial buffer to call strlen() to find the EOF. That is why I was hoping for a more elegant way that upon reaching the EOF that a null would be written to the buffer at that point. Without nulling the buffer previous contents are there from last full buffer read and the strlen() returns a bogus count.I know my edits look strange but it works. :) I just feel like I am missing a more elegant solution.
#19
05/22/2003 (4:43 pm)
Wendell you need to submit that as a bug to the GG staff then.
#20
This is what I think the code should look like (based on Head):
bool Done = false;
do {
stream.readLine(buf, BUF_SIZE);
Done = (stream.getStatus() == Stream::EOS);
data += (char *) buf;
} while (!Done);
On Windows, stream.open calls through to CreateFile, and stream.read calls through to ReadFile. ReadFile won't append a NULL, because if you were calling it to load data into a structure, adding a NULL to the end would be bad. I'm not sure how the script compiler does it, but looking at your code and at winFileIo.cc, I'm fairly certain that it needs to be adding the NULL.
Edit:
To clarify:
The problem comes up on a 2k+ file.
At 2k, you'll get a couple bytes of random junk and maybe an Access Violation or Segmentation Fault from the concat.
At >2k, you'll get the above symptom plus if you manage to make it through the whole file, there will be random junk at the end of the file (that should manifest as a parse error from a compliant XML parser.
I haven't tested the code, but believe it would address Wendall's issue.
Note: I changed the code slightly from the previous edit, on review of ::read. (one final edit to get the code right -- still isn't tested) (blah, one more edit -- function isn't exposed with byte count...)
06/20/2003 (5:51 pm)
I understand what's going on in the comments.This is what I think the code should look like (based on Head):
bool Done = false;
do {
stream.readLine(buf, BUF_SIZE);
Done = (stream.getStatus() == Stream::EOS);
data += (char *) buf;
} while (!Done);
On Windows, stream.open calls through to CreateFile, and stream.read calls through to ReadFile. ReadFile won't append a NULL, because if you were calling it to load data into a structure, adding a NULL to the end would be bad. I'm not sure how the script compiler does it, but looking at your code and at winFileIo.cc, I'm fairly certain that it needs to be adding the NULL.
Edit:
To clarify:
The problem comes up on a 2k+ file.
At 2k, you'll get a couple bytes of random junk and maybe an Access Violation or Segmentation Fault from the concat.
At >2k, you'll get the above symptom plus if you manage to make it through the whole file, there will be random junk at the end of the file (that should manifest as a parse error from a compliant XML parser.
I haven't tested the code, but believe it would address Wendall's issue.
Note: I changed the code slightly from the previous edit, on review of ::read. (one final edit to get the code right -- still isn't tested) (blah, one more edit -- function isn't exposed with byte count...)

Torque Owner J. Donavan Stanley