Saving first characters
Over the last few days I’ve implemented persistence for Artemis entity system. I heavily modified it in the process since it was not really designed for that. So now my game server has a somewhat working world update loop running Artemis. Saving world state can be done in two ways: pausing the update loop and just dumping every entity and component into the DB, or saving in the background. Pausing is the easy way, it takes a few seconds to save over 100000 simple entities on my laptop. The disadvantage is, of course, that every player will be frozen during the process. Ultima Online emulators do it that way, but we can do better. Background saving is much trickier because save process is not instantaneous. If the main loop is not paused some game entities can me modified during the save, rendering saved state inconsistent. How to avoid that?
Right now my save method looks like that:
internal void SaveAll() { if (_DbState.Saving) // _DbState holds some global database state variables throw new InvalidOperationException("Database is in inconsistent state, reset Saving property to continue"); Console.WriteLine("SaveAll() started"); // _Dirty holds entities that are modified during the save _Dirty = new Dictionary<long, Entity>(_Active.Count); // initial size to prevent resizing // we alternate between two DB collections to prevent data corruption if the game server explodes during save _DbState.ToggleActiveCollection(); _DbState.Saving = true; SaveDbState(); // switch collection _EntityCollection = _Db.GetCollection<Entity>(_DbState.ActiveCollectionName); _EntityCollection.RemoveAll(); // clear Stopwatch sw = new Stopwatch(); sw.Start(); // _Active is a dictionary with active entities, we take a snapshot of their IDs at the beginning of a save long[] copy; lock (_ActiveLock) { // this only takes a few milliseconds copy = new long[_Active.Count]; _Active.Keys.CopyTo(copy, 0); } sw.Stop(); Console.WriteLine("Active snapshot: {0}ms, {1} entities", sw.ElapsedMilliseconds, copy.Length); Console.Write("Save to {0}: ", _DbState.ActiveCollectionName); sw.Restart(); // now the save proper, less efficient than a batch insert but what can you do foreach (var entity in copy) // loop through all entities in the snapshot { lock (_DirtyLock) { if (_Dirty.ContainsKey(entity)) // this entity was modified, use cloned value { var dirty = _Dirty[entity]; if (dirty != null) SaveEntity(dirty); else DeleteEntity(entity); } else // not modified, save original object SaveEntity(entity); } } sw.Stop(); Console.WriteLine("{0}ms, dirties: {1}, not modified: {2}, total: {3}", sw.ElapsedMilliseconds, _Dirty.Count, copy.Length-_Dirty.Count, copy.Length); // save entities the easy way: batch insert, this needs paused update loop //_EntityCollection.InsertBatch(_Active.Values, SafeMode.True); // to make sure everything was written (if SafeMode==false) loop this check a few times if (_EntityCollection.Count() != copy.Length) Console.WriteLine("Not all entities were written!"); // save finished _DbState.Saving = false; SaveDbState(); Console.WriteLine("SaveAll() finished"); _Dirty.Clear(); }
Of course this is a prototype and needs stress testing for possible locking and data loss issues, but it seems to work fine for now. Background saves require some overhead when updating entities though: if a save is in progress, a clone of the entity needs to be added to Dirty list before the entity is modified. Also memory requirements go up a bit, because we need to store those clones during save. Testing shows me it’s not too bad, but we’ll see how it works for a real server.
In my test I have 10 types of components, each with a simple data field of varying types. I create 100000 entities with 1-10 random components. There are 5 systems, processing 1-5 different types of components. Every system alters every entity it processes (pessimistic scenario). During every loop around 50 new entities are created, amounting to pretty unrealistic few thousand new entities per second. Here is a sample run on my laptop with background saves every 20 seconds:
Init: 536ms Loading from 0: 0 entities, 14ms, 0 e/s Creating 100000 entities: 2213ms, 45187 e/s Looping for 300s... SaveAll() started Active snapshot: 1ms, 112920 entities Save to Entities1: 7259ms, dirties: 59332, not modified: 53588, total: 112920 SaveAll() finished SaveAll() started Active snapshot: 1ms, 124530 entities Save to Entities0: 8085ms, dirties: 55214, not modified: 69316, total: 124530 SaveAll() finished SaveAll() started Active snapshot: 2ms, 135260 entities Save to Entities1: 10312ms, dirties: 54271, not modified: 80989, total: 135260 SaveAll() finished SaveAll() started Active snapshot: 1ms, 145250 entities Save to Entities0: 12097ms, dirties: 76273, not modified: 68977, total: 145250 SaveAll() finished SaveAll() started Active snapshot: 1ms, 154500 entities Save to Entities1: 12325ms, dirties: 81132, not modified: 73368, total: 154500 SaveAll() finished SaveAll() started Active snapshot: 1ms, 163250 entities Save to Entities0: 13825ms, dirties: 85726, not modified: 77524, total: 163250 SaveAll() finished SaveAll() started Active snapshot: 1ms, 171650 entities Save to Entities1: 14030ms, dirties: 90208, not modified: 81442, total: 171650 SaveAll() finished SaveAll() started Active snapshot: 1ms, 179690 entities Save to Entities0: 15005ms, dirties: 94482, not modified: 85208, total: 179690 SaveAll() finished SaveAll() started Active snapshot: 1ms, 187360 entities Save to Entities1: 16442ms, dirties: 98475, not modified: 88885, total: 187360 SaveAll() finished Loops: 3046, 10.15 loops/s, 191380 entities) Entities subscribed: s12: 30432, s340: 13091, s5: 74095, s6789: 5419, s2458: 5384, total: 128421 Saving 191380 entities to 0 SaveAll() started Active snapshot: 2ms, 191380 entities Save to Entities0: 17398ms, dirties: 0, not modified: 191380, total: 191380 SaveAll() finished Save: 17421ms, 10986 e/s
I’m happy with the numbers. Of course test systems and entities are very simple, but there are lots of them and the pressure on the GC is pretty high. Still, CPU spent in the GC is below 10% and memory usage peaks at around 300 MB. 10 loops per second seems enough for a MMORPG server, it’s not a shooter. I might be wrong, but we’ll see later. And of course my laptop is not exactly a server-grade hardware 😉
Right now the only thing game server actually stores are player characters, which are entities with just two components: name and location.
/* 0 */ { "_id" : NumberLong(1), "Components" : { "Name" : { "_t" : "Name", "Value" : "admin" }, "Location" : { "_t" : "Location", "X" : 1.0, "Y" : 2.0, "Z" : 0.0 } } } /* 1 */ { "_id" : NumberLong(2), "Components" : { "Name" : { "_t" : "Name", "Value" : "Omeg" }, "Location" : { "_t" : "Location", "X" : 1.0, "Y" : 2.0, "Z" : 0.0 } } }
It’s a start, but once the framework is there, extending it will be easy. Stay tuned.
C#, code, entity systems, game development, Heimdall
Hi, it is great to see people doing cool stuff with my library!
thelinuxlich said this on February 24, 2012 at 19:57 |
Also, you are improving exactly the weakest point in the current Artemis C# implementation: persistence, loading/saving world state. It is on my wishlist since forever to do something about it and also write a GUI like Flatredball’s to manage world data in a higher level. If you think it is possible, we could merge your work into Artemis
thelinuxlich said this on February 24, 2012 at 20:09 |
I’ll definitely share my code once it’s in a working state. Not sure how to handle merging it since my persistence engine is dependent on MongoDB and that’s probably not that useful for most people. I decided to not waste time on creating some abstract data access layer because of time and performance constraints. But I’ll definitely share my work on that.
omeg said this on February 25, 2012 at 15:56 |
What if you split the job between entity groups(assigned inside Artemis) asynchronously?
thelinuxlich said this on February 28, 2012 at 21:16 |
this is just awesome 🙂 ill be watching this for sure.
good luck!
anon said this on October 24, 2012 at 04:00 |
Generally, the saving of game entities don’t all happen at once for the reasons you’ve outlined. Disk IO is where you’re going to experience the bottleneck with this process and one alternative is to create a shadow copy of your entities and push that shadow copy to a background thread to perform the save operation without significantly impacting the simulation loop.
In most cases, persistence is a background operation that happens on a per entity basis and not all entities at the same time. Only during critical atomic operations, such as in-game trades would persistence happen real-time. Other operations are often stored in memory and a background or asynchronous operation will handle updating the the datastore from the in-memory entity to minimize impact of disk IO.
crancran said this on October 26, 2012 at 20:55 |