DATA-ORIENTED DESIGN The hardware will thank you.

If you wish to submit an article, please contact support@dataorienteddesign.com for details.


Lots of good resources linked from this site by Daniele Bartolini: Data Oriented Design Resources

Data-Oriented Design book (2018 version)- html with links

Data-Oriented Design book (2018 version) - PDF download (better images)

Data-Oriented Design book - 2018 paperback version (with extra chapters)

Chinese translation provided by KouZhe / Martin Cole

Data-Oriented Design book (2018 version)- html with links

Data-Oriented Design book (2018 version) - PDF download (better images)

FIRST PREV : PAGE 0 : NEXT LAST

remember time then doing analysis of space 08/04/2012:12:12:31

Some elements or development have time and space tied to each other in such a literal way that's it hard to think of a reason not to worry about both at the same time.

Take asset compression.

For physical media, there is a certain amount of compression required in order to fit your game on a disc. Without any compression at all, many games would either take up multiple discs, or just not have much content. With compression, load times go down and processor usage goes up. But, how much compression is wanted? There are some extremely high compression ratio algorithms around that would allow some multiple DVD titles to fit on one disc, but would it be worth investing the time and effort in them?

The only way to find out is to look at the data, in this case, the time it takes to load an asset from disc vs the time it takes to decompress it. If the time to decompress is less than the time to read, then it's normally safe to assume you can try a more complex compression algorithm, but that's only in the case where you have the compute resources to do the decompression.

Imagine a game where the level is loaded, and play commences. In this environment, the likelyhood that you would have a lot of compute resources available during the asset loading is very high indeed. Loading screens cover the fact that the CPUs/GPUs are being fully utilised in mitigating disc throughput limits.

Imagine a free roaming game where once you're in the world, there's no loading screens. In this environment, the likleyhood of good compute resources going spare is low, so decompression algorithms have to be light-weight and assets need to be built so that streaming content is simple and fast too.

Always consider how the data is going to be used, and also what state the system is in when it is going to use it. Testing your technologies in isolation is a sure fire way to give you a horrible crunch period where you try to stick it all together at the end.
Mastodon