After discussing the API a bit with my mentor, Sev, it has been determined that the game engine should initially request a bitdepth/pixelformat of the backend by means of an optional parameter passed to Engine::InitGraphics
There are still a few details that haven’t been determined, like:
- What happens when the engine requests an unsupported format;
- Should the parameter be a bitdepth (8, 16, 24, 32) a generic specifier (8, 555, 565, 1555, 888, 8888), a fully formed Graphics::PixelFormat object, or some other format not yet defined;
- Should any pixelformat conversions be performed if the engine and backend cannot agree on a directly supported format
- If so, should they be performed by the engine, or by the backend? or should it vary by circumstance?
- Probably many others that I haven’t considered yet
But, I am beginning to form a mental picture of how this thing will work, and I will, of course, discuss these remaining questions with Sev, and others, at the next opportunity.