Up to date as of December 2019.


First off, I’d like to thank everyone for what can only be described (for maximum awkwardness) as an “outpouring” of support in the wake of last year’s unemployment and general grumpiness. It didn’t exactly go viral, but half a dozen people inviting me to apply for jobs at various places and a noticeable uptick in Patreon donations does wonders for one’s self-esteem. Thank you all.

Second, as Olin Shivers once said, “I’m feeling much better now”. I’m gainfully employed at the first job I’ve ever actually really liked, I’ve taken some time off from worrying about gamedev stuff a bit, and I’ve worked on various interesting side-projects. Mental and physical health are generally much improved, and life is honestly stable and good. Depression still sucks, I’m still single and full of rage, and it turns out that helping run a hackerspace means that there’s all these heckin’ people who want to talk to you and it’s slowly driving me bonkers. But these are all things I can more or less handle now and I’m feeling ready to devote a bit more work to ggez lately.

Self-care. It’s important, yo.

So since I don’t have much new to report on ggez itself, let’s talk about random things that have happened in the greater vaguely-Rust-gamedev ecosystem!


For one thing, have an official Rust community Gamedev Working Group! Huzzah! What does this mean? Well, basically that we get a monthly newsletter full of cool stuff, and people can talk about the stuff we want/need on the WG issue tracker. Which, given the size of the community (petite) and the amount of time people have to devote to it (not much), this seems about right. It’s gotten people talking about what we want/need out of various foundational libraries and tools, and it’s slooooowly taking over administration of the beloved but aging Are We Game Yet, so I’m pretty happy with it all in all.

Alas, Rust has actually Gotten Popular, and the Rust subreddit has firmly entered the Eternal September where actually interesting and novel content proportionally diminishes. Not to say there’s not good stuff still there, because there is, but the signal-to-noise ratio is a bit more bleak. Well that’s how the world goes. It’s actually a good thing for the language and community: Nobody seems to be asking “what’s the point” or “is this actually going to hit the big time” anymore the way they were in the bright young days of 2017, but rather “what can it do for me” or “how do I get onto the train”. Rust may peak and then decline in popularity like Ruby sorta has, or it may carry on and take over the world like Python seems to be, but its place in history is assured.

ggez itself has actually been surprisingly decent, at least as far as I can tell. We got version 0.5 out, eventually, and as far as I’m concerned it’s API is preeeeetty good for what it needs to do. There’s some things to be improved I’ll talk about later, but it basically hits all the goals I really wanted when I made it. However, the lack of care has definitely shown, there’s a lot of small bugs and misdesigns that have gone unfixed and unloved, and fundamental progress on infrastructure things like unit tests and CI has basically stopped. Plus, I haven’t been even contribute much to upstream things like gfx and winit. No big endeavors for me this year.

The tone of Rust in general the last year has been interesting. I don’t know how much this reflects the full community vs. my personal interests and bugaboos, but I’ve noticed a few weather changes: Compile times have become a Problem. Dependency analysis has become a Problem (not one exclusive to Rust itself, naturally). And between the two of these, the prospect of using large do-everything libraries like nalgebra or rand has become less appealing compared to lighter-weight crates that do one thing well or solve a specific niche. We seem in general to be a little sick of the downsides of making rustc do a lot of work, and are a little more willing to spend more effort to simplify its life, and in gamedev at least a newer generation of crates is starting to appear which reflects this. We’re definitely past the Paleolithic Era of Rust development and into the bright, enlightened and exciting Neolithic.


So I felt that if I was going to do any major work on ggez this year, it was going to be revamping the graphics engine to use gfx-hal and rendy. The graphics submodule is literally half of ggez, and it’s slowly gotten more and more in need of a refactoring and cleanup. A look at coffee and other game frameworks also provided some inspiration for how things could be improved. For one thing, the current ggez graphics module conflates a few things that really are separate steps on the graphics card. For example, just looking at the Image type, it has methods for setting the filter mode, which is how the GPU interpolates texels in a texture, and setting the blend mode, which is how the GPU mixes the end result of rendering with other things already rendered. These are vastly different things that happen at different stages in the graphics pipeline, and changing them has different performance implications, and so shoving them both onto an Image is really pretty wrong. API’s like Vulkan make it very clear where these things happen and in what order, but in OpenGL they look pretty much the same and you can do them whenever you want, so this relationship is obscured unless you already know what’s happening. And when I first wrote ggez, I didn’t know what was happening.

There’s a few other warts like that, and they make things more complex than they need to be: colors may be set per-vertex or per-model, for example, and they’re two distinct parameters that get shoved into the vertex shader. Rendering sprites and meshes are basically separate pipelines and so for instance you can’t use a SpriteBatch to draw many polygons at once, just images. Stuff like that. So, I’ve been wanting to rewrite things to be a little less wart-y for a while, and since gfx-rs is deprecated now in favor of gfx-hal, that seemed like the optimal path. Plus the Amethyst people were already working on rendy, which is basically a fairly low-level “batteries included” toolkit for gfx-hal, and it seemed like a good idea to be able to use that work for ggez as well. So I pulled up my britches, did some research and started playing around with rendy and gfx-hal.

It was, frankly, an uphill battle to start with. Docs were incomplete and implementations unstable, which I expected. However, the people behind them were incredibly supportive, the example code is pretty decent, and all the various breaking changes that got made as the libraries evolved were well managed and evolutionary rather than revolutionary in nature. So, once I figured out enough to start asking questions and experimenting it wasn’t really too difficult to keep up. I went through the long process of understanding how the heck to actually write a graphics engine in Vulkan, bounced off it a few times making a bit more progress each time, and finally settled on a basic API structure that seemed like it would let me do what I wanted. A couple weeks of implementation and I had something that looked right, but it was crash-y and full of memory leaks. Another basically solid week of frustrating debugging and an upstream bugfix I still don’t fully understand, and it worked!

…sort of. The Vulkan backend for gfx-hal worked well, as one would hope. Vulkan is little bit of a pain in the ass since it’s hard to separate out the million little details that all touch each other, but once you make something work, it tends to work. And word on the street is that the Metal backend is nearly as good – I wouldn’t know, since I sorta ragequit on caring about Apple. But, well… The DirectX 12 backend still doesn’t work with rendy ’cause it doesn’t support secondary command buffers and rendy wants to use those. This isn’t actually a priority since Windows runs Vulkan just fine, so the DirectX 12 backend hasn’t gotten much attention, but it still kinda feels bad. More importantly, supporting WebGL is apparently “pretty close” but the non-web OpenGL backend doesn’t work with rendy on Windows either, and so I’m really not sure how much faith I have left. And this is after a year of hard work on these things from some extremely good people.

And even worse… even just using the Vulkan backend, my experimental little quad-throwing program still had some a subtle bug that made it produce different results on different machines. On an NVidia card on Linux it worked fine, or at least drew shapes that I could believe would be correct once I got the math right, but on an AMD card on Windows it just showed nothing. Rendy didn’t give any errors, Vulkan validation layers didn’t show anything wrong, and I spent days going through the program with RenderDoc checking every single stage of it. They all looked correct. But it still… didn’t… work.

This was, to be frank, a little soul-crushing. One of the rendy devs eventually found the bug: I’d made a math mistake in a shader that made it divide by zero, which is undefined, so some graphics driver said “well that’s infinity then” and others said “well that’s zero then”. (Thank you, Omniviral!) It was my screw-up and nothing that wouldn’t have been just as broken with any other graphics API, but by then I was just burned out. The whole point of using gfx for ggez was to make this portability essentially transparent, but turns out I couldn’t escape bonkers graphics drivers no matter what. And after eagerly awaiting the Next Big Thing for literally three years, and doing my best to help out where I can, spending months on research and another month on implementation, I still can’t use something better than OpenGL on the web. My gut response to that is to say “then help make it happen”, but I’m just too tired to yak-shave that hard. So, for now, I’m giving up.



I do not blame gfx-hal or rendy for this in the slightest, to be clear. The task they are trying to do is ridiculous, and they are making it work. I fully expect the story to be different next year, and if that’s okay for you then those crates are probably still the solution you should go for. But dealing with this sort of low-level graphics stuff isn’t actually all that fun for me in practice, I have other things I want to do more, and I’m frankly sick of fighting it. So, let’s re-assess some priorities.

The thing is, ggez doesn’t actually need a Vulkan-style graphics system. While I kinda loathe OpenGL a little and using Vulkan is more fun, my highest remaining unfulfilled priority is actually just making games with Webassembly. And this whole experience has made it patently clear that if you want to support a particular platform well, especially a kinda weird one like a web browser, you have to treat it as a first class citizen right from the start and not something that you bolt on later at your leisure. I want to support Web, I want to do it this year, and no matter what else I do that means using WebGL at one level or another. And all else being equal, fewer layers is better.

“What about WebGPU?” I hear some say. Well, what about WebGPU? The actual standard doesn’t exist yet, some parts of the standardization process can only be described as morbidly hilarious, and there are by definition no standard implementations yet since there’s no standard. The wgpu crate, while cool, uses gfx-hal and rendy and so its ability to run on Web is still tied to theirs. As I said, that will probably happen someday, maybe even soon, but isn’t there yet. Really, if you want to use wide-spread technology that actually works for most people, you’re probably best off aiming five years behind the state of the art, at least. So, WebGL it is.

Now, support for WebGL 2 isn’t great, and through the amazing magic of Khronos’s version numbering, WebGL 1 is roughly equivalent to OpenGL ES 2.0, which is roughly equivalent to OpenGL 2.0 – which was released in heckin 2004. Aiming 5 years behind the technology curve is fine, maybe even expected if you want people who aren’t rich West Coast programmers to actually be able to use your stuff, but aiming 15 years behind the curve is asking a bit much. On the other hand, WebGL 2 is about the same as OpenGL ES 3 which is about the same as OpenGL 4.3 – these version numbers are sheer wizardry, I tell you. And the browsers that don’t support WebGL 2 seem to be mobile things (which are going to be kinda awful no matter what), MS Edge (which will support it once they finish giving up and switching to Chrome’s rendering engine), and Safari and IE (which are always the red-headed stepchildren that nobody sensible will ever want to use). So, as far as I care, WebGL2 support is pretty good among the browsers people actually use, Firefox and Chrome, so that’s okayish for now. I’m sure I will be vigorously proven wrong about this for the next few years, so WebGL 1 compatibility is not something to just entirely give up on, but I’m not going to sweat it too much just yet. There’s only a few WebGL 2 features that I really want to use anyway, but not having to worry about multiple shaders and stuff like that will make life a lot simpler to start with.


I have to say though, after basically not touching OpenGL for literal years but instead digging as deep as I can into Vulkan and related technologies… using it again is kinda surreal. First off, after working with the Vulkan ecosystem I know a lot more about what is actually going on under the hood. What all the pieces are, how they fit together, and what interacts with what. Second off, I am way more ready and willing to dig into raw standards documents to see what the heck is actually going on, though good tutorials are still vital. And third off I have a much greater appreciation for how crap the standards writing and API design in a lot of OpenGL actually is. It’s… really just execrable. There should be year-long, senior-level undergrad design courses where entire classes of bright young students dig through the OpenGL standards finding all the bad designs, ambiguous parameters and undocumented assumptions, and contrast that to Vulkan which does its best to describe as much of these interactions as possible. Much like Lisp or assembly language, even if you never actually use Vulkan for anything other than fooling around, I heartily recommend you learn it because it changes how you think about problems.

But something interesting happens when you start to dig around in different versions of the OpenGL standards docs to find compatibility details… you start to realize it’s getting better. Sometimes significantly better. Not just in what the API does or how it does it, but how the standards are written and how the various technologies (OpenGL, OpenGL ES, WebGL, GLSL…) work together. Even just how things are named gets better, which is one of those sorts of details that is tiny and essentially meaningless but also absolutely drives certain types of people bonkers when it’s messed up. Like me.

And the picture this paints is that at some point, something magical happened: The OpenGL committee got its collective shit together. Finally. Somehow! From my limited experiences in such endeavors, this is probably due to specific people getting involved and possibly others getting un-involved. I don’t know when this happened, my guess would be sometime between OpenGL 2.0 in 2004 and OpenGL 3.3 in 2010, and I don’t know how it happened or who the people concerned were. I bet the debacle of Longs Peak and OpenGL 3.0 had something to do with it, but that’s just a guess. If anyone does know, please tell me, ’cause I bet it’s a great story.

It’s not super obvious that this slow harmonization is happening unless you get into the details of the tech, which frankly I mostly am not qualified to do. But you can see other symptoms. The GLSL version numbers jumped to match OpenGL’s version numbers. Changelogs in standards documents got better, and started explicitly calling out incompatibilities and breaking changes, or lack thereof. And the differences between OpenGL, ES and WebGL start getting explicitly smaller, until you realize… WebGL 2 is now an explicit, strict subset of OpenGL ES 3.0, which is now an explicit, strict subset of OpenGL 4.3. So if you write your code and shaders to run on WebGL 2, they can run anywhere that supports ES 3 or OpenGL 4.3, without modifications, for realsies, no takebacks. (At least before the inevitable driver bugs occur, that is.) This has never happened before in the history of OpenGL. To someone like me, who actually just wants to write cross-platform stuff without sweating the details too much, OpenGL has always been this semi-horrid tarpit of poorly-defined incompatibility and suffering from which there is no escape. It’s been this way since literally the first day I’ve been programming. To discover it’s gotten palpably better is like bumping into Cthulhu again after not seeing him for a decade and discovering that he’s mostly given up on tearing the stars down out of the sky and gotten his life together a bit, settled down with a partner, and now runs a coffee shop. Sure it’s a coffee shop from which frothing mad cultists emerge on a semi-regular basis, but still, the coffee’s actually pretty good.

(So of course this is also the point at which Apple decides it’s time to bulldoze Cthulhu’s entire neighborhood to build a new highway, because that’s totally going to make the world a better place. When there’s already a new train station already being built right next to it. No I’m not bitter.)

Other stuff

I am honestly kinda frustrated with a lot of major dependencies, to be honest.

  • rodio uses cpal for low-level sound output, which has some longstanding issues that suggest it needs some deep design work. It’s been this way for a while, just nobody has been interested and skilled enough to step up. It also doesn’t support wasm32-unknown-unknown. Really, cpal needs love, and rodio needs competition. Digging into cpal is on my to-do list, but so is everything else.
  • gilrs’s gamepad support is… actually pretty okay, it appears but it feels weird ’cause the devs never seem to talk to anyone. Getting it to cooperate with winit’s event loop is something I have yet to work on but I bet it’s gonna be kinda wild, and gilrs in ggez always felt kinda tacked on. Part of that is ggez’s fault: it IS tacked on, and I just haven’t had the energy to smooth things out better. I feel like gilrs should just be part of winit, but it isn’t. Not sure what to do about it.
  • winit is a great project full of great people, and I really can’t stand it sometimes. Dealing with HiDPI support is kinda awful, writing non-event-driven game loops with the new event loop is kinda awful, how glutin wraps it all to add OpenGL context creation support is awful, and I know exactly why all these things are they way they are and winit is doing them right. It’s even getting better! It’s just that the right way is awful to actually use in many small and terrible ways. My gut feeling is that, since they are designing for both UI stuff and game stuff with maximum cross-platform power, they are aiming at too broad a user base and so their design is full of compromises. I’ve nearly forked winit twice over frankly trivial API design issues and I might still do it, but maybe that’s just ’cause I’m overly irritated by little things. I really just want something like GLFW that gives me a window, a single OpenGL context, and an event loop. I don’t need multiple OpenGL contexts, I don’t need buttery-smooth resizing ’cause how often do you actually resize a game’s window, I don’t need the most perfect close-to-the-metal event loop that has utterly zero overhead, but I do need an API that’s easy to get reasonable, consistent results out of. And so far getting it is like pulling teeth. :-(

So maybe I’ll devote myself to trying to fix some of these in the coming year. Writing foundational system compatibility libraries is a pretty thankless task, though. I’d much rather do unimportant things like make programming languages or play with network stuff.


So yeah, I’ve been rewriting ggez’s graphics backend in pure OpenGL, using the excellent glow crate to provide a modicum of abstraction between normal OpenGL and WebGL. It actually works pretty well, and also gives me a chance to experiment with a better drawing API as well that should make life simpler and more explicit. It will certainly undergo some iteration in design, so I’m not sure whether it will end up being ggez 0.6 or 1.0 or 2.0 or what yet. And it’s certainly not ready for prime time yet, though hopefully once I nail down some more fundamental capabilities ironed out then it will progress pretty quick.

Also, now that I’ve had my break and am getting back into the swing of things a little bit… I’d forgotten how awesome all the people who use ggez are. I talk to tons of random people on Discord doing cool stuff because of it, even when I’m not seeking them out. Despite having very little actual organized development effort on my part, 2019 saw nearly 100 pull requests submitted, almost all of them by someone new. Most of them were little things, but a few of the really interesting ones were things like a math optimization that is like a 100x increase in speed for all drawing in debug mode, a change that cuts the crate download side by 1/3 for those of us on slow internet, and an ocean of new projects, small improvements to docs and examples, and fixes to subtle (and not-so-subtle-but-not-well-exercised) bugs.

Thank you all, you guys make it worth it.

ggez roadmap

  • Release piles of bugfixes to version 0.5 – this may become version 0.6 if bugfixes need API breaking changes. I want to consider this the ultimate version with the current graphics engine.
  • Release the next version with the new graphics engine, based on glow. Currently this is the ggraphics branch in the ggez repo. This should function on Webassembly with minimum extra work. However, it will also completely change the graphics API, and I expect it to need a few breaking iterations before we come down on a setup that works well.
  • Carry on from there

Other goals I have for 2020

  • Make a heckin’ game for once
  • Work more with Cranelift and the Webassembly ecosystem
  • Work more with the SPIR-V ecosystem, maybe make Chrysanthemum be actually useful
  • Keep using crev
  • Maybe try a heavy refactor of cpal or winit to see what different designs would look like