What Remains Technical Breakdown
> What Remains is a narrative adventure game for the 8-bit NES video game console, and was released in March 2019 as a free ROM, playable in emulator. It was created by a small team, Iodine Dynamics, over the course of two years of on and off development. It’s currently in the hardware phase as a limited batch of cartridges are being created from all recycled parts.
> The game plays out over 6 stages, wherein the player walks around multiple scenes with 4-way scrolling maps, speaking to NPCs, collecting clues, learning about their world, playing mini-games, and solving simple puzzles. As the primary engineer on this project, I faced a lot of challenges in bringing the team’s vision to reality. Given the significant restrains of the NES hardware, making any game is difficult enough, let alone one with as much content as What Remains. Only by creating useful subsystems to hide and manage this complexity were we able to work as a team to complete the game.
> Herein is a technical breakdown of some of the pieces that make up our game’s engine, in the hopes that others find it useful or at least interesting to read about.
> Nearly all retro game systems generate colors in some variant of RGB encoding. But the raw pixel colors are often designed for very different screens than those that emulators typically run on. In this article, I’ll walk through the importance of color emulation, and provide some example code and screenshots.
The history of Tetris randomizers
> In Tetris, a randomizer is a function which returns a randomly chosen piece. Over the years, the rules of how pieces are chosen has evolved, affecting gameplay and actual randomness.
> Several of them have been reversed engineered and documented. I’ve curated a list of ones that I believed to be important and show how the state of Tetris has changed over the years.
Spacewar - Fanatic Life and Symbolic Death Among the Computer Bums
> 7 December 1972
An account of the first computer game tournament.
> The trend owes its health to an odd array of influences: The youthful fervor and firm dis-Establishmentarianism of the freaks who design computer science; an astonishingly enlightened research program from the very top of the Defense Department; an unexpected market-Banking movement by the manufacturers of small calculating machines, and an irrepressible midnight phenomenon known as Spacewar.
> Reliably, at any nighttime moment (i.e. non-business hours) in North America hundreds of computer technicians are effectively out of their bodies, locked in life-or-death space combat computer-projected onto cathode ray tube display screens, for hours at a time, ruining their eyes, numbing their fingers in frenzied mashing of control buttons, joyously slaying their friend and wasting their employers’ valuable computer time. Something basic is going on.
Plus the beginnings of Xerox PARC.
> “You get just a few more agates in that group and you’ll have all the marbles.”
> The chief marble collector is - well, well - Bob Taylor. When he left the newly restricted ARPA he spent a year at Utah decompressing from the Pentagon and then went to Xerox and there continued his practice of finding and rewarding good men for doing pretty much whatever they considered important work. Freedom to explore in the company of talent is an irresistible lure. In two years Xerox had twenty of the best men around working. Toward what? Well, whatever.
A followup from 2016: https://www.rollingstone.com/culture/culture-news/stewart-brand-recalls-first-spacewar-video-game-tournament-187669/
The 18-month fence hop, the six-day chair, and why video games are so hard to make
> Whether or not a player notices, appreciates, or is able to see these details, everything from a pen on a desk to a chair in a room has to be meticulously made, scrutinized, and tested. But at what cost? How does a developer decide how much time to allocate to set dressing a small room versus a game’s main character? How many polygons should an asset in the corner of a players eye get versus something directly in their face?
Banding in Games: A Noisy Rant
> If you use sRGB correctly, you’re doing pretty well - you will generally hardly notice banding (though dark areas remain)
> If you are not on a platform where it’s readily available, or you want to get rid of the last issues, the rest of this presentation is for you
Dithering. Lots of dithering.
The AI of GoldenEye 007
> GoldenEye 007: one of the most influential games of all time. A title that defined a generation of console gaming and paved the way forward for first-person shooters in the console market. In this article I’m winding the clock back over 20 years to learn the secrets of how one of the Nintendo 64’s most beloved titles built friendly and enemy AI that is still held in high regard today.
Zelda Screen Transitions are Undefined Behaviour
> The vertical scrolling effect in the original “The Legend of Zelda” relies on manipulating the NES graphics hardware in a manor likely that was unintended by its designers.
Adventures In Interactivity
> That book was Creating Adventure Games on Your Computer by Tim Hartnell. The book taught me how to make rudimentary text adventure games on my Apple ][ as a kid and prompted a recent adventure of revisiting the classic text adventures of the past. So grab a torch and get your map making tools ready because today’s Tedium is an exploration of text adventures through the years. Try not to get eaten by a grue along the way.
Games and Graphics in Popup URL bars
> When I animated the URL bar with emojis I mentioned that I’d like to take it to the next level by putting a teeny game inside the URL bar. Well... Some really fine folks beat me to that. But I still wanted to give it a go ! I just needed to come up with something FRESH to work into it...
> So while thinking about how I could expand beyond the 1-dimensional movement of a URL bar, it came to me... Popups ! Yes, the bane of early 2000s internet will help me in 2019 achieve my emoji-url-bar-gaming dreams. By just opening a series of popups and overlapping them in a column we create a 2-dimensional display of sorts:
Down Goes ‘Jeopardy!’ James!
> In Jeopardy James’s case, the dominance was pretty fascinating. Holzhauer was a disruptive force who had turned the game upside down with all-in, aggressive tactics. The quants were losing their minds. There was genuine rumination over whether or not Holzhauer “broke” the game. Would Holzhauer actually be defeated—or would he have to be dragged off the set by “Jeopardy!” producers?
Quake II gets free real-time raytracing updates on June 6
> Windows and Linux users will be able to download the first three levels of the graphically updated game as shareware starting at 6am Pacific Time on June 6. You can play the remaining levels and multiplayer if you point the installer to a legit copy of the full game on your hard drive. The source code for the Vulkan-based update will be posted on Github as well, though Quake II expansion packs will not be supported without extra effort from the community.
Playing with model trains and calling it graph theory
> You’ve probably played with model trains, for instance with something like the Brio set shown below.1 And if you’ve built a layout with a model train set, you may well have wondered: is it possible for my train to use all the parts of my track?
> Sandspiel is a falling sand game I built in late 2018. I really enjoyed writing this game, and wanted to put into writing some of my goals, design decisions, and learnings from that process.
> A quine that plays snake over its own source!
Anti-Ghosting with Temporal Anti-Aliasing
> We decided on TAA for The Grand Tour Game because it tends to produce a softer, more photorealistic image in both static and moving scenes. FXAA (Fast Approximate Anti-Aliasing) and SMAA (Subpixel Morphological Anti-Aliasing) work well for static scenes, but still produce artifacts for moving scenes. Lumberyard’s deferred lighting pipeline does not support MSAA (Multisample Anti-Aliasing). Like MSAA, TAA uses multiple samples per pixel to provide anti-aliasing. The difference is that with temporal anti-aliasing, the samples are spread across multiple frames. It uses a frame history buffer and a per-pixel velocity buffer to reproject each pixel to gather the additional sample. For each pixel, we use the per-pixel velocity as an offset, as well as the previous frame’s view projection matrix, to determine where to query the frame history buffer. Modifying the camera’s projection matrix with a sub-pixel jitter each frame allows us to produce anti-aliased results even in scenes where there is no camera motion. With fast rotation or linear motion, the history pixel (the sample retrieved from the frame history buffer after pixel reprojection) may correspond to a location with vastly different lighting conditions or to an entirely separate object. This history mismatch, if unaddressed, causes severe ghosting, as shown below.
The Sound Of Nostalgia
> The Sega Genesis, with its “Blast Processing” and blue mascot (who is getting a questionable movie makeover in the coming months), stood out for a lot of reasons, but one of the most subtle is something that it contained in at least one of its variants that not a lot of its competitors did—a headphone jack that could produce stereo sound. In a way, it was a nod to its sound chips, which were some of the best to be found on a video game console at the time and had more in common with the era’s sound synthesizers. Reliving those sounds in their best form hasn’t been easy in the modern day, however, due to challenges in emulating the console correctly. However, a challenger appears: The Analogue Mega Sg, a field programmable gate array (FPGA)-based console aims to recreate the experience. Today’s Tedium is a review of that console—and a little backstory on the biggest problem it tries to solve.
Reverse emulating the NES!
To make 1997’s Blade Runner, Westwood first had to create the universe
> Castle’s team faced a considerable number of challenges in bringing the cinematic world of Blade Runner to life using the technologies of the day, most of which stemmed from having to invent, from whole cloth, a way to seamlessly mesh their pre-rendered world with animated voxel characters (it turned out to be vastly more complicated than simply sticking a sprite in front of the background). Tackling this issue introduced an entire interconnected tapestry of difficult problems to solve, very few of which are faced by modern developers who can pick from ready-made game engines to license and use.
> Q2VKPT is the first playable game that is entirely raytraced and efficiently simulates fully dynamic lighting in real-time, with the same modern techniques as used in the movie industry (see Disney’s practical guide to path tracing). The recent release of GPUs with raytracing capabilities has opened up entirely new possibilities for the future of game graphics, yet making good use of raytracing is non-trivial. While some games have started to explore improvements in shadow and reflection rendering, Q2VKPT is the first project to implement an efficient unified solution for all types of light transport: direct, scattered, and reflected light (see media). This kind of unification has led to a dramatic increase in both flexibility and productivity in the movie industry. The chance to have the same development in games promises a similar increase in visual fidelity and realism for game graphics in the coming years.
> This project is meant to serve as a proof-of-concept for computer graphics research and the game industry alike, and to give enthusiasts a glimpse into the potential future of game graphics. Besides the use of hardware-accelerated raytracing, Q2VKPT mainly gains its efficiency from an adaptive image filtering technique that intelligently tracks changes in the scene illumination to re-use as much information as possible from previous computations.