The Fastest GIF Does Not Exist
It seems the reason for pushing values like 10ms back up to 100ms originates from a requirement to emulate the slowness of Netscape. Qt and Firefox source code comments both want to reduce CPU usage but since a value of 20ms is supported, I think they should just be clamping the value to 20ms instead. Or, don’t clamp the value at all! Modern browsers already render 20ms GIF frames just fine, and I’m not sure the “computers are too slow” argument holds up 30 years later.
Open To Conversion
Around this time 30 years ago, two separate working groups were putting the finishing touches on technical standards that would come to reshape the way people observed the world. One technical standard reshaped the way that people used an important piece of office equipment at the time: the fax machine. The other would basically reshape just about everything else, becoming the de facto way that high-quality images and low-quality memes alike are shared on the internet and in professional settings. They took two divergent paths, but they came from the same place: The world of compression standards. The average person has no idea what JBIG, the compression standard most fax machines use, is—but they’ve most assuredly heard about JPEG, which was first publicly released in 1992. The JPEG format is awesome and culture-defining, but this is Tedium, and I am of course more interested in the no-name formats of the world. Today’s Tedium discusses 10 image formats that time forgot. Hope you have the right conversion tool.
An Exploration of JSON Interoperability Vulnerabilities
The same JSON document can be parsed with different values across microservices, leading to a variety of potential security risks. If you prefer a hands-on approach, try the labs and when they scare you, come back and read on.
How to Abuse and Fix Authenticated Encryption Without Key Commitment
Authenticated encryption (AE) is used in a wide variety of applications, potentially in settings for which it was not originally designed. Recent research tries to understand what happens when AE is not used as prescribed by its designers. A question given relatively little attention is whether an AE scheme guarantees “key commitment’’: ciphertext should decrypt to a valid plaintext only under the key that was used to generate the ciphertext. As key commitment is not part of AE’s design goal, AE schemes in general do not satisfy it. Nevertheless, one would not expect this seemingly obscure property to have much impact on the security of actual products. In reality, however, products do rely on key commitment. We discuss three recent applications where missing key commitment is exploitable in practice. We provide proof-of-concept attacks via a tool that constructs AES-GCM ciphertext which can be decrypted to two plaintexts valid under a wide variety of file formats, such as PDF, Windows executables, and DICOM. Finally we discuss two solutions to add key commitment to AE schemes which have not been analyzed in the literature: one is a generic approach that adds an explicit key commitment scheme to the AE scheme, and the other is a simple fix which works for AE schemes like AES-GCM and ChaCha20Poly1305, but requires separate analysis for each scheme.
Reconstruct Instead of Validating
What I want to focus on is (2), because it’s a lesson we learned the hard way in cryptography and didn’t transfer effectively to the rest of security engineering.
One of my favorite cryptographic attacks is the Bleichenbacher‘06 signature forgery. I wrote up how it works when I found it in python-rsa, so again go read that, but here’s a tl;dr. When you verify an RSA PKCS#1 v1.5 signature, you get a ASN.1 DER structure wrapping the message hash that you need to check. If you don’t parse it strictly, for example by allowing extra fields or trailing bytes, an attacker can fake the signature. This was exploited countless times.
The lesson we learned was that instead of parsing the ASN.1 DER to extract the message hash, we should reconstruct the ASN.1 DER we’d expect to see, and then simply compare it byte-by-byte.
The same technique would have saved Vault.
DVD+R and DVD-R; What was that about?
A format war within a format...
A Warm Welcome to ASN.1 and DER
This document provides a gentle introduction to the data structures and formats that define the certificates used in HTTPS. It should be accessible to anyone with a little bit of computer science experience and a bit of familiarity with certificates.
Unintuitive JSON Parsing
Based on the title, could be just about anything...
The parser will not complain about leading zeros because JSON has no concept of leading zeros.
Unexpectedly a parse error, not a lex error.
Two New Tools that Tame the Treachery of Files
Parsing is hard, even when a file format is well specified. But when the specification is ambiguous, it leads to unintended and strange parser and interpreter behaviors that make file formats susceptible to security vulnerabilities. What if we could automatically generate a “safe” subset of any file format, along with an associated, verified parser? That’s our collective goal in Dr. Sergey Bratus’s DARPA SafeDocs program.
We’ve developed two new tools that take the pain out of parsing and make file formats safer:
PolyFile: A polyglot-aware file identification utility with manually instrumented parsers that can semantically label the bytes of a file hierarchically; and
PolyTracker: An automated instrumentation framework that efficiently tracks input file taint through the execution of a program.
How a double-free bug in WhatsApp turns to RCE
In this blog post, I’m going to share about a double-free vulnerability that I discovered in WhatsApp for Android, and how I turned it into an RCE.
Double-free vulnerability in DDGifSlurp in decoding.c in libpl_droidsonroids_gif
Position Independent Code (PIC) in shared libraries
This article explained what position independent code is, and how it helps create shared libraries with shareable read-only text sections. There are some tradeoffs when choosing between PIC and its alternative (load-time relocation), and the eventual outcome really depends on a lot of factors, like the CPU architecture on which the program is going to run.
How to make compressed file quines, step by step
Much of the credit goes to folks much smarter than myself (they will be introduced); this tutorial is meant to curate previous work and literature as much as it is for myself to educate you. The goal here is to allow for any curious, technically-minded newcomer to make sense of all the concepts involved in creating compression quines.
How (not) to sign a JSON object
This covers a lot of ground. I liked this quote, even though there’s much more to the post.
Canonicalization is a quagnet, which is a term of art in vulnerability research meaning quagmire and vulnerability magnet. You can tell it’s bad just by how hard it is to type ‘canonicalization’.
A better zip bomb
This article shows how to construct a non-recursive zip bomb that achieves a high compression ratio by overlapping files inside the zip container. “Non-recursive” means that it does not rely on a decompressor’s recursively unpacking zip files nested within zip files: it expands fully after a single round of decompression. The output size increases quadratically in the input size, reaching a compression ratio of over 28 million (10 MB → 281 TB) at the limits of the zip format. Even greater expansion is possible using 64-bit extensions. The construction uses only the most common compression algorithm, DEFLATE, and is compatible with most zip parsers.
Unraveling the JPEG
JPEG images are everywhere in our digital lives, but behind the veil of familiarity lie algorithms that remove details that are imperceptible to the human eye. This produces the highest visual quality with the smallest file size—but what does that look like? Let’s see what our eyes can’t see!
This article is about how to decode a JPEG image. In other words, it’s about what it takes to convert the compressed data stored on your computer to the image that appears on the screen. It’s worth learning about not just because it’s important to understand the technology we all use everyday, but also because, as we unravel the layers of compression, we learn a bit about perception and vision, and about what details our eyes are most sensitive to. It’s also just a lot of fun to play with images this way.
ufs - Expand time_t support to 48 bits
Fix time overflow issues in the original 32-bit UFS code in two ways. First, treat the original 32-bit seconds fields as unsigned.Second, utilize the spare fields to expand these fields to 48 bits each. Retain the nanosecond-grain accuracy of the nsec fields.
The Road to OCIv2 Images: What's Wrong with Tar?
My first instinct was to title this article “Tar Considered Harmful”, but I had a feeling that the peanut gallery would cringe at such a title. However, this article is very much a foundational discussion of tar and how it fundamentally fails in the use-case of container images (which will outline what we really want from a container image format). There have been some other articles that touch on the issues I go over here, but I hope I can provide a more cohesive insight into the issues with tar. Then again, some folks believe that tar is the best format for container images. I hope this first article will serve as a decent rebuttal.
Now, don’t misunderstand what I’m saying – my point here is not “it’s old, so it’s bad.” tar is the obvious choice for an archive format, due to its long history and ubiquity, and writing a custom format with no justification would be a borderline reckless thing to do. However, tar‘s history is important to understanding how it got to the state it’s in today. This section will be quite long-winded (there’s forty-something years of history to distil into a single blog post), but you can skip to the end.
More than you really wanted to know about patch
Back in the 1980’s somebody invented diff -u (“unified diff format“) as a more human readable alternative o the <old >new lines format you get without the -u, and then Larry Wall whipped up a program to reverse the process and use saved diff -u output to modify a file (which was mind-blowing at the time). As far as I can tell the format wasn’t really meant for that, and was made to work with heuristics and hitting it with a rock, but Larry _did_ go on to invent Perl...
TAR versus Portability
There is no standard version of the tar program.