Mona Tweeta

Wednesday 20 May 2009This is over 15 years old. Be careful.

Mario Klingemann undertook a challenge: to somehow represent an image in a tweet. Not by pasting a URL to the image, but by actually cramming the data into 140 characters. The result is better than you think, and involved all sorts of technical craftiness.

First, he used Chinese characters, to get 210 bytes of data into 140 characters. Then he used clever color and point coordinate representations to optimize the use of the handful of bytes available.

One of my first jobs was at General Computer Corporation, which made games for the Atari 2600. The 2600 system had only 256 bytes of RAM, so the designers of games had to cram all of the mutable data, including the program stack, into 256 bytes. When it was time to plan a new feature for a game, the developers would bring out the memory map, a sheet of paper detailing what each bit of those 256 bytes did, and find a bit that was unused.

And now here we are on multi-gigabyte machines, trying to squeeze something interesting into 210 bytes!

It seems every time we think technology has moved beyond some irritating limit, new tech comes in and re-introduces it to us. No more 8-bit video, unless you’re on a VNC connection. Screens are big, unless you’re in a phone. Network is always connected, unless you’re in a phone. Actually, most of the limitations come back once you’re in a phone.

Or on twitter!

Comments

[gravatar]
The exercise to get a picture into twitter is academically interesting, even if the resulting "picture" is pretty limited.

Still, I like your point about how there are always the same old limitations that just keep coming back to bite us. In our system, we've been tweaking bits and bytes as well.... in this case it's because we need to support millions of records on client software, and a byte per record really starts to add up. It's always the same thing all over again.
[gravatar]
Clayton Christensen has phrased a related idea to this that, for any given technology, the technology advances at a pace faster than consumers can absorb it and take advantage of it. I recently bought a used GameBoy from late 80's era for like 2 dollars, that maybe has higher specs than an Atari 2600(?). I just read today that game graphics will become "perfectly graphically real" in our lifetimes http://www.gamasutra.com/php-bin/news_index.php?story=23742

Add a comment:

Ignore this:
Leave this empty:
Name is required. Either email or web are required. Email won't be displayed and I won't spam you. Your web site won't be indexed by search engines.
Don't put anything here:
Leave this empty:
Comment text is Markdown.