Farcaster Frames was released to much fanfare yesterday, and on the surface the deceptively simple idea has already yielded incredible creativity, with everything from MUDs, to polls, to NFT mints. And the number of folks ideating over the potential was even larger. Then along came this post:

Challenge accepted.

So naturally of course, people learned two things very quickly:
Frames can be animated
Q has a metavm project
And boy did that generate some questions. So what I'm setting out to do in this article is circle back on my post from three days ago, where I spoke about showing what the future of computing and crypto will look like very soon. For those who got to try the demo out before the VPS host piping the command data over decided to nerf the vCPU and not respond to tickets (Big shout out to Vultr, you're a garbage service), congrats – you just got your first taste of that future I was describing.
So in this article, I'm going to describe how it works, and how I was able to build it in only two hours.
Farcaster Frames use meta tags in a style very similar to opengraph, enhancing it with custom button texts and submission urls that can be signed by users on Farcaster, presenting a basic loop:

That frame data includes an image url to display, which is expectedly rendered in an <img> tag. But browsers support a lot of image formats, and so I remembered an old trick used by webcams back in the day: MJPEG. With MJPEG, you can hang onto a request indefinitely until the user closes it, and send back raw JPEG images one by one. This meant I didn't need special video support, I could just use an endpoint that returned MJPEG as a response for the frame image url.
Now I just needed to test it, so I built a simple Next.js API handler that kept an open connection and streamed to all active connections the same frame as it was rendered. I started simple, and just made a setInterval loop that loaded six images from local fs and sent them every 500ms:
import type { NextApiRequest, NextApiResponse } from 'next';
import { join } from 'path';
import { createCanvas } from "canvas";
let frame = 0;
const clients: any[] = [];
export const images = [
fs.readFileSync(join(process.cwd(), 'images/image1.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image2.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image3.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image4.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image5.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image6.jpg')),
];
export default async function handler(this: any, req: NextApiRequest, res: NextApiResponse) {
var headers: any = {};
var multipart = '--mjpeg';
headers['Cache-Control'] = 'private, no-cache, no-store, max-age=0';
headers['Content-Type'] = 'multipart/x-mixed-replace; boundary="' + multipart + '"';
headers.Connection = 'close';
headers.Pragma = 'no-cache';
res.writeHead(200, headers);
const ref = {
mjpegwrite: (buffer: any) => {
res.write('--' + multipart + '\r\n', 'ascii');
res.write('Content-Type: image/jpeg\r\n');
res.write('Content-Length: ' + buffer.length + '\r\n');
res.write('\r\n', 'ascii');
res.write(buffer, 'binary');
res.write('\r\n', 'ascii');
},
mjpegend: () => {
res.end();
},
};
var close = function() {
var index = clients.indexOf(ref);
if (index !== -1) {
clients[index] = null;
clients.splice(index, 1);
}
};
res.on('finish', close);
res.on('close', close);
res.on('error', close);
clients.push(ref);
}
export const mjpegsend = (buffer: any) => {
for (var client of clients)
client.mjpegwrite(buffer);
};
setInterval(() => {
mjpegsend(frames[frame]);
frame++;
frame %= 6;
}, 100);
Loading the API handler url in the browser worked swimmingly. Ok, great, so then the next challenge: make it run Doom.
Luckily for me, I already solved this problem in the research around Quilibrium. One of the advanced features that will be launched later this year is the metaVM, which translates instruction set architectures into an executable format usable by the network, along with many other important components to support a fully functioning VM. The metaVM supports a basic framebuffer device which is IO mapped to RAM at a specific location. The VM translates to a choice of execution calls: durable – on the hypergraph, and therefore somewhat slower, or ephemeral – does not store execution state, and is merely piped over. Supporting keyboard and mouse inputs work similarly with hardware interrupts. Finally, the file system itself is fulfilled with a virtio-9p compatible application, which translates R/W requests for inodes into hypergraph calls. Together, you get a fully distributed virtual machine with optional durability at multiple levels. This, despite sounding rather complicated, is quite simple to implement on Q, and looks a lot like a traditional emulator when you dive into the code.
So then the remaining tasks became only the following remaining items:
Handle inputs from the buttons and send them back as key down/key up events
Build the framebuffer worker and kick it off on start of the Frame server
Convert the framebuffer data into JPEGs
Deploy a filesystem map compatible with the metaVM virtio-9p driver with Linux with Doom to the hypergraph
Execution state updates with the RPC client can be streamed directly from metaVM, so we carved it out to the section of RAM containing the framebuffer, and directly invoked the interrupts for keyboard inputs. That's the first two down.
Node doesn't have a clearcut way to quickly convert buffers to JPEGs, and I wanted to hack this together quickly, so I used node-canvas to serve as the render target for the raw image data, then used canvas.toBuffer('image/jpeg') to create the image. Publishing the buffer data over the worker, the message handler on the API side then only needs to directly call the mjpegsend(buffer) method defined above. Next one down.
For the last one, I had a bit of a cheat here, in that I already built this filesystem map a while ago to demo metaVM (hi friends on Unlonely!) and the QConsole. That's all the work needed done.
Architecturally, the frame integration then looks like this:

And there you have it: Doom on Frames.
IMPORTANT ANNOUNCEMENT: We can play @cassie's DOOM again! It seems MJPEG streams started working again - maybe when WC switched to CF for image proxying? What's next? ScummVM port? Remote controlled robots/webcam streams? Tx @samuellhuber.eth for making me doublecheck this
How it works: https://paragraph.xyz/@quilibrium.com/doom-on-frames
Turns out it only works on web, not on mobile 😭
hell yeah
👀
For quite a while, we were able to upload gifs up to 15mb. It seems within the last week this has changed back to 10mb. I’ve tested this with multiple gifs previously worked.
ScummVM already works, it's just really limited because you only get four buttons
Yeah I figure we can only do the early games, right - 4x cursor keys and text entry (any text input + button == enter). I think that was the "AGI" engine.
got a game in mind?
ScummVM 👌
@remindbot 3 days
new with frames, what are some of the best ones I should try out?
building this onboarding one plz 88 $degen https://warpcast.com/jpfraneto/0x60320267
Find.farcaster.info
save real orcas from this frame: (tips work too but don't tip my frame here, tip the @wavewarriors one!). https://donate.framesframes.xyz/api?frameId=1
$BORED made a love letter to Oregon Trail with Boregon Trail! https://boredbored.framesframes.xyz/api
https://warpcast.com/onefootballclub/0xaf402131
Yo @blockheim, try out Base Name Service Frame App. https://frame.basename.app/api
If in the mood for degeneracy https://warpcast.com/tybb/0x9ee1b16a
We categorize more all popular Frames here: https://www.degen.game/frames/featured
oh wow, this is a great starting point. Thank you!
Love this 1 $degen
You could swap tokens directly on arbitrum https://warpcast.com/complexlity/0x06011240
Create an e2e verifiable Frame poll with Farcaster.vote!
We're back! https://warpcast.com/cassie/0xc329ea28 https://frame.quilibrium.com/polls/1
whoops, forgot to update the endpoint update matching, one sec
the improved framerate is revealing that we need an action button, @v pls let us have more buttons 🥺
Reminder for folks who missed it the first time around: this does not work on mobile, visit it on web
💪⚡️
yes! i finally got to see it in action (o.o) amazing!
ruh-roh
yeah sorry about that, fixed now
👌
It's wild to me that you can stream a live video feed to an <img> tag by using the MJPEG file format.
my favorite frame so far tbh
Of all the frame experiments that I've seen, this is the one I'm still thinking about days later. Would love to hear/read a full run down of how you felt this experience went, as well as the edge cases you came across in the process. Its a huge learning experience for many & your insight has a lot of value Cassie
Already wrote about it 🫡 https://paragraph.xyz/@quilibrium.com/doom-on-frames
insightful as always🙏
@launch Gaming onframe
You scouted @cassie’s launch! https://www.launchcaster.xyz/p/65b74e5d2cccdd9d36ea826c
me: wow, I made the picture show a little flag! @cassie: https://paragraph.xyz/@quilibrium.com/doom-on-frames
me: guess the only way to solve this one is to pay $900 to Vercel @cassie:
Built different
NGL this workflow chart is way underrated alpha. 69 $degen
333 $DEGEN
How did I put Doom on Farcaster Frames with only two hours of work? I told you I'd show you what the future of crypto looks like very soon, this was just the teaser of what abandoning the blockchain looks like. https://paragraph.xyz/@quilibrium.com/doom-on-frames
This is incredible
I was today years old when I learned about the MJPEG image format. Pretty cool! https://en.wikipedia.org/wiki/Motion_JPEG
tip 69 $degen to @cassie heck ya
This is amazing! I just discussed with with my teammate and I think we can implement something like this. Any help with getting into Qulibrium or source code would be amazing. I am going to take a look at the docs. 10 $degen!
Gray read 500 $degen
amazing, what are the limitations here, though? how many image frames can I send? or is there a timeout?
hey loved what you did there, thanks for sharing! dumb quenstion, was playing with your code sample, what the client side looks like within nextjs? using the api route as img source within nextjs make it interpret it as html and 404
nevermind fixed it, had nothing to do with client side, had to update route code
Amazing.
Oh so you were the dev who did this. Awesome ty for this write up!
TIP unlimited $DEGEN to Cassie