The Web Dev Challenge
What if you had 30 minutes to plan and 4 hours to build a web app? Mike and I were invited by Jason Lengstorf of CodeTV to participate in this challenge, hosted by Postman. The challenge was to create personal software – the kind of app that you want to build for your own life but just don’t have the time to create. We were tasked with using Postman’s new MCP tooling for developing MCP servers.
How did it go? Well, let’s just cut to the chase - 4 hours is not much time to build a web app - even with all of the AI tools we have today. I am pretty sure I could spend 4 hours choosing just the right font and colors for my next a16z funded side project that I haven’t started and won’t ship, but hey, I own the domain!
What did we build?
Mike and I set out to build a Spotify playlist game app using Hashbrown, Model Context Protocol (MCP), Angular, and Postman. The idea is pretty simple: imagine you’re on a roadtrip with friends and want to play a game to fill your Spotify queue. There are lots of roadtrip games you could pick from, like “A-Z” where each player picks a song title from the next letter of the alphabet, or “Word Link” where the next song title has to share a word in common with the previous song title. What we wanted to build was an app that let the users define the rules of the game, and then have the app reassemble itself based on those rules. With Hashbrown and generative user interfaces, this is entirely achievable.
How did we build it?
First, we built a local MCP server that provided a few tools to the model: list devices connected to Spotify, search for tracks, and to add a song to the user’s queue. All of it was implemented using Spotify’s excellent Node SDK.
From there, we wanted to explore a new idea in generative user interfaces that we call prompts as inputs. It’s a strategy where you have one generative UI agent instantiate another generative UI agent by passing in a prompt to the child agent through a component’s inputs (or props in React land).
For our app, we created a top level agent that we called the Game Setup Agent. It’s responsible or collecting the rules of the Spotify game, connecting to a Spotify device, and determining who all is playing. It collaborates with the user to achieve its task by displaying different views to the user. For example, if it can connect to a Spotify device, great! If not, it displays a Spotify Device List to the user and lets them select a device to play the game on.
Once it has accomplished its mission and collected all of the game information, it then renders the Game Loop view, which itself has another agent. This agent is responsible for running the game loop; it renders player turns, song pickers, and in the future, scoreboards. What’s unique about Game Loop is that is prompted by the Game Setup Agent through an input:
export class GameLoop {
gameDescription = input.required<{
players: string[];
rules: string;
spotifyDeviceId: string;
}>();
}
The Game Setup Agent can render the GameLoop
component via Hashbrown’s exposeComponent
API:
exposeComponent(GameLoopComponent, {
description: `
Once everything is configured, use this component
to start the game loop
`,
input: {
gameDescription: s.object('Description of the game', {
players: s.array(
'The players playing the game',
s.string('The player name'),
),
rules: s.string(`
The public rules of the game.
`),
spotifyDeviceId: s.string('The Spotify device ID'),
}),
},
})
What this means in practice is that the top level agent and the child agent are connected through component inputs. This pattern allows developers to scope LLMs down into small, achievable tasks, while still affording collaboration. In our experience, using LLMs is a lot like writing a good function: the smaller, the better! By constraining instructions and inputs, the app worked surprisingly well for a variety of game rules that we tried out.
If that all sounds like mumbo jumbo - we get it! We’ve spent years listening and learning to build with AI models, and we’ve put our learnings into our new workshops.
Ok, grab your popcorn, a friend, and enjoy the show!
GPTy Dev Mode
This week OpenAI dropped developer mode for ChatGPT - or as we like to say “gippity”. It’s being rolled out to Pro and Plus accounts. The purpose is to help AI engineers build MCP connectors for gippity. It seems the OpenAI folks are starting to lean into MCP. Check out the docs.
AngularAI
Minko dropped a sweet video that shows off a browser-first AI sidekick that actually knows your Angular app, your code, your config, even your runtime state. You can ask it “what are the routes in my app?”, or even, “create a new page in my app”. This could be a big improvement in the DX for Angular devs. Right now, it’s just a cool demo from a cool dude. But, it does give us some insight into the what the future of dev tooling might look like with in-browser AI agents.
Sides
This guy reversed engineered GPT’s memory system. It’s a fun read, and it’s also super fun (and a bit creepy) to see just how good the team behind ChatGPT inject context into each conversation you have with the model that is enriched from past conversations. There is a lot to learn here, and perhaps, a bit to learn about ourselves.
Taco Bell got pwned. We are hard at work on adding the audio modality to Hashbrown. Mike is deep into the architectural weeds on bringing text-to-speech and speech-to-text to web apps through the same adapter pattern you’ve come to expect from us. With that in mind, we couldn’t help but smile at the story of a Taco Bell customer ordering 1,000 cups of water through Taco Bell’s voice agent.
I think we can all agree we are in a hype cycle with AI. And, with any hype we can get forget to bring along some learnings from the past into the future. We might be doing that with MCP.
Workshops

Want to learn more about Hashbrown and building generative user interfaces? Join us for our online workshops kicking off in October. We cover AI fundamentals, using structured completions to simplify forms, building chatbots that stream user interfaces, and advanced generative UIs that leverage Hashbrown’s JavaScript Runtime.
We’ll be teaching a React version on October 13 and an Angular version on October 14. You can use the discount code THEGRAVY2025
to get 20% off your ticket. We’d love to have you there!