Good morning and welcome to issue #1 of The Gravy, a newsletter about all things generative user interfaces and Hashbrown. Oh my gravy!

The Bite

Hashbrown v0.3 was recently released, and with it comes a new Ollama adapter package, @hashbrownai/ollama. If you haven’t used Ollama before, it’s a completely open-source stack for running open weight models. This unlocks the ability to use models like gpt-oss, Mistral, Meta’s Llama, and deepseek with Hashbrown, letting you go open-source the whole way down your generative UI stack.

To get started, first install Ollama and configure it to use your preferred open weight model. Advanced models capable of Hashbrown’s UI chat - like gpt-oss:120b - require a lot more memory than most of us have in our dev machines. If you don’t have the required hardware, check out Ollama’s Turbo offering to run open-weight models on their infrastructure. It’s pretty affordable (only $20/mo) and provides everything you need to run the largest of the open LLMs. And for what it’s worth, I was able to run smaller models like gpt-oss:20b on my machine.

Once you’ve got Ollama configured, you’ll need to set up an API endpoint for Hashbrown. Here’s an example using Express:

import { HashbrownOllama } from '@hashbrownai/ollama';

app.post('/chat', async (req, res) => {
  const stream = HashbrownOllama.stream.text({
    // Optional if you are using Ollama Turbo
    // turbo: { apiKey: process.env.OLLAMA_API_KEY! },
    request: req.body,
  });

  res.header(
    'Content-Type', 
    'application/octet-stream'
  );

  for await (const chunk of stream) {
    res.write(chunk); 
  }

  res.end();
});

Now you can use Hashbrown’s React hooks or Angular resources to consume your favorite open weight model:

uiChatResource({
  model: 'gpt-oss:120b',
  system: '...',
  components: [...],
})

These models, while extremely powerful, pale in comparison to hosted models from the big vendors. For example, you can get really far using gpt-4.1 without providing any examples. Not true for these open weight models! To make prompting easier, especially for UI chat, we’ve introduced a new prompt helper in @hashbrownai/core that lets you write UI examples in your system prompt. Hashbrown will validate the examples against your provided component schema, then down-level the example into the underlying JSON representation. You’ll need to provide models like gpt-oss:120b with examples using the new prompt helper to get reliable results out of the model:

prompt`
  You are a smart home assistant chatbot. You can answer   
  questions about and control lights and scenes.

  # Examples
  <user>What lights are in the bedroom?</user>
  <assistant>
    <tool-call>getLights</tool-call>
  </assistant>
  <assistant>
    <ui>
      <app-card title="Bedroom Lights">
        <app-light-list-item lightId="id-bedroom" />
        <app-light-list-item lightId="living" />
      </app-card>
    <ui>
  </assistant>
`;

We’d love your feedback on the new prompt helper! We have aspirations of extending this function to validate all parts of your system prompt, including tool calls, structured outputs, and assistant turns.

Sides

  1. Check out this article Mike shared from Geoffry on why UI’s need a HUD not just a copilot.

  2. Want to learn more about model context protocol (MCP)? Check out this great resource and tutorial to learn mcp for beginners from Microsoft.

  3. Apple is about to launch on-device large foundation language models. It’s probably going to be a gamer changer and we’re all for it. We hope future Hashbrown developers get to target models locally installed on users’ devices.

  4. Hashbrown released v0.3 a few days ago. In addition to Ollama support, we also dropped preview APIs for integrating with MCP servers.

  5. Minko Gechev (of the Angular team) recently previewed an experiment called AngularAI, which uses generative AI to help you understand, introspect, and debug your Angular apps.

Want to learn more about Hashbrown and building generative user interfaces? Join us for our online workshops kicking off in October. We cover AI fundamentals, using structured completions to simplify forms, building chatbots that stream user interfaces, and advanced generative UIs that leverage Hashbrown’s JavaScript Runtime.

We’ll be teaching a React version on October 13 and an Angular version on October 14. You can use the discount code THEGRAVY2025 to get 20% off your ticket. We’d love to have you there!

Keep Reading

No posts found