Building Web AI Apps With LLMs And Prompt API
Create web apps interacting with Large Language Models (LLMs) without backend servers or cloud costs using Prompt API & TypeScript libraries like `@types/node`.
In this blog post, I describe how to build a sentiment classification application locally using Chrome's Built-In Prompt API and Angular. The Angular application calls the Prompt API to create a language model and submits queries to Gemini Nano to classify positive or negative sentiment. The benefit of using Chrome's built-in AI is zero cost since the application uses the local models in Chrome Canary. This is the happy path when users use Chrome Dev or Chrome Canary. If users use non-Chrome or old Chrome browsers, a fallback implementation should be available, such as calling Gemma or Gemini...