Chat interfaces with large language models (LLMs) like ChatGPT, Gemini, and Claude have evolved significantly in the last three years, with increasingly complex interfaces and features.
The distinction between programming and prompt-based workflows is becoming less clear as new tools are developed.
Tools like N8N and Lindy have made it easy for non-coders to build LLM-powered app workflows.
Google Labs is a product-focused team responsible for innovations like Notebook LM, which started as a document-based retrieval system and gained popularity with podcasting features.
The latest product from Google Labs, currently in public preview and US-only (with potential VPN workarounds), is Opal — a tool designed to help users build their own LLM workflows without deep technical knowledge.
Opal enables users to chain prompts, leverage various AI models, and prototype or build quick mini-apps for personal tasks.
Opal lets users remix existing workflow templates or create new ones from scratch.
In the blog post generator example, a user enters a topic, then Opal performs web research, writes an outline and the post, and generates a banner image.
The workflow is visualized with steps and utilizes models like Gemini 2.5 Flash for research, and Gemini 2.0 Flash for writing.
Users can access a console to see which models and prompts are used at each step.
Outputs like final blog posts can be previewed, with varying quality in writing depending on the run.
Opal allows detailed editing of each workflow step, including changing which AI model is used for specific tasks like image generation.
Users can introduce new prompts, inputs, or models, such as switching to Image Gen 4 for improved banner images.
New user inputs (like intended reader persona) can be added and routed to relevant workflow steps for further customization.
The example demonstrates generating a blog post tailored for an IT worker interested in automation, with custom inputs flowing through the entire process.
Opal is positioned as an accessible platform for building LLM and generative AI workflows, similar to existing tools but with Google ecosystem integrations.
It is still in preview, only in the US, but potentially accessible via VPN.
Opens up possibilities for users to rapidly prototype, automate, or build full-featured mini-apps powered by AI.
Workflows and prompts built in Opal can also serve as the starting point for further development using code.
Expected to improve rapidly through user feedback, similar to how Notebook LM evolved.
The presenter suggests future content may cover triggered agents and automation, and highlights ongoing rapid adoption of such tools even among developer teams.