Training & how to use Local AI

Get yourself or your team confident with the stack—Open WebUI, Obsidian, and local RAG. Personal use or team rollout; we cover the basics and best practices.

Personal and team training

Personal use

One person, one machine—your Mac or Linux box running Ollama and Open WebUI. Training focuses on: picking models, writing good prompts, using Obsidian as your knowledge base, and when to lean on RAG vs plain chat.

  • First-time setup and model selection
  • Chat best practices and prompt patterns
  • Obsidian + Smart Connections + Copilot
  • Adding docs to RAG and asking questions with sources

Team rollout

Several users sharing one Local AI host (e.g. Mac Mini or Linux server). Training covers: logging in to Open WebUI, roles and permissions, shared vs personal chats, and how to use the organisation’s RAG and Obsidian vaults.

  • Access, roles, and safe use on a shared system
  • Shared knowledge bases and document workflows
  • Consistent prompts and use-case templates
  • Who to contact for model or RAG updates

How to use Open WebUI

Open WebUI is your browser front-end for Local AI. You pick a model, type a message, and get a reply—all without leaving your machine. Training includes:

  • Choosing the right model for the task (reasoning vs quick Q&A)
  • Starting a chat, continuing threads, and when to start a new one
  • Using RAG: selecting a knowledge base and asking questions that use your documents
  • Understanding citations and how to check sources
  • System prompts and persona-style instructions (if enabled by your admin)

How to use Obsidian with Local AI

Obsidian holds your notes and knowledge; plugins connect them to your local Llama model. Training covers:

  • Opening the AI sidebar (Copilot or similar) and when to use it while writing
  • Smart Connections: seeing related notes and asking the AI about links between them
  • Text generation: summaries, expansions, and rewrites from your own wording
  • Keeping sensitive content in your vault and never sending it to the cloud

Best practices we teach

  • Prompt clearly: One clear question or instruction per message; add context when the AI needs it.
  • Use RAG when it matters: For policy, docs, or project knowledge, point the chat at the right knowledge base so answers are grounded in your data.
  • Check citations: When the AI quotes a source, skim the original to confirm it’s correct.
  • Respect roles: On shared systems, stick to your permissions and don’t change model or RAG settings unless you’re allowed.

Training we offer

We run sessions tailored to your setup: personal (one user) or team (several users on one Local AI host). Typically a mix of live walkthrough and Q&A. We can also leave short written guides or short videos for your team to reuse.

Request training