If you've ever run a workshop, you know the rhythm. The agenda says "we'll start with a hands-on exercise," everyone shuffles through prerequisites, and forty minutes later half the room is still wrestling with a Python version, a node_modules cache, or a corporate VPN that won't let docker pull finish. Some learners power through and get to the lesson anyway; others bail out before the workshop has really begun. Either way, you're spending a chunk of the session on environment friction instead of the thing the workshop is about.
Docker Labspaces are Docker's answer to that. A Labspace is a fully packaged, self-contained classroom: instructions on the left, a real VS Code in the browser on the right, and your application running underneath in real Docker containers. Learners launch it with one command, and the first thing they see is the lesson, not an install screen.
In this post we'll walk through what Labspaces are, what you get out of the box, the anatomy of a Labspace on disk, and the workflow I use to turn a normal docker compose project into a Labspace. Along the way I'll point at spurin/rest-labspace - a lab I built that teaches REST API fundamentals by letting learners click endpoints and watch a cartoon avatar react in real time.
Here's what one looks like in action - the rest-labspace running locally, with the tutorial pane on the left, the Character Lab controls in the centre, and the avatar reacting on the right:
.png?updatedAt=1778243636721?tr=w-1200%2Cf-auto%2Cdpr-2)
What are Docker Labspaces?
In Docker's own words, Labspaces "provide fully packaged learning labs, workshops, and demos." But the more useful way to think about one is physically: a Labspace is a Compose file, published to a registry as an OCI artifact, that carries the entire classroom inside it - the runtime services (the interface, the IDE, the configurator, the port republisher, the socket proxy) plus the lesson content itself, packaged alongside. There's no separate registry, no special bundle format - it's just Compose, all the way down.
Because a Labspace is just a Compose file, it runs on any Docker installation that supports docker compose - nothing in the artifact is tied to Docker Desktop specifically. What Docker Desktop does add is a catalogue. The Labspaces extension on the Marketplace gives you a browseable directory of ready-to-run labs and a one-click launch path.
The catalogue itself is curated openly at dockersamples/awesome-labspaces - a growing list covering Docker fundamentals, Compose, networking, AI applications, agentic apps, and plenty more. You don't even have to use the extension to consume one; any Labspace can be started directly from the command line:
docker compose -f oci://spurin/rest-labspace up
That single line pulls down everything - the workshop UI, the IDE, the application stack, and the lesson content - and stands it all up locally. Open http://localhost:3030 in a browser and you're inside the lab.
The mental model I find useful: a Labspace is an OCI artifact that boots a self-contained classroom. Real VS Code on one side, real Docker underneath, lessons rendered next to it - all of it shipped together as a single versioned thing.
If "OCI artifact" is a new term: think of it as the same plumbing Docker Hub uses to store and distribute container images, but generalised to ship any bundle of files. OCI stands for the Open Container Initiative - the open governance body Docker co-founded in 2015 (alongside CoreOS, Google, Microsoft, Red Hat and others) and donated the original container image and runtime specifications to. So when you launch a Labspace with docker compose -f oci://... you're not crossing into someone else's ecosystem; you're using the standards Docker itself helped define, applied to a new kind of payload. A Labspace isn't a container image - it's a bundle of compose files, lesson markdown and metadata - but Docker Hub stores it, versions it, and pulls it down on demand using exactly the same machinery.
What you get out of the box
When a Labspace boots, your learner lands inside a window that looks like a small IDE-shaped operating system. They didn't have to install any of it. Here's what's running:
- A split-screen UI. The lesson markdown is rendered on the left as a tutorial pane with Next/Previous navigation. The right side is the IDE.
- VS Code in the browser. Built on top of
coder/code-server, pre-configured with Docker tooling and the project repository already cloned in. - A built-in terminal.
View → Terminal(orCtrl+`) drops you into a shell with the project on disk and Docker reachable. - "Run" buttons on code fences. Any
bash,sh, orconsolecode block in the lesson markdown gets a small Run button attached. One click executes the command in the IDE terminal - so a learner stuck on a trickycurlinvocation doesn't have to copy-paste, they just press a button. - Service tabs. The
services:array in yourlabspace.yamldefines extra browser tabs that appear next to the IDE - one per app surface you want learners to flip to (Swagger UI, an admin console, your frontend). Click a tab and you're looking at your application in an embedded browser frame. - A host port republisher. A small helper that lives in the workspace's network namespace, watching containers labelled
labspace-resource: "true"and surfacing their published ports on the IDE'slocalhost. The upshot: a learner runningcurl http://localhost:8000from the IDE terminal Just Works, without host networking. (That same label is doing double duty elsewhere - it's what scopes the sandboxed Docker socket to your lab's resources, and what tells the cleanup service which containers, volumes and networks to remove when the lab shuts down. Tabs are not driven by the label - they come fromlabspace.yaml.) - A sandboxed Docker socket. The Labspace wraps the host's Docker socket in a Docker Socket Proxy that filters which containers, volumes and networks the lesson can see (using that
labspace-resourcelabel). Learners get a real Docker experience but can't escape into the host machine - which means you can confidently distribute a lab without worrying about it scribbling over a student's home directory. - Single-command launch. No git clone, no virtualenv, no
npm install, no "make sure you have Python 3.11 not 3.12."docker compose -f oci://author/lab upand you're in.
Add it up and you have something that looks suspiciously like a free, self-hosted, version-controlled GitHub Codespace - except your learners don't need a GitHub account, the environment is yours, and the only dependency is Docker.
Here's the IDE tab in use - real VS Code in the browser, a terminal showing curl output, and the lessons pane on the left with one-click Run buttons attached to every command:
.png?updatedAt=1778243758383?tr=w-1200%2Cf-auto%2Cdpr-2)
Who benefits, and how
Three audiences get something distinct from this:
Learners get the obvious win: zero setup. They click a thing, the lab opens, they read the lesson and run the commands. There's no toy emulator pretending to be Docker - it really is Docker, running real containers, with a real terminal. When the lesson says "publish a port and curl it," the port is genuinely published and the curl genuinely hits it.
Authors and educators get a complete classroom UI for free. The deal is: you write Markdown lessons and a Compose file. In return you get the split-screen interface, the embedded VS Code, the terminal, the tutorial navigation, the "Run" buttons, the service tabs, and a one-line distribution mechanism. You don't need to build any of that. You're not running a hosted platform. You're not maintaining infrastructure. You publish an OCI artifact and your work is done.
Teams and organisations get reproducibility. A Labspace pinned to a tag delivers the exact same workshop today as it did six months ago. New hires onboard against the same environment senior engineers learned on. Internal certifications can be backed by a deterministic, version-controlled lab. CI publishes new versions automatically when the source repo changes.
Anatomy of a Labspace
Here's what a Labspace looks like on disk. The shape is small and the moving parts are few:
my-labspace/ ├── compose.yaml # the dev/maintainer entry point - just `include:` the runtime + your override ├── compose.override.yaml # environment overrides for the Labspace (workspace image, AI models, your app services) ├── labspace/ │ ├── labspace.yaml # manifest: metadata, title, sections, service tabs │ ├── 00-welcome.md # tutorial pages, ordered by sections[] in labspace.yaml │ ├── 01-...md │ └── ... └── (your app sources) # backend/, frontend/ - cloned into the workspace at runtime
The two interesting files are the manifest and the override. The manifest tells the Labspace UI what to render and which tabs to surface; the override is where you customise the Labspace environment - swapping the workspace's base image, adding AI models or extra services, and dropping in your application's containers.
Here's the canonical minimal manifest, taken from dockersamples/labspace-starter:
metadata: id: ${REPO_OWNER}/${REPO_NAME} sourceRepo: github.com/${REPO_OWNER}/${REPO_NAME} contentVersion: abcd123 # Will be filled in during CI title: Labspace starter description: | A basic template for creating a Labspace, including all of the boilerplate needed to write and publish a Labspace. sections: - title: Introduction contentPath: 01-introduction.md - title: Main Content 1 contentPath: 02-main-content.md - title: Conclusion contentPath: 03-conclusion.md services: - id: app url: http://localhost:3000 title: App icon: anchor
Field by field:
metadata.id,metadata.sourceRepo,metadata.contentVersion- identity. Theidbecomes the OCI artifact name when you publish;contentVersionis typically a short commit SHA filled in by CI.titleanddescription- what shows up in the catalogue and the header of the lab.sections- an ordered list of(title, contentPath)pairs. EachcontentPathpoints at a markdown file in the same directory. The Labspace UI renders these as the Next/Previous lesson flow.services- extra tabs surfaced after the IDE tab. Each entry has anid, atitle, aurl(alwayshttp://localhost:<port>), and aniconchosen from Google Material Symbols. The IDE is always tab one; everything else appends.
Under the hood there's a small constellation of services keeping the lab running. The interface renders the split-screen UI - lessons on the left, VS Code on the right. The configurator populates the project volume at startup, in one of two modes: when you're authoring locally it clones your repo from PROJECT_CLONE_URL; when a learner launches a published Labspace, it instead extracts the base64-encoded tarball that ships embedded inside the Compose file. Either way the workspace (the VS Code container) mounts that volume so the project tree shows up in the editor. The host port republisher sits inside the workspace's network namespace, watches for containers labelled labspace-resource: "true", and surfaces their published ports on the IDE's localhost. The Docker socket proxy sandboxes the rest, filtering the socket so the lab only sees its own resources. None of this is something you have to build or configure - it ships with the runtime.
Build with Compose first, then fit it into Labspaces
The single most useful thing to internalise about Labspaces is this: a Labspace is simply a specially crafted docker compose project. It is not a separate framework, not a special build system, not a different runtime - it's a Compose file that pulls in some extra services and ships its lesson content alongside. If your app already runs under Compose, you're already 80% of the way to a Labspace.
My recommendation: build and focus on a working application first, then transition it into a Labspace. Don't try to design for the Labspace from day one. Get the app right under plain Compose - the loop where you iterate on code, ports, environment variables, the bits that actually matter to the lesson - and only once that's solid, wrap it. The Labspace layer is the easy part; the application underneath is what your learners are really there for.
Step 1 - Build the application the normal way
Forget Labspaces for a moment. Get the app working in Docker. Use docker-compose.yaml, run docker compose up --watch, iterate the way you always would. Get to the point where a fresh checkout plus docker compose up produces a working stack. Don't reach for the Labspace until you're there.
Step 2 - Add the Labspace wrapper
Once the app stands up reliably, you wrap it. Two small files:
compose.yaml is your dev/maintainer entry point. It does just one thing: pull in the Labspace runtime via OCI and stack your override on top.
include: - oci://dockersamples/labspace-content-dev - ./compose.override.yaml
The include: of oci://dockersamples/labspace-content-dev is the magic line - it pulls the entire Labspace runtime (interface, VS Code server, configurator, port republisher, socket proxy) down as a side dish to your stack. This is the file you'll run with docker compose up --watch while authoring.
compose.override.yaml is where the Labspace environment is customised. The minimum is a PROJECT_CLONE_URL so the configurator knows where your repo lives. Everything else - swapping the workspace image, adding AI models or extra services, dropping in your own application containers - goes here too. Add labspace-resource: "true" to any service whose published port should be reachable from the IDE terminal:
services: configurator: environment: PROJECT_CLONE_URL: https://github.com/${REPO_OWNER}/${REPO_NAME} backend: image: you/your-app:latest ports: - "8000:8000" labels: labspace-resource: "true"
Without that label, the host port republisher won't surface the port on the IDE's localhost - meaning a learner's curl http://localhost:8000 from the terminal won't reach the service. With it, everything Just Works.
Step 3 - Author the lessons
Drop a labspace/labspace.yaml and your section markdown files into the labspace/ directory. The manifest lists your sections; each markdown file is a normal page with whatever H2/H3 structure you like. Code fences in bash, sh or console will automatically get Run buttons attached - so write your commands in those languages whenever you want one-click execution.
A small but important style note from the Labspaces best-practices guide: address the learner as "you," not "we." A Labspace isn't a co-pilot; it's a lesson the reader is sitting down to. "You're going to send a GET request" reads better in this format than "we're going to send a GET request."
Step 4 - Run it locally
CONTENT_PATH=$PWD docker compose up --watch
Open http://localhost:3030 and you're inside your own lab. The --watch flag is doing more than you might expect: it syncs both your lesson markdown and the contents of the project directory into the running lab as you save. So you can iterate on the prose, the starter app code, or both at the same time, without restarting anything. And changes the lab itself makes during a run - files a learner edits, files a lesson rewrites - stay inside the running container without touching the source on disk, so your starting point is preserved.
Step 5 - Publish
When you're happy:
docker compose -f oci://dockersamples/labspace -f compose.override.yaml \ publish you/your-lab --with-env -y
That command bundles your compose stack and your lesson content together as a single OCI artifact and pushes it to Docker Hub. Behind the scenes the publish flow generates a third Compose file - compose.pipeline.yaml - which declares a Compose config named labspace-content (a base64-encoded tarball of your labspace/ directory) and adds an override mounting that config into the configurator service. That's how the lesson content travels with the artifact: when a learner pulls the published Labspace, Compose hands the configurator the embedded tarball instead of cloning from a URL, and the lab is fully self-contained. From that point on, anyone in the world can run your lab with:
docker compose -f oci://you/your-lab up
dockersamples/labspace-starter ships with a GitHub Actions workflow that does steps 4 and 5 for you on every push - so once it's set up, your lab republishes itself whenever you commit a change to the lesson markdown.
A working example - spurin/rest-labspace
To make the abstract concrete, here's the lab I built: spurin/rest-labspace. It teaches REST API fundamentals - GET, PUT, PATCH, DELETE and POST - by giving learners a customisable cartoon avatar and an API that controls it. The avatar itself is rendered with the Big Heads library, which provides the friendly, expressive characters you see in the lab. Every endpoint a learner calls visibly changes the avatar. Set the hair colour with a PATCH, and the hair colour changes. DELETE an accessory and the avatar takes off its glasses. POST to /animate and the character winks.
The teaching device matters: the avatar gives instant, visual feedback, which means a learner who just sent their first ever HTTP request sees something happen in the world. That's a much stickier first lesson than "you should now see a 200 response in the terminal."
The PATCH lesson is a good example - learners pick a preset like "Pop star", "Hipster" or "Surfer", fire a single multi-field PATCH, and see the avatar transform in one go:
.png?updatedAt=1778243758382?tr=w-1200%2Cf-auto%2Cdpr-2)
The lab has six sections - a welcome tour and five verb-focused lessons - and runs two services:
backend- Python 3.12 + FastAPI + Uvicorn. Holds the character's state in memory, exposes it as JSON, and auto-generates a Swagger UI at/docs. Listens on port8000.frontend- React + TypeScript + Vite, served by nginx in production. Renders the avatar by polling the backend. Listens on port3031(port3030is reserved by the Labspace interface, which is why the frontend goes one port up).
Here's the actual labspace/labspace.yaml from the repo:
metadata: id: spurin/rest-labspace sourceRepo: github.com/spurin/rest-labspace contentVersion: 0.2.0 author: James Spurin title: REST API Fundamentals description: | Hands-on playground for learning REST APIs. Click controls in the Character Lab and watch a character react in real time, or run the equivalent curl command yourself in the IDE terminal - both drive the same API. Covers GET, PUT, PATCH, DELETE, and POST. By James Spurin. sections: - title: Welcome contentPath: 00-welcome.md - title: GET - read state contentPath: 01-get.md - title: PUT - replace a sub-resource contentPath: 02-put-hair.md - title: PUT / DELETE on collection items contentPath: 03-accessories.md - title: PATCH - multi-field update contentPath: 04-patch.md - title: POST - trigger an action contentPath: 05-interactions.md services: - id: app title: Character Lab url: http://localhost:3031 icon: face - id: api title: Swagger UI url: http://localhost:8000/docs icon: api
Two service tabs surface alongside the IDE: Character Lab for the avatar UI, and Swagger UI for the auto-generated API documentation. The icons (face and api) come straight from Google's Material Symbols set.
The Swagger tab is one of the nice side-effects of using FastAPI - every endpoint, method, parameter and response schema gets documented automatically, and learners can explore everything beyond the lessons without leaving the lab:
.png?updatedAt=1778243758384?tr=w-1200%2Cf-auto%2Cdpr-2)
And here's the compose.override.yaml - the cleanest example I have of labspace-resource labels in the wild:
services: backend: image: spurin/rest-labspace-backend:latest container_name: rest-labspace-backend ports: - "8000:8000" environment: CORS_ORIGINS: "*" labels: # Required for the Labspace host-port-republisher to expose this port # on the workspace's localhost (so curls in the IDE terminal hit # http://localhost:8000 successfully). labspace-resource: "true" frontend: image: spurin/rest-labspace-frontend:latest container_name: rest-labspace-frontend ports: # 3030 is reserved by the Labspace interface, so the Character Lab # publishes on 3031 and is surfaced as a tab via labspace.yaml. - "3031:80" depends_on: - backend labels: labspace-resource: "true"
That's effectively the entire integration surface area: published images, port mappings, labels.
What does it feel like to run? A learner clicks Open in Docker Desktop, lands on the welcome page, opens the IDE tab, opens a terminal and runs:
curl http://localhost:8000/api/character
A blob of JSON comes back describing the character's hair, face, accessories and clothing. They flip to the Character Lab tab and see the avatar that JSON describes. They run the next lesson's PATCH to change the hair colour. The avatar's hair changes. They keep going. By the end of the lab they've sent every kind of REST request that matters.
The whole thing was small to build. The bulk of the work was the FastAPI backend and the React frontend - the parts that have nothing to do with Labspaces. The Labspace integration itself is two compose files, one manifest, and six markdown files.
Try it yourself
If you want to see it in action, the easiest path is the extension. Install Docker Desktop 4.10 or later, install the Labspaces extension from the Marketplace, browse dockersamples/awesome-labspaces, and pick one that interests you. Click in and you're learning.
If you want to skip the extension entirely, run rest-labspace directly from your terminal:
docker compose -f oci://spurin/rest-labspace up
Open http://localhost:3030 and the lab is yours.
If you want to author your own lab, start from dockersamples/labspace-starter. The README walks through the template repo flow, the local dev command, and the publish pipeline. The starter even ships with a Claude Code slash command (/labspace-author) that scaffolds an entire labspace from a one-line description - a nice on-ramp if you've got a workshop topic in mind but haven't sat down to write the structure yet.
Closing thoughts
The thing that excites me about Labspaces is what they imply about the future of technical learning. For years, the gap between "I read about this" and "I tried this" was the install friction - the Python version, the Docker version, the apt-get update, the corporate proxy. Labspaces close that gap by packaging the environment alongside the lesson and shipping both as a single thing.
If you teach, mentor, run workshops, or maintain an open source project that people want to learn, this format is worth your time. The skill investment is small (Compose plus Markdown) and the leverage is high (every learner gets the same environment, every time, with no setup). I built rest-labspace in an afternoon and it's a great way for new starters to understand the positivity and benefits of learning web API development.
Now is a great time to dive in. Pick a topic you'd love to teach, fork labspace-starter, and write your first lesson. The world needs more hands-on, no-setup, run-it-now learning material - and you're better placed than most to build it.
Good Luck!



