🏛️ Open Metaverse Hackathon · March 7–8, 2026 · Frontier Tower, SF
🏆 PLACES Track Entry

ROOM

A semantic memory layer for the spatial internet.
Places hold memory. Multiple knowledge graphs coexist without collapsing.

Gaussian splats give you the Visual. ROOM gives you the Meaning.

🚀 Enter Frontier Tower ✏️ Open Editor
Scroll to explore

Where ROOM lives

Spatial environments have four layers of information. ROOM operates at the foundational Semantic layer — giving meaning to coordinates.

🎨 Visual
Pixels, textures, photorealistic rendering
📐 Geometric
3D shape, volume, spatial relationships
⚙️ Physical
Rules, dynamics, collision, forces
🧠 Semantic ← ROOM
Meaning, memory, knowledge graphs
ROOM adds the layer that Gaussian splats are missing — what a place means, not just what it looks like.

From ghost world to semantic world

Beautiful splats have no memory. ROOM gives them one.

👻 The Ghost World

Splats without meaning

Photorealistic Gaussian splats create stunning visual twins. But underneath:

  • No real surface topology
  • No collision or physics
  • No memory of events
  • No way to annotate or search
  • No knowledge graph
🧠 ROOM's Answer

Semantic memory layer

Instead of fighting the ghost world with collision meshes, give it memory:

  • Coordinates become Place nodes
  • Happenings become Event nodes
  • Multiple viewpoints coexist as Perspectives
  • Outputs crystallize as Artifacts
  • All connected in a flat JSON graph

Memory has structure

Four atomic types mirroring how the hippocampus encodes spatial memory.

📍
Place
A spatial coordinate or region. The irreducible ground. Like hippocampal place cells — firing when you occupy a specific location.
Event
A happening anchored to a Place with a timestamp. The basic unit of spatial memory. Events are what places remember.
👁️
Perspective
A viewpoint on an Event. Multiple Perspectives coexist — architectural, social, experiential. The same place through different eyes.
📄
Artifact
A crystallized output — document, media, insight. Produced from Perspectives. Portable across ROOM instances.

From capture to memory

Four steps to give a place its first memory.

1

📸 Capture

Scan a space with Polycam. Export as .ply Gaussian splat. The building gets a digital twin.

2

📌 Anchor

Click a coordinate in the viewer. A new Event node writes to world.json with position and time. The place begins to remember.

3

🏷️ Annotate

Add Perspectives. Set an ontology tag, visibility, content. Multiple viewpoints on the same event — architectural, social, mythological — all coexist.

4

🚶 Tour

Walk the guided narrative. Nodes surface as waypoints. Toggle perspectives to see the same space through different eyes.

Four spatial memories

Polycam captures of the hackathon venue. Each becomes a Place node — a root for events, perspectives, and artifacts.

🏛️
Exterior
The street-level approach. Where arrival begins.
🚪
Main Lobby
The threshold between the street and the spatial internet.
2nd Floor
The hackathon space. Where builders gather and creation hums.
🛸
??? Secret Room
Find it to unlock it. 👽

Interoperable by design

ROOM connects to the open metaverse ecosystem. No proprietary lock-in.

📡
arrival.space
Gaussian splat hosting + streaming
🧠
Claudesidian
Obsidian → ROOM perspective bridge
📷
Polycam
Spatial capture → .ply export
🌐
RP1 / MSF
Open metaverse standards
⏱️ Code Complete in
-- hrs
:
-- min
:
-- sec