- Space
- Posts
- Space is back with SpaceKit
Space is back with SpaceKit
Resuming DevLog updates!
Hey folks, it’s been a while!
We paused after the February PoC to explore a few different ways forward but we’ve come full circle. We are working on the first demo you can try out for September 🚀
You might be interested in my internal review of 12 months of R&D which covers some of the gap.
What to Expect
As 3D creatures we need a 3D interface. Two metaphors for that interface:
A “theater” where humans (or machine) cue machines to act using motion, voice, as well as traditional digital events.
A “spatial desktop” replacing hover and click with point and speak a command (or gesture).
To drive this we are focused on a plug and play kit to transform any room into a “theater” in minutes. You can then subscribe to events in your downstream apps and automations.

Anticipated event types:
collisions (with virtual zones or objects or devices)
proximity (same as above)
human body poses
body scale gestures (point, wave, etc.)
high level events like "focus”
SpaceKit has three parts:
a lightweight SDK
a visual editor
a depth camera
We will be posting our devlog daily on social and doing weekly rollups and announcements as needed. Please feel free to drop any thoughts or reactions you have!
Weekly Progress Rollup
This week we kicked off development of SpaceKit and made significant progress.
SoftwareWe are starting with a visual editor for now, a few basic wireframes, and expect to move to direct AR views later. | HardwareWorked to have the depth camera “look around” but will downgrade to a static peripheral and focus purely on the software SDK. |
We are looking for folks interested in early access to shape the product as it evolves. Drop a reply if interested!
Till next week!
— Michael