Designs spatial UI layouts, prototypes hand gesture interactions, creates 3D menu systems, and plans gaze-based navigation for AR/VR/XR applications. Provides ergonomic placement guidelines, comfort zone thresholds, input latency budgets, and accessibility fallback patterns for immersive interfaces across platforms including Meta Quest, Apple Vision Pro, and WebXR. Use when the user asks about spatial computing interfaces, VR/AR UI design, mixed reality interactions, hand tracking UX, HUD design, holographic displays, headset UI, immersive environment layouts, 3D interface design, cockpit dashboards, or comfort-zone-compliant panel placement.
96
96%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Passed
No known issues
A spatial UI/UX design skill for AR/VR/XR environments. Covers ergonomic thresholds, input models, layout templates, and validated workflows for immersive interface design.
| Zone | Vertical Offset | Depth (from user) | Use Case |
|---|---|---|---|
| Primary UI | 0°–15° below eye level | 0.5 m – 2.0 m | HUDs, menus, dashboards |
| Secondary UI | 15°–35° below eye level | 1.0 m – 3.0 m | Contextual panels, tooltips |
| Peripheral | ±35°–60° horizontal | 1.5 m+ | Alerts, ambient indicators |
| Out-of-comfort | >35° below or >30° above | <0.4 m or >5.0 m | Avoid for sustained interaction |
| Input Type | Max Acceptable Latency | Notes |
|---|---|---|
| Direct hand touch | < 20 ms | Mismatch causes discomfort |
| Gaze + pinch select | < 50 ms | Visual feedback must be immediate |
| Controller raycast | < 80 ms | Pointer drift compounds at high latency |
| Voice command | < 300 ms | Confirm with visual acknowledgement |
Primary input → Secondary input → Accessibility fallback
hand gesture → gaze + pinch → controller raycast → voice command{
"panel": {
"anchorType": "world",
"position": { "x": 0, "y": -0.15, "z": 1.2 },
"rotation": { "pitch": 10, "yaw": 0, "roll": 0 },
"size": { "width": 0.6, "height": 0.4, "unit": "meters" },
"billboard": "yaw-only",
"depthClamp": { "min": 0.5, "max": 3.0 },
"inputSurface": "front-only",
"backplateOpacity": 0.85
}
}{
"cockpit": {
"anchorType": "head-locked",
"primaryZone": { "yOffset": -0.12, "depth": 1.0, "arcWidth": 60 },
"secondaryZone": { "yOffset": -0.28, "depth": 1.5, "arcWidth": 100 },
"peripheralAlerts": { "yOffset": 0.05, "arcWidth": 120, "opacity": 0.6 }
}
}using UnityEngine;
public class ComfortZonePanel : MonoBehaviour
{
[Header("Placement Settings")]
public float depth = 1.2f; // meters in front of camera
public float verticalOffsetDeg = -10f; // degrees below eye level
public float smoothSpeed = 5f;
private Transform _cam;
void Start() => _cam = Camera.main.transform;
void Update()
{
// Target position: depth ahead, offset below gaze
Vector3 forward = _cam.forward;
Vector3 targetPos = _cam.position
+ forward * depth
+ Vector3.down * Mathf.Tan(verticalOffsetDeg * Mathf.Deg2Rad) * depth;
// Smooth follow (yaw-only billboard)
transform.position = Vector3.Lerp(transform.position, targetPos, Time.deltaTime * smoothSpeed);
Vector3 lookDir = transform.position - _cam.position;
lookDir.y = 0;
transform.rotation = Quaternion.LookRotation(lookDir);
}
}function placeComfortPanel(panel, camera, depthM = 1.2, vertOffsetDeg = -10) {
const offset = Math.tan((vertOffsetDeg * Math.PI) / 180) * depthM;
panel.position.copy(camera.position)
.addScaledVector(camera.getWorldDirection(new THREE.Vector3()), depthM);
panel.position.y += offset;
panel.lookAt(camera.position); // billboard toward user
}| Constraint | Recommended Adaptation |
|---|---|
| No hand tracking | Gaze dwell + controller confirm |
| Limited head mobility | Expand interaction arc; use voice trigger |
| Low vision | Increase angular font size to 0.7°+; high-contrast backplate |
| Single-hand use | All controls reachable one-handed; no pinch-hold combos |
| Motion sensitivity | Reduce UI animation; prefer instant transitions over smooth follow |
010799b
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.