A downloadable project for Android

SAM Guide is an experimental Android application exploring continuous spatial awareness, context-sensitive AI behavior, and human-like conversational restraint using on-device sensors, camera input, and a multimodal generative model.

SAM behaves like a quiet but perceptive companion:

  • It notices the environment through the camera and sensors
  • Builds a rolling spatial and behavioral model
  • Speaks only when there’s a real reason
  • Adapts how often it initiates based on how the user responds

This project is designed as a research / demo prototype for competitions and evaluation, not a consumer release.

 Key Features

  • 📷 Camera-based spatial awareness (CameraX + Camera2 interop)
  • 📐 Camera intrinsics provided to the AI for better relative distance reasoning
  • 🧭 Sensor fusion (accelerometer, gyroscope, rotation, light, proximity, etc.)
  • 📍 Location-aware snapshots with urgency detection
  • 🧠 Rolling context window + state summarization to stay responsive
  • 🗣️ Speech with restraint (TTS + continuous STT, silence by default)
  • 🧩 Human-like conversational rules (no nagging, no follow-ups, no filler)
  • 🎛️ Live HUD for sensors and AI context (useful for demos and recordings

Published 7 days ago
StatusPrototype
CategoryOther
PlatformsAndroid
AuthorUralstech

Download

Download
SAM-Guide_Demo.apk 14 MB
Download
GitHub Repository
External

Install instructions

The app is distributed as an APK for installation on Android devices. On first launch:

1. Add your Gemini API key

The app will prompt you to enter:

  • Speech recognition language
  • Snapshot interval
  • Gemini API key

The key is encrypted on-device using the Android Keystore (AES-GCM) and never stored in plaintext.

2. Grant permissions on first run

SAM requires:

  • Camera
  • Microphone
  • Location

These are essential for spatial awareness and speech.

Leave a comment

Log in with itch.io to leave a comment.