See Through Walls with a $9 Microcontroller

See Through Walls with a $9 Microcontroller | RuView
router
cell_wifi Edge AI & Sensors

See Through Walls with a $9 Microcontroller

Deploying WiFi DensePose on Kubernetes.

"How I deployed a real-time WiFi-based human sensing system on a homelab K3s cluster with a $9 ESP32-S3, live pose estimation, and RTSP camera fusion."

WiFi signals pass through walls. When a person moves — or even breathes — those signals scatter differently. What if you could read that scattering pattern and reconstruct what happened on the other side?

That's exactly what RuView does. Built on research from Carnegie Mellon's DensePose From WiFi paper, RuView is an open-source edge AI system that turns commodity WiFi signals into real-time human pose estimation, vital sign monitoring, and presence detection — all without a single pixel of video.

I took it a step further: deployed it on Kubernetes, wired up a live ESP32-S3 sensor, and fused the WiFi signal data with an RTSP camera feed for dual-modal pose estimation. Here's how.

memory

The Hardware: $9 and a WiFi Router

The entire sensing hardware cost me $9:

  • check_circle 1x ESP32-S3 ($9) — a dual-core microcontroller with WiFi that exposes Channel State Information (CSI). CSI gives you per-subcarrier amplitude and phase data — 56+ data points per WiFi frame, 20 times per second. That's the raw material for sensing.

Standard consumer WiFi only gives you RSSI (a single signal strength number). CSI is like going from a thermometer to a thermal camera — instead of one number, you get a detailed map of how the signal is being affected by everything in the room.

info

I also had three ESP32-C3s sitting around, but those are single-core RISC-V chips that can't handle the DSP pipeline. The S3's dual-core Xtensa is required — one core captures CSI interrupts while the other runs signal processing.

layers

The Software Stack

RuView's Rust sensing server processes the signal chain:

terminal
ESP32 CSI (UDP) → Hampel outlier rejection → SpotFi phase correction
    → Fresnel zone modeling → FFT vital sign extraction
    → AI backbone (RuVector attention networks)
    → 17 body keypoints + breathing rate + heart rate + presence

At 54,000 frames/sec throughput in Rust, this is fast enough to process live data from multiple sensors with headroom to spare. The server exposes a REST API, WebSocket stream, and a full browser UI.

bolt

Flashing the ESP32-S3

The firmware ships as pre-built binaries in GitHub Releases. Flashing takes 30 seconds:

Bash
pip install esptool
python -m esptool --chip esp32s3 --port COM7 --baud 460800 \
  write_flash --flash_mode dio --flash_size 8MB \
  0x0 bootloader.bin \
  0x8000 partition-table.bin \
  0xd000 ota_data_initial.bin \
  0x10000 esp32-csi-node.bin

Then provision it with your WiFi credentials and the IP of your server:

Bash
python provision.py --port COM7 \
  --ssid "MyWiFi" --password "secret" \
  --target-ip 192.168.1.9

The ESP32 connects to WiFi and starts streaming CSI frames over UDP to port 5005. No internet needed after provisioning — everything stays local.

view_in_ar

Containerizing for Kubernetes

The project includes a multi-stage Dockerfile that compiles the Rust server and bundles the UI into a minimal Debian image:

Dockerfile
FROM rust:1.85-bookworm AS builder
WORKDIR /build
COPY rust-port/wifi-densepose-rs/ ./
RUN cargo build --release -p wifi-densepose-sensing-server \
    && strip target/release/sensing-server

FROM debian:bookworm-slim
COPY --from=builder /build/target/release/sensing-server /app/
COPY ui/ /app/ui/
EXPOSE 3000/tcp 3001/tcp 5005/udp
CMD ["/app/sensing-server --source auto --ui-path /app/ui --bind-addr 0.0.0.0"]

Built and pushed to GHCR:

docker build -f docker/Dockerfile.rust -t ghcr.io/zektopic/ruview:k8s-poc .
docker push ghcr.io/zektopic/ruview:k8s-poc
dns

The K8s Deployment

The deployment has one unusual requirement: UDP hostPort. The ESP32 sends raw CSI frames to a specific IP:port, so the pod needs to receive those packets directly on the host's network interface without kube-proxy NAT:

YAML
ports:
- containerPort: 5005
  protocol: UDP
  hostPort: 5005    # ESP32 sends directly to host

This means the pod must be pinned to a specific node (nodeName) — if it moves, the ESP32 would be sending to the wrong IP. For a homelab this is fine; for production you'd use a DaemonSet or a LoadBalancer with UDP support.

The HTTP API and WebSocket get standard NodePort services:

type: NodePort
ports:
- name: http
  port: 3000
  nodePort: 30900

An nginx reverse proxy ties it all together, handling WebSocket upgrades with proper timeout settings so the live data stream doesn't drop.

dashboard

What It Looks Like

The Observatory UI is the star — a cinematic Three.js dashboard with five holographic panels:

waves

Subcarrier Manifold

Live heatmap of all 56+ WiFi subcarriers, showing frequency effects.

favorite

Vital Signs Oracle

Breathing rate (6-30 BPM) & heart rate (40-120 BPM) from phase variations.

person_search

Presence Heatmap

Room-level signal field showing where people are located.

scatter_plot

Phase Constellation

Complex-plane plot of CSI phase, revealing movement patterns.

memory_alt

Convergence Engine

Signal processing pipeline metrics and overall health.

The Pose Fusion view goes further — it overlays WiFi-derived pose estimation onto a live camera feed. I connected my RTSP camera through Frigate's go2rtc, which already handles RTSP-to-HLS transcoding. The browser loads the HLS stream alongside the CSI data, and the fusion engine cross-correlates video motion with WiFi signal changes.

analytics

Real Data, Real Results

With the ESP32-S3 powered on and placed in my office, the system immediately detected:

  • sensors Presence: true with confidence ~0.78
  • directions_run Motion level: present_movingpresent_stillactive
  • group Person count: 1 (estimated from CSI subcarrier patterns)
  • speed 64 subcarriers streaming at 20 Hz

All through the wall, with no camera in the room.

JSON Response
{
  "classification": {
    "confidence": 0.78,
    "motion_level": "present_moving",
    "presence": true
  },
  "estimated_persons": 1,
  "features": {
    "breathing_band_power": 34.27,
    "motion_band_power": 61.17,
    "spectral_power": 158.92
  }
}
shield_lock
security

Privacy by Design

This is the compelling part. There is no camera in the sensing loop. The ESP32 captures WiFi signal disturbances — amplitude and phase changes caused by human bodies scattering radio waves.

There are no images, no video frames, no biometric data stored. The "sensing" is fundamentally different from surveillance.

For applications like elderly care monitoring, hospital patient tracking, or smart building occupancy — where cameras raise serious privacy and regulatory concerns — WiFi sensing sidesteps the problem entirely.

rocket_launch

What's Next

hub

Multi-node mesh

Adding 3-6 ESP32-S3 nodes for full 360-degree room coverage with multistatic fusion.

radar

ESP32-C6 + mmWave

Pairing the C6 with a Seeed MR60BHA2 60 GHz sensor for clinical-grade vital signs.

extension

Edge WASM modules

65 implemented edge intelligence modules run directly on the ESP32 as tiny WASM binaries (fall detection, sleep monitoring) with zero cloud dependency.

model_training

Training pipeline

Recording labeled CSI sessions to train the adaptive classifier for room-specific signal characteristics.

play_circle

Try It Yourself

The fastest path to a working system:

# 1. Docker (simulated data, no hardware)
docker run -p 3000:3000 ghcr.io/zektopic/ruview:k8s-poc
# Open http://localhost:3000/ui/

# 2. With ESP32-S3 hardware (~$9)
# Flash firmware, provision WiFi, run server with --source auto

# 3. Full K8s deployment
# See the deployment guide for complete instructions

The entire system — firmware, server, UI, signal processing, neural networks — is open source under MIT. One $9 microcontroller and some WiFi signals. That's all it takes to give a room spatial awareness.

No comments:

Post a Comment