Star 历史趋势
数据来源: GitHub API · 生成自 Stargazers.cn
README.md

π RuView

RuView - WiFi DensePose

Beta Software — Under active development. APIs and firmware may change. Known limitations:

  • ESP32-C3 and original ESP32 are not supported (single-core, insufficient for CSI DSP)
  • Single ESP32 deployments have limited spatial resolution — use 2+ nodes or add a Cognitum Seed for best results
  • Camera-free pose accuracy is limited — use camera ground-truth training for 92.9% PCK@20

Contributions and bug reports welcome at Issues.

See through walls with WiFi

Turn ordinary WiFi into a sensing system. Detect people, measure breathing and heart rate, track movement, and monitor rooms — through walls, in the dark, with no cameras or wearables. Just physics.

π RuView is a WiFi sensing platform that turns radio signals into spatial intelligence.

Every WiFi router already fills your space with radio waves. When people move, breathe, or even sit still, they disturb those waves in measurable ways. RuView captures these disturbances using Channel State Information (CSI) from low-cost ESP32 sensors and turns them into actionable data: who's there, what they're doing, and whether they're okay.

What it senses:

  • Presence and occupancy — detect people through walls, count them, track entries and exits
  • Vital signs — breathing rate and heart rate, contactless, while sleeping or sitting
  • Activity recognition — walking, sitting, gestures, falls — from temporal CSI patterns
  • Environment mapping — RF fingerprinting identifies rooms, detects moved furniture, spots new objects
  • Sleep quality — overnight monitoring with sleep stage classification and apnea screening

Built on RuVector and Cognitum Seed, RuView runs entirely on edge hardware — an ESP32 mesh (as low as $9 per node) paired with a Cognitum Seed for persistent memory, cryptographic attestation, and AI integration. No cloud, no cameras, no internet required.

The system learns each environment locally using spiking neural networks that adapt in under 30 seconds, with multi-frequency mesh scanning across 6 WiFi channels that uses your neighbors' routers as free radar illuminators. Every measurement is cryptographically attested via an Ed25519 witness chain.

RuView also supports pose estimation (17 COCO keypoints via the WiFlow architecture), trained entirely without cameras using 10 sensor signals — a technique pioneered from the original DensePose From WiFi research at Carnegie Mellon University.

Built for low-power edge applications

Edge modules are small programs that run directly on the ESP32 sensor — no internet needed, no cloud fees, instant response.

Rust 1.85+ License: MIT Tests: 1463 Docker: multi-arch Vital Signs ESP32 Ready crates.io

WhatHowSpeed
🦴 Pose estimationCSI subcarrier amplitude/phase → 17 COCO keypoints171K emb/s (M4 Pro)
🫁 Breathing detectionBandpass 0.1-0.5 Hz → zero-crossing BPM6-30 BPM
💓 Heart rateBandpass 0.8-2.0 Hz → zero-crossing BPM40-120 BPM
👤 Presence sensingTrained model + PIR fusion — 100% accuracy0.012 ms latency
🧱 Through-wallFresnel zone geometry + multipath modelingUp to 5m depth
🧠 Edge intelligence8-dim feature vectors + RVF store on Cognitum Seed$140 total BOM
🎯 Camera-free training10 sensor signals, no labels needed84s on M4 Pro
📷 Camera-supervised trainingMediaPipe + ESP32 CSI → 92.9% PCK@2019 min on laptop
📡 Multi-frequency meshChannel hopping across 6 bands, neighbor APs as illuminators3x sensing bandwidth
🌐 3D point cloud (optional fusion)Camera depth (MiDaS) + WiFi CSI + mmWave radar → unified spatial model22 ms pipeline · 19K+ points/frame
# Option 1: Docker (simulated data, no hardware needed) docker pull ruvnet/wifi-densepose:latest docker run -p 3000:3000 ruvnet/wifi-densepose:latest # Open http://localhost:3000 # Option 2: Live sensing with ESP32-S3 hardware ($9) # Flash firmware, provision WiFi, and start sensing: python -m esptool --chip esp32s3 --port COM9 --baud 460800 \ write_flash 0x0 bootloader.bin 0x8000 partition-table.bin \ 0xf000 ota_data_initial.bin 0x20000 esp32-csi-node.bin python firmware/esp32-csi-node/provision.py --port COM9 \ --ssid "YourWiFi" --password "secret" --target-ip 192.168.1.20 # Option 3: Full system with Cognitum Seed ($140) # ESP32 streams CSI → bridge forwards to Seed for persistent storage + kNN + witness chain node scripts/rf-scan.js --port 5006 # Live RF room scan node scripts/snn-csi-processor.js --port 5006 # SNN real-time learning node scripts/mincut-person-counter.js --port 5006 # Correct person counting

[!NOTE] CSI-capable hardware recommended. Presence, vital signs, through-wall sensing, and all advanced capabilities require Channel State Information (CSI) from an ESP32-S3 ($9) or research NIC. The Docker image runs with simulated data for evaluation. Consumer WiFi laptops provide RSSI-only presence detection.

Hardware options for live CSI capture:

OptionHardwareCostFull CSICapabilities
ESP32 + Cognitum Seed (recommended)ESP32-S3 + Cognitum Seed~$140YesPose, breathing, heartbeat, motion, presence + persistent vector store, kNN search, witness chain, MCP proxy
ESP32 Mesh3-6x ESP32-S3 + WiFi router~$54YesPose, breathing, heartbeat, motion, presence
Research NICIntel 5300 / Atheros AR9580~$50-100YesFull CSI with 3x3 MIMO
Any WiFiWindows, macOS, or Linux laptop$0NoRSSI-only: coarse presence and motion

No hardware? Verify the signal processing pipeline with the deterministic reference signal: python archive/v1/data/proof/verify.py


WiFi DensePose — Live pose detection with setup guide
Real-time pose skeleton from WiFi CSI signals — no cameras, no wearables

▶ Live Observatory Demo  |  ▶ Dual-Modal Pose Fusion Demo  |  ▶ Live 3D Point Cloud

The server is optional for visualization and aggregation — the ESP32 runs independently for presence detection, vital signs, and fall alerts.

Live ESP32 pipeline: Connect an ESP32-S3 node → run the sensing server → open the pose fusion demo for real-time dual-modal pose estimation (webcam + WiFi CSI). See ADR-059.

🔬 How It Works

WiFi routers flood every room with radio waves. When a person moves — or even breathes — those waves scatter differently. WiFi DensePose reads that scattering pattern and reconstructs what happened:

WiFi Router → radio waves pass through room → hit human body → scatter
    ↓
ESP32 mesh (4-6 nodes) captures CSI on channels 1/6/11 via TDM protocol
    ↓
Multi-Band Fusion: 3 channels × 56 subcarriers = 168 virtual subcarriers per link
    ↓
Multistatic Fusion: N×(N-1) links → attention-weighted cross-viewpoint embedding
    ↓
Coherence Gate: accept/reject measurements → stable for days without tuning
    ↓
Signal Processing: Hampel, SpotFi, Fresnel, BVP, spectrogram → clean features
    ↓
AI Backbone (RuVector): attention, graph algorithms, compression, field model
    ↓
Signal-Line Protocol (CRV): 6-stage gestalt → sensory → topology → coherence → search → model
    ↓
Neural Network: processed signals → 17 body keypoints + vital signs + room model
    ↓
Output: real-time pose, breathing, heart rate, room fingerprint, drift alerts

No training cameras required — the Self-Learning system (ADR-024) bootstraps from raw WiFi data alone. MERIDIAN (ADR-027) ensures the model works in any room, not just the one it trained in.


🏢 Use Cases & Applications

WiFi sensing works anywhere WiFi exists. No new hardware in most cases — just software on existing access points or a $8 ESP32 add-on. Because there are no cameras, deployments avoid privacy regulations (GDPR video, HIPAA imaging) by design.

Scaling: Each AP distinguishes ~3-5 people (56 subcarriers). Multi-AP multiplies linearly — a 4-AP retail mesh covers ~15-20 occupants. No hard software limit; the practical ceiling is signal physics.

Why WiFi sensing winsTraditional alternative
🔒No video, no GDPR/HIPAA imaging rulesCameras require consent, signage, data retention policies
🧱Works through walls, shelving, debrisCameras need line-of-sight per room
🌙Works in total darknessCameras need IR or visible light
💰$0-$8 per zone (existing WiFi or ESP32)Camera systems: $200-$2,000 per zone
🔌WiFi already deployed everywherePIR/radar sensors require new wiring per room
🏥 Everyday — Healthcare, retail, office, hospitality (commodity WiFi)
Use CaseWhat It DoesHardwareKey MetricEdge Module
Elderly care / assisted livingFall detection, nighttime activity monitoring, breathing rate during sleep — no wearable compliance needed1 ESP32-S3 per room ($8)Fall alert <2sSleep Apnea, Gait Analysis
Hospital patient monitoringContinuous breathing + heart rate for non-critical beds without wired sensors; nurse alert on anomaly1-2 APs per wardBreathing: 6-30 BPMRespiratory Distress, Cardiac Arrhythmia
Emergency room triageAutomated occupancy count + wait-time estimation; detect patient distress (abnormal breathing) in waiting areasExisting hospital WiFiOccupancy accuracy >95%Queue Length, Panic Motion
Retail occupancy & flowReal-time foot traffic, dwell time by zone, queue length — no cameras, no opt-in, GDPR-friendlyExisting store WiFi + 1 ESP32Dwell resolution ~1mCustomer Flow, Dwell Heatmap
Office space utilizationWhich desks/rooms are actually occupied, meeting room no-shows, HVAC optimization based on real presenceExisting enterprise WiFiPresence latency <1sMeeting Room, HVAC Presence
Hotel & hospitalityRoom occupancy without door sensors, minibar/bathroom usage patterns, energy savings on empty roomsExisting hotel WiFi15-30% HVAC savingsEnergy Audit, Lighting Zones
Restaurants & food serviceTable turnover tracking, kitchen staff presence, restroom occupancy displays — no cameras in dining areasExisting WiFiQueue wait ±30sTable Turnover, Queue Length
Parking garagesPedestrian presence in stairwells and elevators where cameras have blind spots; security alert if someone lingersExisting WiFiThrough-concrete wallsLoitering, Elevator Count
🏟️ Specialized — Events, fitness, education, civic (CSI-capable hardware)
Use CaseWhat It DoesHardwareKey MetricEdge Module
Smart home automationRoom-level presence triggers (lights, HVAC, music) that work through walls — no dead zones, no motion-sensor timeouts2-3 ESP32-S3 nodes ($24)Through-wall range ~5mHVAC Presence, Lighting Zones
Fitness & sportsRep counting, posture correction, breathing cadence during exercise — no wearable, no camera in locker rooms3+ ESP32-S3 meshPose: 17 keypointsBreathing Sync, Gait Analysis
Childcare & schoolsNaptime breathing monitoring, playground headcount, restricted-area alerts — privacy-safe for minors2-4 ESP32-S3 per zoneBreathing: ±1 BPMSleep Apnea, Perimeter Breach
Event venues & concertsCrowd density mapping, crush-risk detection via breathing compression, emergency evacuation flow trackingMulti-AP mesh (4-8 APs)Density per m²Customer Flow, Panic Motion
Stadiums & arenasSection-level occupancy for dynamic pricing, concession staffing, emergency egress flow modelingEnterprise AP grid15-20 per AP meshDwell Heatmap, Queue Length
Houses of worshipAttendance counting without facial recognition — privacy-sensitive congregations, multi-room campus trackingExisting WiFiZone-level accuracyElevator Count, Energy Audit
Warehouse & logisticsWorker safety zones, forklift proximity alerts, occupancy in hazardous areas — works through shelving and palletsIndustrial AP meshAlert latency <500msForklift Proximity, Confined Space
Civic infrastructurePublic restroom occupancy (no cameras possible), subway platform crowding, shelter headcount during emergenciesMunicipal WiFi + ESP32Real-time headcountCustomer Flow, Loitering
Museums & galleriesVisitor flow heatmaps, exhibit dwell time, crowd bottleneck alerts — no cameras near artwork (flash/theft risk)Existing WiFiZone dwell ±5sDwell Heatmap, Shelf Engagement
🤖 Robotics & Industrial — Autonomous systems, manufacturing, android spatial awareness

WiFi sensing gives robots and autonomous systems a spatial awareness layer that works where LIDAR and cameras fail — through dust, smoke, fog, and around corners. The CSI signal field acts as a "sixth sense" for detecting humans in the environment without requiring line-of-sight.

Use CaseWhat It DoesHardwareKey MetricEdge Module
Cobot safety zonesDetect human presence near collaborative robots — auto-slow or stop before contact, even behind obstructions2-3 ESP32-S3 per cellPresence latency <100msForklift Proximity, Perimeter Breach
Warehouse AMR navigationAutonomous mobile robots sense humans around blind corners, through shelving racks — no LIDAR occlusionESP32 mesh along aislesThrough-shelf detectionForklift Proximity, Loitering
Android / humanoid spatial awarenessAmbient human pose sensing for social robots — detect gestures, approach direction, and personal space without cameras always onOnboard ESP32-S3 module17-keypoint poseGesture Language, Emotion Detection
Manufacturing line monitoringWorker presence at each station, ergonomic posture alerts, headcount for shift compliance — works through equipmentIndustrial AP per zonePose + breathingConfined Space, Gait Analysis
Construction site safetyExclusion zone enforcement around heavy machinery, fall detection from scaffolding, personnel headcountRuggedized ESP32 meshAlert <2s, through-dustPanic Motion, Structural Vibration
Agricultural roboticsDetect farm workers near autonomous harvesters in dusty/foggy field conditions where cameras are unreliableWeatherproof ESP32 nodesRange ~10m open fieldForklift Proximity, Rain Detection
Drone landing zonesVerify landing area is clear of humans — WiFi sensing works in rain, dust, and low light where downward cameras failGround ESP32 nodesPresence: >95% accuracyPerimeter Breach, Tailgating
Clean room monitoringPersonnel tracking without cameras (particle contamination risk from camera fans) — gown compliance via poseExisting cleanroom WiFiNo particulate emissionClean Room, Livestock Monitor
🔥 Extreme — Through-wall, disaster, defense, underground

These scenarios exploit WiFi's ability to penetrate solid materials — concrete, rubble, earth — where no optical or infrared sensor can reach. The WiFi-Mat disaster module (ADR-001) is specifically designed for this tier.

Use CaseWhat It DoesHardwareKey MetricEdge Module
Search & rescue (WiFi-Mat)Detect survivors through rubble/debris via breathing signature, START triage color classification, 3D localizationPortable ESP32 mesh + laptopThrough 30cm concreteRespiratory Distress, Seizure Detection
FirefightingLocate occupants through smoke and walls before entry; breathing detection confirms life signs remotelyPortable mesh on truckWorks in zero visibilitySleep Apnea, Panic Motion
Prison & secure facilitiesCell occupancy verification, distress detection (abnormal vitals), perimeter sensing — no camera blind spotsDedicated AP infrastructure24/7 vital signsCardiac Arrhythmia, Loitering
Military / tacticalThrough-wall personnel detection, room clearing confirmation, hostage vital signs at standoff distanceDirectional WiFi + custom FWRange: 5m through wallPerimeter Breach, Weapon Detection
Border & perimeter securityDetect human presence in tunnels, behind fences, in vehicles — passive sensing, no active illumination to reveal positionConcealed ESP32 meshPassive / covertPerimeter Breach, Tailgating
Mining & undergroundWorker presence in tunnels where GPS/cameras fail, breathing detection after collapse, headcount at safety pointsRuggedized ESP32 meshThrough rock/earthConfined Space, Respiratory Distress
Maritime & navalBelow-deck personnel tracking through steel bulkheads (limited range, requires tuning), man-overboard detectionShip WiFi + ESP32Through 1-2 bulkheadsStructural Vibration, Panic Motion
Wildlife researchNon-invasive animal activity monitoring in enclosures or dens — no light pollution, no visual disturbanceWeatherproof ESP32 nodesZero light emissionLivestock Monitor, Dream Stage
🧩 Edge Intelligence (ADR-041) — 60 WASM modules across 13 categories, all implemented (609 tests)

Small programs that run directly on the ESP32 sensor — no internet needed, no cloud fees, instant response. Each module is a tiny WASM file (5-30 KB) that you upload to the device over-the-air. It reads WiFi signal data and makes decisions locally in under 10 ms. ADR-041 defines 60 modules across 13 categories — all 60 are implemented with 609 tests passing.

CategoryExamples
🏥Medical & HealthSleep apnea detection, cardiac arrhythmia, gait analysis, seizure detection
🔐Security & SafetyIntrusion detection, perimeter breach, loitering, panic motion
🏢Smart BuildingZone occupancy, HVAC control, elevator counting, meeting room tracking
🛒Retail & HospitalityQueue length, dwell heatmaps, customer flow, table turnover
🏭IndustrialForklift proximity, confined space monitoring, structural vibration
🔮Exotic & ResearchSleep staging, emotion detection, sign language, breathing sync
📡Signal IntelligenceCleans and sharpens raw WiFi signals — focuses on important regions, filters noise, fills in missing data, and tracks which person is which
🧠Adaptive LearningThe sensor learns new gestures and patterns on its own over time — no cloud needed, remembers what it learned even after updates
🗺️Spatial ReasoningFigures out where people are in a room, which zones matter most, and tracks movement across areas using graph-based spatial logic
⏱️Temporal AnalysisLearns daily routines, detects when patterns break (someone didn't get up), and verifies safety rules are being followed over time
🛡️AI SecurityDetects signal replay attacks, WiFi jamming, injection attempts, and flags abnormal behavior that could indicate tampering
⚛️Quantum-InspiredUses quantum-inspired math to map room-wide signal coherence and search for optimal sensor configurations
🤖Autonomous & ExoticSelf-managing sensor mesh — auto-heals dropped nodes, plans its own actions, and explores experimental signal representations

All implemented modules are no_std Rust, share a common utility library, and talk to the host through a 12-function API. Full documentation: Edge Modules Guide. See the complete implemented module list below.

🧩 Edge Intelligence — All 65 Modules Implemented (ADR-041 complete)

All 60 modules are implemented, tested (609 tests passing), and ready to deploy. They compile to wasm32-unknown-unknown, run on ESP32-S3 via WASM3, and share a common utility library. Source: crates/wifi-densepose-wasm-edge/src/

Core modules (ADR-040 flagship + early implementations):

ModuleFileWhat It Does
Gesture Classifiergesture.rsDTW template matching for hand gestures
Coherence Filtercoherence.rsPhase coherence gating for signal quality
Adversarial Detectoradversarial.rsDetects physically impossible signal patterns
Intrusion Detectorintrusion.rsHuman vs non-human motion classification
Occupancy Counteroccupancy.rsZone-level person counting
Vital Trendvital_trend.rsLong-term breathing and heart rate trending
RVF Parserrvf.rsRVF container format parsing

Vendor-integrated modules (24 modules, ADR-041 Category 7):

📡 Signal Intelligence — Real-time CSI analysis and feature extraction

ModuleFileWhat It DoesBudget
Flash Attentionsig_flash_attention.rsTiled attention over 8 subcarrier groups — finds spatial focus regions and entropyS (<5ms)
Coherence Gatesig_coherence_gate.rsZ-score phasor gating with hysteresis: Accept / PredictOnly / Reject / RecalibrateL (<2ms)
Temporal Compresssig_temporal_compress.rs3-tier adaptive quantization (8-bit hot / 5-bit warm / 3-bit cold)L (<2ms)
Sparse Recoverysig_sparse_recovery.rsISTA L1 reconstruction for dropped subcarriersH (<10ms)
Person Matchsig_mincut_person_match.rsHungarian-lite bipartite assignment for multi-person trackingS (<5ms)
Optimal Transportsig_optimal_transport.rsSliced Wasserstein-1 distance with 4 projectionsL (<2ms)

🧠 Adaptive Learning — On-device learning without cloud connectivity

ModuleFileWhat It DoesBudget
DTW Gesture Learnlrn_dtw_gesture_learn.rsUser-teachable gesture recognition — 3-rehearsal protocol, 16 templatesS (<5ms)
Anomaly Attractorlrn_anomaly_attractor.rs4D dynamical system attractor classification with Lyapunov exponentsH (<10ms)
Meta Adaptlrn_meta_adapt.rsHill-climbing self-optimization with safety rollbackL (<2ms)
EWC Lifelonglrn_ewc_lifelong.rsElastic Weight Consolidation — remembers past tasks while learning new onesS (<5ms)

🗺️ Spatial Reasoning — Location, proximity, and influence mapping

ModuleFileWhat It DoesBudget
PageRank Influencespt_pagerank_influence.rs4x4 cross-correlation graph with power iteration PageRankL (<2ms)
Micro HNSWspt_micro_hnsw.rs64-vector navigable small-world graph for nearest-neighbor searchS (<5ms)
Spiking Trackerspt_spiking_tracker.rs32 LIF neurons + 4 output zone neurons with STDP learningS (<5ms)

⏱️ Temporal Analysis — Activity patterns, logic verification, autonomous planning

ModuleFileWhat It DoesBudget
Pattern Sequencetmp_pattern_sequence.rsActivity routine detection and deviation alertsS (<5ms)
Temporal Logic Guardtmp_temporal_logic_guard.rsLTL formula verification on CSI event streamsS (<5ms)
GOAP Autonomytmp_goap_autonomy.rsGoal-Oriented Action Planning for autonomous module managementS (<5ms)

🛡️ AI Security — Tamper detection and behavioral anomaly profiling

ModuleFileWhat It DoesBudget
Prompt Shieldais_prompt_shield.rsFNV-1a replay detection, injection detection (10x amplitude), jamming (SNR)L (<2ms)
Behavioral Profilerais_behavioral_profiler.rs6D behavioral profile with Mahalanobis anomaly scoringS (<5ms)

⚛️ Quantum-Inspired — Quantum computing metaphors applied to CSI analysis

ModuleFileWhat It DoesBudget
Quantum Coherenceqnt_quantum_coherence.rsBloch sphere mapping, Von Neumann entropy, decoherence detectionS (<5ms)
Interference Searchqnt_interference_search.rs16 room-state hypotheses with Grover-inspired oracle + diffusionS (<5ms)

🤖 Autonomous Systems — Self-governing and self-healing behaviors

ModuleFileWhat It DoesBudget
Psycho-Symbolicaut_psycho_symbolic.rs16-rule forward-chaining knowledge base with contradiction detectionS (<5ms)
Self-Healing Meshaut_self_healing_mesh.rs8-node mesh with health tracking, degradation/recovery, coverage healingS (<5ms)

🔮 Exotic (Vendor) — Novel mathematical models for CSI interpretation

ModuleFileWhat It DoesBudget
Time Crystalexo_time_crystal.rsAutocorrelation subharmonic detection in 256-frame historyS (<5ms)
Hyperbolic Spaceexo_hyperbolic_space.rsPoincare ball embedding with 32 reference locations, hyperbolic distanceS (<5ms)

🏥 Medical & Health (Category 1) — Contactless health monitoring

ModuleFileWhat It DoesBudget
Sleep Apneamed_sleep_apnea.rsDetects breathing pauses during sleepS (<5ms)
Cardiac Arrhythmiamed_cardiac_arrhythmia.rsMonitors heart rate for irregular rhythmsS (<5ms)
Respiratory Distressmed_respiratory_distress.rsAlerts on abnormal breathing patternsS (<5ms)
Gait Analysismed_gait_analysis.rsTracks walking patterns and detects changesS (<5ms)
Seizure Detectionmed_seizure_detect.rs6-state machine for tonic-clonic seizure recognitionS (<5ms)

🔐 Security & Safety (Category 2) — Perimeter and threat detection

ModuleFileWhat It DoesBudget
Perimeter Breachsec_perimeter_breach.rsDetects boundary crossings with approach/departureS (<5ms)
Weapon Detectionsec_weapon_detect.rsMetal anomaly detection via CSI amplitude shiftsS (<5ms)
Tailgatingsec_tailgating.rsDetects unauthorized follow-through at access pointsS (<5ms)
Loiteringsec_loitering.rsAlerts when someone lingers too long in a zoneS (<5ms)
Panic Motionsec_panic_motion.rsDetects fleeing, struggling, or panic movementS (<5ms)

🏢 Smart Building (Category 3) — Automation and energy efficiency

ModuleFileWhat It DoesBudget
HVAC Presencebld_hvac_presence.rsOccupancy-driven HVAC control with departure countdownS (<5ms)
Lighting Zonesbld_lighting_zones.rsAuto-dim/off lighting based on zone activityS (<5ms)
Elevator Countbld_elevator_count.rsCounts people entering/leaving with overload warningS (<5ms)
Meeting Roombld_meeting_room.rsTracks meeting lifecycle: start, headcount, end, availabilityS (<5ms)
Energy Auditbld_energy_audit.rsTracks after-hours usage and room utilization ratesS (<5ms)

🛒 Retail & Hospitality (Category 4) — Customer insights without cameras

ModuleFileWhat It DoesBudget
Queue Lengthret_queue_length.rsEstimates queue size and wait timesS (<5ms)
Dwell Heatmapret_dwell_heatmap.rsShows where people spend time (hot/cold zones)S (<5ms)
Customer Flowret_customer_flow.rsCounts ins/outs and tracks net occupancyS (<5ms)
Table Turnoverret_table_turnover.rsRestaurant table lifecycle: seated, dining, vacatedS (<5ms)
Shelf Engagementret_shelf_engagement.rsDetects browsing, considering, and reaching for productsS (<5ms)

🏭 Industrial & Specialized (Category 5) — Safety and compliance

ModuleFileWhat It DoesBudget
Forklift Proximityind_forklift_proximity.rsWarns when people get too close to vehiclesS (<5ms)
Confined Spaceind_confined_space.rsOSHA-compliant worker monitoring with extraction alertsS (<5ms)
Clean Roomind_clean_room.rsOccupancy limits and turbulent motion detectionS (<5ms)
Livestock Monitorind_livestock_monitor.rsAnimal presence, stillness, and escape alertsS (<5ms)
Structural Vibrationind_structural_vibration.rsSeismic events, mechanical resonance, structural driftS (<5ms)

🔮 Exotic & Research (Category 6) — Experimental sensing applications

ModuleFileWhat It DoesBudget
Dream Stageexo_dream_stage.rsContactless sleep stage classification (wake/light/deep/REM)S (<5ms)
Emotion Detectionexo_emotion_detect.rsArousal, stress, and calm detection from micro-movementsS (<5ms)
Gesture Languageexo_gesture_language.rsSign language letter recognition via WiFiS (<5ms)
Music Conductorexo_music_conductor.rsTempo and dynamic tracking from conducting gesturesS (<5ms)
Plant Growthexo_plant_growth.rsMonitors plant growth, circadian rhythms, wilt detectionS (<5ms)
Ghost Hunterexo_ghost_hunter.rsEnvironmental anomaly classification (draft/insect/wind/unknown)S (<5ms)
Rain Detectionexo_rain_detect.rsDetects rain onset, intensity, and cessation via signal scatterS (<5ms)
Breathing Syncexo_breathing_sync.rsDetects synchronized breathing between multiple peopleS (<5ms)

🧠 Self-Learning WiFi AI (ADR-024) — Adaptive recognition, self-optimization, and intelligent anomaly detection

Every WiFi signal that passes through a room creates a unique fingerprint of that space. WiFi-DensePose already reads these fingerprints to track people, but until now it threw away the internal "understanding" after each reading. The Self-Learning WiFi AI captures and preserves that understanding as compact, reusable vectors — and continuously optimizes itself for each new environment.

What it does in plain terms:

  • Turns any WiFi signal into a 128-number "fingerprint" that uniquely describes what's happening in a room
  • Learns entirely on its own from raw WiFi data — no cameras, no labeling, no human supervision needed
  • Recognizes rooms, detects intruders, identifies people, and classifies activities using only WiFi
  • Runs on an $8 ESP32 chip (the entire model fits in 55 KB of memory)
  • Produces both body pose tracking AND environment fingerprints in a single computation

Key Capabilities

WhatHow it worksWhy it matters
Self-supervised learningThe model watches WiFi signals and teaches itself what "similar" and "different" look like, without any human-labeled dataDeploy anywhere — just plug in a WiFi sensor and wait 10 minutes
Room identificationEach room produces a distinct WiFi fingerprint patternKnow which room someone is in without GPS or beacons
Anomaly detectionAn unexpected person or event creates a fingerprint that doesn't match anything seen beforeAutomatic intrusion and fall detection as a free byproduct
Person re-identificationEach person disturbs WiFi in a slightly different way, creating a personal signatureTrack individuals across sessions without cameras
Environment adaptationMicroLoRA adapters (1,792 parameters per room) fine-tune the model for each new spaceAdapts to a new room with minimal data — 93% less than retraining from scratch
Memory preservationEWC++ regularization remembers what was learned during pretrainingSwitching to a new task doesn't erase prior knowledge
Hard-negative miningTraining focuses on the most confusing examples to learn fasterBetter accuracy with the same amount of training data

Architecture

WiFi Signal [56 channels] → Transformer + Graph Neural Network
                                  ├→ 128-dim environment fingerprint (for search + identification)
                                  └→ 17-joint body pose (for human tracking)

Quick Start

# Step 1: Learn from raw WiFi data (no labels needed) cargo run -p wifi-densepose-sensing-server -- --pretrain --dataset data/csi/ --pretrain-epochs 50 # Step 2: Fine-tune with pose labels for full capability cargo run -p wifi-densepose-sensing-server -- --train --dataset data/mmfi/ --epochs 100 --save-rvf model.rvf # Step 3: Use the model — extract fingerprints from live WiFi cargo run -p wifi-densepose-sensing-server -- --model model.rvf --embed # Step 4: Search — find similar environments or detect anomalies cargo run -p wifi-densepose-sensing-server -- --model model.rvf --build-index env

Training Modes

ModeWhat you needWhat you get
Self-SupervisedJust raw WiFi dataA model that understands WiFi signal structure
SupervisedWiFi data + body pose labelsFull pose tracking + environment fingerprints
Cross-ModalWiFi data + camera footageFingerprints aligned with visual understanding

Fingerprint Index Types

IndexWhat it storesReal-world use
env_fingerprintAverage room fingerprint"Is this the kitchen or the bedroom?"
activity_patternActivity boundaries"Is someone cooking, sleeping, or exercising?"
temporal_baselineNormal conditions"Something unusual just happened in this room"
person_trackIndividual movement signatures"Person A just entered the living room"

Model Size

ComponentParametersMemory (on ESP32)
Transformer backbone~28,00028 KB
Embedding projection head~25,00025 KB
Per-room MicroLoRA adapter~1,8002 KB
Total~55,00055 KB (of 520 KB available)

The self-learning system builds on the AI Backbone (RuVector) signal-processing layer — attention, graph algorithms, and compression — adding contrastive learning on top.

See docs/adr/ADR-024-contrastive-csi-embedding-model.md for full architectural details.


📖 Documentation

DocumentDescription
User GuideStep-by-step guide: installation, first run, API usage, hardware setup, training
Build GuideBuilding from source (Rust and Python)
Architecture Decisions79 ADRs — why each technical choice was made, organized by domain (hardware, signal processing, ML, platform, infrastructure)
Domain Models7 DDD models (RuvSense, Signal Processing, Training Pipeline, Hardware Platform, Sensing Server, WiFi-Mat, CHCI) — bounded contexts, aggregates, domain events, and ubiquitous language
Desktop AppWIP — Tauri v2 desktop app for node management, OTA updates, WASM deployment, and mesh visualization
Medical ExamplesContactless blood pressure, heart rate, breathing rate via 60 GHz mmWave radar — $15 hardware, no wearable
Extended DocumentationLatest additions, key features, installation, quick start, signal processing, training, CLI, testing, deployment, and changelog

📄 License

MIT License — see LICENSE for details.

📞 Support

GitHub Issues | Discussions | PyPI


WiFi DensePose — Privacy-preserving human pose estimation through WiFi signals.

关于 About

π RuView: WiFi DensePose turns commodity WiFi signals into real-time human pose estimation, vital sign monitoring, and presence detection — all without a single pixel of video.
agentic-aidenseposeesp32firmwaremcumincutmonitoringpose-estimationrfselfself-learningwifiwifi-hackingwifi-security

语言 Languages

Rust50.6%
JavaScript16.3%
Python16.0%
TypeScript7.2%
C3.7%
Shell3.0%
HTML2.0%
CSS1.1%
Makefile0.1%
CMake0.0%
Roff0.0%
Dockerfile0.0%
PowerShell0.0%
Swift0.0%
Batchfile0.0%
Mako0.0%

提交活跃度 Commit Activity

代码提交热力图
过去 52 周的开发活跃度
443
Total Commits
峰值: 169次/周
Less
More

核心贡献者 Contributors