Skip to content

dandelionskyy/Embedded-Systems-Project-Game

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ELEC2645 — Embedded Multi-Game Platform with CNN Gesture Recognition

Group 3 — XJEL2645

A collaborative embedded gaming platform on STM32L476RG, featuring three independently developed games, a unified menu system, and real-time CNN-based gesture recognition running entirely on-device. The entire pipeline — from gyroscope data collection, to CNN model training, to on-chip inference — is built from scratch with self-collected training data and a custom neural network deployed on the Cortex-M4 at 80 MHz.


Two Operation Modes

The firmware supports two compile-time modes (selected via #define in Core/Src/main.c):

1. Game Mode (MODE_GAME)

The full-featured mode: boot animation → CNN self-test → gesture test playground → three-game menu.

BOOT → Pixel-Art Intro → CNN Self-Test → Gesture Test Mode → [BTN3] → Game Menu
                                                                    ├── Game 1: Tennis Rally
                                                                    ├── Game 2: Temple Quest
                                                                    └── Game 3: Gold Miner

2. Data Collection Mode (MODE_DATA_COLLECTION)

A dedicated firmware mode for collecting training data from the MPU6050. Press PC8 to capture 150 gyroscope + accelerometer samples (3.0 seconds @ 50 Hz), then the MCU streams the data over UART2 at 921600 baud in a structured format:

IMU
ax1 ay1 az1 gx1 gy1 gz1
ax2 ay2 az2 gx2 gy2 gz2
... (150 lines, 6 columns each)

A companion Python script (Serial_Read.py) runs on the PC side — it connects via serial, auto-detects the IMU header, and saves each recording as a labeled .txt file. The user selects the gesture class in the script (e.g. "Horn", "Wave"), performs the gesture while pressing PC8, and the script handles file naming, incremental indexing, and graceful Ctrl+C exit with a session summary.

Data collection workflow:

PC side (Serial_Read.py)          MCU side (MODE_DATA_COLLECTION)
┌─────────────────────┐           ┌──────────────────────────┐
│ 1. Select COM port  │──UART──▶  │                          │
│ 2. Choose gesture   │           │  PC8 → start sampling    │
│    class (1-13)     │           │  150 samples @ 20ms      │
│ 3. Set count (N)    │           │  6-axis: AccXYZ + GyroXYZ│
│ 4. "Press PC8!"     │◀──UART──  │  UART output: IMU\n+data │
│ 5. Auto-detect IMU  │           │                          │
│ 6. Confirm & save   │           │                          │
│ 7. Repeat until done│           │                          │
└─────────────────────┘           └──────────────────────────┘

Self-Collected Training Dataset

All gesture training data was collected by our team using the hardware we built — not downloaded from any public dataset. This ensures the CNN model is specifically tuned to our MPU6050 sensor characteristics, sampling rate, and the physical way we perform gestures.

Property Value
Collection tool Serial_Read.py + MCU in MODE_DATA_COLLECTION
Sensor MPU6050 on our custom hardware (SoftI2C on PB4/PB5)
Data per gesture 150 time-steps × 6 axes (AccX, AccY, AccZ, GyroX, GyroY, GyroZ)
Duration per sample 3.0 seconds (150 samples @ 20ms = 50 Hz)
Classes 13 gestures
Samples per class 50 recordings (~7,500 total gesture recordings)
Total raw data 13 × 50 × 150 × 6 = 585,000 float values
Storage TraningData_5_11/ — 650 .txt files
Training columns GyroX, GyroY, GyroZ only (columns 3,4,5)

The 13 gesture classes collected:

# Gesture Physical Motion Shape
0 RightAngle Sharp 90° turn
1 SharpAngle Acute angle turn
2 Lightning Zigzag bolt
3 Triangle Three-sided shape
4 Letter_H H-shaped path H
5 Letter_R R-shaped path R
6 Letter_W W-shaped path W
7 Letter_Phi Φ-shaped path Φ
8 Circle Full circular motion
9 UpAndDown Vertical oscillation
10 Horn Horn-shaped curve 🎺
11 Wave Horizontal wave ~
12 NoMotion Sensor stationary

CNN Training & Inference Pipeline

The complete pipeline — data collection → model training → weight export → on-device inference — runs end-to-end with no cloud dependency.

Stage 1: Data Collection (MCU + Python)

The MCU streams raw IMU data over UART. Serial_Read.py captures it and saves labeled .txt files organized by gesture class. The script handles auto-detection of the IMU header (robust to printf debug output on the same serial line), user confirmation before saving, incremental file naming, and graceful Ctrl+C exit.

Stage 2: Model Training (Python + TensorFlow/Keras + NNoM)

CNNTrainRaw.py loads all .txt files from the training data directory, builds a 1D convolutional neural network, trains it, and exports both a Keras .h5 model and an NNoM-compatible weights.h header.

Model architecture (Magic Wand — 1D CNN):

Input: float[150][3] gyroscope (X, Y, Z in rad/s)
        ↓  Int8 Quantize (×32 scale)
Conv1D: 1 → 30 channels, kernel=3, stride=3, ReLU
        ↓
Conv1D: 30 → 15 channels, kernel=3, stride=3, ReLU
        ↓
MaxPool1D: pool_size=3, stride=3
        ↓
Flatten: 75 features
        ↓  Dropout(0.5)
Dense: 75 → 13, Softmax
        ↓
Output: 13-class probability distribution

Training configuration:

Parameter Value
Framework TensorFlow 2 / Keras
Train/Test split 80% / 20%
Batch size 80
Max epochs 200 (early stopping, patience=10)
Optimizer Adam
Loss Categorical Crossentropy
Weight export NNoM generate_model()weights.h (int8 C header, ~12 KB)

Stage 3: On-Device Inference (STM32L476 + NNoM)

The trained weights (weights.h) are compiled directly into the firmware. At runtime, nnom_infer.c wraps the NNoM v0.4.3 library to run inference entirely on the Cortex-M4:

Gesture Capture (3s)       Int8 Quantization        Forward Pass          Result
┌──────────────────┐      ┌──────────────┐      ┌─────────────┐      ┌──────────┐
│ MPU6050 @ 50Hz   │ ───▶ │ float→int8    │ ───▶ │ Conv1D ×2   │ ───▶ │ Argmax   │
│ 150 samples ×3   │      │ scale = ×32   │      │ Dense + SM  │      │ + Conf%  │
│ gyro rad/s       │      │ 8KB buffer    │      │ 15-27ms     │      │ 13 class  │
└──────────────────┘      └──────────────┘      └─────────────┘      └──────────┘

Inference performance (on-chip, no external compute):

Metric Value
Inference time 15–27 ms (pure C, no CMSIS-NN)
Static RAM 8 KB
Model weights (flash) ~12 KB
Stack per inference ~400 B
Confidence threshold 81% (103/127)
Flash utilization ~165 KB / 1 MB (16%)
RAM utilization ~8.4 KB / 128 KB (6.5%)

GestureApp — High-Level API for Games

Built on top of the CNN inference layer, GestureApp provides a PC8-button-driven state machine that any game can drop in with three function calls:

// In your game loop:
if (current_input.pc8_pressed) Gesture_OnPC8Press();  // Start/stop capture
Gesture_Tick(HAL_GetTick());                            // Drive 20ms sampling
if (Gesture_GetStatus() == GESTURE_STATUS_INFER_DONE) {
    const MW_InferenceResult_t* r = Gesture_GetResult();
    // Use r->gesture_id (0–12), r->confidence_percent
    Gesture_Reset();
}

State machine: IDLE →(PC8)→ RECORDING →(150 samples)→ INFER_DONE →(PC8)→ IDLE


Game Gesture Mappings

Each game uses gesture recognition differently:

Game 1 — Tennis Rally Showdown: Gestures are your swing. The auto-pre-trigger detects when the ball is approaching and starts capture automatically. Three gestures map to the three shot types:

Gesture Shot Behavior
Horn SMASH Powerful, fast, low trajectory
Wave TOPSPIN Fast with heavy forward spin, dips after net
Letter W SLICE Slower with backspin, stays low

Game 2 — Temple Quest: Press PC8 to cast a gesture skill. Each gesture activates a different ability (Shield, Dash, Heal, Shockwave, Magnet, Bonus Score).

Game 3 — Gold Miner Deluxe: Gestures used on the mode selection screen (Circle = move left, Horn = move right).


Highlights

  • Self-Collected Dataset: 650 gesture recordings across 13 classes, collected by our team using our own hardware and Python tooling — not downloaded, not pre-packaged
  • Dual-Mode Firmware: Game Mode (3 games + CNN gesture test) and Data Collection Mode (streams MPU6050 data to PC for labeling), switchable by #define
  • On-Device CNN Inference: 13-class gesture recognition using NNoM on STM32L476 (Cortex-M4, 80 MHz) — 15-27 ms per inference, ~8 KB RAM, zero cloud dependency
  • Complete Training Pipeline: Serial_Read.py (capture) → CNNTrainRaw.py (train + export weights.h) → nnom_infer.c (on-device inference) — full reproducibility
  • Three Distinct Games: Tennis (sports) → Temple Quest (roguelike) → Gold Miner (arcade) — each developed independently with shared peripheral drivers
  • Rich Hardware Stack: 240×240 SPI LCD, dual analog joysticks, 9 EXTI buttons, PWM buzzer with musical notes, PWM LED, hardware RNG, MPU6050 6-axis IMU
  • No-Merge-Conflict Architecture: Shared runtime in shared/, each game in its own folder — three developers work simultaneously

System Architecture

┌─────────────────────────────────────────────────────┐
│                    main.c                            │
│  ┌──────────┐  ┌──────────┐  ┌──────────────────┐  │
│  │  Menu    │  │ Gesture  │  │  Game Dispatcher  │  │
│  │ System   │  │ Test Mode│  │  (State Machine)  │  │
│  └──────────┘  └──────────┘  └──────────────────┘  │
│         │                           │               │
│    ┌────┴────┬─────────────┬───────┴───────┐       │
│    │ Game 1  │   Game 2    │    Game 3     │       │
│    │ Tennis  │ Temple Quest│  Gold Miner   │       │
│    └─────────┴─────────────┴───────────────┘       │
│                                                      │
│  Shared Layer: Menu, InputHandler, SoftI2C           │
│  Driver Layer: LCD, Joystick, Buzzer, PWM, MPU6050   │
│  CNN Layer: NNoM inference engine + GestureApp API   │
└─────────────────────────────────────────────────────┘

Hardware Platform

Component Detail
MCU STM32L476RGTx — ARM Cortex-M4 @ 80 MHz, 1 MB Flash, 128 KB SRAM
Board NUCLEO-L476RG
Display ST7789V2 1.54" SPI LCD — 240×240 px, 16-color indexed palette, DMA-accelerated
Joysticks 2× analog (12-bit ADC) — P1: PC0/PC1 (A5/A4), P2: PA0/PA1 (A0/A1) — 8-direction + polar/cartesian output
Buttons 9× with hardware EXTI debouncing (200 ms) — BTN2-BTN9, PC8 (gesture trigger)
Audio PWM buzzer on TIM2 CH3 (PB10) — 20 Hz–20 kHz, musical notes C4–C8 with sharps
Lighting PWM LED on TIM4 CH1 (PB6) — 0–100% brightness; onboard LD2 on PA5
IMU MPU6050 6-axis — accelerometer ±2/4/8/16 g, gyroscope ±250/500/1000/2000 dps, SoftI2C on PB4/PB5
RNG Hardware random number generator for procedural content
Debug USART2 (PA2/PA3) @ 115200 baud (game mode) / 921600 baud (data collection) with printf redirect

Project Structure

├── Core/                    STM32CubeMX auto-generated HAL init
├── Drivers/                 CMSIS + STM32L4 HAL library
├── shared/                  Shared runtime
│   ├── Menu.h/c             Menu state machine + star-field UI
│   ├── InputHandler.h/c     Button debouncing + input state
│   └── SoftI2C.h/c          Bit-bang I2C for MPU6050
├── game_1/                  Tennis Rally Showdown — gesture-driven tennis
├── game_2/                  Temple Quest — top-down survival maze runner
├── game_3/                  Gold Miner Deluxe — dual-player arcade miner
├── gesture_app/             GestureApp API — PC8-driven capture state machine
├── CNN/                     NNoM inference engine
│   ├── nnom_infer.h/c       CNN API wrapper
│   ├── weights.h            Pre-trained model weights (~12 KB)
│   └── nnom_lib/            NNoM v0.4.3 library
├── Joystick/                Dual-axis ADC joystick driver
├── Buzzer/                  PWM buzzer driver with musical note API
├── PWM/                     General-purpose PWM driver
├── MPU6050/                 6-axis IMU driver (SoftI2C)
├── ST7789V2_Driver_STM32L4/ LCD driver (SPI + DMA)
├── TraningData_5_11/        Self-collected gesture dataset (650 .txt files)
├── Serial_Read.py           Data collection script (PC ← MCU over UART)
├── CNNTrainRaw.py           CNN training + weights.h export script
└── CNNTestSerialRaw.py      Serial inference test script

The Three Games

# Game Genre Key Innovation Developer Docs
1 Tennis Rally Showdown Sports MPU6050 swing-as-racket — CNN classifies your gesture to determine shot type Yaotian Zhang README
2 Temple Quest Roguelike Runner Procedural room generation + curse chase + gesture-cast abilities Shuo Li README
3 Gold Miner Deluxe Arcade Dual-joystick co-op/PvP, 3-level campaign, shop economy Jiahao Ding README

See each game's README for full gameplay details, controls, items, enemies, and tips.


Button Map

Button GPIO EXTI Typical Use
BTN2 PC2 EXTI2 Back to menu / Exit
BTN3 PC3 EXTI3 Menu select / Confirm
PC8 PC8 EXTI8 Gesture capture trigger
BTN4 PA8 EXTI8 Custom (Game 3)
BTN5 PA10 EXTI10 Custom
BTN6 PA7 EXTI7 Custom
BTN7 PA6 EXTI6 Custom
BTN8 PC4 EXTI4 Custom
BTN9 PC5 EXTI5 P2 fire button (Game 3)

Available Timers for Game Use

Timer Default Role Free?
TIM2 Buzzer PWM Occupied
TIM4 LED PWM Occupied
TIM6 100 Hz ISR (10 ms tick) Auto-started, g_tim6_ticks available
TIM7 1 Hz ISR (1 s tick) Initialized, not started — call HAL_TIM_Base_Start_IT(&htim7)

Build & Flash

cd build/Debug
ninja                          # Compile
# Flash Unit_4_1_Menu_Template.bin to STM32L476 via ST-Link

To switch modes, edit Core/Src/main.c line 71:

#define MODE_GAME              // Normal game mode
// #define MODE_DATA_COLLECTION  // Data collection mode

Boot Sequence (Game Mode)

  1. Pixel-art startup animation with team member credits (~7.5 s)
  2. CNN inference self-test using 3 pre-recorded gestures (Circle, Horn, Letter_H)
  3. Gesture Test Mode: Press PC8 to try gesture recognition — results printed to UART
  4. Press BTN3 → Game Menu Mode: Joystick UP/DOWN to browse, BTN3 to launch

About

XJEL2645 Embedded Systems Project-Group Project Game

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages