A comprehensive guide to the hardware, firmware, software, and algorithms used in the project.
- 1. Project Overview
- 2. Hardware Setup & Components
- 3. Arduino Firmware Explained
- 4. Unity Project Architecture
- 5. Key Algorithms & Techniques
- 6. User Interface (UI) System
- 7. Signal Flow: How It All Works Together
This project is an interactive simulation of a robotic arm controlled in 3D space by a custom-built, motion-sensing "air mouse." The user physically moves the air mouse, and the on-screen robotic arm mimics the motion to perform tasks in a training game. The system is designed to be highly modular, allowing for easy customization of the arm's structure, controls, and game mechanics.
- Real-time Motion Control: The arm's target is controlled by the pitch, yaw, and roll of a physical MPU-6050 sensor.
- Modular Robotic Arm: The arm is built using an Inverse Kinematics (IK) system that can support any number of joints and bone lengths.
- Interactive Training Game: A game loop where the player picks up and assembles parts at target locations.
- Haptic Feedback: A vibration motor provides physical feedback for key game events like picking up and placing objects.
- Procedural Audio: Joint movement sounds are generated in real-time, creating a dynamic and realistic effect that syncs perfectly with motion speed.
- Advanced UI Shaders: Custom raymarching shaders create a dynamic, glowing "holographic" cube and background that react to the air mouse's rotation.
- Comprehensive UI Control: In-game menus allow the user to connect, configure all air mouse and haptic feedback parameters, and view a live data stream.
- Arduino Nano (or compatible board)
- MPU-6050 6-Axis Accelerometer & Gyroscope Module
- Vibration Motor Module (with built-in driver)
- Pushbutton (for optional click input)
- Breadboard and Jumper Wires
The components are connected to the Arduino Nano as follows. Power is supplied via the USB connection to the computer.
- MPU-6050 Sensor:
VCC
→5V
on ArduinoGND
→GND
on ArduinoSCL
→A5
on Arduino (I2C Clock)SDA
→A4
on Arduino (I2C Data)
- Vibration Motor Module:
GND
→GND
on ArduinoVCC
→5V
on ArduinoIN
→ PinD9
on Arduino (must be a PWM pin, marked with~
)
- Pushbutton (Optional):
- One leg →
GND
on Arduino - Other leg → Pin
D3
on Arduino
- One leg →
- Arduino Nano: The "brain" of the physical controller. Its sole purpose is to read raw data from the MPU-6050, listen for commands from Unity, and send all data to the computer over the USB serial port.
- MPU-6050: The motion sensor. It contains a 3-axis accelerometer and a 3-axis gyroscope, providing the rotational data (pitch, roll, yaw) that drives the air mouse.
- Vibration Motor Module: The haptic feedback device. The module includes a driver transistor, allowing the Arduino to safely turn the motor on and off with a simple digital signal, and control its intensity with PWM.
The Arduino code is designed to be a "dumb" data forwarder. It performs no calculations or game logic. This is a deliberate design choice that makes the hardware universal; all complex logic is handled in Unity, allowing for rapid iteration without ever needing to re-upload code to the Arduino.
Communication happens over the serial port at a 115200 baud rate.
-
Arduino to Unity (Sensor Data): The Arduino constantly sends a single line of comma-separated values (CSV) representing the full sensor state.
- Format:
ax,ay,az,gx,gy,gz,clickState
ax, ay, az
: Raw accelerometer data.gx, gy, gz
: Raw gyroscope data (used for pitch, yaw, roll).clickState
:1
if the button is pressed,0
otherwise.
- Format:
-
Unity to Arduino (Haptic Commands): Unity sends simple string commands to the Arduino to control the vibration motor.
- Format:
V,intensity,duration\n
V
: The command prefix for Vibration.intensity
: A number from0
(off) to255
(full power).duration
: The time in milliseconds the motor should stay on.\n
: A newline character to signal the end of the command.
- Format:
setup()
: Initializes serial communication, connects to the MPU-6050 sensor, and configures the motor pin as an output.loop()
:- Listen for Commands: It first checks
Serial.available()
to see if a command has arrived from Unity. If so, it reads the string and parses the intensity and duration. - Manage Vibration (Non-Blocking): The code uses the
millis()
function for vibration control. When a command is received, it turns the motor on and calculates a future timestamp (vibrationStopTime
). It then checks on every loop if the current time has passed this timestamp. If it has, it turns the motor off. This avoids usingdelay()
, which would halt the program and disrupt the sensor data stream. - Read and Send Sensor Data: It reads the latest data from the MPU-6050 and sends it to the computer in the specified CSV format.
- Listen for Commands: It first checks
The Unity project is built on several independent, modular systems that communicate with each other.
- Input System (
AirMouseInput
): Handles all communication with the Arduino. - Robotic Arm System (
IKController
): Manages the arm's physical simulation. - Game Logic (
AssemblyGameManager
): Controls the training game rules and state. - UI System (Various Scripts): Manages all menus, displays, and visual feedback.
- Sound System (Various Scripts): Manages procedural joint audio and event-based sound effects.
IK_System
: An empty GameObject containing the hierarchy of arm joints. TheIKController
script is attached here.GameManager
: An empty GameObject with theAssemblyGameManager
script.UI Canvas
: Contains all UI panels, buttons, and text displays.- Prefabs:
AssemblyPart
andTargetLocation
prefabs are used to spawn the game objects for each round.
-
AirMouseInput.cs
:- Purpose: Manages the serial port connection on a separate thread to prevent the game from freezing. Reads incoming sensor data, queues outgoing vibration commands, and provides smoothed input values to other scripts.
- Key Logic: Uses
System.IO.Ports.SerialPort
for communication. AThread
reads data in a loop.ConcurrentQueue
safely handles commands sent from the main game thread.Vector2.SmoothDamp
provides frame-rate independent smoothing for buttery-smooth controls in both the editor and final builds.
-
IKController.cs
:- Purpose: Implements the Inverse Kinematics algorithm.
- Key Logic: Takes a list of joint
Transforms
and aTarget
transform. InLateUpdate
, it iteratively adjusts each joint's rotation to make theEndEffector
reach theTarget
. See the Algorithms section for more detail.
-
TargetController.cs
:- Purpose: Moves the IK target in the scene based on user input.
- Key Logic: Has two modes. In
Mouse
mode, it uses screen-to-world raycasting. InAirMouse
mode, it reads the smoothed data fromAirMouseInput
and translates its position accordingly. It also implements the movement bounding box.
-
AssemblyGameManager.cs
:- Purpose: The "brain" of the training game. Manages game state, spawning, and win conditions.
- Key Logic: Uses a state machine (
AwaitingPickup
,AssemblingPart
). Spawns parts and locations at random. The "pickup" mechanic is achieved by parenting the assembly part to the player's target. It uses C#Action
events to announce key moments (OnPartPickedUp
, etc.) to other scripts in a decoupled way.
-
JointSoundController.cs
:- Purpose: Generates realistic, procedural motor sounds for a single joint.
- Key Logic: Attached to each joint pivot. It calculates angular speed in
Update()
. InOnAudioFilterRead()
, it generates a sound wave from scratch, mixing a base waveform (like Sawtooth) with percussive clicks and a low-frequency grind. The pitch and volume are directly controlled by the joint's movement speed.
-
UI Controller Scripts (
MainMenuController
,AirMouseUIController
, etc.):- Purpose: These scripts act as bridges between the UI elements (sliders, buttons) and the public variables of the core system scripts.
- Key Logic: They read initial values from scripts like
AirMouseInput
to populate the UI. They useonClick
andonValueChanged
listeners to call functions that update the variables on the target scripts.
The IKController
uses CCD, an intuitive iterative algorithm.
- The loop starts from the joint closest to the end effector and moves backward toward the base.
- For each joint, it creates two vectors:
VectorA
(from the current joint to the end effector) andVectorB
(from the current joint to the target). - It calculates the rotation needed to align
VectorA
withVectorB
usingQuaternion.FromToRotation
. - It applies this small rotation to the current joint.
- This process repeats for all joints. By running this entire cycle several times per frame, the arm quickly converges on a solution and points at the target.
To create a realistic sound that syncs perfectly with movement, the JointSoundController
generates audio from code instead of playing a recording.
OnAudioFilterRead(float[] data, int channels)
: This special Unity function runs on a separate audio thread. It gives the script direct access to the audio buffer (data
) before it's sent to the speakers.- Waveform Generation: The script generates a base tone by calculating the values of a mathematical function (like
Sine
orSawtooth
) over time. Thefrequency
of this function determines the pitch. - Layer Mixing: A convincing mechanical sound is created by mixing three layers:
- Whine: The base
Sawtooth
waveform mixed with a small amount of random noise. - Clicks: A percussive layer made by generating a burst of loud noise that quickly decays, triggered at a frequency proportional to movement speed.
- Grind: A low-frequency sine wave that modulates the amplitude of the whine layer, creating a "wobble" effect.
- Whine: The base
The holographic cube and background effects are created using raymarching, a rendering technique different from standard polygons.
- Signed Distance Field (SDF): The core of the shader is a function
D(p)
that, for any point in spacep
, returns the shortest distance to any object in the scene. - Raymarching Loop: The shader casts a ray from the camera. Instead of checking for triangle intersections, it evaluates the SDF to find the largest "safe" step it can take along the ray. It takes this step and repeats the process. This is much more efficient for rendering complex mathematical shapes.
- Shader Logic: The shader calculates the cube's SDF, which is rotated based on the air mouse input. When a ray hits the surface, it calculates reflections of a procedural sky and floor to determine the final color. The background-only version uses the same technique but makes the cube itself invisible, showing only the reflections.
The AssemblyGameManager
uses public static event Action
to announce game events. Other scripts, like GameLogUI
and GameSoundEffects
, "subscribe" to these events.
- Benefit: This is a powerful design pattern. The
GameManager
doesn't need to know that a UI or sound script exists. It just shouts "A part was picked up!" into the void. Any script interested in that event can listen for it. This makes the code highly modular and easy to expand—you can add new feedback systems without ever touching theGameManager
code again.
- Functionality: Allows the user to input the COM port and baud rate. A "Connect" button attempts to initialize the
AirMouseInput
script. A toggle allows the user to bypass this and use the standard mouse. - Feedback: A status text field provides real-time updates ("Connecting...", "Connected!", "Failed"). Upon successful connection, a raw data display panel appears, showing the live data stream from the Arduino.
- Functionality: This panel, controlled by
AirMouseUIController
, provides sliders, toggles, and dropdowns to configure every public variable on theAirMouseInput
andAssemblyGameManager
scripts. - Controls: Sensitivity, roll sensitivity, deadzone, smoothing, axis mapping, axis inversion, and all haptic feedback intensity/duration values can be tweaked in real-time.
- Targeting Line: A
LineRenderer
object controlled byTargetingLineUI
. It draws a line between the player's controller and the current objective (either the part to be picked up or the assembly location). ATextMeshPro
object at the line's midpoint displays the distance. - Game Log: A
TextMeshPro
text element controlled byGameLogUI
. It listens for game events and displays status messages ("New Round!", "Part Picked Up!") with a smooth fade-out effect.
This is the end-to-end data flow for a single user action:
- The user physically moves the air mouse hardware.
- The MPU-6050 sensor detects the change in orientation.
- The Arduino's
loop()
reads the new sensor values. - The Arduino formats the data into a CSV string and sends it over the USB serial port.
- In Unity, the
AirMouseInput
script's dedicated thread is constantly listening. It receives the CSV string. - The thread parses the string into numerical values for pitch and yaw.
- In the main game thread, the
Update()
method inAirMouseInput
reads these latest values and appliesSmoothDamp
to create a smoothed input vector. - The
TargetController
script reads this final, smoothed vector and updates the 3D position of theTarget
GameObject in the scene. - The
IKController
'sLateUpdate()
method detects that itsTarget
has moved. It runs the CCD algorithm to calculate the new rotations for all arm joints to make the end effector follow the target. - As the joints rotate, the
JointSoundController
script on each joint detects the angular speed and generates a procedural motor sound with a corresponding pitch and volume. - If the
Target
moves close to anAssemblyPart
, theAssemblyGameManager
script detects this, parents the part to the target, and fires theOnPartPickedUp
event. - The
GameSoundEffects
andGameLogUI
scripts hear this event and play the pickup sound and display the "Part Picked Up!" message, respectively. - The
AssemblyGameManager
also calls theSendVibrationCommand
on theAirMouseInput
script. AirMouseInput
queues the command. On its next processing cycle, the serial thread sends theV,200,150\n
command back to the Arduino.- The Arduino's
loop()
receives the command, turns on the vibration motor, and sets a timer to turn it off, providing haptic feedback to the user.