# UniBots — How the system works

This overview matches the integrated match stack in **`FinalVersion/mainMK.py`** (target: **Raspberry Pi 5**). Other folders (`MKImplementation/`, `KrishivBallDetection/`, `GeneralTesting_*`, etc.) are parallel variants and experiments.

---

## My role

**Hardware and software:** I worked on **Raspberry Pi** bring-up, **circuitry** (wiring, sensors/actuators with the rest of the team as applicable), and **embedded Python** that ties vision, motors, and servos together. The competition stack relied on **reactive vision + motor control** (artificial potential field–style navigation in the ball pipeline), not a separate global planner.

**Path planning:** I also explored implementing a **path planning** algorithm for the robot, but that approach **was not used in the competition** — match code stayed with the integrated camera-driven navigation and atomic pick/dock sequences below.

---

## Purpose

The robot **searches for balls** with a camera, **drives toward them** using motor commands from vision, **picks them up** with a servo-driven claw, and when storage is full or time runs out, uses **AprilTags** to find a **basket / wall** and **unload**.

---

## Main pieces

| Piece | Role |
|--------|------|
| **Camera + OpenCV** | Captures frames and passes the latest image to detector processes. |
| **Ball detector process** | Runs the “K” vision pipeline (`ball_detector_runtime_K`: detect balls, obstacles, navigation vector). |
| **AprilTag detector process** | Uses `AprilTagNavigator_M` (pupil_apriltags) for configured tag IDs and steering toward the dock. |
| **Motors thread** | Executes movement from `moveTargetBall` / `moveTargetTag`, and **atomic sequences** (approach → stop → grab or dock). |
| **Servo thread** | Claw and unload handle; coordinates via `clawBusy`, `pickingInProcess`, etc. |
| **Button (gpiozero)** | Start / pause workers / long-hold reset. |
| **Main timer process** | Tracks match time; after ~160 s of runtime can set **finalising** state for end-game unload. |

---

## Concurrency model

- **Processes:** `MainTimer`, `BallDetector`, `AprilTagDetector` (multiprocessing).
- **Threads:** `MotorsHandler`, `ServoHandler`, `ButtonHandler`, `ImageFrameCapture`.

**Pause / stop:** `worker_pause_event` (workers), `timer_pause_event` (timer), `stop_event` (shutdown).

---

## Shared state (mental model)

- **`moveTargetBall` / `moveTargetTag`** — `{"angle": degrees, "magnitude": 0..1}` (angle relative to heading; 0° = forward).
- **`lockedBall`**, **`pickingInProcess`**, **`dockingInProcess`** — capture and dock **atomic** sequences; image capture can freeze during these.
- **`engageClaw` / `engageHandle`**, **`clawAdjusted`**, **`clawBusy`**, **`storageFull`**, ball counts — servos and when to unload.

---

## Ball vision: three modes

1. **SEARCH** — no balls: motors may scan (e.g. pivot).  
2. **TRACK** — `moveTargetBall` set; `compute_navigation_vector()` adds obstacle repulsion in the K runtime.  
3. **CAPTURE** — ball low in frame and aligned: **`pickingInProcess`** triggers the atomic grab.

---

## AprilTags

`AprilTagNavigator` keeps allowed tag IDs, picks a closest tag, and supplies steering toward the basket. Docking mirrors picking: an **atomic** sequence while **`dockingInProcess`** is set.

---

## Dependencies

See repo **`dependencies.txt`** (gpiozero, opencv-contrib-python, numpy). Code also uses **pupil_apriltags** for AprilTags — install whatever your entry script imports.

---

## Repo layout note

**`FinalVersion/`** is the best starting point for the integrated competition behavior; other directories are hardware variants or tests.

---

## Code excerpts

Below are **real snippets** from `FinalVersion/` showing how the main integration is structured.

### Entry imports, GPIO/camera constants, and shared state

`FinalVersion/mainMK.py`

```python
import multiprocessing
import threading
import time
import logging

import cv2
from gpiozero import Button

import ServoController_MK as SC
import MotorsController_M_bugFix as MC

from ball_detector_runtime_K import (
    detect_balls,
    detect_obstacles,
    compute_navigation_vector,
)

import configTag_M as configTag
from AprilTagNavigator_M import AprilTagNavigator

BUTTON_GPIO = 4           # GPIO pin for the physical button
CAMERA_INDEX = 0
CAPTURE_Y_THRESHOLD = 0.88  # ball bottom at this fraction of frame height → Capture mode

SHARED_INITIAL_STATE = {
    "algHasBeenStarted":  False,
    "timeStarted":        0.0,
    "finilisingState":    False,
    "startingPosReached": False,
    "lockedBall":         "",
    "moveTargetBall":     None,
    "pickingInProcess":   False,
    "captureAlignPending": False,
    "captureAlignAngleDeg": 0.0,
    "captureAlignRetryCount": 0,
    "captureIgnoreUntilS": 0.0,
    "lastCaptureSignature": "",
    "moveTargetTag":      None,
    "dockingInProcess":   False,
    "dockingInfo":       None,
    "engageClaw":         False,
    "engageHandle":       False,
    "clawAdjusted":       False,
    "clawBusy":           False,
    "heldPingPong":        0,
    "heldSteel":           0,
    "storageFull":        False,
    "btnHeld":            False,
}
```

### Main timer — match timing and finalising state

`FinalVersion/mainMK.py`

```python
def main_timer_process(shared, timer_pause_event, stop_event):
    log.info("MainTimer started.")

    while not stop_event.is_set():
        timer_pause_event.wait()

        if shared["algHasBeenStarted"]:
            elapsed = time.time() - shared["timeStarted"]
            if elapsed > 160 and not shared["finilisingState"]:
                log.info("1000 s elapsed – entering finalising state.")
                shared["finilisingState"] = True
        else:
            shared["algHasBeenStarted"] = True
            shared["timeStarted"]       = time.time()
            log.info("Algorithm timer started.")

        time.sleep(0.02)

    log.info("MainTimer stopped.")
```

### Ball detector — vision loop, guard conditions, SEARCH, pipeline call

`FinalVersion/mainMK.py`

```python
def ball_detector_process(frame_queue, shared, worker_pause_event, stop_event):
    log.info("BallDetector started.")

    while not stop_event.is_set():
        worker_pause_event.wait()

        if (not shared["storageFull"]
                and not shared["finilisingState"]
                and not shared["pickingInProcess"]
                and not shared["dockingInProcess"]):

            frame = None
            try:
                frame = frame_queue.get_nowait()
            except Exception:
                pass

            if frame is None:
                time.sleep(0.02)
                continue

            try:
                detection = detect_balls(frame, ping_pong_profile=PING_PONG_PROFILE)
            except Exception as exc:
                log.warning(f"detect_balls raised: {exc}")
                time.sleep(0.02)
                continue

            detected_balls = detection["detected_balls"]
            frame_height   = detection["frame"].shape[0]

            if not detected_balls:
                shared["lockedBall"]     = ""
                shared["moveTargetBall"] = None
                time.sleep(0.02)
                continue

            target = detected_balls[0]
            # ... CAPTURE / TRACK branches continue below in source ...
```

### AprilTag navigator — detector setup and responsibilities

`FinalVersion/AprilTagNavigator_M.py`

```python
import cv2
import math
from pupil_apriltags import Detector


class AprilTagNavigator:
    """
    This class handles AprilTag detection for the robot's basket docking task.

    Main responsibilities:
    1. Detect AprilTags in a frame
    2. Keep only the basket tag IDs we care about
    3. Choose the closest valid tag
    4. Compute steering info so the robot can move toward it
    """

    def __init__(
        self,
        target_tag_ids,
        camera_params,
        tag_size_m,
        frame_size=(640, 480),
        tag_families="tagStandard41h12",
    ):
        self.target_tag_ids = set(target_tag_ids)
        self.camera_params = camera_params
        self.tag_size_m = tag_size_m
        self.frame_width, self.frame_height = frame_size

        self.detector = Detector(
            families=tag_families,
            nthreads=2,
            quad_decimate=1.0,
            quad_sigma=0.0,
            refine_edges=1,
            decode_sharpening=0.25,
            debug=0
        )
```

---

*Reference notes for the UniBots codebase; excerpts mirror `FinalVersion` at time of writing.*
