Skip to main content

Automated Turret (WORK IN PROGRESS)

January 2025 - April 2025

GitHub Repository

Introduction

My sixth major course at USNA was EW309 (System Modeling and Simulation). We pursued a project starting from the problem statement to the final prototype and report.

The final project was to design and build a turret that could automatically track and fire at a target.

Turret Andrew Bernas

Final Report

Abstract — The EW309 automated turret design project aimed to develop a fully autonomous control system for the Sentry Turret, integrating motor control, computer vision, and ballistics analysis to accurately detect, aim, and engage static circular targets with a minimum accuracy of 95%. The project was based on applications in defense, law enforcement, and industrial automation, and it highlighted the critical engineering challenges associated with autonomous targeting, precise motor control, and feedback systems. The final turret design employed a dual-axis actuation system powered by DC motors and controlled via a custom-built circuit using a Raspberry Pi Pico microcontroller. Target detection used the Luxonis OAK-1 Lite camera paired with a YOLOv11 based neural network, and achieved classification and localization of colored targets within a given board space and with varying conditions. Performance evaluations demonstrated that the turret can successfully target specified circles and discriminate between valid targets and decoys with steady-state errors within ±0.5°. Ballistics testing across distances between 10 feet and 20 feet quantified system biases and precision, confirming the turret’s capability to reliably meet specified performance criteria. These results reinforce the effectiveness of classical control methods combined with advanced computer vision in an autonomous turret system.

Introduction

Automated turrets have evolved in recent years, including integrating computer vision and custom motor control to improve targeting accuracy and responsiveness. This project focuses on designing an electrical control system for a prefabricated Sentry Turret that incorporates the concepts of the United States Naval Academy’s Robotics and Control Engineering major. By using a combination of computer vision, motor control, and feedback loops, the turret will autonomously identify and engage targets within its field of view with at least 95% overall accuracy. This section details the motivation behind the project, the problem statement, and the background research that informs the design and implementation of the turret system.

Motivation

Autonomous turrets have significant applications across various industries, including military, law enforcement, and consumer products. In the military, autonomous targeting systems enhance security by providing rapid threat assessment and response with minimal human input or casualties. The ability to identify and neutralize targets while avoiding unintended harm to noncombatants is crucial both ethically and operationally. Beyond defense and law enforcement, autonomous turrets are increasingly used in commercial and research settings, such as industrial automation, wildlife monitoring and management, and robotics competitions. The growing sophistication of computer vision and automated control systems has enabled more precise and accurate targeting, reducing reliance on human operators and increasing overall efficiency. This project addresses key challenges in target identification, motor control, and feedback loops, reinforcing fundamental concepts in robotics, control theory, and computer vision. By engaging with these technical challenges, students gain a deeper understanding of how autonomous systems operate, fostering both technical proficiency and critical thinking.

Problem Statement

Design, implement, and test an autonomous method of controlling a prefabricated Sentry Turret. The method should enable the turret to autonomously detect and hit two circular static targets of a specified size and color with 95% accuracy. The turret should be able to distinguish between its target and decoys of many sizes and colors, including the same size or color as the target. The gun itself will operate in a stationary position with an unobstructed view. The turret should be able to detect two targets within the full field of view of the gun mounted camera, aim itself, and fire accurately at those targets within 15 seconds. The system must also provide diagnostics at the end of each trial including the final steady-state error for pitch and yaw rotations and how many shots the turret intended to take.

Background Research

Modern automated turret designs rely on a camera for target detection and motor actuators to orient the gun. For example, [1] demonstrated a system using a webcam and two stepper motors to aim a toy gun, as illustrated in Fig. 1. The turret implements a motion detection algorithm to detect any movement within the camera’s field of view. If motion is detected, the system will calculate the necessary rotation angle and required motor steps to aim at its general location. This approach offers a cheap and easily replicable design, but the target acquisition software remains rudimentary.

Figure 1 Andrew Bernas

Figure 1: Motion detecting Nerf gun turret [1]

Advancements in computer vision explore facial recognition and the use of color filters [2] for targeting, as illustrated in Fig. 2. The method proposed can detect and target multiple colors or faces with the use of Hue, Saturation, and Value color filtering. Alternative methods incorporate human detection capabilities using histogram of oriented gradient (HOG) and support vector machine (SVM) classifiers [3]. Even with the advances in computer vision algorithms, these systems fail to implement a target discriminating capability, shooting anything that moves.

Figure 2 Andrew Bernas

Figure 2: Face detecting laser pointer turret [2]

To overcome the limitations of non-discriminatory target acquisition, recent work focuses on targeting the largest object within the camera's field of view [4]. Should a blue object get within the camera’s field of view, the turret will reach the desired servo angles to aim a laser at it. If there are multiple blue objects detected, the turret will aim at the largest target. The main advantage of the design is its ability to discriminate between different targets based on their size. However, the target acquisition program is still quite limited as it only locates blue objects. Providing the user with greater control over target selection would enhance the automated turret's ability to distinguish and prioritize firing at specified targets.

Several turret actuation methods rely on servo motors [2],[3],[4], as illustrated in Fig. 3. Servo motors implement a built-in position controller for motor actuation. While this might work with small turret designs, servo motors tend to not exist at large sizes. To scale this approach for actuating larger guns, a custom motor position control loop would be necessary [5]. An additional sensor (like an encoder) would be required to measure the motor’s position and an additional controller must be developed to calculate the input voltage.

Figure 3 Andrew Bernas

Figure 3: Servo motor actuated turret [4]

To address the limitations of previous works [1],[2],[3],[4] this report will encompass an automated turret system enabling the user to specify the characteristics of the targets to fire at and utilize an easily scalable motor controller. By utilizing advancements in computer vision, the turret can distinguish between pre-specified targets and decoys. Additionally, by utilizing a classical approach towards motor position control, this method can be easily reproduced at scale.

Turret Actuation

The turret’s actuation system is made up of three main subsystems: a laptop for user/programmable control, a turret board which handles interpreting signal commands and power, and the turret body. These processes are depicted in Fig. 4 and all code can be found in Appendix A.

Figure 4 Andrew Bernas

Figure 4: Operational block diagram for actuation phase

The turret body is a repurposed Proaim Sr. Pan Tilt Head [6] operating on two axes, yaw and pitch. A custom motor control board replaces the default control box, enabling direct interfacing with the motors. The tilt head is modified to operate in a 180° range with a physical stop and the pan head can operate without limitations, captured in Fig. 5. It has two brushed, high torque DC motors that can operate between 6°/sec at 4VDC and 51°/sec at 12VDC, according to Proaim [6]. Both motors are using far less than their maximum load of 16.5lbs. The custom circuit board, shown in Fig. 6, connects two systems: a Raspberry Pi PICO microcontroller, which manages control logic and interprets commands from the laptop, and two motor drivers that power the motors and control their speed and direction. The two ROHM BD62120AEFJ motor drivers can operate between 8VDC and 24VDC [7], are powered via a 12VDC power supply and are used to control both the yaw and pitch motors.

Figure 5 Andrew Bernas

Figure 5: Turret motor locations

Figure 6 Andrew Bernas

Figure 6: Board layout and motor wiring

The main advantage for using this motor driver is its ability to control high power (12VDC) motors with low power (3.3VDC) pulse width modulation (PWM) logic from the PICO microcontroller [8]. The motor driver is a standard H-bridge that allows control over polarity of the load, which in turn, allows control over the direction of the motors. Motor direction and speed is facilitated via two low power PWM input pins, IN1 and IN2. Turning IN1 on and IN2 off causes clockwise rotation and turning IN1 off and IN2 on causes counterclockwise rotation. Speed is directed via the PWM duty cycle through each pin. A 100% duty cycle would result in full power (12VDC) and a 0% duty cycle would result in no power (0VDC). The PWM frequency of 1kHz was selected to balance audible motor noise and efficient operation. This motor driving process is managed by the motor class, motor.py, in Appendix A.

In order to control the turret, the PC communicates movement commands to the PICO over a serial port. The PICO then decides which motor to actuate and the polarity of voltage to achieve the desired motor rotation. Finally, the microcontroller sends PWM signals to the motor controllers to power the motors for yaw and pitch actuation.

The user can use the left arrow key, right arrow key, up arrow key, and down arrow key to control the orientation of the turret and use the space key to stop the motors. Upon key press, the laptop will send the respective movement command over serial to the PICO (manual_pc.py). When the microcontroller receives the commands "RIGHT" or "LEFT," it interprets them as clockwise or counterclockwise rotation, respectively. Via pins GP9 (IN1) and GP10 (IN2), the microcontroller will instruct the motor controller to apply 12V DC at a 60% duty cycle to the yaw motor. The same is true for commands “UP” or “DOWN,” but for the pitch motor at GP12 (IN1) and GP13 (IN2). A 60% duty cycle was chosen for a controlled yet responsive motor speed. The “SPACE” command locks both motors by setting the IN1 and IN2 duty cycle to 100% (manual_pico.py).

The greatest difficulty experienced was getting the laptop to detect keyboard presses and send them to the PICO via serial. The initial attempt used MATLAB for keyboard control to allow for easy integration of data collection and analysis. Unfortunately, MATLAB does not offer a straightforward solution for this, so an alternative approach was implemented using the Python keyboard library [9]. This allowed key presses to be detected using the is_pressed() method within a while loop. While granting easier control of the turret, it may require the use of Python data analysis tools instead of MATLAB. Looking forward, it's important to be flexible during problem-solving and avoid relying too heavily on familiar approaches.

Position Sensor

The turret has a BNO055 9-axis inertial measurement unit (IMU) [10] which consists of triaxial accelerometer to measure linear acceleration, a triaxial gyroscope to measure angular velocity, and a triaxial magnetometer to measure the sensor's orientation relative to magnetic north. Since the turret needs a motor position controller to aim at the target, orientation and angular velocity measurements are required. Orientation measurements are used for the calculation of the error between current and desired motor positions, and angular velocity data will be utilized for system identification (plant transfer function estimation).

Although only orientation and angular velocity are required, relying solely on the gyroscope is insufficient. Gyroscope measurements are inherently noisy, sensitive to temperature variations, and do not directly provide orientation. Instead, orientation is derived by integrating angular velocity, which accumulates sensor bias over time, leading to drift. To counteract this, sensor fusion combines data from the accelerometer, gyroscope, and magnetometer, yielding a more accurate state estimation. The accelerometer and magnetometer improve orientation measurements by referencing constant external vectors: the gravity vector from the accelerometer and magnetic north from the magnetometer.

The BNO055 IMU communicates to the PICO via Inter-Integrated Circuit (I2C) on pins GP2 and GP3. This is pictured in Fig. 7 with the additional two wires being the sensor’s ground and 5V power.

Figure 7 Andrew Bernas

Figure 7: BNO-055 IMU Wiring Diagram and Mounting

GP2 (orange wire) is a serial clock line (SCL), crucial for synchronizing communication between the PICO and IMU. It does this by sending pulses at regular intervals, with each pulse indicating when each bit of data is transmitted via GP3 (brown wire) or the serial data line (SDA). Using the Adafruit BNO055 library [11], the PICO receives the yaw, roll, and pitch position in degrees and velocity in degrees per second using the euler() and gyro() methods respectively. The yaw and pitch data (roll data discarded) are then streamed over serial to the PC using a formatted print statement. The PC receives that information in a data array which it decodes into yaw and pitch position data and velocity data, for a total of four lists storing data as it is recorded.

The streamed data is published at a constant rate of 10 Hz by using a while loop and time.sleep(0.1). Collecting measurements at 10 Hz should be fast enough for most motor control designs and reading at equal time intervals is crucial for accurate control calculations. A PID controller, for example, requires integrals and derivatives. Inconsistent measurement intervals distort these calculations, potentially accumulating errors and hindering the controller's ability to reach the desired position.

To enable concurrent reading and writing to the serial port, the PICO leverages its dual-core Arm Cortex-M0+ processor [8], executing each process to a separate thread. This is done via the _thread MicroPython library [12] and allows the turret to simultaneously receive keyboard movement commands from the laptop while publishing constant IMU data back to it. Fig. 8 illustrates the process from start to finish, noting that the data collection process continuously loops until “q” is pressed. The code for this process can be found in manual_IMU_pico.py in Appendix A.

Figure 8 Andrew Bernas

Figure 8: Component Operations Diagram for serial communication

Fig. 9 displays a plot of the turret’s position and velocity over a testing period of five seconds using the Matplotlib library [13] in manual_IMU_pc.py script. The test shifted the turret clockwise then counterclockwise (yaw), stopped, then tilted the turret down then back up (pitch), which agrees with the Position vs. Time plot. Each time the turret changes direction in either direction, the velocity shifts quickly to a constant speed of about ±40 deg/s as seen in the Angular Velocity vs. Time plot. This consistency in moving in either the clockwise or counterclockwise direction will make it easier for system identification. Instead of having to calculate four different transfer functions for movement in all four directions, only two or maybe even one transfer function would be required to model the DC motor turret system.

Figure 9 Andrew Bernas

Figure 9: Plot of Position vs. Time and Angular Velocity vs Time for turret motor actuation

Initially, getting two-way serial communication between the PICO and laptop proved to be difficult. The original approach utilized the Python asyncio library for concurrent processing, but the _thread library was more straightforward to implement and provided better documentation. This allowed the PICO to manage two independent processes, serial reading and writing, to operate simultaneously on different threads.

Additional issues occurred while rotating across the turret’s centerline, when the PC recorded a dramatic jump from 0° to 360°. This large shift would make it difficult to implement a position controller which requires a smooth transition between angles. A sudden jump, representing the same relative position, can confuse the controller and lead to erratic behavior. To mitigate this issue, a “wrap to pi” function was implemented to map the data to a continuous range from -180° to 180°. This function can be seen in manual_IMU_pico.py in Appendix A.

During testing, a frequent problem arose when the PICO script would time out if a keyboard command was not received. If the user did not send commands quick enough, there was a high chance the script would break. Initially, a simple sleep command allowed enough time to press a key after code execution, but a more robust solution was implemented utilizing a while loop. The program now loops continuously until a serial command triggers its exit, allowing the remaining code to execute.

Controller Design

This section of the report outlines the process of data collection, transfer function derivation, system response analysis, and controller development for the yaw and pitch motors. The System Modeling section covers data collection and transfer function derivation, while the Deadzone Compensation section explains how stiction in the DC motors was addressed. The Yaw and Pitch Controller Design section details the approach used to develop a position controller. To enhance system performance, a Proportional-Integral (PI) controller was implemented. The PI controller adjusts the control input based on the current error (proportional) and the accumulation of past errors (integral), ensuring immediate error correction while eliminating steady-state error. Finally, the Performance Evaluation section assesses the controller's effectiveness, comparing its performance to design expectations and outlining any modifications made to improve the system.

System Modeling

The objective of system modeling is to develop a mathematical model of the Sentry Turret motors, which is crucial for evaluating its dynamics before designing a controller. The system model provides transfer functions to analyze and predict the turret’s behavior based on a given input. For this setup, the velocity transfer function was derived using a first-order system approximation of the DC motor turret system. While more accurate models exist for representing the system, a first-order approximation is sufficient for determining the appropriate controller type and calculating initial control gains based on a design point.

Data collection was carried out using controlled, repeatable, and time-based trials to model the turret’s dynamics. First-order system analysis was performed by examining the step response for each motor. A step response with a 30% duty cycle was selected to provide enough voltage to initiate motor movement, while keeping the speed low enough to allow for accurate data measurements from the 10Hz IMU. Each motor received a one-second step input, using the time.sleep() method, to ensure it reached its final value. During the whole trial, the PICO recorded angular velocity and angular position from the IMU using the gyro() and euler() methods, respectively. Measurements were taken at equal time intervals to reduce the risk of missing critical changes in the system’s behavior and simplify the transfer function estimation. Each motor was powered individually in both directions, clockwise (CW) and counterclockwise (CCW) for yaw, and up and down for pitch, as shown in Fig. 10 and performed via the calc_TF_pico.py and calc_TF_pc.py scripts in Appendix A.

Figure 10 Andrew Bernas

Figure 10: Position and angular velocity step response for yaw and pitch motors at 30% duty cycle

Using this data, first-order system analysis can be performed to estimate the motor’s transfer function. The velocity transfer function represents how the turret's angular velocity in deg/s responds to a given motor input in volts and is given by,

Gvel(s)=bs+a.G_{vel}(s)=\frac{b}{s+a}.
(1)

The position transfer function represents how the turret’s position in degrees responds to a given motor input in volts and can be derived by taking the integral of the velocity transfer function,

Gpos(s)=bs(s+a).G_{pos}(s)=\frac{b}{s(s+a)}.
(2)

The coefficients aa and bb are found via the calc_TF_data_anlaysis.py script by analyzing the velocity step response data, as illustrated in Fig. 11. The blue line represents the step response and black line denotes the start of the given motor step input. The final value, FvF_v, (red line) was calculated by taking the average of the last four values of the step response to compensate for any noise within the velocity measurement. The 63% final value is denoted by the magenta line and was used for determining the system's time constant, τ\tau, (green line) or the time it takes for the motor to reach 63% of its final velocity.

Templates

Figure 1 Andrew Bernas

Figure 1 Body Frame of Quadcopter

Fi=bωi2F_i = b\omega_i^2
(1)

Table 1 Quadcopter Model Parameters

Parameter\textbf{Parameter}Description\textbf{Description}Value\textbf{Value}
mmQuadcopter Mass\text{Quadcopter Mass}0.65 kg\text{0.65 kg}
llQuadcopter Arm Length\text{Quadcopter Arm Length}0.23 m\text{0.23 m}
bbThrust Factor\text{Thrust Factor}3.13×105 Ns23.13\times 10^{-5}\text{ Ns}^2
ddTorque Factor\text{Torque Factor}7.5×107 Ns27.5\times 10^{-7}\text{ Ns}^2
IxI_xMoment of Inertia about x-axis\text{Moment of Inertia about x-axis}7.5×103 Ns27.5\times 10^{-3}\text{ Ns}^2
IyI_yMoment of Inertia about y-axis\text{Moment of Inertia about y-axis}7.5×103 Ns27.5\times 10^{-3}\text{ Ns}^2
IzI_zMoment of Inertia about z-axis\text{Moment of Inertia about z-axis}7.5×103 Ns27.5\times 10^{-3}\text{ Ns}^2
ggAcceleration of Gravity\text{Acceleration of Gravity}9.81 m/s29.81\text{ m/s}^2