homecombo
home
What is Project Silent Willow?
Project Silent Willow started off as a side-project where I tested my basic C++ knowledge/hardware design engineering, however it quickly developed into one that I earned my first award in the "real world" (as in, not a school award).
In short, it's a custom-made mechanum-wheeled vehicle, integrating my own microcontroller, with AI using an Nvidia Jetson Nano. The vehicle was initially designed to use a LiDAR unit mounted atop, however I've yet to figure out how to extract data from the points it plots, so for now, it just has image recognition capabilities. Alongside the hardware/firmware, I've also developed a custom controller app, which although was a pain to make, was also helpful in developing my Android development skills (since they lacked a LOT before I started this).
Technology used
Before I made my custom microcontroller, I used an Arduino MEGA 2560 with a few DRV8825 stepper motor drivers. As it was my first ever "physical" prototype of my concept, the wiring was all over the place, and the code I developed was also a mess (I'd give myself credit here, as I was only 12 when I started making it). However, as I continued working on it, I developed my own microcontroller, which was a lot more efficient and easier to work with. From there, I tried integrating LiDAR (unsuccessful for now), image recognition, and quieter drivers. For the new drivers, I opted for more efficient/quieter TMC2209 drivers (kinda ironic that it's called "Silent" despite original tests being VERY loud). In the AI department, I opted for an Nvidia Jetson Nano (2GB model), which had - at the time - plenty of compute power. Recent developments, however, now demand a better system, so I'm currently in negotiation to get one of Nvidia's newer Jetson models!
Firmware/software
Onboard firmware
- Written in C++ using VSCode and Arduino IDE (chosen for well known framework and accessibility on all development platforms)
Jetson Nano firmware
- Written in Python, using Jupyter Notebook as the GUI (used several python packages developed by the Nvidia DLI)
Android app
- Written in Kotlin (mainly found the syntax easier to understand than Java), using Android Studio
Communication protocol
- I chose BLE as it was the most light-weight and is the most common protocol for BT communications (although, I had many issues trying to get permissions working when developing the Android app)
Links/main explainer video
A short section of my submission video for the certification I got (muted intentionally)
Final project evaluation
Overall, this project was quite fun to make. It tested my pre-existing knowledge of basic C++, but also introduced me to new technologies and hardware (such as the Jetson Nano) as well as Android app development. Obviously with any big project, there will be issues, and so the below are some of the big issues I came across when making PSW!
Wrong baud rate
- When I first started implementing BLE, I set the baud rate to 115200, which was the wrong rate for the BLE module I used - this set me back an embarrassing 2 months
False positives/negatives
- This was mainly specific to how I trained the model for image recognition, but it would make the vehicle move forward even if there wasn't a "forward" QR reference
LiDAR
- As stated above, I was unable to (and am still unable to) get the raw data from the LiDAR unit I added to the top, hence why the LiDAR was just sitting on the top with no added functions to the system other than to display the map it created
UART cross-communications
- When initially connecting the Jetson to the main control microcontroller, I was unable to get the data to transmit quickly enough, which caused the vehicle to have a slight delay between stopping/starting (this has been fixed in newer versions of PSW, though!)