Connected to the robot for the first time. For whatever reason, the arduino IDE could only connect and run the test code when using Windows.
Created a flowchart for the overall logic of the robot. This flowchart details all of the algorithms required for the robot to win.
Connected the robot to bluetooth. Test cases were sent and received successfully.
Day 2:
Made measurements for future distance calculations.
Successfully sent test signals through the phone's bluetooth, which will be used to ping the robot about danger/gold in the actual game.
Edited the flowchart further to generalize the movement algorithm, to account for more types of maps, and to shoot the Wumpus upon detection.
Started basic algorithm code and framework for how the grid is stored.
Day 3:
Started movement code:
It turned out all of the distance measurements were irrelevant, since it is easier to have the robot align itself with the gridlines instead of getting the robot to be manually centered.
First we made sure the robot could detect the gridlines. Here are our first attempt:
And here is our second attempt, with a little less loopiness:
Continued framework for robot code.
Added the algorithmic pathfinding between two tiles.
Tested our general algorithm with Bram once again: there were still some edge cases and specific circumstances to figure out.
Day 4:
Expanded algorithm to:
Identify potential gold spaces.
Navigate back to the start when the gold is found.
Generally figure out how to successfully get to the gold; the only unimplemented feature was shooting the Wumpus.
Started code for turning and moving to an adjacent square.
Unfortunately, this was imprecise and buggy, since one of the sensors occasionally turned itself permanently off for no apparent reason.
In the hopes of making the sensors as accurate as possible, we used a screwdriver to calibrate them. This did not fix the sensor issue.
Day 5:
Tried to fix a sensor using lines of code listed on the documentation website.
Finished the algorithm and implemented it into the rest of the code. Using a virtual version of the grid, the algorithm runs very well!
Wrote simple code to receive bluetooth signals and move to an adjacent square.
Unfortunately, none of these additions could be tested on the robot itself: when the code was put on the robot, nothing happened, despite functions running individually.
There was not even any way for us to figure out what the error was, since using the wire showed a memory allocation error.
Another group tried to run their (verifiably working) code on our robot, and that also did not work.
At one point, one of the wheels just spontaneously decided to not be able to turn at the same time as the other wheel. Both wheels could be run on their own.
Day 6:
Improved algorithm to work in certain edge cases.
After a combination of miscellaneous actions, including deleting old files, reducing the code size, and replacing the battery, the robot spontaneously started to work again.
Fixed how the robot receives the bluetooth signal:
Before, the robot didn't wait for a signal to be received. If we didn't send a signal, then the root would look for a signal, see nothing, and continue driving.
To fix this, the robot now sends a number to itself, over and over again, until a different number (sent by us) is received. This tells the robot to stop waiting and to use our number for determining what was detected.
Attempted to improve movement precision:
Even when both wheels are powered by the same amount, subtle differences in motor speed and movement cause the robot to not move in a perfect straight line or to perfectly turn 90 degrees. These environmental factors also make this movement slightly different every time.
As a result, we needed to add a way for the robot to readjust itself every time. Unfortunately, we couldn't figure out how to fix all possible misalignments by the end of this last class.
It didn't help that occasionally the robot would fail to move the wheels enough to turn.