You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need to be able to stay in our lane and go straight in the drag race portion of the competition. @ValRat and I have talked about it a bit, and one good way might be to mount the lidar pointing directly sideways, and use the cones as reference to ensure that the robot is going straight.
After some discussion with @RobynCastro today, it looks like our Hokuyo has a 240 degree FOV, so we should be able to basically use it for everything.
Current Plan:
use the hokuyo lidar to detect cones on our side of the drag strip, and write a node that will use the cones to form a straight line that it can follow, maintaining a set distance from the cones.
when we reach the end of the course, the final wall will be detected by the hokuyo and we will stop at a preset distance from it
TODO
work out the math to make sure that we'll able to see enough cones at all times in worst case (cones can be as much as 1m apart). We basically need to determine the relationship between how far we are away from the cones, what our current heading is, and how many cones we can see. (@RobynCastro - could you take a shot at this? the math shouldn't be too hard I think)
Port over the ObstacleManager from the IGVC repo. This will allow us to find discrete obstacles that we can use to detect both the wall and the individual cones (I'lll take care of this)
We need to be able to stay in our lane and go straight in the drag race portion of the competition. @ValRat and I have talked about it a bit, and one good way might be to mount the lidar pointing directly sideways, and use the cones as reference to ensure that the robot is going straight.
See here for an example video.
The text was updated successfully, but these errors were encountered: