The team was founded in September 2010 after Boston University was approached by a BU alumni from Aurora flight sciences interested in putting together a UAV oriented student group at the university. The group was intended to excite students about the field of unmanned aerial robotics. The group early on focused its efforts on creating an entry for the 2011 AUVSI IARC challenge. In order to facilitate the completion of this project, the team has been broken down into three groups: logic, flight platform, and systems. Logic is tasked with developing the code that will make the UAV autonomous, flight platform deals with the structure and motion of the vehicle, and the systems group is responsible for integrating the components of the UAV and also with creating the device that will interact with the flash drives. A four member executive board, with three of the members being the heads of the major groups, oversees the entire team and is composed of a President, Vice President, Treasurer, and Secretary. In addition to this executive board, an advisory board made up of graduate students from a variety of backgrounds pertaining to the competition was formed to aide in the more advanced technical details of the challenge. Each subgroup has been assigned at least one graduate student as a mentor.
Since it’s founding, the team has evolved to deal with the constraints on funding and time. Up until now, most of the teams funding has come from the University. However, the group has partnered with the Boston University Intelligent mechatronics lab on the recently awarded AIRFOILS MURI grant. Under this grant we will be researching methodologies for incorporating biologically inspired inner and outer loop control strategies into flight vehicles. This grant is funding the development of our first vehicle we will use for early tests. We identified early on that this competition was heavily weighted to the logic development. With that in mind, the logic group has been seeking innovative solutions to deal with the unknowns of the competition environment while staying within weight and power budget.
The team has begun designing a complete airframe solution from the ground up to push the limits of the state of the art. Currently, our flight platform is stabilized with an arduino-based autopilot called ArduPilot Mega. We have been experimenting with the Microsoft Xbox Kinect sensor as a means of replacing hokuyo’s heavier 30m lidar. The lighter weight of the Kinect combined with the new 3D perspective the sensor offers us of our environment will give us greater versatility as we develop high level stabilization an mapping algorithms. In order to handle the data from the Kinect, we have stripped down an ASUS 1215N Eee pc outfitted with a 3×3 MIMO 802.11n Wi-Fi card. The kinect will be utilized for a variety of tasks such as velocity stabilization, mapping, and 3D obstacle avoidance. To complement the Kinect (and compensate for its narrow field of view), we are using a hokuyo URG-04LX to increase local topological awareness.
Our plan for the competition is to implement a motion description language to abstract the environment into a set of motion primitives. Doing so will greatly reduce the complexity of the environment and simplify path planning. Sign recognition is carried out using a CUDA based SIFT algorithm to provide positive sign identification and location. In addition to software developed at IML, we are using Robot Operating System (ROS) and leveraging its code repositories.
Current Team Structure