This talk will survey some of the exciting work happening in Applied Algebraic Topology with a focus on applications to robotics and sensing. The talk will include a gentle introduction to the tools (homology/cohomology/persistence/sheaves), alongside relevant applications to motion planning, sensor coverage, mapping, optimization, and sensing.
For robust, life-long autonomous operation in dynamic unstructured environments, mobile robots must contend with vast amounts of continually evolving data. The exploring robot must adapt to its environment and refine its workspace representation with new observations. The key competency we seek is “introspection”: to ability to determine what is perplexing, which can further drive active information acquisition or human disambiguation. The talk explores this in the context of place recognition and semantic mapping.
Reformulating planning problems as probabilistic inference problems is interesting, but does not necessarly solve fundamental problems. In this talk I will review three variations of the theme where the reformulation has lead to novel theoretical insights and efficient algorithms. These are in the context of stochastic optimal control and model-free Reinforcement Learning, for multi-agent POMDPs, and for relational MDPs. I will conclude with some questions and first steps on a problem we currently work on: how to efficiently plan in the case of uncertainty over existence of objects.
I will talk about the research I have done with my collaborators on teaching robots to perform manipulation tasks based on human demonstrations. Of particular interest are tasks involving deformable objects, where it is hard to perceive the full state of the system and model its dynamics. I will also talk about some techniques we have developed to solve the associated perception and motion planning problems.
As dynamic robot behaviors become more capable and well understood, there becomes a need for a wide variety of equally capable and understood transitions between these behaviors. For legged robots, we introduce a new formalism for understanding behavioral components as a self-manipulation, and then build up a hybrid system that defines topologically the space of dynamic transitions as a cellular complex. Our primary motivation is not to facilitate numerical simulation but rather to promote design insight -- behavior design, controller design, and platform design.
Next generation wearable robots will use soft materials such as textiles and elastomers to provide a more conformal, unobtrusive and compliant means to interface to the human body. These robots will augment the capabilities of healthy individuals (e.g. improved walking efficiency, increased grip strength) in addition to assisting patients who suffer from physical or neurological disorders. This talk will focus on two different projects that demonstrate the design, fabrication and control principles required to realize these systems.