|
Achieved Deliverables
2002-2003:
-
Guided Training via a Modular Software System for Learning from Interaction
with the Environment and People:
-
Cog learns simple arm and end effector tasks via a combination
of self-exploration and explicit training. With tactile reinforcement
signals, Cog is taught by a human trainer to perform simple postural
arm and hand actions. Subsequently, the trainer teaches the robot
to perform such learned actions in response to tactile (touch
to particular fingers) and visual (objects of particular colors)
stimuli.
-
Expoiting a Model of Muscle Fatique for Human-like Movement.
-
Learning How Joints Move in Relation to Virtual Muscle Groups
-
Starting simply, from an inclination to randomly move its virtual
muscles, Cog learns to activate its muscle model so it can move
to particular points in joint angle space. Cog acquires an unsupervised
linear dependency model between joint velocities and controller
modules that supervise multiple muscles in combination.
-
Active segmentation
-
Cog uses active exploration to resolve visual ambiguity in its
workspace. Objects can sometimes be difficult to locate if their
visual appearance is similar to the general background. Cog solves
this problem by sweeping its arm through regions of interest.
If no object is there, the arm passes unimpeded. If an object
is present, the impact between it and the robot's arm causes the
object to move, revealing its boundary.
-
Cog uses a mirror neuron model to learn how different objects respond
to the actions it can perform.
-
If the robot taps an object and it slips and rolls, it learns
to predict the direction of slip based on visaul evidence, and
can then use that information to deliberately trigger or avoid
rolling an object while tapping it. The mirror neuron model allows
the robot to mimic an action demonstrated by a human relative
to the natural behavior of the object, rather than pure geometry.
-
Open object recognition
-
With open object recognition, the set of objects Cog can recognize
grows over time, as it accumulates experience through active segmentation
and other experimental methods. The robot clusters episodes of
object interaction to learn the properties of novel, unfamiliar
objects. An operator can introduce names for objects to facilitate
further task-related communication.
-
Perceptual cycle
-
Cog uses the constraints of known activities to learn about the
objects used within those activities -- for example, during manipulation.
Cog can track known objects to learn about activities they occur
in, such as a sorting task or object search. By combining the
ability to learn about objects through activity constraints and
activities through tracking objects, the robot can achieve a virtuous
cycle of perception.
-
Adaptive Control of Cog’s Arm Using a Nonlinear Sliding-Modes
Controller
-
Two degrees of freedom on Cog’s arm operate via non-parametric
adaptive control using a nonlinear sliding-modes controller. This
sufficiently mitigates the high signal to noise ratio arising
in Cog’s arm (due to a small strain gauge signal that experiences
capacitive coupling with other signals) and allows semi-autonomous,
task adequate control.
-
Learning Actions and Objects from Observed Use
-
While Cog watches an event involving someone’s arm handling
an object (e.g. filing a surface, swinging a pendulum), its vision
system extracts both the nature of the arm movement and derives
a predictive dynamical model of the object.
-
A Compact Linear Series Elastic Actuator Design For Human-like Neck
Joint
-
For a new robotic head, two new coupled neck axes were designed
and built using linear series elastic actuators aligned in parallel.
The design is compact: the two axes have intersecting centers
of rotation. Force control in combination with elastic actuation
provides safe, human like compliancy.
-
The ALIVE architecture
-
The ALIVE architecture consisting of a stack and the CreaL software
development environment controls the new robotic head. The stack
is a special purpose, extensible, real-time, small form-factor
hardware architecture of controller boards, sensor boards, network
board, and off-the-shelf processor. CreaL, which is retargetable,
extracts efficient computational power to allow many lightweight
threads from the relatively cheap off-the-shelf processor via
efficient software scheduling, compilation and language abstraction.
The ALIVE architecture facilitates complete designer control over
startup and failure sequences which is essential for continuous,
safe robot operation.
|