search

UMD     This Site






Four authors with University of Maryland ties have written the chapter “Vision During Action: Extracting Contact and Motion from Manipulation Videos—Toward Parsing Human Activity,” in the book Modelling Human Motion: From Human Perception to Robot Design, published by Springer in July 2020.

ISR-affiliated Professor Yiannis Aloimonos (CS/UMIACS) and Associate Research Scientist Cornelia Fermüller (UMIACS), along with alumnus Konstantinos Zampogiannis (CS Ph.D. 2019) and current Ph.D. student Kanishka Ganguly, present an active, bottom-up method for the detection of actor-object contacts and the extraction of moved objects and their motions in RGBD videos of manipulation actions.

When we physically interact with our environment using our hands, we touch objects and force them to move: contact and motion are defining properties of manipulation. The core of the approach described in the new book lies in non-rigid registration: continuously warping a point cloud model of the observed scene to the current video frame, generating a set of dense 3D point trajectories.

Under loose assumptions, the authors employ simple point cloud segmentation techniques to extract the actor and subsequently detect actor–environment contacts based on the estimated trajectories. For each such interaction, using the detected contact as an attention mechanism, they obtain an initial motion segment for the manipulated object by clustering trajectories in the contact area vicinity and then jointly refine the object segment and estimate its 6DOF pose in all observed frames.

Because of its generality and the fundamental, yet highly informative, nature of its outputs, this approach is applicable to a wide range of perception and planning tasks. The authors qualitatively evaluate the method on a number of input sequences and present a comprehensive robot imitation learning example that demonstrates the crucial role of outputs in developing action representations/plans from observation.



Related Articles:
'OysterNet' + underwater robots will aid in accurate oyster count
Game-theoretic planning for autonomous vehicles
Chahat Deep Singh named a Future Faculty Fellow
Cornelia Fermüller is PI for 'NeuroPacNet,' a $1.75M NSF funding award
Microrobots soon could be seeing better, say UMD faculty in Science Robotics
CSRankings places Maryland robotics at #10 in the U.S.
Aloimonos, Sandini contribute chapter to MIT Press book, Cognitive Robotics
Autonomous drones based on bees use AI to work together
New system uses machine learning to detect ripe strawberries and guide harvests
Which way should I go?

July 17, 2020


«Previous Story  

 

 

Current Headlines

Khaligh Honored With Linda Clement Outstanding Advisor Award

UMD Launches Institute Focused on Ethical AI Development

Remembering Rance Cleaveland (1961-2024)

Dinesh Manocha Inducted into IEEE VGTC Virtual Reality Academy

ECE Ph.D. Student Ayooluwa (“Ayo”) Ajiboye Recognized at APEC 2024

Balachandran, Cameron, Yu Receive 2024 MURI Award

UMD, Booz Allen Hamilton Announce Collaboration with MMEC

New Research Suggests Gossip “Not Always a Bad Thing”

Ingestible Capsule Technology Research on Front Cover of Journal

Governor’s Cabinet Meeting Features Peek into Southern Maryland Research and Collaboration

 
 
Back to top  
Home Clark School Home UMD Home