[robotics-worldwide] [meetings]Invitation: IROS Workshop on Latest Advances in Big Activity Data Sources for Robotics and New Challenges

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[robotics-worldwide] [meetings]Invitation: IROS Workshop on Latest Advances in Big Activity Data Sources for Robotics and New Challenges

Asil Kaan Bozcuoğlu
Call for Participation: IROS Workshop on Latest Advances in Big Activity
Data Sources for Robotics & New Challenges
-------------------------------------
Date:        Oct 1st, 2018
Time:        09:00 am~ 06:30 pm
Location:  Madrid Municipal Conference Centre
                 Madrid, Spain
Room:      Monaco (1L1)
Website:   https://urldefense.proofpoint.com/v2/url?u=http-3A__www.open-2Dease.org_activity-2Ddata-2Dworkshop_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Y0pLXVXS2stbrervc_Cl6ssrHC_IGK1RcGMquFVbj9c&s=hIJEGZcWhKPYLoIP9AN8jdTR4hH6IW_S3MriKX7sdm0&e=
-----------------------------------

Dear fellow roboticists,

On behalf of my colleagues, I would like to cordially invite you to IROS
Workshop on Latest Advances in Big Activity Data Sources for Robotics & New
Challenges. Below you can find the information about the workshop.

Kind regards,
On behalf of BDSR Organizers,
Asil Kaan Bozcuoğlu

--------------------------------------
Invited talks: https://urldefense.proofpoint.com/v2/url?u=http-3A__www.open-2Dease.org_activity-2Ddata-2Dworkshop_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Y0pLXVXS2stbrervc_Cl6ssrHC_IGK1RcGMquFVbj9c&s=hIJEGZcWhKPYLoIP9AN8jdTR4hH6IW_S3MriKX7sdm0&e=

- Prof. Dr. Tamim Asfour    (Title: “The KIT Whole-Body Human Motion
Database and the KIT Motion-Language Dataset”)

Abstract:We first present the KIT whole-Body Human Motion Database, a
large-scale database of whole-body human motion, in which the motion data
does not only consider motions of the human subject but the position and
motion of objects with which the subject is interacting as well.  We
present the methods and tools used for a unifying representation of
captured human motion such as the reference kinematics and dynamics model
of the human body, the master motor map (MMM), together with procedures and
techniques for the systematic recording, annotating, classification of
human motion data and for contact-based segmentation of whole-body human
motion.  Second, we will present the KIT Motion-Language Dataset, a large,
open, and extensible dataset for linking human motion and natural language,
where annotations of whole-body human motions in natural language are
obtained using a crowd-sourcing approach. Finally, we show how the database
and dataset are used for the generation of multi-contact whole body for
humanoid robots as well as for generating text from motion and motion from
text by learning a bidirectional mapping between human whole-body motion
and natural language using deep recurrent neural networks.


- Prof. Michael Beetz, PhD   (Title: “Virtual Reality-based Mental
Simulations for Robots and Hybrid Reasoning”)


- Dr. Tetsunari Inamura (Title: “Integration of Robot Simulator and
Behavior data collection system for Human-Robot Interaction” )

Abstract: Recently, human behavior/motion database has great attention
in analysis
and learning of human activity, extraction of human skill knowledge and
application to human-robot interaction systems. One of the important
targets is a collection of user behavior in human-robot interaction
scenario; however, it requires a huge time and human cost in maintenance of
the robot and experimenting with many test subjects. Since the robot has to
behave under the identical condition among all of the behavior collection
sessions, it also faces difficulties to keep the same quality and
performance of the physical robot. To deal with the difficulties, we
develop a data collection system that integrates a robot simulator and
immersive human-robot interface in virtual reality. The crowd-sourcing
function of the system enables any users to take part in the human-robot
interaction experiment anytime anywhere. It means that the system can
collect the social and embodied human/robot behavior, without spending a
lot of time on robot maintenance and experiments. Through a demonstration
experiment at RoboCup@Home competition, I will show the feasibility and
potential of the system for evaluation of social intelligence of the
human-robot interaction systems.


- Prof. Dr. Katja Mombaur (Title: “Benchmarking schemes and data bases for
humanoid performance evaluation in the Eurobench project”)
Abstract: Standardized benchmarking schemes will play an increasingly
important role in evaluating robot performance and predicting their
suitability for real world applications. The Eurobench project, led by CSIC
Madrid, aims at defining benchmarking standards for humanoid and wearable
robots with a special focus on locomotion, setting up two benchmarking
facilities for general use and implement a data base with a collected
benchmarking data for future reference. The project builds among others on
expertise

on benchmarking and performance evaluation collected in 5 previous Euopean
projects (KoroiBot, H2R,BioMot, WalkMan and Balance). The European robotics
community will be able to participate in the Eurobench project in the
context of two subcalls, (a) contributing to the benchmarking setups and
measurement and (b) performing benchmarking experiments in the established
facilities. In this talk, I will in particular highlight the planned work
on benchmarking humanoid locomotion against the background of previous
research and discuss the most important applications scenarios as well as
the suitability of different key performance indicators.


- Dr.-Ing Karinne Ramirez-Amaro (Title: “Crowd-Sourcing Human Activity Data
from Virtual Reality”)


-  Prof. Dr. Wataru Takano (Title: “On Human Activity Dataset for Humanoid
Robots”)
Abstract: Human behaviors come in a variety of forms and styles. We handles
their varieties in the real world by breaking them down or putting them
together in the language form. The symbolic representations of the
behaviors underlie our intelligence, and their technical implementation is
required for a humanoid robot that is integrated into our daily life. This talk
presents contributions of encoding the human demonstrations into stochastic
models. These motion models can be connected to their relevant action
descriptions in the stochastic manner. The connection allows the humanoid
robot to understand human activities in the descriptive sentences, and to
synthesize human-like actions from sentence commands. Additionally, the
motion models are extended to encode the physical property, more
specifically, profiles of joint angles and joint torques. The motion models
can compute a pair of joint angle and torque that satisfies the physical
consistency. This computation leads to a design of the force controller
from human demonstrations. The experiments show its validity of a roboic
arm drawing on a board in response to the reaction force.

---------------------------------
Schedule (Tentative):

Time                                   Event
                         Comment

9:00 – 9:20                      Opening
              Opening remarks by the organizers.

9:30 – 10:15                   Invited talk 1
            Prof. Asfour

10:15 – 11:00                 Invited talk 2
           Prof. Beetz


11:00 – 11:30                Coffee break


11:30 – 12:15                 Invited talk 3
           Prof. Inamura

12:20 – 13:20                 Lightning Talks
        Accepted papers (6 papers)


13:30 – 14:30                Lunch break


14:30 – 15:00               Poster session

15:00 – 15:45               Invited Talk 4
        Prof. Mombaur

15:45 – 16:30               Invited Talk 5
         Dr. Ramirez-Amaro


16:30 – 17:00               Coffee break


17:00 – 17:45               Invited Talk 6
         Prof. Takano

17:45 – 18:15               Discussion

18:15 -   18:30              Final remarks

18:30                              End



------------------------------------

Workshop Objectives

Recently, we have witnessed that robots start to execute human-level
complex tasks such as making popcorn, baking pizza and carrying out
chemical experiments. Although these executions are milestones by
themselves, robots still have limitations in terms of flexibility, speed
and adaptability. To attack these limitations, we believe that big data
sources, which contain activities from robots, human tracking and virtual
reality, play an important role. Having a big activity data source on site
can help robots in many ways such as learning motion parameterizations,
adapting to different conditions and generalizing their existing knowledge.
Although we see many applications which start to make use of big data, the
research community is still in the phase of “re-inventing the wheel” by
designing new data structures, collecting similar data and implementing
interfaces between data and learning/control routines. Our main goal in
this workshop is to gather the interested researchers from IROS attendees
and make a step towards the standardization of research tools and data
formats to strengthen the joint research efforts.

We believe that the data coming from different agents should reside in a
similar format for being combined and used together. On the other hand,
there exist surely unique aspects of each source. For instance, the robotic
activity data has usually annotations from the control and perception
routines but we do not have access to such “brain dumps” in the human
tracking data or in the virtual reality (VR). Similarly, we can detect
force-dynamic events and the ground truth in the simulation and virtual
reality environments in contrast to real-world executions. Thus, one of our
objectives in this workshop is to discuss and seek an answer to the
questions “Is it possible to come up with a generic data format to cover
all these aspects? If so, how can we design such a format?”

A more specific sub-problem is that the variety of virtual reality engines
is used by roboticists. All the available VR engines present different
input/output devices and are able to capture position and orientation by
using tracking technologies. Therefore the development of VR-software
complies to demanding quality standards and timing constraints which obey
the needs of sensory simulation and direct 3D interaction. Working
applications combine different packages and implement specific solutions.
In every VR-setting, activity data is represented and stored in different
formats and therefore virtual scenarios cannot be easily integrated and
interpreted uniformly. To this end, we plan to analyze the existing virtual
reality set-ups from the accepted papers and assess the needs of the
research community. An important question to be asked is“Can we agree on a
pseudo-standard VR system for robotics like Gazebo for 3D-simulations?”

Overall, the participants will get insights from the state-of-the-art with
the presentations of the invited speakers and the authors of the accepted
papers in this workshop. We foresee that the keynotes from the invited
speakers with known expertise on the field will lead to valuable
discussions. Since we plan to assess the community’s needs, we will
encourage every participant to actively communicate and discuss on possible
ways towards a collaborative research effort in the dedicated discussion
slot. In addition, we will offer the authors a place for their posters to
explain in detail their work for every accepted paper.



--
Asil Kaan Bozcuoğlu, M.Sc.
Institute for Artificial Intelligence,
University of Bremen

Visitor Address:
Am Fallturm 1, Entrance E
Office 1.56
Bremen, Germany D-28359

email: [hidden email]
web: https://urldefense.proofpoint.com/v2/url?u=http-3A__ai.uni-2Dbremen.de_team_asil-5Fkaan-5Fbozcuoglu&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Y0pLXVXS2stbrervc_Cl6ssrHC_IGK1RcGMquFVbj9c&s=rznIwEsdbjo7kBKw32euzVQmiiU0YtJve7WD2hrAYUM&e=
phone (office): +49 421 218 64044
fax: +49 421 218 64047
--

Open-EASE - Web Analysis Tool for Robot Experience Data:
https://urldefense.proofpoint.com/v2/url?u=http-3A__www.open-2Dease.org_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Y0pLXVXS2stbrervc_Cl6ssrHC_IGK1RcGMquFVbj9c&s=RzB5AAJy9wf119JV-sD7Sc9uUwuXDUPNAklghOqKjCM&e=
_______________________________________________
robotics-worldwide mailing list
[hidden email]
http://duerer.usc.edu/mailman/listinfo.cgi/robotics-worldwide