[robotics-worldwide] [IROS][meetings] CFP Workshop on Crossmodal Learning for Intelligent Robotics in conjunction with IEEE/RSJ IROS 2018

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[robotics-worldwide] [IROS][meetings] CFP Workshop on Crossmodal Learning for Intelligent Robotics in conjunction with IEEE/RSJ IROS 2018

Pablo Barros
**Apologies for cross-posting**

1st CALL FOR PAPERS for the international workshop:

* Crossmodal Learning for Intelligent Robotics * in conjunction with
IEEE/RSJ IROS 2018

* Madrid, Spain - Friday 5 October 2018 *

* Website:
https://urldefense.proofpoint.com/v2/url?u=http-3A__www.informatik.uni-2Dhamburg.de_wtm_WorkshopCLIR18_index.php&d=DwIDaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=g36WSvGZ7zUsmTtt3BbKBLJpe293uhR0JB7bM94C1W4&s=5Kr7L1ouhQnEUQiITfulM6mDGgkgSxpr2EXwrJUclJI&e= *

I. Aim and Scope

The ability to efficiently process crossmodal information is a key
feature of the human brain that provides a robust perceptual experience
and behavioural responses. Consequently, the processing and integration
of multisensory information streams such as vision, audio, haptics and
proprioception play a crucial role in the development of autonomous
agents and cognitive robots, yielding an efficient interaction with the
environment also under conditions of sensory uncertainty.

Multisensory representations have been shown to improve performance in
the research areas of human-robot interaction and sensory-driven motor
behaviour. The perception, integration, and segregation of multisensory
cues improve the capability to physically interact with objects and
persons with higher levels of autonomy. However, the multisensory input
must be represented and integrated in an appropriate way so that they
result in a reliable perceptual experience aimed to trigger adequate
behavioural responses. The interplay of multisensory representations can
be used to solve stimulus-driven conflicts for executive control.
Embodied agents can develop complex sensorimotor behaviour through the
interaction with a crossmodal environment, leading to the development
and evaluation of scenarios that better reflect the challenges faced by
operating robots in the real world.

This half-day workshop focuses on presenting and discussing new
findings, theories, systems, and trends in crossmodal learning applied
to neurocognitive robotics. The workshop will feature a list of invited
speakers with outstanding expertise in crossmodal learning.
II. Target Audience

This workshop is open to doctoral students and senior researchers
working in computer and cognitive science, psychology, neuroscience
and related areas with the focus on crossmodal learning.

III. Confirmed Speakers
  
   1.   * Yulia Sandamirskaya *
           Institute of Neuroinformatics (INI), University and ETH Zurich
   2.   * Angelo Cangelosi *
           Plymouth University and University of Manchester, UK
   3.   * Stefan Wermter *
           Hamburg Universit, Germany
  
IV. Submission

    1. Topics of interest:

   - New methods and applications for crossmodal processing
     (e.g., integrating vision, audio, haptics, proprioception)
   - Machine learning and neural networks for multisensory robot perception
   - Computational models of crossmodal attention and perception
   - Bio-inspired approaches for crossmodal learning
   - Crossmodal conflict resolution and executive control
   - Sensorimotor learning for autonomous agents and robots
   - Crossmodal learning for embodied and cognitive robots

   2. For paper submission, use the following IEEE template:
   <https://urldefense.proofpoint.com/v2/url?u=http-3A__ras.papercept.net_conferences_support_support.php&d=DwIDaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=g36WSvGZ7zUsmTtt3BbKBLJpe293uhR0JB7bM94C1W4&s=Loqya5JTJPv5ZHcPabenz7p_DL_ehY8946_Xf2KVTVk&e=>*
  
   3. Submitted papers should be limited to *2 pages (extended
abstract)* or *4 pages (short paper)*.

   4. Send your pdf file to [hidden email] AND
[hidden email]
  
Selected contributions will be presented during the workshop as
spotlight talks and in a poster session.

Contributors to the workshop will be invited to submit extended versions
of the manuscripts to a special issue (to be arranged).  Submissions
will be peer reviewed consistent with the journal practices.

V. Important Dates

   * Paper submission deadline: August 15, 2018
   * Notification of acceptance: September 5, 2018
   * Camera-ready version: September 15, 2018
   * Workshop: Friday 5 October 2018
  
VI. Organizers

   * German I. Parisi * Hamburg University, Germany
   * Pablo Barros * Hamburg University, Germany
   * Doreen Jirak * Hamburg University, Germany
   * Jun Tani * Okinawa Institute of Science and Technology, Japan
   * Yoonsuck Choe * Samsung Research & Texas A&M University, TX, USA

--
Dr.rer.nat. Pablo Barros
Postdoctoral Research Associate - Crossmodal Learning Project (CML)
Knowledge Technology
Department of Informatics
University of Hamburg
Vogt-Koelln-Str. 30
22527 Hamburg, Germany
Phone: +49 40 42883 2535
Fax: +49 40 42883 2515
barros at informatik.uni-hamburg.de
https://urldefense.proofpoint.com/v2/url?u=https-3A__www.inf.uni-2Dhamburg.de_en_inst_ab_wtm_people_barros.html&d=DwIDaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=g36WSvGZ7zUsmTtt3BbKBLJpe293uhR0JB7bM94C1W4&s=-sfhWDwELeWr84JlHlzbXyk_JazSMMA4upaspYXaYv4&e=
https://urldefense.proofpoint.com/v2/url?u=https-3A__www.inf.uni-2Dhamburg.de_en_inst_ab_wtm_&d=DwIDaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=g36WSvGZ7zUsmTtt3BbKBLJpe293uhR0JB7bM94C1W4&s=aTup31T4S6H5gQUyLeuxXIe9TeySDNBQ4Im9Hzb9HX8&e=

_______________________________________________
robotics-worldwide mailing list
[hidden email]
http://duerer.usc.edu/mailman/listinfo.cgi/robotics-worldwide