Quantcast

[robotics-worldwide] [meetings] EPIC@ICCV 2nd Int Workshop on Egocentric Perception Interaction and Computing

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[robotics-worldwide] [meetings] EPIC@ICCV 2nd Int Workshop on Egocentric Perception Interaction and Computing

Walterio Mayol-Cuevas
*EPIC@ICCV 2017*

Following on the success of EPIC@ECCV last year, we are thrilled to invite
participants and contributors to the second edition of the International
Workshop on Egocentric Perception Interaction and Computing (EPIC) at
ICCV2017:

https://urldefense.proofpoint.com/v2/url?u=http-3A__www.eyewear-2Dcomputing.org_EPIC-5FICCV17&d=DwIBaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rStC00sfV_qVWqlcnmGhKrRoTMim5rLso-FTzVW2NfNyo&m=bK-sTLZtzk_lOoaMAtAY8p9CUgxbIBTGWtPW2dOuiSs&s=JjoGfhx008Tp2EbtAjqLrKDft0orHFqbsP6vAoQwB6o&e= 

*Full Paper Submission: June 15th, 2017 (OR *5th Aug if not accepted into
main ICCV*)*

*Submission of Extended Abstracts: September 12th, 2017*

Notification of Acceptance: July 31th, 2017

Camera-Ready Paper Due: August 25th, 2017

*Workshop: October 29th, 2017*
Aims and Scope
Egocentric camera devices are becoming increasingly popular, both as
research prototypes and off-the-shelf products. They can acquire images,
videos and can collect multimodal data such as gaze information, GPS
position, IMU data, etc. When connected with head-mounted displays, they
can also provide new forms of interaction and visualization. Based on this
rapid progress, we believe we are only at the beginning and these
technologies and their application can have a great impact on our life.

In fact, these egocentric camera devices will be able to automatically
understand what the wearer/robot is doing, acting on, manipulating, and
where his or her attention is going. On the scientific side, for example,
scientists are already taking advantage of these kinds of wearable cameras
by monitoring and analyzing visual experiences of infants' early life. This
work has already resulted in important findings on how young children
actively explore the world to create experiences and collect visual data to
facilitate their development and early learning of words and visual objects

Egocentric perception introduces a series of challenging questions for
computer vision since motion, real-time responsiveness and generally
uncontrolled interactions in the wild are more frequently required or
encountered. Questions such as what to interpret as well as what to ignore,
how to efficiently represent egocentric actions, and how captured information
can be turned into useful data for guidance or log summaries become central.

This new EPIC@X series of workshops aims to bring together the various
communities that are relevant to egocentric perception including Computer
Vision, Multimedia, HCI and the Visual Sciences and is planned to be held
on the major conferences in these fields. EPIC@ICCV will accept Full Papers
for novel work, and Extended Abstracts for ongoing or already published
work. Both research and application works related to Egocentric Perception,
Interaction and Computing are encouraged, including those that can be
demonstrated or are in the prototype stages.

Submissions are expected to deal with human-centric perception including,
but not limited to:

   - Eyewear devices for egocentric perception and computation
   - Eyewear devices for acquisition and visualization
   - Egocentric vision for object/event recognition
   - Egocentric vision for summarization
   - Egocentric vision for social interaction and human behavior
   understanding
   - Egocentric vision for children and education
   - Egocentric vision for health
   - Head-mounted eye tracking and gaze estimation
   - Computational visual behaviour analysis
   - Attention modelling and next fixation prediction
   - Eye-based human-computer interaction
   - Human and wearable devices interaction
   - Symbiotic human-machine vision systems
   - Affective computing with respect to wearable devices
   - Interactive AR/VR and Egocentric perception
   - Augmented human performance
   - Interactive AR/VR and Egocentric perception
   - (Eye-based) daily life and activity monitoring
   - Benchmarking and quantitative evaluation with human subject experiments


*Submission Details:*

https://urldefense.proofpoint.com/v2/url?u=http-3A__www.eyewear-2Dcomputing.org_EPIC-5FICCV17&d=DwIBaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rStC00sfV_qVWqlcnmGhKrRoTMim5rLso-FTzVW2NfNyo&m=bK-sTLZtzk_lOoaMAtAY8p9CUgxbIBTGWtPW2dOuiSs&s=JjoGfhx008Tp2EbtAjqLrKDft0orHFqbsP6vAoQwB6o&e= 

Organisers:Giuseppe Serra,  University of Udine, Italy
David Crandall,  Indiana University, USA
Dima Damen University of Bristol, UK
Walterio Mayol-Cuevas, University of Bristol, UK
Rita Cucchiara,  University of Modena and Reggio Emilia, Italy
Andreas Bulling, Max Planck Institute for Informatics, Germany
Chen Yu, Indiana University, USA*EPIC Community*

If you are interested to learn about Egocentric Perception, Interaction and
Computing, including future calls for paper, code, datasets and jobs,
subscribe to the newly introduced mailing list: [hidden email]

Instructions to subscribe:

   - send an email to: [hidden email]
   - with the subject: subscribe epic-community
   - and blank message content


--
Prof. Walterio Mayol-Cuevas
Professor of Robotics, Computer Vision and Mobile Systems
Department of Computer Science
Merchant Venturers Building
University of  Bristol,
Woodland Road BS8 1UB
United Kingdom
_______________________________________________
robotics-worldwide mailing list
[hidden email]
http://duerer.usc.edu/mailman/listinfo.cgi/robotics-worldwide
Loading...