You can download a print version or a
smartphone-friendly version of the
program.
If you want to go all in, note that the print version also doubles as a conference badge! And
you can even print your own conference sign!
Accepted submissions will be published in the ACM digital library as part of the International
Conference Proceedings Series (ICPS).
Monday, 16th of March 2020
Time | Talk | Recording |
---|---|---|
09:25 - 09:40 | Conference Opening | Watch |
09:40 - 10:40 | Keynote Stelarc - Contemporary Chimeras: Creepy, Uncanny and Contestable Bodies | Watch |
Session 1: Perception I - Light | Watch | |
11:00 - 11:20 | Eye-based Interaction Using Embedded Optical Sensors on an Eyewear Device for Facial Expression Recognition Katsutoshi Masai, Kai Kunze and Maki Sugimoto | Watch |
11:20 - 11:30 | Altering the Speed of Reality? Exploring Visual Slow-Motion to Amplify Human Perception using Augmented RealityPascal Knierim, Thomas Kosch, Gabrielle LaBorwit and Albrecht Schmidt | Watch |
11:30 - 11:50 | DehazeGlasses: Optical Dehazing with an Occlusion Capable See-Through DisplaysYuichi Hiroi, Takumi Kaminokado, Atsushi Mori and Yuta Itoh | Watch |
11:50 - 12:00 | Vision Extension for a Ball Camera by Using Image CompletionTsubasa Kitayama, Shio Miyafuji and Hideki Koike | Watch |
12:00 - 12:20 | OmniView: An Exploratory Study of 360 Degree Vision using Dynamic Distortion based on Direction-of-InterestFeng Liang, Kevin Stevanus, Holger Baldauf, Kai Kunze and Yun Suen Pai | Watch |
12:20 - 13:40 | Lunch | |
Session 2: Perception II - Vibration | Watch | |
13:40 - 14:00 | The Lateral Line: Augmenting Spatiotemporal Perception with a Tactile InterfaceMatti Krüger, Christiane B. Wiebel-Herboth and Heiko Wersing | Watch |
14:00 - 14:20 | HapticPointer: A Neck-worn Device that Presents Direction by Vibrotactile Feedback for Remote Collaboration TasksAkira Matsuda, Kazunori Nozawa, Kazuki Takata, Atsushi Izumihara and Jun Rekimoto | Watch |
14:20 - 14:40 | GenVibe: Exploration of Interactive Generation of Personal Vibrotactile PatternsErik Pescara, Florian Dreschner, Karola Marky, Kai Kunze and Michael Beigl | Watch |
14:40 - 15:00 | Manipulatable Auditory Perception in Wearable ComputingHiroki Watanabe and Tsutomu Terada | Watch |
15:00 - 15:10 | Novel Input and Output opportunities using an Implanted MagnetPaul Strohmeier and Jess McIntosh | Watch |
15:10 - 15:30 | Poster and Demo Madness PDMSkin: On-Skin Gestures with Printable Ultra-Stretchable Soft Electronic Second Skin, Tobias Röddiger, Michael Beigl, Daniel Wolffram, Matthias Budde and Hongye Sun Investigation of Effective Parts for Rotation and Translation of the Legs Using Hanger Reflex, Hanamichi Sanada, Masato Kobayashi, Kon Yuki and Hiroyuki Kajimoto GymSoles++: Using Smart Wearbales to Improve Body Posture when Performing Squats and Dead-Lifts, Don Samitha Elvitigala, Denys J.C. Matthies, Chamod Weerasinghe and Suranga Nanayakkara EgoSpace: Augmenting Egocentric Space by Wearable Projector, Yuya Adachi, Haoran Xie, Takuma Torii, Haopeng Zhang and Ryo Sagisaka Towards a Wearable for Deep Water Blackout Prevention, Frederik Wiehr, Andreas Höh and Antonio Krueger High-speed Projection Method of Swing Plane for Golf Training, Tomohiro Sueishi, Chikara Miyaji, Masataka Narumiya, Yuji Yamakawa and Masatoshi Ishikawa Augmented Workplace: Human-Sensor Interaction for Improving the Work Environment, Yutaka Arakawa e2-MaskZ: a Mask-type Display with Facial Expression Identification using Embedded Photo Reflective Sensors, Akino Umezawa, Yoshinari Takegawa, Katsuhiro Suzuki, Katsutoshi Masai, Yuta Sugiura, Maki Sugimoto, Yutaka Tokuda, Diego Martinez Plasencia, Sriram Subramanian, Masafumi Takahashi, Hiroaki Taka and Keiji Hirata | Watch |
Session 3: Moving and Experiencing the Body | Watch | |
16:00 - 16:10 | Sensor Glove Implemented With Artificial Muscle Set For Hand RehabilitationBiyuan Wang, Nobuhiro Takahashi and Hideki Koike | Watch |
16:10 - 16:30 | Accelerating Skill Acquisition of Two-Handed Drumming using Pneumatic Artificial MusclesTakashi Goto, Swagata Das, Katrin Wolf, Pedro Lopes, Yuichi Kurita and Kai Kunze | Watch |
16:30 - 16:50 | PoseAsQuery: Full-Body Interface for Repeated Observation of a Person in a Video with Ambiguous Pose Indexes and Performed PosesNatsuki Hamanishi and Jun Rekimoto | Watch |
16:50 - 17:00 | Investigation of Effective Parts for Rotation and Translation of the Legs Using Hanger ReflexHanamichi Sanada, Masato Kobayashi, Kon Yuki and Hiroyuki Kajimoto | Watch |
17:00 - 17:10 | Go-Through: Disabling Collision to Access Obstructed Paths and Open Occluded Views in Social VRJens Reinhardt and Katrin Wolf | Watch |
17:20 - 17:40 | Remote Treatment System of Phantom Limb Pain by Displaying Body Movement in Shared VR SpaceKenta Saito, Atsushi Okada, Yu Matsumura and Jun Rekimoto | Watch |
Tuesday, 17th of March 2020
Time | Talk | Recording |
---|---|---|
09:00 - 10:10 | Keynote Kasper Hornbæk - Research Problems in Body-based User Interfaces | Watch |
Session 4: Sports and Gestures | Watch | |
10:40 - 10:50 | KissGlass: Greeting Gesture Recognition using Smart GlassesRichard Li, Juyoung Lee, Woontack Woo and Thad Starner | Watch |
10:50 - 11:10 | ExemPoser: Predicting Poses of Experts as Examples for Beginners in Climbing Using a Neural NetworkKatsuhito Sasaki, Keisuke Shiro and Jun Rekimoto | Watch |
11:10 - 11:30 | waveSense: Low Power Voxel-tracking Technique for Resource Limited DevicesAnusha Withana, Tharindu Kaluarachchi, Chanaka Singhabahu, Shanaka Ransiri, Yilei Shi and Suranga Nanayakkara | Watch |
11:30 - 11:50 | The Jungle Warm-Up Run: Augmenting Athletes with Coach-Guided Dynamic Game ElementsFrederik Wiehr, Marko Vujic, Antonio Krueger and Florian Daiber | Watch |
11:50 - 12:10 | Archery shots visualization by clustering and comparing from angular velocities of bowsMidori Kawaguchi, Mitake Hironori and Shoich Hasegawa | Watch |
12:10 - 13:30 | Lunch | |
Session 5: Cognition | Watch | |
13:30 - 13:50 | Design of Altered Cognition with Reshaped BodiesKenichiro Shirota, Makoto Uju, Yurike Chandra, Elaine Czech, Roshan L. Peiris and Kouta Minamizawa | Watch |
13:50 - 14:10 | Wearable Reasoner : Enhanced Human Rationality Through A Wearable Audio Device With Explainable AI AssistantValdemar Danry, Pat Pataranutaporn, Yaoli Mao and Pattie Maes | Watch |
14:10 - 14:20 | SpotlessMind - A Design Probe for Eliciting Attitudes towards Sharing NeurofeedbackPassant El.Agroudy, Xiyue Wang, Evgeny Stemasov, Teresa Hirzle, Svetlana Shishkovets, Siddharth Mehrotra and Albrecht Schmidt | Watch |
14:20 - 14:40 | Facilitating Experiential Knowledge Sharing through Situated ConversationsRyo Fujikura and Yasuyuki Sumi | Watch |
Session 6: HCI Futures, from Skin to Cells | Watch | |
15:00 - 15:20 | VersaTouch: A Versatile Plug-and-Play System that EnablesTouch Interactions on Everyday Passive SurfacesYilei Shi, Haimo Zhang, Jiashuo Cao and Suranga Nanayakkara | Watch |
15:20 - 15:40 | WristLens: Enabling Single-Handed Surface Gesture Interaction for Smartwatch using Optical Motion SensorHui-Shyong Yeo, Juyoung Lee, Andrea Bianchi, Alejandro Samboy, Hideki Koike, Woontack Woo and Aaron Quigley | Watch |
15:40 - 16:00 | PDMSkin: On-Skin Gestures with Printable Ultra-Stretchable Soft Electronic Second SkinTobias Röddiger, Michael Beigl, Daniel Wolffram, Matthias Budde and Hongye Sun | Watch |
16:00 - 16:10 | Sketching On-Body Interactions using Piezo-Resistive Kinesiology TapePaul Strohmeier, Narges Pourjafarian, Marion Koelle, Cedric Honnet, Bruno Fruchard and Jürgen Steimle | Watch |
16:10 - 16:30 | Living Bits : Opportunities and Challenges for Integrating Living Microorganisms in Human-Computer InteractionPat Pataranutaporn*, Angela Vujic*, David Kong and Pattie Maes, Misha Sra, *first authors | Watch |
16:50 - 17:50 | Keynote Enkelejda Kasneci: It’s in Your Eyes – How Eye Tracking will Shape our Future | Watch |
17:50 - 18:10 | Awards | Watch |
18:10 - 18:40 | Conference Closing | Watch |
Stelarc
>> Contemporary Chimeras: Creepy, Uncanny and Contestable Bodies <<<
Stelarc’s projects and performances explore alternate anatomies. His first projects in 1968 were helmets that split binocular vision. His early actions from 1970 involved amplifying body signals and sounds as control signals for real-time interaction. Between 1973-1975 he made 3 films of the inside of his body. Between 1976-1988 he realised 27 body suspensions with insertions into his skin, in different positions, in remote locations and in varying situations. In 1980 the Third Hand, an EMG controlled mechanism, with pinch, grasp and wrist rotation, was engineered. For the Fifth Australian Sculpture Triennial in 1993 he designed a sculpture for the inside of his stomach. In 1997 in Hamburg City, Exoskeleton, a 6-legged walking robot was engineered. Fractal Flesh, Ping Body and Parasite were internet performances that explored remote and involuntary choreography via a muscle stimulation system. He has surgically constructed and cell grown an ear on his arm in 2006. The intent is still to electronically augment and internet enable it. In 2017 with Propel, he was attached to the end of an industrial robot arm so that his position/orientation, trajectory and velocity could be precisely programmed. With the Re-Wired / Re-Mixed performance, 2018, for five days, six hours a day in Perth, he could only see with the eyes of someone in London, only hear with the ears of someone in NY, whilst anyone, anywhere could choreograph his exoskeleton arm. StickMan is an interactive installation and performance where the body was actuated by a minimal but full-body exoskeleton. In 1996 he was made an Honorary Professor of Art and Robotics at Carnegie Mellon University, Pittsburgh and in 2002 was awarded an Honorary Doctorate of Laws by Monash University, Melbourne. In 2010 he was awarded the Ars Electronica Hybrid Arts Prize. In 2015 he received the Australia Council’s Emerging and Experimental Arts Award. In 2016 he was awarded an Honorary Doctorate from the Ionian University, Corfu. Between 2013 – 2018 Stelarc was a Distinguished Research Fellow, School of Media, Creative Arts and Social Inquiry (MCASI), Curtin University, Perth. His artwork is represented by Scott Livesey Galleries, Melbourne.
Kasper Hornbæk
>> Research Problems in Body-based User Interfaces <<< /p>
Kasper Hornbæk is a Professor in Computer Science at the University of Copenhagen. He received his PhD from the University of Copenhagen in 2002. Hornbæk’s work has contributed to HCI in two important ways. First, he and his coauthors have created new models of usability and user experience. They have shown that we can measure how computer tools extend our bodies; that different aspects of usability are orthogonal (and therefore should be measured independently); and that meaning is an overlooked component of user experience. Second, Hornbæk and his collaborators have sought to develop fundamental concepts and methods in the HCI field. They have discussed the field-defining concept of interaction, analysed what it means that a user interface is subtle, and outlined the key types of problems addressed in HCI. Currently, Hornbæk works on body-based user interfaces and human-computer integration.
Enkelejda Kasneci
>> It’s in Your Eyes – How Eye Tracking will Shape our Future <<< /p>
Enkelejda Kasneci is a Full Professor of Computer Science at the University of Tübingen, Germany, where she leads the Human-Computer Interaction Group. As a BOSCH-scholar, she received her M.Sc. degree in Computer Science from the University of Stuttgart in 2007. In 2013, she received her PhD in Computer Science from the University of Tübingen. For her PhD research, she was awarded the research prize of the Federation Südwestmetall in 2014. From 2013 to 2015, she was a Margarete-von-Wrangell Fellow. Her main research interests are applied machine learning, eye-tracking technology and applications. She serves as a reviewer and PC member for several journals and major conferences. In 2016, she founded LeadersLikeHer, the world’s first open career network for women from industrial, research and public organizations.