All Stories

  1. An Immersive and Interactive VR Dataset to Elicit Emotions
  2. Searching Across Realities: Investigating ERPs and Eye-Tracking Correlates of Visual Search in Mixed Reality
  3. Less Typing, More Tagging: Investigating Tag-based Interfaces in Online Accommodation Review Creation and Perception
  4. Exploring Users' Mental Models and Privacy Concerns During Interconnected Interactions
  5. Usability and Adoption of Graphical Tools for Data-Driven Development
  6. Multimodal Detection of External and Internal Attention in Virtual Reality using EEG and Eye Tracking Features
  7. Hands-free Selection in Scroll Lists for AR Devices
  8. Crossing Mixed Realities: A Review for Transitional Interfaces Design
  9. Second Workshop on Engineering Interactive Systems Embedding AI Technologies
  10. Learning in the wild – exploring interactive notifications to foster organic retention of everyday media content
  11. Significant Productivity Gains through Programming with Large Language Models
  12. Extending Jupyter with Multi-Paradigm Editors
  13. Investigating the Effects of Eye-Tracking Interpolation Methods on Model Performance of LSTM
  14. Optimizing Visual Complexity for Physiologically-Adaptive VR Systems: Evaluating a Multimodal Dataset using EDA, ECG and EEG Features
  15. Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive Systems
  16. Perceived Empathy of Technology Scale (PETS): Measuring Empathy of Systems Toward the User
  17. Sitting Posture Recognition and Feedback: A Literature Review
  18. PhysioCHI: Towards Best Practices for Integrating Physiological Signals in HCI
  19. Investigating Opportunities for Active Smart Assistants to Initiate Interactions With Users
  20. Developing an Emoji-based User Experience Questionnaire: UEQ-Emoji
  21. SeatmateVR: Proxemic Cues for Close Bystander-Awareness in Virtual Reality
  22. The Actuality-Time Continuum: Visualizing Interactions and Transitions Taking Place in Cross-Reality Systems
  23. SensCon: Embedding Physiological Sensing into Virtual Reality Controllers
  24. A Mixed-Method Exploration into the Mobile Phone Rabbit Hole
  25. Adapting Visual Complexity Based on Electrodermal Activity Improves Working Memory Performance in Virtual Reality
  26. A Scoping Survey on Cross-Reality Systems
  27. Supporting Software Developers Through a Gaze-Based Adaptive IDE
  28. Engineering Interactive Systems Embedding AI Technologies
  29. Exploring Smart Standing Desks to Foster a Healthier Workplace
  30. Highlighting the Challenges of Blinks in Eye Tracking for Interactive Systems
  31. Deep Learning Super-Resolution Network Facilitating Fiducial Tangibles on Capacitive Touchscreens
  32. Understanding and Mitigating Technology-Facilitated Privacy Violations in the Physical World
  33. Using Pseudo-Stiffness to Enrich the Haptic Experience in Virtual Reality
  34. Exploring Physiological Correlates of Visual Complexity Adaptation: Insights from EDA, ECG, and EEG Data for Adaptation Evaluation in VR Adaptive Systems
  35. A Database for Kitchen Objects: Investigating Danger Perception in the Context of Human-Robot Interaction
  36. The Impact of Expertise in the Loop for Exploring Machine Rationality
  37. Walk This Beam: Impact of Different Balance Assistance Strategies and Height Exposure on Performance and Physiological Arousal in VR
  38. ThumbPitch: Enriching Thumb Interaction on Mobile Touchscreens using Deep Learning
  39. Extended Mid-air Ultrasound Haptics for Virtual Reality
  40. A survey of digitally augmented piano prototypes across the decades.
  41. The Influence of Transparency and Control on the Willingness of Data Sharing in Adaptive Mobile Apps
  42. Conductive Fiducial Tangibles for Everyone
  43. The Skewed Privacy Concerns of Bystanders in Smart Environments
  44. A Touch of Realities: Car-Interior-Based Haptic Interaction Supports In-Car VR Recovery from Interruptions
  45. A Survey of Natural Design for Interaction
  46. The Human in the Infinite Loop: A Case Study on Revealing and Explaining Human-AI Interaction Loop Failures
  47. Cobity: A Plug-And-Play Toolbox to Deliver Haptics in Virtual Reality
  48. Tangible Interfaces Support Young Children’s Goal Interdependence
  49. Adapting visualizations and interfaces to the user
  50. Complementary interfaces for visual computing
  51. Designing Tangible Tools to Engage Silent Students in Group Discussion
  52. Designing a Tangible Interface to “Force” Children Collaboration
  53. Virtual Reality Adaptation Using Electrodermal Activity to Support the User Experience
  54. VRception: Rapid Prototyping of Cross-Reality Systems in Virtual Reality
  55. User Perceptions of Extraversion in Chatbots after Repeated Use
  56. Designing Tangible as an Orchestration Tool for Collaborative Activities
  57. Designing a Physiological Loop for the Adaptation of Virtual Human Characters in a Social VR Scenario
  58. A Meta-Analysis of Tangible Learning Studies from the TEI Conference
  59. Flyables: Haptic Input Devices for Virtual Reality using Quadcopters
  60. ConAn: A Usable Tool for Multimodal Conversation Analysis
  61. Neural Photofit: Gaze-based Mental Image Reconstruction
  62. 3D Hand Pose Estimation on Conventional Capacitive Touchscreens
  63. A Design Space for User Interface Elements using Finger Orientation Input
  64. Introduction to Intelligent User Interfaces
  65. Super-Resolution Capacitive Touchscreens
  66. Pose-on-the-Go: Approximating User Pose with Smartphone Sensor Fusion and Inverse Kinematics
  67. Vibrosight++: City-Scale Sensing Using Existing Retroreflective Signs and Markers
  68. Revisited: Comparison of Empirical Methods to Evaluate Visualizations Supporting Crafting and Assembly Purposes
  69. Deep learning for human-computer interaction
  70. Investigating User Perceptions Towards Wearable Mobile Electromyography
  71. Imprint-Based Input Techniques for Touch-Based Mobile Devices
  72. Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones
  73. VibroComm: Using Commodity Gyroscopes for Vibroacoustic Data Reception
  74. Enhancing Mobile Voice Assistants with WorldGaze
  75. Improving Humans' Ability to Interpret Deictic Gestures in Virtual Reality
  76. Audio VR
  77. POST
  78. Finding the Sweet Spot
  79. Notification in VR
  80. Force Touch Detection on Capacitive Sensors using Deep Neural Networks
  81. Investigating Unintended Inputs for One-Handed Touch Interaction Beyond the Touchscreen
  82. EyePointing
  83. KnuckleTouch
  84. Text Analysis Using Large High-Resolution Displays
  85. Effect of Orientation on Unistroke Touch Gestures
  86. Improving the Input Accuracy of Touchscreens using Deep Learning
  87. Investigating the Effect of Orientation and Visual Style on Touchscreen Slider Performance
  88. Online, VR, AR, Lab, and In-Situ
  89. Investigating the feasibility of finger identification on capacitive touchscreens using deep learning
  90. Up to the Finger Tip
  91. How to communicate new input techniques
  92. The Effect of Road Bumps on Touch Interaction in Cars
  93. Demonstrating palm touch
  94. Designing finger orientation input for mobile touchscreens
  95. Evaluating the Disruptiveness of Mobile Interactions
  96. Pac-Many
  97. PalmTouch
  98. The Effect of Offset Correction and Cursor on Mid-Air Pointing in Real and Virtual Environments
  99. Understanding Large Display Environments
  100. Fingers' Range and Comfortable Area for One-Handed Smartphone Interaction Beyond the Touchscreen
  101. InfiniTouch
  102. Interaction techniques for window management on large high-resolution displays
  103. Machine learning with tensorflow for mobile and ubiquitous interaction
  104. Teach Me How! Interactive Assembly Instructions Using Demonstration and In-Situ Projection
  105. Estimating the Finger Orientation on Capacitive Touchscreens Using Convolutional Neural Networks
  106. Using Variable Movement Resistance Sliders for Remote Discrete Input
  107. A smartphone prototype for touch interaction on the whole device surface
  108. Feasibility analysis of detecting the finger orientation with depth cameras
  109. Improving software-reduced touchscreen latency
  110. Machine learning for intelligent mobile user interfaces using TensorFlow
  111. Understanding the ergonomic constraints in designing for touch surfaces
  112. RunMerge
  113. Interaction Methods and Use Cases for a Full-Touch Sensing Smartphone
  114. Towards Supporting Remote Cheering during Running Races with Drone Technology
  115. Understanding Work in Public Transport Management Control Rooms
  116. Mobile Interactions Augmented by Wearable Computing
  117. Microgesture detection for remote interaction with mobile devices
  118. Mobile In-Situ Pick-by-Vision
  119. Automatic projection positioning based on surface suitability
  120. Screen arrangements and interaction areas for large display work places
  121. Design Guidelines for Notifications on Smart TVs
  122. Finger Placement and Hand Grasp during Smartphone Interaction
  123. RAMPARTS
  124. From Mobile to Wearable
  125. Modeling Distant Pointing for Compensating Systematic Displacements
  126. Subjective and Objective Effects of Tablet's Pixel Density
  127. TUIs in the Large
  128. Using Space
  129. Pick from here!
  130. Using In-Situ Projection to Support Cognitively Impaired Workers at the Workplace