[To Appear] We make 2 presentations at AHs 2023 and 5 presentations at IEEE VR 2023

Mr. Yokoro and Mr. Oshimi (M2) will give the following two oral presentations at Augmented Humans 2023, a conference on human augmentation to be held in Glasgow, Scotland from March 12 to 14.

(Conference Paper)

  • “DecluttAR: An Interactive Visual Clutter Dimming System to Help Focus on Work”,
    Kaito Yokoro, Monica Perusquía-Hernández, Naoya Isoyama, Hideaki Uchiyama, and Kiyoshi Kiyokawa
    (An AR system that enhances work concentration by enabling users to adjust the appearance of individual objects on their desk)
  • “LocatAR: An AR Object Search Assistance System for a Shared Space”,
    Hiroto Oshimi, Monica Perusquía-Hernández, Naoya Isoyama, Hideaki Uchiyama, and Kiyoshi Kiyokawa
    (An AR system that tracks the movement of objects in a collaborative workspace and supports object-finding while respecting privacy)

We will also give five presentations at the upcoming IEEE VR 2023 conference, which will be held both in-person in Shanghai and online from March 25 to March 29. IEEE VR is the top conference in the field of virtual reality and only the top 10.0% of submitted papers are published in the top journal, IEEE TVCG, with an overall acceptance rate of 21.4%, including papers published in the conference proceedings. We will give two oral presentations, one in TVCG (by Mr. Yan (an alumni) and Ms. Hu (D2)) and another in the conference proceedings (by Mr. Otono (M2)). In addition to these, we will present two posters (by Mr. Yang (an alumni) and Ms. Otsuka (M2)) and a research demo (by Mr. Yan and Ms. Hu).

(TVCG Paper)

  • “Add-on Occlusion: Turning Off-the-Shelf Optical See-through Head-mounted Displays Occlusion-capable”,
    Yan Zhang, Xiaodan Hu, Kiyoshi Kiyokawa, and Xubo Yang (collaboration with Shanghai Jiao Tong University)
    (Research on an attachment module for supporting per-pixel occlusion on existing optical see-through AR glasses, such as HoloLens)

(Conference Paper)

  • “I’m Transforming! Effects of Visual Transitions to Change of Avatar on the Sense of Embodiment in AR”,
    Riku Otono, Adélaïde Genay, Monica Perusquía-Hernández, Naoya Isoyama, Hideaki Uchiyama, Martin Hachet, Anatole Lécuyer, and Kiyoshi Kiyokawa (collaboration with INRIA, France)
    (A study that investigated the enhancement of a sense of embodiment through the gradual transformation of the user’s own body into an avatar through voluntary actions.)

(Poster)

  • “A Palm-Through Interaction Technique for Controlling IoT Devices”,
    Zhengchang Yang, Naoya Isoyama, Nobuchika Sakata, and Kiyoshi Kiyokawa
    (A user interface that uses a palm held up in the air like a touch panel to control distant IoT devices)
  • “An AR Visualization System for 3D Carbon Dioxide Concentration Measurement Using Fixed and Mobile Sensors”,
    Maho Otsuka, Monica Perusquía-Hernández, Naoya Isoyama, Hideaki Uchiyama, and Kiyoshi Kiyokawa
    (An AR system that automatically recognizes the placement of CO2 sensors installed indoors and visualizes the distribution of CO2 concentration)

(Research Demo)

  • “Add-on Occlusion: Building an Occlusion-capable Optical See-through Head-mounted Display with HoloLens 1”,
    Yan Zhang, Xiaodan Hu, Kiyoshi Kiyokawa, and Xubo Yang (collaboration with Shanghai Jiao Tong University)
    (Research demo of an attachment module for supporting per-pixel occlusion on HoloLens 1)