Source: Publication Open Repository TOrino
Journals · Editorials · Book chapters · Conference proceedings · Other | Filter by author: |
- Fabrizio, Lamberti; Gianluca, Paravati; Valentina, Gatteschi; Alberto, Cannavò; Paolo, Montuschi, Virtual character animation based on affordable motion capture and reconfigurable tangible interfaces. IEEE Transactions On Visualization And Computer Graphics, IEEE, pag. 1-14, in press, ISSN: 1077-2626, doi: 10.1109/TVCG.2017.2690433.Software for computer animation is generally characterized by a steep learning curve, due to the entanglement of both sophisticated techniques and interaction methods required to control 3D geometries. This paper proposes a tool designed to support computer animation production processes by leveraging the affordances offered by articulated tangible user interfaces and motion capture retargeting solutions. To this aim, orientations of an instrumented prop are recorded together with animator's motion in the 3D space and used to quickly pose characters in the virtual environment. High-level functionalities of the animation software are made accessible via a speech interface, thus letting the user control the animation pipeline via voice commands while focusing on his or her hands and body motion. The proposed solution exploits both off-the-shelf hardware components (like the Lego Mindstorms EV3 bricks and the Microsoft Kinect, used for building the tangible device and tracking animator's skeleton) and free open-source software (like the Blender animation tool), thus representing an interesting solution also for beginners approaching the world of digital animation for the first time. Experimental results in different usage scenarios show the benefits offered by the designed interaction strategy with respect to a mouse \& keyboard-based interface both for expert and non-expert users.
- Lamberti, Fabrizio; Paravati, Gianluca; Gatteschi, Valentina; Cannavo', Alberto, Supporting Web analytics by aggregating user interaction data from heterogeneous devices using viewport-DOM based heat maps. IEEE Transactions On Industrial Informatics, IEEE, pag. 1989-1999, 2017, ISSN: 1551-3203, doi: 10.1109/TII.2017.2658663.The players of the digital industry look at network Big Data as an incredible source of revenues, which can allow them to design products, services and market strategies ever more tailored to users' interests and needs. This is the case of data collected by Web analytics tools, which describe the way users interact with Web contents and where their attention focuses onto during navigation. Given the complexity of information to analyze, existing tools often make use of visualization strategies to represent data aggregated throughout separate sessions and multiple users. In particular, heat maps are often adopted to study the distribution of mouse activity and identify page regions that are more frequently reached during interaction. Unfortunately, since Web contents are accessed via ever more heterogeneous devices, region-based heat maps cannot be exploited anymore to aggregate data concerning user's attention, since the same Web content may move to another page location or exhibit a different aspect depending on the access device used or the user agent setup. This paper presents the design of a visual analytics framework capable to deal with the above limitation by adopting a data collection approach that combines information about regions displayed with information about page structure. This way, the well-known heat map-based visualization can be produced, where interactions can be aggregated on a per-element basis independently of the specific access configuration. Experimental results showed that the framework succeeds in accurately quantifying user's attention and replicating results obtained by manual processing.
- Bonaiuto, Stefano; Cannavò, Alberto; Piumatti, Giovanni; Paravati, Gianluca; Lamberti, Fabrizio, Tele-operation of robot teams: a comparison of gamepad-, mobile device- and hand tracking-based user interfaces. In: Proc. 4th IEEE International COMPSAC Workshop on User Centered Design and Adaptive Systems (UCDAS 2017), IEEE, pag. 1-6, in press.
- Cannavò, Alberto; Lamberti, Fabrizio, User interaction feedback in a hand-controlled interface for robot team tele-operation using wearable augmented reality. In: Proc. Smart Tools and Applications in Graphics 2017 (STAG 2017), Eurographics Association, pag. 1-10, in press.Continuous advancements in the field of robotics and its increasing spread across heterogeneous application scenarios make the development of ever more effective user interfaces for human-robot interaction (HRI) an extremely relevant research topic. In particular, Natural User Interfaces, e.g., based on hand and body gestures, proved to be an interesting technology to be exploited for designing intuitive interaction paradigms in the field of HRI. However, the more sophisticated the HRI interfaces become, the more important is to provide users with an accurate feedback about the state of the robot as well as of the interface itself. In this work, an Augmented Reality (AR)-based interface is deployed on a head-mounted display to enable tele-operation of a remote robot team using hand movements and gestures. A user study is performed to assess the advantages of wearable AR compared to desktop-based AR in the execution of specific tasks.
- Alberto, Cannavò; Filippo Gabriele Pratticò, ; Giuseppe, Ministeri; Fabrizio, Lamberti, A movement analysis system based on immersive virtual reality and wearable technology for sport training. In: Proc. 4th International Conference on Virtual Reality (ICVR2018), ACM, pag. 1-6, in press.The use of virtual reality (VR) is widespread in a growing number of application domains. Continuous technological advancements in the field of computer graphics made VR an interesting tool for learning purposes, especially in sport. Examples can be found in different sports such as rugby, baseball, soccer, golf, etc. This paper presents a VR-based training system that can be used as a self-learning tool to improve the execution of a given technical gesture. In particular, the basketball free throw gesture is considered. To assess the usefulness of the proposed system, experimental tests were carried out in a small-scale setup by involving 18 non-skilled volunteers. Results demonstrated that the designed system can improve the execution of the considered gesture in terms of both timing and spatial positioning compared to an alternative technique based on video projection.
- Attanasio, Giuseppe; Cannavò, Alberto; Cibrario, Francesca; Lamberti, Fabrizio; Montuschi, Paolo; Paravati, Gianluca, HOT: Hold your own tools for AR-based constructive art. In: Proc. IEEE Symposium on 3D User Interfaces 2017 (IEEE 3DUI 2017), IEEE, pag. 256-257, 2017, ISBN: 978-150906716-9, doi: 10.1109/3DUI.2017.7893369.Using digital instruments to support artistic expression and creativity is a hot topic. In this work, we focused on the design of a suitable interface for Augmented Reality-based constructive art on handheld devices. Issues to be faced encompassed how to give artists sense of spatial dimensions, how to provide them with different tools for realizing artworks, and how much moving away from ``the real'' and going towards ``the virtual''. Through a touch-capable device, such as a smartphone or a tablet, we offer artists a clean workspace, where they can decide when to introduce artworks and tools. In fact, besides exploiting the multi-touch functionality and the gyroscopes/accelerometers to manipulate artworks in six degrees of freedom (6DOF), the proposed solution exploits a set of printed markers that can be brought into the camera's field of view to make specific virtual tools appear in the augmented scene. With such tools, artists can decide to control, e.g., manipulation speed, scale factor, scene parameters, etc., thus complementing functionalities that can be accessed via the device's screen.
- Cancedda, Laura; Cannavò, Alberto; Garofalo, Giuseppe; Lamberti, Fabrizio; Montuschi, Paolo; Paravati, Gianluca, Mixed reality-based user interaction feedback for a hand-controlled interface targeted to robot teleoperation. In: Proc. 4th International Conference on Augmented Reality, Virtual Reality, and Computer Graphics (AVR 2017), Springer, vol. 10325, pag. 447-463, 2017, ISBN: 978-331960927-0, doi: 10.1007/978-3-319-60928-7_38.The continuous progress in the field of robotics and the diffusion of its related application scenarios in today's modern world makes human interaction and communication with robots an aspect of fundamental importance. The development of interfaces based on natural interaction paradigms is getting an increasingly captivating topic in Human-Robot Interaction (HRI), due to their intrinsic capabilities in providing ever more intuitive and effective control modalities. Teleoperation systems require to handle a non-negligible amount of information coming from on-board sensors as well as input devices, thus increasing the workload of remote users. This paper presents the design of a 3D User Interface (3DUI) for the control of teleoperated robotic platforms aimed at increasing the interaction efficiency. A hand gesture driven controller is used as input modality to naturally map the position and gestures of the user's hand to suitable commands for controlling the platform components. The designed interface leverages on mixed reality to provide a visual feedback to the control commands issued by the user. The visualization of the 3DUI is superimposed to the video stream provided by an on-board camera. A user study confirmed that the proposed solution is able to improve the interaction efficiency by significantly reducing the completion time for tasks assigned in a remote reach-and-pick scenario.
- Cannavò, Alberto; Cermelli, Fabio; Chiaramida, Vincenzo; Ciccone, Giovanni; Lamberti, Fabrizio; Montuschi, Paolo; Paravati, Gianluca, T4T: Tangible Interface for Tuning 3D object manipulation tools. In: Proc. IEEE Symposium on 3D User Interfaces 2017 (IEEE 3DUI 2017), IEEE, pag. 266-267, 2017, ISBN: 978-150906716-9, doi: 10.1109/3DUI.2017.7893374.A 3D User Interface for manipulating virtual objects in Augmented Reality scenarios on handheld devices is presented. The proposed solution takes advantage of two interaction techniques. The former (named ``cursor mode'') exploits a cursor, which position and movement are bound to the view of the device; the cursor allows the user to select objects and to perform coarse-grain manipulations by moving the device. The latter (referred to as ``tuning mode'') uses the physical affordances of a tangible interface to provide the user with the possibility to refine objects in all their aspects (position, rotation, scale, color, and so forth) with a fine-grained control
Journals
Conference proceedings