SINLab: XR - human motion from a robot arm?

This comprises at least 3 distinct thesis topics.

What is Extended Reality?

Extended Reality (XR) has no single interpretation. It can be an increase of realism by bringing the real world into the Metaverse (or a Metaverse, depends on your point of view). This is good for everybody who wants to share content, like the media industry, cultural historians, archeologists, the tourism industry, everybody who wants to explore data in a more appealing form.

But XR can also means that people are enabled to interact physically with remote space and (important for this thesis) with other people in remote places. This interpretation of XR places their consciousness in a remote space in the real world, where actions have real consequences. By acting on the remote side through a robot as your "avatar", you can actually achieve physical change. You can actually break things.

This view is extended to all forms of remote interaction with an unmodelled remote environment. This view is important because even today people cannot be replaced by autonomous entities for everything we do. Human knowledge and understanding allows them act creativity and flexibly based on a real-time assessment of remote situations. In performing actions that have an effect on remote humans, people's emotional involvement is important. And sometimes people simply don't want to be replaced by autonomous entities.

Scenarios for this include search-and-rescue operations, risk assessment (earthquake sites, ...), remote exploration (subsea, caves, ...), education including physical feedback (artisanal skills involving accurate kinesthetics or other senses like touch, smell and hearing), remote healthcare that requires physical interaction of patient and carer (stroke rehabilition and various forms of physiotherapy) and so on.

This thesis is concerned with the second interpretation of XR.

We want to explore some very basic questions about the interaction between people using VR equipment and a robot to allow people to interact with each other across long distances.

Environment

The research for this thesis is conducted in the SINLab at IFI. Our robot uses the UR10e arm with a 5-finger Shadow Hand. Our controls are (for now) the Vive Tracker for arm motion, the Leap Motion (version 1) for hand motion and the UltraHaptics emitter for touch feedback.

 

Specific question for this thesis

In our scenario, one person (the controller) moves their arm and hand to control the motion of a robot arm and hand on the remote side. We dream of a robot arm that mirrors the natural motion of the controller in an entirely human-like fashion.

The other person (the partner) would in an ideal world react to this robot arm motion in the same way as to a human motion. However, this is not actually happening any time soon, and we want to explore how this situation can be improved without building an Android suitable for a science fiction movie.

  • The robot arm is (today) disembodied. The controller's body apart from the arm is at best represented on a video screen. Of course, this is not the future, but we need to understand whether this matters for the natural reaction of the partner.
  • The robot arm is made mostly of metal. Even the hand itself weights 4.5 kg, which is quite a bit more than a human hand. It is an inherently dangerous object that has no subconscious controls and very limited awareness of its surroundings. How does its movement pattern and movement speed influence the visible reactions and conscious reflection of the partner?
  • The joints of the robot arm are huge and not in the same places as human's joints. How should the arm move to be perceived (by the partner) as appropriate for the physical avatar of the controller? Should the motion follow the optimal robotic motion path? Or should it move in curves that emulate the controller's upper and lower arm pose and hand pose in a more natural manner? Can you observe that the partner reacts in different ways to these alternative motion path?

Learning outcome

Experience in

  • in formulating, investigating and answering research questions
  • controlling an advanced robot setup (30 joints) in real-time
  • conducting, evaluating and interpreting user studies

Conditions

We expect that you:

  • have been admitted to a master's program in MatNat@UiO - primarily PROSA
  • take this as a long thesis
  • will participate actively in the weekly SINLab meetings
  • are present in the lab and collaborate with other students and staff
  • are interested in and have some knowledge of C/C++ programming
  • are interested in conducting user studies
  • are willing to share your results on Github
  • include the course IN5060 in the study plan, unless you have already completed a course on classical (non-ML) data analysis

 

Publisert 29. aug. 2023 10:44 - Sist endret 3. okt. 2023 09:13

Veileder(e)

Omfang (studiepoeng)

60