What you see is what you touch, and scientists bring “active touch” to the metaverse

Professor Jiang Hanqing of Westlake University’s team proposed and developed the “high-protection Allah mechanical haptic interaction system” for the first time in the world, bringing a new dimension of tactile perception to the metaverse. A few days ago, the relevant research results were published in Nature – Machine Intelligence.

Based on curved origami structure, the virtual and real world two-way intercommunication and multi-scale interactive device integration. Photo courtesy of the research group

In order to make the metaverse accessible, researchers have tried a variety of solutions, mostly to create a “passive touch” through vibration or pressure compensation: including common handles with vibration functions, vibration motors that can be attached to the skin, and motors based on rigid connecting rods or rope structures to drive hand exoskeletons. This kind of “touch” is a passive interactive experience that starts from the device and gives the user.

Jiang Hanqing created a new concept of “active touch” – unlike the shoulders, chest, waist, back and other parts of the human body that usually receive “passive touch”, human hands and feet usually take the initiative to perceive the physical world through active touch. The research team chose to start with “mechanical touch” (i.e., stiffness, the soft and hard touch of objects) to simulate the feeling of hands and feet when they actively touch an object.

They developed a set of “high-security Allah mechanical haptic interaction system”, using origami modules of different materials and sizes to build two different dimensions of interactive devices: a handheld device that can trigger a local touch sensation, and a foot-operated device that can produce a full body feeling. When using handheld interactive devices, users can experience the softness and hardness of different objects they interact with through active grasping; When using a foot-operated device, the user can experience the ground characteristics of the environment in the form of full-body movement through active pedaling. 

The realization of this active mechanical touch is due to the passive deformation of the curved origami structure inside the hardware equipment during the interaction process and actively triggered by the user – under the cooperation of the motor, the curved origami can be bent into different angles, and will also produce different sizes of reaction force, thereby giving the user different “elastic” feedback.

For example, Jiang Hanqing said, folding two thin plastic sheets in half and interspersing them together in an “X” shape, imagine that when you press the plastic sheet vertically “up and down”, the force and angle are different, and the perception of the “rebound” feedback received by the hand will be different. This change in touch is transmitted to the brain, and the brain will make a judgment based on the “soft and hard”: whether it is cotton, a wooden board, or a steel ball… If you replace your hand with a foot, the brain will also determine whether the person is walking on the road, on the grass, or stepping on ice according to the stiffness feedback transmitted by the foot… As a result, people in the virtual world of the meta-universe can perfectly realize “what they see is touched, and where they are tread”. (Source: Wen Caifei, China Science News)

Related paper information:

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button