The computer engineers at Duke university have developed a program called EyeSyn. This program will help the Metaverse platform to train the applications that are being developed.
Human eye is a very complex part of our body and also one of the most important tool to interact with the world. Therefore it is imperative that the applications in Metaverse can mimic as accurately as possible. This includes not just the movement of the eyes but also the dilation of pupils depending on the change in our surroundings.
However previously to train the application, required people to physically wear the headset. According to Maria Gorlatova, Nortel Networks Assistant professor of Electrical and computer engineering at Duke, “But training that kind of algorithm requires data from hundreds of people wearing headsets for hours at a time,” Gorlatova added. “We wanted to develop software that not only reduces the privacy concerns that come with gathering this sort of data, but also allows smaller companies who don’t have those levels of resources to get into the metaverse game.”
Instead of real eyes, the engineers at Duke university have developed a set of virtual eyes that can simulate the eyes. The simulation is accurate enough for the companies to train their applications and the results are accepted by International conference on information Processing in Sensor Networks (IPSN), 4-6 May 2022, the first annual Network Sensing and Control Forum
Eyes can be used to extract a lot of information that can be used by the companies in order to tailor their content. In the physical world when interacting with other people the eyes can give indications of what the other person is thinking. For example a bad liar or a bad poker player’s eyes will give away important information on what he or she is thinking. Of course there are other emotions such as pain, excitement, sadness etc that also affect our eyes. Hence it is clear that the companies would want to use this information in Metaverse for purposes such as marketing.