Monday, November 1, 2021

This Might Be Important: Facebook Wants To Be The AI of the Metaverse (FB)

On December 1 the new stock symbol will be  MVRS.

When Google Glass was released the privacy peeps took center stage arguing against people filming everything they looked at. They were right but that is just a surface issue. And we are seeing that with the introduction of Facebook's Meta's Ray-Ban® Stories "smart" glasses last month. 

Ireland's Data Protection Commission issued a statement voicing some of their concerns, which is a good thing considering Ireland's position in the big-tech ecosphere. But they are limited in the scope of what they can object to, and there are a whole lot more objectionable ramifications of the Ray-Ban® Stories than whether the LED light  used to notify people they are being filmed, whether it is large enough.

The glasses are being pitched to the retail user as a hands-free way to record the moments of your life la-di-da while what Facebook sees is a way to enlist millions of people into collecting data for Zuckerberg.

Think of it as the Google Street View cars collecting geospatial information combined with mobile facial recognition cameras combined with machine-learning training material combined with.....

And in the meantime though the metaverse is seen by some as an electronic bread-and-circuses; the real-world sucks so here, try our digital crack, which is probably true but barely scratches the surface of what's going on.

Here is Tiernan Ray (no relation to Ray-Ban®) with more:

Facebook has gathered thousands of hours of first-person video in order to develop neural networks that operate more capably with data seen from a first-person point of view.

facebook-2021-before-after-detectron-modelfinal.png

 Facebook AI

To operate in augmented and virtual reality, Facebook believes artificial intelligence will need to develop an "egocentric perspective." 

To that end, the company on Thursday announced Ego4D, a data set of 2,792 hours of first-person video, and a set of benchmark tests for neural nets, designed to encourage the development of AI that is savvier about what it's like to move through virtual worlds from a first-person perspective. 

The project is a collaboration between Facebook Reality Labs and scholars from 13 research institutions, including academic institutions and research labs. The details are laid out in a paper lead-authored by Facebook's Kristen Grauman, "Ego4D: Around the World in 2.8K Hours of Egocentric Video."

Grauman is a scientist with the company's Facebook AI Research unit. Her background as a professor at UT Austin has been focused on computer vision and machine learning in related topics. 

The idea is that the data set will propel researchers to develop neural nets that excel at performing tasks from a first-person perspective -- in the same way that big datasets such as ImageNet propelled existing AI programs from a "spectator" perspective.

The point of egocentric perception is to try to fix the problems a neural network has with basic tasks, such as image recognition when the point of view of an image shifts from third-person to first-person, said Facebook. 

Also: Facebook announces $50 million investment in 'responsible' metaverse development

Most image recognition systems that can detect objects seen from the sidelines have high failure rates if the object is presented the from the point of view of a person encountering the object.

The Ego4D initiative specifically targets the Metaverse, the coming world of immersive social networking that Facebook CEO Mark Zuckerberg discussed at the company's last earnings report

"These benchmarks will catalyze research on the building blocks necessary to develop smarter AI assistants that can understand and interact not just in the real world but also in the metaverse, where physical reality, AR, and VR all come together in a single space," said Facebook....

....MUCH MORE

We'll have more next week.