This week, Oculus Quest hand tracking SDK was released, Nreal is standing up to Magic Leap, Oculus talk about a festive tracking challenge and more! Merry Christmas everyone!
Unity’s XR Interaction Toolkit helps devs avoid reinventing the wheel for simple VR and AR interactions.
Great list for those getting their headsets (or just a bit of much needed spare time) over the holidays to experience some VR! VRScout has a holiday gift guide too.
The Oculus Quest now has a dynamic fixed foveated rendering (FFR) feature, which devs can use instead of manually setting the FFR level.
It turns out, Christmas lights can be a bit tough for Oculus’ controller tracking. Here’s how they sorted it out!
Devs can start integrating the new hand tracking into their apps.
SkarredGhost has a look at how to enable that very hand tracking so you can try it out. Oddly enough it’s not on my headset yet…
“In the spirit of Mozilla’s Privacy Not Included guidelines, you might be wondering: what personal information is Oculus collecting while you use your device?”
This 16-foot fiber optic cable is supposed to provide the ideal PC VR experience on Quest.
The foot controller also has a developer SDK on the way.
This fantastic VR experience is now up on the Oculus Quest!
Chi Xu, the founder of Nreal, is formally and strongly calling for the lawsuit against his company from Magic Leap to be dismissed.
Here’s what some of the industry are thinking right now.
It turns out, HoloLens 2 is having display issues… hopefully they sort them out soon!
To be more accurate, over half contain AR features (they’re not fully AR apps). It’s still a great sign for the industry!
“Machine learning has a privacy problem, but techniques like differential privacy, federated learning, and homomorphic encryption might offer a solution”.
“Researchers said it appears to be the first use of artificial intelligence to support an inauthentic social media campaign”.
“Researchers at MIT and IBM developed a sophisticated machine learning model that recommends documents based on their topics”.
“The machine learning community looks poised to tackle climate change, both through research and through responsible practices”.
The Cafe X Robot Coffee Bar is pretty neat! They’ve opened a new location in the San Jose airport.
The gig economy, chatbots, AI and more are on the horizon.
“Researchers hailing from Amazon and Cambridge propose a system that synthesizes singing from a data set of vocal performances”.
Uber AI Labs’ Plug and Play Language Model lets you plug in one or more simple attribute models into a large, unconditional language model.
There are a whole lot of smart home products out there with Alexa in them!
The revenue that Amazon earns from in-skill purchases hasn’t been quite as high as hoped just yet.
Firm’s first high-end speaker gets the thumbs up from The Inquirer.
“You can use Bleno, a Node.js module, to mock out a BLE peripheral and get started on the mobile software implementation right away”.
Learn to build for the Amazon Echo, Google Home, Facebook Messenger, Slack & more!