This week, we have a robot that swims like a jellyfish, robots helping nurses (it sounds like they’re doing quite well!), smart glasses that might one day be better than prescription ones, Facetime can now use AR to make it look like you’re looking at the camera and more!
They are “seeking additional information pertaining to VR technology that could be used to train US forces for potential nuclear combat”.
The Oculus Rift has 44.12% of VR headset use on Steam while HTC Vive has 42.42%.
This group are using the Quest for a local multiplayer house-scale VR experience.
They “have actually been conducting practical research on these technologies from the very beginning”.
Then added them back again. So they’re there if you want to make accessories and mods for the headset!
When your caller is looking at their screen rather than their camera, they’ll have their eyes adjusted so it’ll look like they’re looking at the camera through AR!
The three trends VR Scout see are “getting kids and families moving”, “crafting new worlds” and “augmented board games”.
The Looking Glass Factory are making some pretty fascinating steps in the land of AR.
These “autofocals” can restore proper vision in people who would otherwise need traditional glasses.
“DeepNude” was an awful app idea that I guess someone was bound to do eventually. It’s offline now, with even Vice’s original reporting of it being questionable. The big question — how do we prevent malicious AI like this? Irresponsible developers will always fall through the cracks.
The drones drew a collection of 100 crowd-sourced line drawings onto this big mural.
In a paper, a team of researchers propose an approach that enables an AI model to classify music in genres it’s never encountered before.
Great to see emerging tech used to try to help with our society’s health.
“When tested against Beatles songs where the author is known, the system achieved an accuracy of 76 per cent.”
Moxi is programmed to run errands around the hospital and has done well with taking some things off nurses’ plates, but patients have really loved it too.
“Researchers have developed a tool that predicts with 83% accuracy the likelihood you’ll return something in your shopping cart.”
“Russia wants to use small drones as offensive tools, not just for reconnaissance”.
It mimics the movement of jellyfish through water and looks pretty cool.
Using 300,000 images of historic and modern Stockholm, it combines them into a really neatly transitioned video — definitely worth watching!
In summary — it’s faster!
Adafruit ported TensorFlow for Micro-controllers to the Arduino IDE!
“IBM’s Hypertaste project seeks to build an artificial tongue that’s capable of analysing liquids using a combination of electrodes and machine learning”.
Researchers at Udacity propose “a machine learning system that generates lecture videos from audio narration alone”.
“To researchers’ surprise, deep learning vision algorithms often fail at classifying images because they mostly take cues from textures, not shapes.”
This is definitely on the stranger end of robots we’ve seen in this newsletter.
Amazon’s answers to a US senator were pretty upfront and honest, but they’re not ideal.
Social robots in pediatric units at hospitals “can lead to more positive emotions in sick children” — that’s a good use of tech!
VentureBeat spoke with Amazon’s head of customer experience, Nathan Smith, about how they’ve been working to improve the Alexa experience.
The Micro:Bit breakout board is a pretty neat concept, made even neater when placed into a Wii-inspired nunchuk!
This July will have the fifth anniversary of the launch of the Raspberry Pi Model B+ form factor which we’re all super used to by this point!
Here’s the latest for building the visuals to go with your Alexa skill.
According to Hackster, you might indeed want to add a fan to the new Pi to keep the temperature down.
You can “just wave your hand above the central area, and this shield can sense your movements”.
Learn to build for the Amazon Echo, Google Home, Facebook Messenger, Slack & more!