Google is working on radar technology that can automatically pause Netflix

Google has discovered technology that can read human body movements to allow devices to “understand the social context around them” and make decisions.

Developed by Google’s Advanced Technology and Products Division (ATAP) in San Francisco, the technology consists of chips embedded in TVs, phones and computers.

But instead of using cameras, the technology uses radar – radio waves that are reflected to determine the distance or angle of nearby objects.

If built into future devices, the technology can turn off the TV if you nod or automatically pause Netflix when you leave the sofa.

Assisted by machine learning algorithms, it usually allows devices to know that someone is approaching or entering their “privacy”.

Instead of using cameras, the technology uses radar - radio waves that are reflected to determine the distance or angle of nearby objects

Instead of using cameras, the technology uses radar – radio waves that are reflected to determine the distance or angle of nearby objects

Google has introduced technology that can read people's body movements to allow devices to

Google has introduced technology that can read people’s body movements to allow devices to “understand the social context around them” and make decisions such as flashing information as you pass or lowering the volume.

RADAR: HIGH FREQUENCY RADIO WAVES

Radar is an acronym for Radio Detection and Range.

It uses high-frequency radio waves and was first developed during World War II to assist fighter pilots.

It works in a simple way, a machine sends a wave and then a separate sensor detects it when it comes back.

This is almost the same way that vision works, light is reflected from an object and into the eye, where it is detected and processed.

Instead of using visible light that has a short wavelength, radar uses radio waves with a much longer wavelength.

By detecting the range of waves that have bounced back, the computer can create an image of what is to come that is invisible to the human eye.

This can be used to see through different materials, in darkness, fog and different weather conditions.

Scientists often use this method to discover terrain, as well as to study archaeological and valuable finds.

As a non-invasive technique it can be used to gain an image without degrading or damaging precious finds and monuments.

The technology is described in a new video published by ATAP, part of a documentary series that presents the latest research and development.

The technology giant wants to create “socially intelligent devices” that are controlled by “waving or turning heads.”

“As human beings, we understand each other intuitively – without saying a word,” said Leonardo Giusti, head of design at ATAP.

“We catch social signs, subtle gestures that we naturally understand and react to. What if computers understood us that way?

Such devices will be powered by Soli, a small chip that sends radar waves to detect human movements, from heartbeat to body movements.

Soli is already featured in Google products such as the second-generation Nest Hub smart display to detect movement, including the depth of a person’s breathing.

Soli was first introduced in the 2019 Google Pixel 4 smartphone, allowing gesture control such as waving songs, snoozing alarms and dropping phone calls, although it was not included in Pixel 5 for the next year.

The difference with the new technology is that Soli will work when consumers are not necessarily aware of it, instead of consumers actively doing something to activate it.

if it is built into a smart TV, it can be used to make decisions such as reducing the volume when it detects that we are sleeping – information gathered from an inclined position of the head, which indicates that it is leaning against the side of a chair or sofa.

At some point in the future, technology may be so advanced – enough to capture “submillimeter movement” – that it can detect whether the eyes are open or closed.

Other examples include a wall thermostat that automatically flashes weather when users pass by, or a computer that mutes a notification when it sees no users sitting at a desk, according to With cable.

Assisted by machine learning algorithms, the technology will allow devices to know that someone is approaching or entering their

Assisted by machine learning algorithms, the technology will allow devices to know that someone is approaching or entering their “privacy”

Technology could mean that a wall thermostat will automatically flash weather conditions when users pass by

Technology could mean that a wall thermostat will automatically flash weather conditions when users pass by

In addition, when consumers are in the kitchen following a video recipe, the device may stop when consumers move away to receive ingredients and resume when they return.

The technology, which is still under development, has some drawbacks – in a crowded room, radar waves can have difficulty detecting one person from another, as opposed to just one large mass.

In addition, taking control away from the user to pass it on to devices can lead to a whole new era of technology doing things that users don’t want to do.

“People are set up to really understand human behavior, and when computers break it, it leads to this kind of frustration. [situations]”Chris Harrison of Carnegie Mellon University’s Institute for Human-Computer Interaction told Wired.

This image shows what the device defines as the overlap between two personal spaces - that of man and device

This image shows what the device defines as the overlap between two personal spaces – that of man and device

“Bringing people in as social scientists and behavioral scientists in the field of computers makes these experiences much more enjoyable and much more humanistic.”

Radar has a clear advantage over camera privacy – it removes customers’ fears that Google employees may be watching live broadcasts from you when you sleep in front of your TV, for example.

But some users may still worry about how their traffic data is used and stored, even if it is anonymous.

“There’s no such thing as invading privacy and not invading privacy,” Harrison said. “Everything is in the spectrum.”

WHAT IS THE SALT PROJECT?

Project Soli uses invisible radar emitted by a microchip to detect finger movements.

In particular, it uses broadband radar to detect movement, speed and distance.

It operates using a radar spectrum of 60 Ghz at up to 10,000 frames per second.

These movements are then translated into commands that mimic touches on the screen.

The chips developed with the German manufacturer Infineon are small enough to be built into wearables and other devices.

The biggest challenge is said to be shrinking a shoebox-sized radar – usually used by police in speed traps – into something small enough to fit on a microchip.

Inspired by advances in communications preparing for next-generation Wi-Fi, called Wi-Gig, the team of lead researcher Ivan Popirev has shrunk radar components to millimeters in just 10 months.