New Scientist article A new study suggests that smart phones are already starting to turn into computer chips.
The research, published in the journal Scientific Reports, looked at a number of smartphone sensors including accelerometers, gyroscopes, cameras and touch sensors.
The researchers found that smartphone sensors can be used to detect the direction of a person’s gaze by measuring the time it takes for the phone to respond to the direction a person is looking.
The results show that smartphone cameras are capable of being used to create “virtual cameras” that can be attached to a person to monitor their behaviour.
This could potentially be useful in detecting people at a distance, or to track their movements when they go to and from a restaurant, or where they’re going, says lead researcher Jules De Boer.
The sensors were able to detect a person turning their head from 90 degrees to 90 degrees around a 3D camera.
The phone can then track the person’s movements in the room.
This can be done in real-time by a smartphone app that uses real-world data collected by the sensors to create a digital 3D model of the user’s surroundings.
It could be used for tracking the movement of people who are moving from one location to another, for example, or even to determine the direction they’re looking at.
The findings have important implications for how we use smart phones, says De Boers team leader Professor Adrian Smith, who is also at the University of Cambridge.
“What is really exciting is that these sensors can capture motion in real time,” he says.
The researchers used three different sensor types. “
It’s really interesting that you can use sensors to be able, for instance, to track the movements of people.”
The researchers used three different sensor types.
The first is a camera, which captures motion from a distance and is usually used to record a person walking or running.
The second is a gyroscope, which is usually attached to the phone’s sensor and measures the distance between the phone and the sensor, as well as the angle it is pointing at.
And the third sensor is a touch sensor that can detect movement with a hand and is typically attached to any smartphone.
The team used a series of measurements to track people walking, cycling and running across the room using a smartphone.
This was the only part of the study that used smartphone accelerometers.
The phones accelerometers measure the movement from the user moving their hand from the bottom of the phone screen to the top.
The accelerometer measures the movement with the phone on the top screen, and the sensors measured the acceleration with the user turning their hand.
These three sensors all tracked the same position, and all were able, when combined, to detect movements from the head, which are usually much shorter than the movement that occurs when the person is walking, De Boos team says.
The new findings are exciting because they demonstrate that smart phone sensors can potentially be used by people to record their movements.
“Our findings suggest that there’s a potential application for sensor data to be used as a user interface to track physical movement,” says DeBoers team member David Smith.
The technology could also be used in other applications, he says, such as to help people with Parkinson’s disease or Parkinson’s syndrome to control their body movements.
De Boars team has already used their sensor technology in conjunction with a virtual reality system, using a combination of smartphone tracking and motion tracking.
They have also developed an app that can automatically track your movements.
This type of tracking could be useful for those who can’t speak or read, and for those people who have trouble with reading or writing.
The latest research suggests that this technology could be integrated into a number to existing smartphone apps.
The app could then automatically track the movement and position of a particular person.
This would be useful to people who use smartphones to do other activities, such for instance to communicate, study or watch movies.
“A lot of these things we’ve been trying to do, we’ve done with physical movements, but now we can do it with sensors,” says Smith.