The use of semi-autonomous cars could lead to an increase in drunk driving, according to one expert. Some smart ovens have been accidentally preheated overnight by their owners. Biometric tracking data from VR and AR games could be used against players.
Could the use of semi-autonomous cars lead to an increase in drunk driving?
The current batch of Level 3 semi-autonomous cars almost seem like they should be able to drive themselves, but a human being is still considered the “true driver” and thus retains responsibility for the vehicle’s actions. What happens if a drunk human driver decides to let the automation act as a designated driver?
In a recent Forbes Opinion, autonomous vehicle expert Lance Eliot said that not only will the current automation be insufficient to overcome the drunk driving behavior of those already prone to drink and drive even in conventional cars, but he believes that the Level 3 automation may even encourage people who wouldn’t have normally considered driving drunk to attempt to do so.
Eliot wonders whether breath analyzers or other technologies will need to be added to Level 3 cars and how drivers might react if their cars begin to challenge their ability to drive.
Smart home appliance maker June is planning an immediate app update after some smart oven owners turned their ovens on overnight and preheated them accidentally.
The company says the June app for iOS will no longer open to the “Oven” tab by default, which is where people can adjust their temperature and cook settings. When someone chooses to preheat their oven, a temperature and cook mode will need to be selected to start the preheat session, and two presets—one for roasting and one for baking—will be removed.
Users will also be given the ability to disable the remote preheat option entirely. The other update will allow the June Oven to turn off its heating elements after 30 minutes of inactivity. The oven will detect when there’s no food inside it, send users a notification that it’s going to turn off if no action is taken and then power down.
Can playing a virtual reality game impact your ability to attain life insurance?
It can if the game’s creators decide to sell its users’ tracking data.
Biometric tracking data from virtual and augmented reality (VR and AR)—micro-movements of head, torso, hands and eyes—can be used to diagnose or predict anxiety, depression, schizophrenia, addiction, ADHD, autism spectrum disorder and more about a person’s cognitive and physical function.
What happens when this data is fed into users’ psychometric profiles? Such profiles may start out as relatively harmless, merely predicting when someone might be getting ready to buy a new car. However, sprawling psychographic profiles with medical inputs could leave people vulnerable.
Individuals have unique patterns of movement and can sometimes be identified using gaze, head direction, hand position, height, and other behavioral and biological characteristics collected in VR headsets. VR and AR tracking data should be considered potential “personally identifiable information” (PII) because it can be used to distinguish or trace an individual’s identity, either alone or when combined with other personal or identifying information.
Data collected by VR technologies is currently unregulated, and how it is collected, used and shared is not monitored by any external entity. Areas of concern include loss of freedom, harm to reputation, and decrease in access and opportunity due to online identities becoming inseparable from offline ones.