- Meta is producing her Ariya General 2 smart glasses, full of sensors and AI features.
- Clever glasses can also know your eyes, movements, and even heart rate what is happening around you and your emotions about it
- Smart glasses are currently being used to help researchers train robots and create better AI systems that can be added to consumers’ smart glasses.
Ray Ben Meta Smart Glasses are still relatively new, but Meta is already working with her new Ariya General 2 smart glasses. Unlike Ray Bans, these smart glasses are only for research purposes, for now, but full of sensors, cameras, and processing power that it seems indispensable that what Meta learns from them will be included in future clothing.
Project area research level tools, such as new smart glasses, computer vision, robotics, or any related hybrids related to context, are used by the focus of meta. For developers, the idea is to use these glasses to use these glasses so that they can develop more effectively for teaching with the world, teaching and interacting with the world.
In 2020, first Arya smart glasses came out. Arya General 2s are far more developed in hardware and software. They pack light, more accurate, more power, and look too much as glasses wear in their regular life, even though you will not make them mistakes for the standard couple’s spectators.
Four computer vision cameras can see around 80 ° arc around you and measure depth and relative distance, so it can tell both how far your coffee is from your keyboard, or where the drone landing gear is going. It is the beginning of the sensory equipment in glasses, which has an integral light sensor with ultra -violet mode, a contact microphone that can raise your voice even in the noise environment, and a plus detector embedded in the nasal pad that can estimate your heart rate.
The future face of
There is also a lot of eye tracking technology, which is worth mentioning where you are seeing, when you blink, how your disciples change, and the focus you are focusing on. Even it can track your hands, measuring a joint movement in a way that can help robot training or gestures. Jointly, glasses can know what you are seeing, how you are catching something, and if what you are seeing is getting your heart rate due to an emotional reaction. If you are holding an egg and seeing your oath -taking enemy, AI may be able to know that you want to throw eggs on them, and will help you make it the right purpose.
As stated, these are research tools. They are not for sale to consumers, and Meta has not said whether they will ever be. Researchers have to apply to access, and it is expected that the company will start seeking these requests later this year.
But its implications are much larger than that. Meta projects for smart glasses are beyond checking messages. They want to connect human interaction with machines with the real world, and teach them to do the same. Theoretically, they can see, hear, and translate it as the world around them, as humans do.
This is not going to happen tomorrow, but Ariya General 2 smart glasses prove that it is far closer than you think. And it is probably just a matter of time before some versions of Ariya General 2 finish for sale to the average person. There will be a powerful Ai mind sitting on your face, remember where you have left your keys and sending you a robot to pick up you.