Evolution has become available in the Glasses: Intel Made intelligent glasses look it's normal

    Abdulaziz Sobh

    0/5 stars (0 votes)



    The most important parts of the new intelligent Vaunt glasses from Intel are the pieces that were left out.

    There is no camera to drag people, no button to push, no gesture area to slide, no bright LCD screen, no strange arm floating in front of the lens, no speaker or microphone (for now).

    From the outside, the Vaunt glasses look like glasses. When he wears them, he sees a flow of information about what a screen looks like, but in reality, it is projected onto the retina.

    The prototypes that I took in December also seemed almost indistinguishable from the normal ones. They come in various styles, work with recipes and can be used comfortably all day. In addition to a small red flash that is sometimes seen on the right lens, people around you may not know you are wearing smart glasses.

    As Google Glass did five years ago, Vaunt will launch an "early access program" for developers later this year. But Intel's goals are different from Google's. Instead of trying to convince us that we could change our lives for a worn-out screen, Intel is trying to change the worn-out screen for our heads to fit our lives.

    Google Glass and the Glassholes that came with that gave a bad reputation to the worn out screens. HoloLens aims for a complete and high-level AR experience that literally puts your Windows PC in your head. Magic Leap puts a whole computer on your hip, in addition to your headphones with a set of glasses that seem to belong to a Vin Diesel movie.

    We live in a world where our watches have LTE and our phones can turn our faces into cartoon characters that bounce in real time. Hopefully, a successful pair of smart glasses will provide similar wonders. Each gadget these days has more, more, more.

    With Vaunt, Intel is betting less.

    Remove the stickers and part numbers of the Vaunt prototypes that I tried last December, and they would look like a little thick glasses, with plastic frames. With a little more brightness, you could see me using them all the time, even if they did not have a screen. Although I only saw two versions at the San Francisco offices of Intel's New Devices Group (NDG), Intel anticipates having many different styles available when the product is formally launched.

    "When we see what types of new devices exist, [we] are very excited about the [products] they are wearing," says Itai Vonshak, NDG's product manager. "The products worn on the head are difficult because people assign many attributes to put something on their heads, it means something about their personality." That's Vonshak's political way of saying that other smart glasses look terrible, so his goal was to create something that he has, as he puts it again and again, "zero social cost."

    "We wanted to make sure that someone put this into practice and gained value without any negative impact of technology on their head," he says. "Everything from scratch is designed to make technology disappear."

    One of the main design goals of the Vaunt team was to create a pair of smart glasses that you could wear all day. Vaunt's code name inside Intel was "Superlite" for one reason: they needed to weigh less than 50 grams. That's even more than most eyeglasses by a notable margin, but Google Glass added an extra 33 grams over the pair you were wearing. Something else and they would feel uncomfortable. The electronics and batteries should be placed so that they do not give too much weight to the nose or the ears. They did not have to look like normal glasses, they had to feel like them.

    That is why all the electronic components in Vaunt are inside two small modules integrated into the stems of the glasses. However, the most important thing is that the electronic components are located completely near the face of the frames, so that the rest of the stems, and even the frame itself, can be flexed a bit, like any other pair of normal glasses. Other smart glasses have batteries built into the entire stem, "so they become very stiff and do not deform to fit the size of the head," says Mark Eastwood, director of industrial design at NDG. "It's very important when you look at the glasses that it deforms in all its length to fit your head."

    Okay, but what does it mean to carefully trim the technology and additional features so you can have normal looking lenses that really leave you?

    In essence, Vaunt is simply a system to display a small heads-up style screen in your peripheral vision. You can show simple messages such as addresses or notifications. It works through Bluetooth with an Android phone or an iPhone, in the same way, that your smartwatch does, taking the commands of an application that runs in the background to control it.

    You could say that this amounts to little more than a Pebble smartwatch on your face, especially because Vonshak designed the excellent Pebble timeline interface before the company was acquired and closed. But Intel has big plans for the Vaunt's small screen.

    Before going into all that, let's establish the basic concepts of hardware. On the right side of the glasses is a set of electronic components designed to power a very low power laser (technically a VCSEL). That laser shines a red and monochromatic image somewhere in the neighborhood of 400 x 150 pixels in a holographic reflector on the right lens of the glasses. The image is reflected on the back of your eyeball, directly on the retina. The left shank also houses electronic elements, so the glasses have the same power on both sides.

    So, yes: lasers in your eye. However, do not worry, says Eastwood. "It's a first-class laser, it's so weak that we do not [need it certified]," he says, "and in the case of [Vaunt], it's so low-wattage that it's at the lower end of a class one laser. "

    The hardware here is all customized, up to the silicon that powers Vaunt, which is designed by Intel, of course. "We had to integrate very, very efficient light sources into the energy consumption, MEMS devices to paint an image," says Jerry Bautista, the leader of the team that builds portable devices in Intel's NDG. "We use a holographic graduation embedded in the lens to reflect the correct wavelengths to your eye.The image is called retinal projection, so the image is 'painted' on the back of the retina."

    Because it shines directly on the back of the retina, the image it creates is always focused. It also means that the screen works just as well in prescription glasses as in non-prescription glasses.

    In addition to the VCSEL and all associated chips needed to power it, the Vaunt includes Bluetooth to communicate with your phone. It also has an application processor (more in one-bit applications) and some other sensors. Notably, it includes an accelerometer and a compass so you can detect some basic head gestures and know in which direction you are looking. The prototypes I used did not have a microphone, but future models may have one to use with a smart assistant like Alexa.
    To use the system correctly, Vaunt must adapt to his face. That involved a fairly quick and simple procedure: measuring my pupillary distance. It is a standard process that anyone with glasses will be familiar with, and it is essential that the screen appears in the right place in your field of vision. Once that was measured, a software engineer programmed my measurements into a pair of prototype glasses, and I put them on.

    Using a Vaunt screen is different from anything else you have tried. Project a rectangle of text and red icons in the lower right part of your visual field. But when I was not looking down in that direction, the screen was not there. My first thought was that the frames were misaligned.

    It turns out that this is a characteristic, not an error. Vaunt's screen is not intrusive. It is there when you want it, and it goes away completely when you do not need it. Without a speaker or a vibration mode to notify you, I could not help but wonder if that would mean a lot of omitted information.

    Not so, according to Intel engineers. His eyes rarely stay still. They wander and see things in their peripheral vision all the time, their brain just does not bother to process and include all that information in their focus. But if there is new information there, you are likely to notice it.

    The unit I saw was simply running a demonstration loop of possible notifications and information that I could see: walking instructions, an incoming call notification. There are no beeps or vibrations when the screen turns on or a notification appears, but you do notice when it occurs because the movement is noticed in the peripheral vision. Something like the T. rex in Jurrasic Park, it's easy to ignore things when you're still, but your eyes get moving

    Of course, the new and really interesting display hardware is not enough without software, and Intel is not yet ready to share too many details about the software.

    But NDG executives are happy to talk about the obvious things: it will download most of the work to your phone, just like a smartwatch or even a Fitbit. It will admit some applications, it will work with iPhone and Android phones, and there will be some integration with voice assistants at some point.

    Vonshak was also especially clear about another point: the goal is to do more than simply send notifications to his eyeball. Instead, Intel aims to provide contextual and environmental information when you need it. But since they still could not go into detail, all the examples were very hypothetical. "You're in the kitchen, you're cooking, you can go 'Alexa, I need that cookie recipe,' and bam, it appears in your glasses, 'says Vonshak.
    How will you interact with Vaunt? That is also a bit confusing. Sometimes hypotheticals implied voice. Other times, it seemed that the very subtle gestures of the head, followed by the accelerometer, would be the key. And in other ways, it seemed that you should not interact with him at all, but simply trust that the AI will show you what you need to know at that moment. An example I heard was getting relevant information about the person who calls (a birthday or a reminder) while talking on the phone with them.

    Whatever the final interaction model, it will be subtle and you should not expect to press, slide and touch. "We really believe that it can not have any social cost," insists Vonshak again. "So if it's strange, if you look like a geek, if you're playing and playing, then we've lost."

    One notable possibility: since Vaunt only uses Bluetooth and Bluetooth Low Energy, there is no technical reason why you can not create a simple remote controller, for example, on your smartwatch, or maybe even on your clothes. I can not help but notice that one of the places where Levi's and Google developed the Jacquard Project is literally next to the NDG offices.
    Vonshak also describes scenarios that are more complex, such as walking down the street and looking to the left or right and seeing Yelp restaurant information appear when looking at a restaurant. Your phone knows your location, your glasses know how you are looking for it, so the data is available to create that kind of feature. Someone has to put everything together.

    But if Intel can be that someone is a completely different question, one that Intel will have to answer with clarity and confidence when it is ready to talk more about the software. The type of contextual, environmental and useful information displayed here is quite similar to what Google promised with Google Now a few years ago ... and then could not really meet. If Google, with its integrated ecosystem and a large amount of personal information, could not fulfill that vision, how could a third-party product from Intel do it?

    I did not get an answer to that question, except, well, do not assume that Intel is trying to do exactly that. "Listen, sometimes a better way to succeed is to reduce the problem," says Soffer. Intel's IA to discover what to show is "focused on certain types of moments, and we've been developing this technology for five or six years to focus on those tangible, everyday moments."

    He suggests that Vaunt would do a better job, say, by showing him his flight information when his hands are busy, carrying his luggage around the airport or showing him his shopping list when he is pushing the cart. "You're not going to be addicted to this because you're also looking at your PC. You will not become addicted to this because you're on the train and you're checking your Facebook [on your phone]," says Soffer.

    Vaunt is not intended to replace other screens, but to become a new type of screen that is used in ways that other screens are not. "These will hook you to the value they bring when they are the optimal screen because of what they give you." For how can they overcome those limitations that other 'heavier' screens can not, [because] they ask you too much. "

    Soffer's vision makes Vaunt a sort of "intermediate" screen, which is an interesting idea. It is also one that nobody is really asking for. But Vonshak believes in his potential, although elucidating precisely what that potential is at this moment is not easy. "When I saw the first smartphone, I did not go and I said, 'Wow, share the trip, that's going to happen,'" he argues. "But the fact is that the shared trip would never have happened without smartphones." We're excited about this because it allows new use cases for developers. [...] What will happen when we bring a new type of screen with new and new capabilities? sensors in the head? I think that new use cases are going to happen. "

    To that end, Intel will launch an "early access program" this year, so developers can begin to experiment with these emerging behaviors. That sounds similar to Google Glass's "Explorer Program," but Intel obviously hopes that these glasses do not cause the same kind of backlash as Glass.

    What are those developers really going to create? Well, applications, of course. Presumably, they will mainly live on the phone, but they can also work in some way on the Vaunt. (Disclosure: my wife works in the VR application store program on Oculus).

    Although Intel was not ready to fully detail how the software works or what the developer's SDK would look like, in one day of conversation, I collected enough to be able to do what I think are some guesses. Vaunt's programming will involve JavaScript. One of the main engineers at Vaunt is Brian Hernacki, who was deeply involved in the architecture of webOS. Vonshak also had a season at Palm before moving on to LG (and later to Pebble).

    Working on LG's webOS TVs, I suspect that Vonshak learned the importance of a new trick: directly transmitting Internet content to a device. In the same way that a Google Cast-enabled TV is just an endpoint for any video transmission, perhaps Vaunt is just a final point where applications in the cloud can transmit information.

    I asked him about all this, and since Vaunt is not ready for its launch, Vonshak said that "we'll talk about all of that at a later date," but that "it's really built as an open platform." He adds: "It was built from scratch to be a mobile platform that accesses the internet, and a portable device becomes really powerful when the way of accessing the Internet changes."

    The Vaunt news broke for the first time last week with the Bloomberg scoop that said: "Intel plans to sell a majority stake in its augmented reality business." Intel did not tell me the story of Bloomberg, but I think the key storyline is: "Intel intends to attract investors who can contribute to the business with solid sales channels, industry or design experience, rather than financial backups ".

    That line is in line with what the sources tell me: that Intel is not so eager to sell the entire NDG web, but is looking for a partner to help bring this to retail. It also agrees with what Bautista told me in December. "It's very unlikely that Intel will take it to market because we usually do not do that." Our main business [is] we work with partners, we work with others to do it, "says Bautista." With these lenses, we are working with key suppliers of ecosystem hardware, whether frames or lenses and the like. Because we believe that there is a complete channel for people who wear glasses that are already there. "

    Intel has a reputation for displaying ideas that never become real products. It is a great concept, it demonstrates technology and then hopes to convince others to take that idea and turn it into a real product. The CEO, Brian Krzanich, enters a CES scenario, talks about a cargo bowl (or hey, smart glasses!), And then we wait to see if they will come to the market. Often (maybe even usually), they do not.

    I think the intention with Vaunt is a bit different from Intel's usual playbook. First, the Bloomberg report confirms that Intel is looking for partners with "solid sales channels ... instead of financial backers." On the other hand, Bautista and I talked a little about how eyeglass sales channels work in December.

    "There is something in the order of 2.5 billion people who require corrective lenses," he says. "They get their glasses from somewhere." Sixty percent of them come from eye care providers ... We would say that these glasses belong to those types of channels, people will buy them as if they bought their glasses today.

    It makes sense to sell glasses in eyeglass stores. Not only because it is a pre-existing sales channel, but also because you will have to have Vaunt glasses adjusted to your pupillary distance. Intel, despite its close relationship with Oakley, certainly has no direct experience in those channels.

    I do not know if there is an association of treatment made to bring these things to the market. I certainly do not know if Intel has a plan to challenge or partner with Luxottica, which has a massive and powerful monopoly on eyeglasses of all kinds in many regions of the world. Sources say that the most likely scenario is for a start-up company to bring Vaunt to the market, backed by both Intel and anyone with whom it interacts.

    Whoever eventually tries to sell Vaunt to real consumers will face another challenge at least as powerful as Luxottica's monopoly: ecosystems. In his time at Pebble, Vonshak himself saw what happens to a portable device that does not have deep access to the operating system it needs to work really well. (Reminder: sold by parts.) As it is not made by Apple or Google, Vaunt will need to find a way to succeed where other third-party wearables could not.

    That's in addition to convincing people that it's normal to wear smart glasses and that Vaunt provides enough value to justify its price. Unlike Magic Leap or HoloLens, Vaunt looks and feels normal. But it also does so much less than those devices. "Less is more" is a wonderful theory. We can not see if it is also a great business model for some time.

    Until then, what I can tell you is that the experience of trying Vaunt was much more convincing to me than the AR resistant cyborg glasses that I tested at CES last month. Portable devices must fit into our lives before they can change them.

    Vaunt is the first pair of smart glasses I've tried that does not seem ridiculous. They have shown me that it is possible to make a kind of AR device that I really want to use every day. Now we just have to see what Intel will really do with all those possibilities.