«

Meta Ray-Ban智能眼镜上手体验:低调含蓄,操作直观

qimuai 发布于 阅读:11 一手编译


Meta Ray-Ban智能眼镜上手体验:低调含蓄,操作直观

内容来源:https://www.engadget.com/wearables/meta-ray-ban-display-hands-on-discreet-and-intuitive-002334346.html?src=rss

内容总结:

Meta在2025年Connect大会上发布了三款智能眼镜,其中备受关注的Meta Ray-Ban Display智能眼镜以其隐蔽性设计和直观交互体验,展现出接近大众期待的智能眼镜形态。这款眼镜虽未实现沉浸式AR功能,但通过右镜片内嵌的20度视场角显示屏,可实时显示短信、导航预览和日程信息。其显示效果在户外自动调光功能下更为清晰,且从外部几乎不可见发光源。

眼镜配备的神经传感腕带可通过识别手腕肌肉运动实现操控,用户无需语音指令即可完成应用切换、音量调节等操作。实测显示,该手势交互系统虽需适应但足够直观。此外,眼镜支持视频通话及"对话聚焦"功能,可在嘈杂环境中为对话提供实时字幕,对听障人群具有实用价值。

尽管799美元的定价高于前代产品,但相比厚重AR眼镜更贴近日常佩戴需求。轻量化设计与多配色选择(如浅沙色)提升了佩戴美观度。虽然该产品尚未达到取代手机的目标,但其显示功能显著减少了用户对手机的依赖,标志着智能眼镜向实用化迈出重要一步。

中文翻译:

Meta雷朋智能眼镜体验:低调直观,爱不释手
虽无法取代手机,却令人佩戴成瘾

近十年来我测试过无数智能眼镜,最常被问及的问题是"真的能透过镜片看到内容吗?"。多年来我不得不解释:真正意义上的智能眼镜尚未问世。

如今局面已然改变。尽管过去一年我见过众多搭载显示功能的眼镜,但Meta雷朋智能眼镜最接近人们对"智能眼镜"的理想形态。需要明确的是,它并未采用Meta Orion原型机那种沉浸式AR技术——Meta将"显示型AI眼镜"与AR设备划为完全不同品类。其单右镜片配备的20度视场角远小于Orion的70度,这看似妥协,实际体验却出乎意料。

单屏设计反而提升了日常佩戴实用性。它专为需要时瞥视设计,而非持续显示的叠加界面。更小的显示区域意味着更高清度(42像素/度),户外行走时自动亮度调节功能使显示效果甚至优于室内环境。

更令人称道的是:当注视佩戴者时,完全看不到镜片发光。即便贴近观察,显示屏也几乎难以察觉。小型显示屏还降低了成本(799美元),并避免出现常见AR眼镜的厚重感。69克的重量虽比第二代Meta雷朋略重,但相比那些我曾试戴的厚重黑色智能眼镜已是巨大改进。浅沙色款Wayfarer式镜架对我这种宽脸型尤为友好。

配套的Meta神经传感腕带与Orion原型机所用技术类似,通过检测手部肌肉微动作转化为眼镜界面操作。这种手势操控异常流畅:拇指沿食指顶部滑动模拟方向键切换应用,拇指食指捏合旋转手腕即可调节音量。虽需适应时间,但所有操作都符合直觉。

Meta的终极目标是让智能眼镜取代手机。虽然目前尚未实现,但显示屏的存在确实减少了手机使用频率。它能显示短信、导航预览、日程信息,甚至成功进行了视频通话(对比扎克伯格主题演讲时的失败演示),通话效果超乎预期:不仅能清晰看到对方画面,还能同步开启眼镜摄像头显示己方小窗画面。

对话聚焦功能在嘈杂环境中实时生成对话字幕的能力令人震撼。虽然对着眼前真人观看实时字幕让我这种注重眼神交流的人略感不适,但对听障人士无疑是革命性创新。结合Meta AI本就出色的翻译功能,这项技术潜力巨大。

腕带手势唤醒Meta AI的功能消除了公开场合喊"Hey Meta"的尴尬,显示屏还解决了长期困扰雷朋和欧克利用户的取景难题——现在可以直接预览拍摄画面并回放照片,告别盲目抓拍。

三十分钟的体验虽不足以判断其日常实用性,但已清晰表明:这正是众人期待已久的智能眼镜雏形。

Meta Connect 2025大会以三款新眼镜发布拉开序幕:第二代雷朋Meta眼镜、欧克利Meta先锋运动眼镜及799美元的Meta雷朋显示眼镜(含视频叠加功能)。欢迎查阅我们的大会实时博客获取最新动态。

英文来源:

Meta Ray-Ban Display hands-on: Discreet and intuitive
They won't replace your phone, but you still won't want to take them off.
I've been testing smart glasses for almost a decade. And in that time, one of the questions I've been asked the most is "oh, but can you see anything in them?" For years, I had to explain that no, glasses like that don't really exist yet.
That's no longer the case. And while I've seen a bunch of glasses over the last year that have some kind of display, the Meta Ray-Ban Display glasses feel the closest to fulfilling what so many people envision when they hear the words "smart glasses."
To be clear, they don't offer the kind of immersive AR that's possible with Meta's Orion prototype. In fact Meta considers "display AI glasses" to be a totally separate category from AR. The display is only on one lens — the right — and its 20-degree field of view is much smaller than the 70 degrees on Orion. That may sound like a big compromise, but it doesn't feel like one.
The single display feels much more practical for a pair of glasses you'll want to wear every day. It's meant to be something you can glance at when you need it, not an always-on overlay. The smaller size also means that the display is much sharper, at 42 pixels per degree. This was especially noticeable when I walked outside with the glasses on; images on the display looked even sharper than in indoor light, thanks to automatic brightness features.
I also appreciated that you can't see any light from the display when you're looking at someone wearing the glasses. In fact the display is only barely noticeable at all when you at them up close.
Having a smaller display also means that the glasses are cheaper, at $799, and that they don't look like the chunky AR glasses we've seen so many times. At 69 grams, they are a bit heavier and thicker than the second-gen Meta Ray-Bans, but not much. As someone who has tried on way too many pairs of thick black smart glasses, I'm glad Meta is offering these in a color besides black. All Wayfarer-style frames look wide on my face but the lighter "sand" color feels a lot more flattering.
The Meta Neural Band wristband that comes with the display glasses functions pretty much the same as the band I used on the Orion prototype. It uses sensors to detect the subtle muscle movements on your hand and wrist and can translate that into actions within the glasses' interface.
It's hard to describe, but the gestures for navigating the glasses interfaces work surprisingly well. I can see how it could take a little time to get used to the various gestures for navigating between apps, bringing up Meta AI, adjusting the volume and other actions, but they are all fairly intuitive. For example, you use your thumb to swipe along the the top of your index finger, sort of like a D-pad, to move up and down and side to side. And you can raise and lower the speaker volume by holding your thumb and index finger together and rotating your wrist right or left like it's a volume knob.
It's no secret that Meta's ultimate goal for its smart glasses is to replace, or almost replace, your phone. That's not possible yet, but having an actual display means you can look at your phone a whole lot less.
The display can surface incoming texts, navigation with map previews (for walking directions), and info from your calendar. I was also able to take a video call from the glasses — unlike Mark Zuckerberg's attempted live demo during his keynote — and it was way better than I expected. I could not only clearly see the person I was talking to and their surroundings, I could turn on my glasses' camera and see a smaller version of the video from my side.
I also got a chance to try the Conversational Focus feature, which allows you to get live captions of the person you're speaking with even in a loud environment that may be hard to hear. There was something very surreal about getting real-time subtitles to a conversation with a person standing directly in front of me. As someone who tries really hard to not look at screens when I'm speaking to people, it almost felt a little wrong. But I can also see how this would be incredibly helpful to people who have trouble hearing or processing conversations. It would also be great for translations, something Meta AI already does very well.
I also appreciated that the wristband allows you to invoke Meta AI with a gesture so you don't always have to say "Hey Meta." It's a small change, but I've always felt weird about talking to Meta AI in public. The display also addresses another one of my longtime gripes with the Ray-Ban Meta and Oakley glasses: framing a photo is really difficult. But with a display, you can see a preview of your shot, as well as the photo after the fact, so you no longer have to just snap a bunch and hope for the best.
I've only had about 30 minutes with the glasses, so I don't really know how having a display could fit into my daily routine. But even after a short time with them, they really do feel like the beginning of the kind of smart glasses a lot of people have been waiting for.
The Meta Connect 2025 event kicked off with the company announcing three new sets of smartglasses: the second-gen Ray-Ban Meta glasses, the Oakley Meta Vanguard sports glasses and the $799 Meta Ray-Ban Display glasses, the last of which include a video overlay. You can read our earlier Meta Connect liveblog to see how it played out in real-time.

Engadget

文章目录


    扫描二维码,在手机上阅读