«

苹果应用商店排名第二的社交应用Neon通过付费方式邀请用户录制通话内容,并将数据出售给人工智能公司。

qimuai 发布于 阅读:0 一手编译


苹果应用商店排名第二的社交应用Neon通过付费方式邀请用户录制通话内容,并将数据出售给人工智能公司。

内容来源:https://techcrunch.com/2025/09/24/neon-the-no-2-social-app-on-the-apple-app-store-pays-users-to-record-their-phone-calls-and-sells-data-to-ai-firms/

内容总结:

一款名为Neon Mobile的手机应用近期在美国苹果应用商店社交类榜单中异军突起,位列第二。该应用以"通话赚钱"为卖点,宣称用户仅凭接打电话每年即可获得"数百甚至数千美元"收益,其商业模式是将用户通话录音出售给人工智能公司用于AI模型训练。

根据其服务条款,该应用可记录用户所有呼入呼出通话,但对外宣称仅录制用户单方语音(与另一位Neon用户通话时除外)。用户每与Neon用户通话1分钟可获得0.3美元,与非用户通话每日最高奖励30美元。应用数据分析显示,该应用在9月18日仅排名第476位,但近日迅速飙升至榜单前列。

尽管Neon声称在出售数据前会隐去姓名、邮箱和电话号码等个人信息,但法律专家指出其用户协议中存在重大隐患:条款赋予公司"全球性、排他性、不可撤销、可转授的永久授权",可对录音数据进行任意处理。网络安全律师彼得·杰克逊警告,声音数据可能被用于制造仿冒用户声线的诈骗电话,"一旦声音数据泄露,足以伪造身份实施各种欺诈"。

更令人担忧的是,应用测试过程中未对通话方发出任何录音提示,来电显示也与普通网络电话无异。目前该应用创始人亚历克斯·基安尚未就相关质疑作出回应。有证据显示,这家公司实际运营地点仅为纽约的一处公寓。

此现象折射出人工智能时代隐私边界日益模糊的现状。法律专家詹妮弗·丹尼尔斯指出,单方录音的设计可能是为了规避窃听法,但这种"用隐私换小利"的模式正挑战社会伦理底线。随着AI技术普及,公众对隐私保护的敏感度正在下降,但用户可能尚未意识到,他们的行为不仅危及自身安全,更将日常联系人的隐私置于风险之中。

中文翻译:

一款能够录制通话并向用户支付费用以获取音频数据、再转售给AI公司的新型应用,竟在苹果美国应用商店社交类榜单中高居第二位,这令人难以置信。

这款名为Neon Mobile的应用自我定位为赚钱工具,承诺用户通过授权其获取通话音频每年可获利"数百甚至数千美元"。据其官网说明,与Neon用户通话每分钟支付0.3美元,与非用户通话每日最高可获30美元,推荐新用户还能获得奖励。应用情报公司Appfigures数据显示,该应用9月18日在美国应用商店社交类排名仅476位,但昨日晚间已飙升至第10名。

周三,Neon在iPhone社交类免费应用榜单中已跃居亚军。当天清晨更跻身全品类应用总榜第7位,随后又上升至第6名。

根据服务条款,Neon移动应用可采集用户所有呼入呼出通话。但官方宣传称仅录制用户自身语音,除非通话双方均为Neon用户。条款明确表示这些数据将出售给"AI公司,用于开发、训练、测试和改进机器学习模型、人工智能工具系统及相关技术"。

此类应用能合法上架,表明AI技术已深度侵入用户生活乃至传统隐私领域。而其高排名则印证市场上确实存在愿意用隐私换取微利的人群,无论这对自身或社会将造成多大代价。

尽管隐私政策有所承诺,Neon条款中关于用户数据授权的表述极为宽泛:授予自身"全球性、排他性、不可撤销、可转让、免版税的完全付费权利和许可(含多级分许可权),可销售、使用、存储、传输、公开展示、公开表演(包括数字音频传输)、向公众传播、复制、为适配显示格式而修改、根据条款创建衍生作品,以任何现有或未来媒体格式通过任何渠道整体或部分分发用户录音"。这为Neon超越声明的数据使用留下了操作空间。

条款还包含大量关于测试功能的免责声明,这些功能不提供担保且可能存在各类问题。虽然Neon应用引发诸多警示,但从技术层面可能仍属合法。

"仅录制单方通话旨在规避窃听法规,"Blank Rome律师事务所隐私与数据保护团队合伙人詹妮弗·丹尼尔斯向TechCrunch指出,"多数州法律要求录音需经通话双方同意...这种操作方式值得玩味。"格林伯格格鲁斯克尔律师事务所网络安全律师彼得·杰克逊赞同此观点,并表示"单方转录"的表述可能暗指实际录制完整通话,仅在最终文本中删除对方内容。

法律专家还对数据匿名化程度提出质疑。Neon声称向AI公司出售数据前会删除姓名、邮箱和电话号码,但未说明合作方如何使用这些数据。语音数据可能被用于制作仿冒用户的虚假通话,或用于训练AI语音模型。

"一旦语音数据泄露,就可能被用于诈骗,"杰克逊警告,"该公司掌握你的电话号码和足够信息——他们拥有你的语音录音,可被用来冒充身份实施各种欺诈。"即使公司本身可信,Neon也未曾披露合作方名单及后续数据使用权限。与其他拥有珍贵数据的企业一样,该公司同样面临潜在的数据泄露风险。

TechCrunch实测发现,Neon在通话时未给出任何录制提示,也未提醒接听方。应用功能与其他网络电话应用无异,来电显示仍为正常号码。(其宣称功能是否属实,还需安全研究人员进一步验证。)

Neon创始人亚历克斯·基安未回应置评请求。企业登记文件显示,这位在公司官网仅标注为"亚历克斯"的创始人,实际在纽约一所公寓运营业务。领英动态表明基安数月前曾从Upfront Ventures获得初创资金,但截至发稿该投资方未回应问询。

AI是否已使用户对隐私问题麻木?曾几何时,企业通过移动应用收集数据盈利还需暗中操作。2019年Facebook付费让青少年安装监控应用的丑闻曝光时,曾引发轩然大波。次年应用商店分析提供商通过数十款看似无害的应用收集生态数据时,又掀起舆论风波。关于VPN应用隐私保护名不副实的警告屡见不鲜,政府报告也详细记载机构定期采购市面"商业流通"的个人数据。

如今AI助手常态化参与会议记录,常时待机的AI设备充斥市场。但丹尼尔斯指出,至少这些场景中所有参与者都同意被录音。面对个人数据的广泛使用与交易,部分用户可能 cynical 地认为既然数据终将被售,不如亲自获利。

可悲的是,他们可能在不自知中泄露了远超预期的信息,同时危及他人隐私。"知识工作者——坦白说所有人——都渴望尽可能简化工作,"杰克逊坦言,"某些效率工具确实实现了这点,但代价不仅是你的隐私,更日益危及日常交流对象的隐私安全。"

英文来源:

A new app offering to record your phone calls and pay you for the audio so it can sell the data to AI companies is, unbelievably, the No. 2 app in Apple’s U.S. App Store’s Social Networking section.
The app, Neon Mobile, pitches itself as a moneymaking tool offering “hundreds or even thousands of dollars per year” for access to your audio conversations.
Neon’s website says the company pays 30¢ per minute when you call other Neon users and up to $30 per day maximum for making calls to anyone else. The app also pays for referrals. The app first ranked No. 476 in the Social Networking category of the U.S. App Store on September 18 but jumped to No. 10 at the end of yesterday, according to data from app intelligence firm Appfigures.
On Wednesday, Neon was spotted in the No. 2 position on the iPhone’s top free charts for social apps.
Neon also became the No. 7 top overall app or game earlier on Wednesday morning and became the No. 6 top app.
According to Neon’s terms of service, the company’s mobile app can capture users’ inbound and outbound phone calls. However, Neon’s marketing claims to only record your side of the call unless it’s with another Neon user.
That data is being sold to “AI companies,” Neon’s terms of service state, “for the purpose of developing, training, testing, and improving machine learning models, artificial intelligence tools and systems, and related technologies.”
The fact that such an app exists and is permitted on the app stores is an indication of how far AI has encroached into users’ lives and areas once thought of as private. Its high ranking within the Apple App Store, meanwhile, is proof that there is now some subsection of the market seemingly willing to exchange their privacy for pennies, regardless of the larger cost to themselves or society.
Despite what Neon’s privacy policy says, its terms include a very broad license to its user data, where Neon grants itself a:
…worldwide, exclusive, irrevocable, transferable, royalty-free, fully paid right and license (with the right to sublicense through multiple tiers) to sell, use, host, store, transfer, publicly display, publicly perform (including by means of a digital audio transmission), communicate to the public, reproduce, modify for the purpose of formatting for display, create derivative works as authorized in these Terms, and distribute your Recordings, in whole or in part, in any media formats and through any media channels, in each instance whether now known or hereafter developed.
That leaves plenty of wiggle room for Neon to do more with users’ data than it claims.
The terms also include an extensive section on beta features, which have no warranty and may have all sorts of issues and bugs.
Though Neon’s app raises many red flags, it may be technically legal.
“Recording only one side of the phone call is aimed at avoiding wiretap laws,” Jennifer Daniels, a partner with the law firm Blank Rome‘s Privacy, Security & Data Protection Group, tells TechCrunch.
“Under [the] laws of many states, you have to have consent from both parties to a conversation in order to record it … It’s an interesting approach,” says Daniels.
Peter Jackson, cybersecurity and privacy attorney at Greenberg Glusker, agreed — and tells TechCrunch that the language around “one-sided transcripts” sounds like it could be a backdoor way of saying that Neon records users’ calls in their entirety but may just remove what the other party said from the final transcript.
In addition, the legal experts pointed to concerns about how anonymized the data may really be.
Neon claims it removes users’ names, emails, and phone numbers before selling data to AI companies. But the company doesn’t say how AI partners or others it sells to could use that data. Voice data could be used to make fake calls that sound like they’re coming from you, or AI companies could use your voice to make their own AI voices.
“Once your voice is over there, it can be used for fraud,” says Jackson. “Now this company has your phone number and essentially enough information — they have recordings of your voice, which could be used to create an impersonation of you and do all sorts of fraud.”
Even if the company itself is trustworthy, Neon doesn’t disclose who its trusted partners are or what those entities are allowed to do with users’ data further down the road. Neon is also subject to potential data breaches, as any company with valuable data may be.
In a brief test by TechCrunch, Neon did not offer any indication that it was recording the user’s call, nor did it warn the call recipient. The app worked like any other voice-over-IP app, and the caller ID displayed the inbound phone number, as usual. (We’ll leave it to security researchers to attempt to verify the app’s other claims.)
Neon founder Alex Kiam didn’t return a request for comment.
Kiam, who is identified only as “Alex” on the company website, operates Neon from a New York apartment, a business filing shows.
A LinkedIn post indicates Kiam raised money from Upfront Ventures a few months ago for his startup, but the investor didn’t respond to an inquiry from TechCrunch as of the time of writing.
Has AI desensitized users to privacy concerns?
There was a time when companies looking to profit from data collection through mobile apps handled this type of thing on the sly.
When it was revealed in 2019 that Facebook was paying teens to install an app that spies on them, it was a scandal. The following year, headlines buzzed again when it was discovered that app store analytics providers operated dozens of seemingly innocuous apps to collect usage data about the mobile app ecosystem. There are regular warnings to be wary of VPN apps, which often aren’t as private as they claim. There are even government reports detailing how agencies regularly purchase personal data that’s “commercially available” on the market.
Now AI agents regularly join meetings to take notes, and always-on AI devices are on the market. But at least in those cases, everyone is consenting to a recording, Daniels tells TechCrunch.
In light of this widespread usage and sale of personal data, there are likely now those cynical enough to think that if their data is being sold anyway, they may as well profit from it.
Unfortunately, they may be sharing more information than they realize and putting others’ privacy at risk when they do.
“There is a tremendous desire on the part of, certainly, knowledge workers — and frankly, everybody — to make it as easy as possible to do your job,” says Jackson. “And some of these productivity tools do that at the expense of, obviously, your privacy, but also, increasingly, the privacy of those with whom you are interacting on a day-to-day basis.”

TechCrunchAI大撞车

文章目录


    扫描二维码,在手机上阅读