弗洛克公司利用海外零工构建其监控人工智能系统。

内容来源:https://www.wired.com/story/flock-uses-overseas-gig-workers-to-build-its-surveillance-ai/
内容总结:
美国AI监控公司弗洛克(Flock)近日被曝出使用海外外包劳工训练其人工智能算法,引发公众对监控数据安全与隐私保护的担忧。
据科技媒体404 Media调查,弗洛克公司通过自由职业平台Upwork雇佣海外工作者,对来自美国社区的监控录像进行人工标注,以训练其车牌识别和行为分析算法。曝光的内部资料显示,这些外包员工需要分类车辆型号、颜色,转录车牌信息,甚至标注“枪声”“车祸”等音频事件。工作面板数据显示,部分标注者位于菲律宾。
弗洛克作为美国广泛部署的自动车牌识别系统供应商,已在数千个社区安装摄像头,其系统能持续扫描车辆特征并实现全国范围轨迹追踪。执法部门日常使用该系统调查案件,并与移民海关执法局(ICE)进行数据协作。由于当局常在没有搜查令的情况下调取数据,美国公民自由联盟和电子前沿基金会已对某密集安装近500个弗洛克摄像头的城市提起诉讼。
该公司专利文件显示其技术可识别衣着特征甚至“种族”,近期更宣传将推出“尖叫检测”功能,进一步加剧隐私倡导者的忧虑。尽管曝光的工作指南中包含纽约、密歇根等多地车辆图像及亚特兰大律所广告,弗洛克未回应其AI训练所用影像的具体来源。
在媒体介入询问后,此前公开的AI训练管理面板已被撤下,弗洛克公司最终拒绝就此事发表评论。这一事件再次引发关于美国居民监控数据被海外第三方处理的敏感性问题。
中文翻译:
自动车牌识别与人工智能摄像头公司弗洛克(Flock)正通过Upwork平台雇佣海外工作者训练其机器学习算法。根据404 Media获得的该公司意外泄露的内部资料,培训材料详细指导工作者如何审阅并分类涉及美国境内人员与车辆的监控影像。
这一发现引发了关于弗洛克监控摄像头所采集影像的实际访问权限、以及影像审阅人员所在地点的疑问。弗洛克技术已在美国广泛渗透,其摄像头遍布数千个社区,警方每日利用该系统调查劫车等案件。地方警察还曾多次在该系统中为美国移民及海关执法局(ICE)执行查询操作。
人工智能与机器学习企业常因海外劳动力成本较低而雇佣海外工作者训练算法。但弗洛克业务的本质——建立持续监控美国居民活动的监控系统——意味着其影像数据可能比其他人工智能训练任务更为敏感。
弗洛克摄像头持续扫描途经车辆的车牌、颜色、品牌和型号,执法人员可借此在全国范围内检索特定车辆的行驶轨迹。当局通常无需搜查令即可调取这些数据,导致美国公民自由联盟和电子前沿基金会近期对某座遍布近500个弗洛克摄像头的城市提起诉讼。
总体而言,弗洛克运用人工智能或机器学习技术,从摄像头影像中自动识别车牌、车辆、行人及其衣着特征。该公司一项专利还提及摄像头可检测"人种"。
多位线人向404 Media指出一个暴露在线的管理面板,其中显示了弗洛克人工智能训练的各项指标,包括"已完成标注数量"和"标注者待处理任务队列"。所谓标注即工作者为已审阅影像添加注释以训练人工智能算法,任务涵盖车辆品牌、颜色、类型分类、车牌转录及"音频任务"。弗洛克近期开始推广可检测"尖叫声"的功能,该面板显示工作者有时能在两天内完成数以万计的标注。
暴露的面板列有负责标注弗洛克影像的工作人员名单。404 Media通过领英等网络资料核查发现,部分人员位于菲律宾。泄露材料显示多数人通过Upwork平台受雇,该平台按网站描述是提供设计、文案及"人工智能服务"的自由职业接洽平台。
线人还提供了数份公开的弗洛克演示文稿,其中详细说明了工作者应如何对影像进行分类。目前尚不清楚弗洛克人工智能工作者具体审阅哪些摄像头影像,但工作指南中的截图显示大量美国车牌车辆图像,涉及纽约、密歇根、佛罗里达、新泽西和加利福尼亚等地。其他图像包含清晰显示美国境内场景的路标,其中一张甚至出现了亚特兰大某特定律师事务所的广告。
关于音频标注的幻灯片要求工作者"完整收听音频",然后从下拉菜单中选择"车辆碰撞"、"枪声"或"危险驾驶"等标签。另有幻灯片指出轮胎摩擦声可能与"车辆甩尾"相关,并说明因成人儿童尖叫声难以区分,工作者需使用二级下拉菜单标注判断确信度,选项包括"确定"与"不确定"。
另一套演示文稿明确要求工作者无需标注车内人员,但需标注摩托车骑手与行人。
在404 Media联系弗洛克置评后,该暴露面板已无法访问。弗洛克随后拒绝发表评论。
英文来源:
Flock, the automatic license plate reader and AI-powered camera company, uses overseas workers from Upwork to train its machine learning algorithms, with training material telling workers how to review and categorize footage including images people and vehicles in the United States, according to material reviewed by 404 Media that was accidentally exposed by the company.
The findings bring up questions about who exactly has access to footage collected by Flock surveillance cameras and where people reviewing the footage may be based. Flock has become a pervasive technology in the US, with its cameras present in thousands of communities that cops use every day to investigate things like carjackings. Local police have also performed numerous lookups for ICE in the system.
Companies that use AI or machine learning regularly turn to overseas workers to train their algorithms, often because the labor is cheaper than hiring domestically. But the nature of Flock’s business—creating a surveillance system that constantly monitors US residents’ movements—means that footage might be more sensitive than other AI training jobs.
Flock’s cameras continuously scan the license plate, color, brand, and model of all vehicles that drive by. Law enforcement are then able to search cameras nationwide to see where else a vehicle has driven. Authorities typically dig through this data without a warrant, leading the American Civil Liberties Union and Electronic Frontier Foundation to recently sue a city blanketed in nearly 500 Flock cameras.
Broadly, Flock uses AI or machine learning to automatically detect license plates, vehicles, and people, including what clothes they are wearing, from camera footage. A Flock patent also mentions cameras detecting “race.”
Multiple tipsters pointed 404 Media to an exposed online panel which showed various metrics associated with Flock’s AI training.
It included figures on “annotations completed” and “annotator tasks remaining in queue,” with annotations being the notes workers add to reviewed footage to help train AI algorithms. Tasks include categorizing vehicle makes, colors, and types, transcribing license plates, and “audio tasks.” Flock recently started advertising a feature that will detect “screaming.” The panel showed workers sometimes completed thousands upon thousands of annotations over two day periods.
The exposed panel included a list of people tasked with annotating Flock’s footage. Taking those names, 404 Media found some were located in the Philippines, according to their LinkedIn and other online profiles.
Many of these people were employed through Upwork, according to the exposed material. Upwork is a gig and freelance work platform where companies can hire designers and writers or pay for “AI services,” according to Upwork’s website.
The tipsters also pointed to several publicly available Flock presentations which explained in more detail how workers were to categorize the footage. It is not clear what specific camera footage Flock’s AI workers are reviewing. But screenshots included in the worker guides show numerous images from vehicles with US plates, including in New York, Michigan, Florida, New Jersey, and California. Other images include road signs clearly showing the footage is taken from inside the US, and one image contains an advertisement for a specific law firm in Atlanta.
One slide about audio told workers to “listen to the audio all the way through,” then select from a drop-down menu including “car wreck,” “gunshot," and “reckless driving.” Another slide says tire screeching might be associated with someone “doing donuts,” and another says that because it can be hard to distinguish between an adult and a child screaming, workers should use a second drop-down menu explaining their confidence in what they heard, with options like “certain” and “uncertain.”
Another slide deck explains that workers should not label people inside cars but should label those riding motorcycles or walking.
After 404 Media contacted Flock for comment, the exposed panel became no longer available. Flock then declined to comment.