Home News Signal’s Meredith Whittaker: AI is fundamentally ‘a surveillance technology’

Signal’s Meredith Whittaker: AI is fundamentally ‘a surveillance technology’

by WeeklyAINews
0 comment

Why is it that so many firms that depend on monetizing the information of their customers appear to be extraordinarily scorching on AI? For those who ask Sign president Meredith Whittaker (and I did), she’ll inform you it’s just because “AI is a surveillance expertise.”

Onstage at TechCrunch Disrupt 2023, Whittaker defined her perspective that AI is basically inseparable from the massive information and focusing on business perpetuated by the likes of Google and Meta, in addition to much less consumer-focused however equally outstanding enterprise and protection firms. (Her remarks frivolously edited for readability.)

“It requires the surveillance enterprise mannequin; it’s an exacerbation of what we’ve seen because the late ’90s and the event of surveillance promoting. AI is a approach, I believe, to entrench and broaden the surveillance enterprise mannequin,” she stated. “The Venn diagram is a circle.”

“And using AI can also be surveillant, proper?” she continued. “You understand, you stroll previous a facial recognition digicam that’s instrumented with pseudo-scientific emotion recognition, and it produces information about you, proper or improper, that claims ‘you’re comfortable, you’re unhappy, you may have a nasty character, you’re a liar, no matter.’ These are in the end surveillance methods which can be being marketed to those that have energy over us typically: our employers, governments, border management, and so forth., to make determinations and predictions that can form our entry to assets and alternatives.”

Paradoxically, she identified, the information that underlies these methods is steadily organized and annotated (a mandatory step within the AI dataset meeting course of) by the very employees at whom it may be aimed.

See also  What is Deepfake Technology?

“There’s no technique to make these methods with out human labor on the stage of informing the bottom fact of the information — reinforcement studying with human suggestions, which once more is simply sort of tech-washing precarious human labor. It’s hundreds and hundreds of employees paid little or no, although en masse it’s very costly, and there’s no different technique to create these methods, full cease,” she defined. “In some methods what we’re seeing is a sort of Wizard of Oz phenomenon, after we pull again the curtain there’s not that a lot that’s clever.”

Not all AI and machine studying methods are equally exploitative, although. Once I requested if Sign makes use of any AI instruments or processes in its app or growth work, she confirmed that the app has a “small on-device mannequin that we didn’t develop, we use it off the shelf, as a part of the face blur function in our media modifying toolset. It’s not really that good… but it surely helps detect faces in crowd photographs and blur them, in order that if you share them on social media you’re not revealing individuals’s intimate biometric information to, say, Clearview.”

“However right here’s the factor. Like… yeah, that’s an ideal use of AI, and doesn’t that simply disabuse us of all this negativity I’ve been throwing out onstage,” she added. “Positive, if that had been the one marketplace for facial recognition… however let’s be clear. The financial incentives that drive the very costly technique of growing and deploying facial recognition expertise would by no means let that be the one use.”

See also  AMD CEO sees PC market recovery in 2nd half as AI demand ramps

Source link

You may also like

logo

Welcome to our weekly AI News site, where we bring you the latest updates on artificial intelligence and its never-ending quest to take over the world! Yes, you heard it right – we’re not here to sugarcoat anything. Our tagline says it all: “because robots are taking over the world.”

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

© 2023 – All Right Reserved.