
"If free AI assistants wore shoes, this would be the other one dropping: Meta could soon use almost anything you say to its chatbot to try to sell you stuff more effectively on its platforms, the company announced yesterday. For example, asking the purple and blue circle in your Instagram DMs to recommend nearby nature trails could lead to more ads for hiking boots, as well as creator content about mountains."
"They could appear in your Facebook feed, too, if your accounts are linked. Voice recordings by Ray-Ban Meta glasses will also factor into ad targeting. There's no opt-out option, other than ignoring the chatbot, but some things remain sacred. Meta said it won't use conversations about your religion, political views, sexual orientation, health, or race and ethnicity to suggest ads. And chats from before Dec. 16, when the policy takes effect, will also be excluded."
Meta will leverage nearly any user input to its chatbot to tailor advertising and creator recommendations across Instagram and Facebook when accounts are linked. Voice recordings captured by Ray-Ban Meta glasses will also be incorporated into targeting. Users will not be able to opt out of this data use aside from avoiding the chatbot. Conversations about religion, political views, sexual orientation, health, race, and ethnicity will be excluded from ad targeting. Chats from before Dec. 16 are excluded. Other AI companies are similarly moving to monetize free chatbots with integrated shopping and ad placements.
Read at Morning Brew
Unable to calculate read time
Collection
[
|
...
]