US startup advertises AI bully' role to test patience of leading chatbots
Briefly

US startup advertises AI bully' role to test patience of leading chatbots
"You'll spend a full eight-hour day interacting with leading AI chatbots and your only job is to be brutally honest about how frustrating they are. The job requires no computer science degree or specialised AI skills. The only prerequisite is having an extensive personal history of being let down by technology and the patience to ask the same question over and over again."
"People constantly have to repeat themselves to chatbots. We wanted to turn that every day frustration into something visible. The role reads almost like a stress test for human temperament as much as machine intelligence: candidates are expected to keep the conversation going, revisit earlier topics and gently force the AI to admit when it has lost track."
"All the AI lives and breathes on memory. It's the holy grail. But the AI memory solutions that were in the market in 2024, when we started our business, were unreliable—meaning they would lose context and start hallucinating."
Memvid, a California startup, created an unusual job position called 'AI bully' that pays $800 for an eight-hour shift testing artificial intelligence chatbots. Workers interact with leading AI systems with the sole objective of being brutally honest about frustrations and limitations. The role requires no specialized technical skills, only patience and experience with technology failures. Workers repeatedly ask questions, revisit topics, and force AI systems to acknowledge when they lose context. This conversation-driven approach serves as a stress test for both human temperament and machine intelligence. The company designed this role to highlight a critical persistent problem: AI chatbots losing context over time and hallucinating information. According to CEO Mohamed Omar, memory represents the foundation of AI functionality, yet existing memory solutions in 2024 proved unreliable, causing systems to lose context and generate false information.
Read at www.theguardian.com
Unable to calculate read time
[
|
]