
"I woke up in a cold sweat. In my nightmare, I was being chased by tens of thousands of people, all of whom were enraged because I destroyed their privacy. They were all holding laptops over their heads, swinging them like clubs intended for my head. They say nightmares reflect whatever your subconscious is trying to tell you. Given the work I was planning to start in the morning, I knew exactly what my dark night brain was trying to say."
"Also: I retested GPT-5's coding skills using OpenAI's guidance - and now I trust it even less It was saying, "Stop!" Don't do it." My inner knowing was screaming at the top of its lungs, "Don't let the AI code for you." This, believe it or not, is not hyperbole. I was getting ready to start a coding project where I was planning on using an AI for help."
A developer planned a deep architectural change to mission-critical code that provides access security and privacy for over 20,000 sites. The planned change raised severe privacy and security risks if flawed code shipped. AI assistance for coding core infrastructure carries the potential to break protected sites or expose private content to the public internet. Users rely on the code to create restricted-access sites for family, schools, clients, and developers locking down in-progress projects. AI may be useful for noncore features, but core infrastructure and privacy-sensitive systems require human-led design, review, and validation.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]