Many colleges and universities have made cuts in these programs, often bolstering STEM programs at their expense. It's a situation that has sparked no small amount of impassioned editorials. The headline of a recent article at The Guardian by Alice Speri referenced an 'existential crisis at U.S. universities,' and Speri's reporting features numerous examples of undergraduate and graduate programs facing cuts or outright elimination.
The term "conspiracy theory" calls to mind a variety of dubious claims and controversies, like rumors about Area 51, claims that the Earth is flat, and the movement known as QAnon. At first blush, these phenomena would seem to have little in common with bogus word origins. But there are a variety of false etymologies that spread virally and refuse to go away, in much the same way that stories about chemtrails, black helicopters, and UFOs refuse to die.
Take the sur­prise some have expressed in recent years upon find­ing out that the expres­sion to "pic­ture" some­thing in one's head isn't just a fig­ure of speech. You mean that peo­ple "pic­tur­ing an apple," say, haven't been just think­ing about an apple, but actu­al­ly see­ing one in their heads? The inabil­i­ty to do that has a name: aphan­ta­sia, from the Greek word phan­ta­sia, "image," and prefix - a, "with­out."
Peter Drucker saw this symbiosis first. He realized that the new industrial order would depend on a worker who produced ideas instead of widgets. The knowledge worker became the engine of prosperity, and management became the social technology that synchronized millions of minds. The modern firm was as much an invention as the transistor it depended on. Three decades later, Tom Peters caught the next wave.
You know that sinking feeling when you realize you've been using a phrase that makes you sound less intelligent than you actually are? I had one of those moments a few years back during a pitch meeting for my startup. I was presenting to potential investors, and I kept saying "I think" before every point I made. "I think our user acquisition strategy will work."
Anticolonialism, Ontology, and Semiotics draws upon Africana anticolonial philosophy-especially the work of Frantz Fanon and two of his most influential interpreters, Eldridge Cleaver and Sylvia Wynter-to develop a basic analytical model for doing anticolonial political theory. I wanted to show that there is something distinctive, something special, to be found in this tradition of thought that has not been fully appreciated by philosophers and theorists in other fields.
In 2024, I made a vow to never base my art criticism on wall labels. My decision came after reading reactions to that year's Whitney Biennial. "If every label in 'Even Better Than the Real Thing,' the 81st installment of the Whitney Biennial, were peeled off the walls and tossed into the Hudson, what would happen?" asked Jackson Arn in the New Yorker. (He went on to suggest that the overall show would have been much better.)
For the first time, speech has been decoupled from consequence. We now live alongside AI systems that converse knowledgeably and persuasively-deploying claims about the world, explanations, advice, encouragement, apologies, and promises-while bearing no vulnerability for what they say. Millions of people already rely on chatbots powered by large language models, and have integrated these synthetic interlocutors into their personal and professional lives. An LLM's words shape our beliefs, decisions, and actions, yet no speaker stands behind them.
I've interviewed over 200 people for articles, from startup founders to burned-out middle managers, and I've discovered something fascinating: intellectual depth isn't about fancy degrees or knowing obscure facts. It shows up in how we communicate. When certain habits dominate someone's style, it reveals a concerning lack of curiosity and critical thinking that goes beyond just being annoying-it fundamentally limits their ability to engage with the world meaningfully.