
"As generative AI seeps into virtually every aspect of our daily lives through jobs, entertainment, and even food, you gotta wonder: is anyone not on board with the AI takeover? Apparently not. Former McKinsey analyst turned Dartmouth University professor Scott Anthony told Fortune that one of the feelings he's seeing more and more among college students isn't excitement for the AI future, but utter terror."
"The Dartmouth prof contrasted his student's anxieties to those of his fellow tenured professors, who are typically eager to try out the latest LLM software. It's not hard to see why this is the case - with a cushy gig at one of the nation's elite universities, Dartmouth faculty are free from the economic horror story that is the AI boom. For students entering today's job market, the future looks far less secure."
"One headline-inducing study from MIT earlier this summer split participants into three groups to compete tasks like writing essays: one which used LLMs, one which used common search engines, and one "brain only group." Compared to the other groups, the researchers found that the LLM group had an easier time writing their essays, though this "came at a cognitive cost, diminishing users' inclination to critically evaluate the LLM's output or 'opinions,'""
College students increasingly fear using large language models, worrying they will lose critical thinking skills and aspects of their humanity. Tenured professors at elite universities tend to embrace the latest LLM software, often insulated from economic insecurity by secure employment. Many students face a far less secure job market and fear AI-driven displacement. A recent MIT study split participants into groups using LLMs, search engines, or no tools, and found LLM users completed essays more easily. That ease came with a cognitive cost: diminished inclination to critically evaluate the model's outputs or opinions. Those trends amplify student anxiety during a messy technological transition.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]