The Artificial Intelligence (AI) boom started in the 1980s and has continued to rise. Today, AI has become more integrated into people's daily lives than ever before. Be it for efficiency, optimisation, or convenience, people use AI for a range of items—from making grocery lists to writing their thesis. AI almost functions as a secondary brain.
AI is progressing from basic task automation to advanced, independent systems that manage various functions. It's a major turnaround for the tech world. What seemed like a technology limited to only science fiction movies a few decades ago is now an accessible tool to many populations.
Economic reports state that the AI market will go from $150.2 billion in 2023 to $1,345.2 billion by 2030. AI's ability to mimic human behaviour and perform autonomous tasks in a much more calibrated manner is lifting productivity and reducing hiring needs across software engineering and customer service.
What was initially a tool is now a revolutionary infrastructure. However, can the positive side of AI outweigh its negatives? Most likely not. AI has caused or is bound to cause significant damage to the socio-political climate, labour market, environment, psychology, education, art, and literature.
As people become increasingly reliant on prompt-based responses, the habit of questioning, analysing, or seeking out nuanced information weakens. AI delivers simplified and jargon-free answers—often without context or critical engagement. This feeds into a larger problem of the rise in anti-intellectualism, where expert knowledge is dismissed in favour of quick, emotionally charged content. When AI replaces human experts, authority shifts from evidence to convenience. The UK's Guardian warns that students relying heavily on AI are sacrificing the deep reasoning and academic rigour traditionally fostered in higher education.
AI-based platforms prioritise engagement over accuracy, making users more susceptible to disinformation and ideological manipulation. In such an environment, critical thinking is devalued and actively discouraged. This vulnerability creates fertile ground for fascist ideals. Historically, anti-intellectualism paves the way to fascism by prioritising myth and ideology over expert insight. Without rigorous education & research and an emphasis on critical thinking, AI may become a tool of mass indoctrination and cultural regression.
The growing trend of using AI as a therapist poses serious psychological risks. Many users turn to chatbots for "therapy" when AI isn't trained. It is not trained for structured, evidence-based therapy but instead provides emotional validation, reinforcing biases and unhealthy thought patterns rather than challenging them. This offers short-term emotional gratification while neglecting the complex work required for real psychological healing.
Human emotions and cognition are deeply complex, each uniquely curated by trauma, cultural context, body language, subconscious cues, and nonverbal interactions that no AI, so far, can fully grasp or respond to. A trained human therapist brings not only expertise but also empathy, moral judgement, and the ability to adapt to emotional changes in real time, unlike AI modules.
Furthermore, studies show users becoming emotionally attached to AI in an alarming manner, even naming it and treating it as a friend, which can delay or replace essential human connection. Experts warn that parasocial engagement can erode genuine social skills and increase isolation.
Meanwhile, a recent MIT Media Lab study tracked 54 participants, split into three groups: AI-assisted, search engine–assisted, and brain-only. They wrote SAT‑style essays across multiple sessions. EEG data showed the AI group exhibited the lowest brain activity, with an estimated ~55% decrease in neural connectivity compared to the brain-only group. 80% of AI users also couldn't quote from their earlier essays, signalling poor memory retention.
These participants also demonstrated significantly lower alpha and beta-wave engagement (i.e., brain rhythms associated with creativity and focus), indicating deep cognitive under-engagement. Even when AI users returned to writing without support, their brain connectivity remained suppressed. Researchers termed this phenomenon "cognitive debt"—a lasting mental dulling from over-reliance on AI.
Regarding art, literature, and creation, it has taken humans centuries to reach where we are today. It's a lovely process of trial and error, self-expression, and vocalising ideas in creative ways. Quoting the Dead Poet's Society, "We read and write poetry because we are members of the human race, and the human race is filled with passion." The passion that AI can't replicate—at least not yet.
AI's rise in art and literature has troubling effects on creativity and authenticity. Studies show that AI-generated art often leads to creative fixation, where artists rely on existing patterns, reducing originality. In the literary world, critics like Emily Bender dismiss LLM-generated text as "plagiarism machines" and "stochastic parrots", arguing these systems lack genuine understanding, threatening to marginalise true literary craft. It is also to be noted how students have complained about having to "dumb down" their writing so that it is not accused of being AI-generated. Artists and authors report real-world consequences: traditional forms are undervalued, job opportunities shrink, and low-quality "AI slop" fills platforms that are flooded.
The toll AI takes on the environment, particularly water consumption, is substantial. According to a study by the University of California, generating a single 100-word prompt can consume over 500 millilitres of water due to the immense cooling demands of data centres. More broadly, data centres powering AI models can consume up to 18,927,000 litres of water daily for an entire town of 10,000–50,000 people. In 2022, Microsoft's water usage jumped by 34 per cent and Google's by 22 per cent, mainly due to AI infrastructure. This strains local ecosystems. Many AI data centres are located in regions with limited water resources (e.g., the U.S. Southwest). Hence, they compete with agriculture and residential needs.
(Thapaliya is an intern at The Rising Nepal.)