Mental health in India is often shrouded in silence and stigma. Despite the growing awareness, many still view mental health issues as personal...
Vous n'êtes pas connecté
By Birgitta Dresp-Langley Digital technology and Artificial Intelligence (AI) pose threats to human right to freedom of thought [1], identified in terms of threats to the right not to reveal one’s thoughts, not to be penalized for one’s thoughts, and not to have one’s thoughts manipulated. To lose freedom of thought is to lose dignity and psychological integrity at the individual level, and social fairness and trust at the level of society. Rights to freedom of thought and dignity [2] are protected by international human rights law, with the clear understanding that what the latter aims to secure is mental autonomy [1, 2]. In the digital society at the current technological scale, an individual’s mental autonomy in terms of agency over their thoughts, actions, and decisions is compromised by factors relating to access to information, domain knowledge and expertise, and the individual understanding of and access to the different types of digital technology and AI currently deployed. As a consequence, digital policymakers have to entirely reconsider individual citizen awareness and ability to act. In 2021 [3], the World Health Organization Mental Health Atlas formally recognized new forms of addiction to digital technology (connected devices) as a worldwide problem, where excessive online activity and internet use lead to inability to manage time, energy, and attention during daytime and produce disturbed sleep patterns or insomnia during nighttime. This problem was subsequently reported to have increased in magnitude worldwide during the COVID-19 pandemic [4]. Both the Digital Safety Act [5] and the AI Act [6] of the European Commission are aimed towards a unified regulatory framework that is to protect individuals, in particular youth and other vulnerable populations, from mental manipulation threats and mental health risks posed by digital technology and AI. However, clear definitions of mental manipulation, or mental health risks that would have to be mitigated by effective regulation, are not stipulated. This quicksand of conceptual vagueness in the different legal texts pertaining to digital technology and mental health is due to 1) uncertainty about a general definition of what is to be understood by mental health on the one hand, and 2) a complete lack of understanding of the psychological mechanisms at play. This narrative review follows the Stanford University guidelines [7] and is aimed at providing a conceptual background for further investigation. Insights into phenomena and symptoms translating the impact of digital technology on mental health, the psychological mechanisms at play, and a set of appropriate testing tools for further quantification in clinical and experimental research are reviewed. In the history of mass media communication, digitalization and AI have brought about technology for mental manipulation [8-11] well beyond what was previously possible. In the 20th century, communication in human society changed quantitatively and qualitatively by the invention of radio [12-14] and its use for advertising and political propaganda beyond mere public information. Our belief systems are intrinsically biased by our personal history and experience, and we tend to believe in the validity of a certain type of information rather than another on the basis of such bias [15]. The strongest reinforcement of false beliefs is, purely and simply, the mere repetition of false facts and figures at the largest possible scale [16]. Repeating a false claim over and over again makes it believable through psychological mechanisms at play in incantation in tribal rituals [17]. Television [18-20] and, ultimately, the internet and the mobile phone, with the explosion of social networks [21] and chatbots driven by various forms of AI, have progressively given birth to an entirely new breed of technology, ideally suited for behavioural monitoring and manipulation at the largest possible scale [22]. Intrinsically healthy real-world stimuli and sensations through physical contact with others and with material reality, which have a measurably positive effect on our psychological well-being [23, 24] have been replaced by a whole universe of remote virtual social reality [25] and deceptive online interactions. Dark patterns in mobile phone applications Dark patterns were first identified in connection with persuasive technology tools aimed at influencing user behaviour in mobile phone applications. A first attempt at categorization was proposed already in 2010 [26] to shed light on evil practices used in the design of mobile phone applications, where DP have been predominant for some time. The presence of dark patterns in the UX design of digital platforms, websites, and mobile applications has been identified in 95% of all platforms and applications linked to commercial ventures [27]. A dark pattern is defined as “a user interface that has been carefully crafted to trick users into doing things that they usually might not do” [26]. Dark Patterns rely on a solid understanding of unconscious mechanisms of psychological conditions, and never have the user’s interest in mind. For example, companies showing a long list of privacy policies in mobile applications although consumers involuntarily accept to share more data than they intended to. Aggressive commercial dark patterns involve practices that force the consumer to proceed to certain steps, make decisions, or perform certain tasks without freedom of choice, while mild dark patterns leave option to opt-out of the process. Surprisingly, it has been found that aggressive dark patterns are much better accepted by users than mild ones [28, 29], presumably because aggressive ones more strongly display what may be called digital authority. Digital authority, combined with the novelty effect of a given technology application or AI product, paradoxically helps masking the intrinsically evil intent behind them [30, 31], which is akin to a psychological phenomenon known as the Milgram Effect [32]. Dark patterns enable AI to deceptively lead consumers and users to “do what they are told to” willingly and uncritically. Mild dark patterns are present in almost all commercially linked internet and mobile applications. Individuals of all ages worldwide have already become used to being manipulated by them, a phenomenon referred to as dark pattern blindness or blindness to malicious internet design [27] akin to the well-known phenomenon of change blindness in psychology [33]. Normative perspectives for analyzing dark pattern effects in terms of their potentially detrimental effects on consumers and society have been proposed by the OECD [34, 35], without any deeper scrutiny of their consequences for the psychological well-being of individuals and society. Clear working hypotheses are urgently needed here for carving out appropriate regulatory measures. Also, there is no universally agreed definition of the dark pattern concept apart from the understanding that they lead users into making decisions they would not have made if fully informed and capable of selecting alternatives or given the choice of opting out [36]. In short, by their manipulative nature, dark patterns online limit individual freedom of choice and the conscious awareness that is necessary “to say no” to a process, a procedure or a product. The General Data Protection Regulation (GDPR) of the EU [37], which encompasses the demand for clear information, transparency, control of all data, is violated by dark patterns in digital technology and AI. The motivations and intentions detectable behind a digital design pattern or AI determine whether it contains dark patterns or not. Mobile phone applications use dark patterns to achieve business goals in a virtual space where people spend most of their time. With the wider use of online games and AI-driven interactive platforms and social media [30, 31], we have entered a new universe of human computer interaction, by nature manipulative, because the aim of online platforms and social media is to keep people engaged. Dark patterns therein serve the purpose of manipulating users to act in a certain way [34, 35, 36], which is often contrary to their interests, in time alters their psychological ability to perceive and decide cogently and, ultimately, may engender behavioural changes that negatively affect individuals [36] and society as a whole. The design of online gaming and interactive platforms unilaterally benefits the interest of their creators. Deceptive interaction as a general dark pattern, omnipresent in social media platform design and impossible to single out at the technical level, has potentiated mental manipulation at the largest possible scale. The Digital Services Act (DSA) of the European Commission [38] recognizes the dangers of dark patterns in digital technology and Artificial Intelligence (AI) to the safety and well-being of human individuals and society. The necessity for their uniform legal ban across Europe is recognized further in the Artificial Intelligence Act [39]. Therein subliminal behavioural priming by dark patterns in Artificial Intelligence (AI) is identified as “high risk” technology to be banned legally in the future as soon as scientific insight on the mechanisms at play is available. Deceptive design in AI-driven social media The convergence of human intelligence and AI has given way to a deceptive space of relational interactions between humans and AI. The humanwashing of AI-enabled machines [40] as a specific form of anthropomorphism represents a deceptive way of using AI-enabled systems and machines, misleading organizational stakeholders and the broader public about the true capabilities such machines possess. This has enabled the deterioration of social structures and communication processes with regard to consequences, accountability, and liability [41]. Widespread cybercrime and cyber-insecurity have been listed among the top ten global risks to society in the next two years and beyond at World Economic Forum in 2024 [42]. AI threatens democratic aspects of trust formation, and biases and destroys critical social aspects relating to distrust and its consequences on individuals and society [41, 42, 43, 44, 45]. Beyond the spread of false information, the rise of AI-manipulated multimedia, with a massive presence of AI-powered automated accounts and various forms of harmful content, the perils we encounter when exposed to such an online ecosystem have been allowed to proliferate beyond controllable levels [45]. This has produced a novel, particularly evil, breed of dark pattern: deceptive interaction [46, 47]. We may define deceptive interactions enabled by AI-powered platforms in terms of online contacts and interactive feed-backs that the user cannot control, either because the contact per se is fake and therefore deceptive [46], and/or the interaction deceives because it is coercive, deceitful, diminishing, hostile, discriminating and manipulative [47]. Deceptive interaction (Figure 1) makes humans believe they are interacting with another human when they are not, or allow other human platform members to create user profiles anonymously, permitting them to lie to and/or abuse others without penalty. Repeated exposure to algorithm-driven deceptive interactions engenders mental and behavioural modifications, especially in the young [48]. This may have irreversible, consequences on an individual’s perception, understanding, reasoning, thoughts, emotions, trust, and overall psychological well-being at the individual level and at the level of their social interactions with others in the real world (Figure 1). Figure 1 Deceptive interaction in AI-powered social media as the most destructive form of dark pattern, directly manipulating human mental autonomy. As a consequence, the psychological well-being of individuals and society is jeopardized. Moreover, public health is threatened by deceptive AI, given the large amount of inaccurate information that circulates online within and beyond the formal health care system [49]. AI and the internet have changed people's engagement with health information be it through search, user-generated content, or mobile apps and social media platforms in a way that could negatively affect health outcomes with regard to life quality and risk of mortality. This, ultimately, threatens public trust in health care institutions [49, 50]. AI-driven platforms that enable fake news campaigns and online hate speech have fuelled the social phenomenon of polarization, characterized by the fragmentation of society into antagonistic fractions with strictly opposed values that impede cooperative processes in the pursuit of common good [51]. The threats represented by polarization have already materialized, and democracy is globally under siege [52, 53]. Polarization has impeded pandemic response [54] and consensus on critical global issues such as climate change [55], and has weakened the resilience of our society to adversity. From a positive perspective, social media have enabled activist groups and minorities to become more visible and get their messages over to the wider public, highlight inequalities, denounce injustice, and advance discourse on social justice at a larger scale. The negative side, however, is that social media are being used by extremists, political parties, and politically motivated players to demonize specific groups and to discriminate already marginalized populations to achieve specific, non-noble ends that destroy cohesion in society [56, 57]. Deceptive interaction via AI-driven digital platforms increasesviolence in society by promoting hate speech, loss of empathy for others, criminality, domestic violence, and terrorism, with a measurable impact on national statistics of domestic terrorist attacks [58, 59]. Figure 2 Digital authority as a source of phenomena of unwarranted obedience (Milgram Effect), de-individuation, online conformity, fear of missing out, and polarization AI-driven manipulative disinformation via social media also effectively conveys false endowments to technological or financial resources, national and foreign governments and political parties [60]. The large-scale nature of online deception is one of the conditions of its success in a new social universe where digital authority engenders a clearly identifiable set of negative consequences on human individual and collective mindsets and behaviour (Figure 2). Impact of digital technology on mental health The impact of digital technology on mental health and well being encompass a wide range of adverse effects, symptoms and conditions, from social withdrawal in problematic online use to full-blown digital addiction [61, 62, 63, 64], chronic stress [65, 66], anxiety [67, 68], depression [69, 70], anhedonia [71, 72, 73], and suicidal ideation [74, 75, 76]. Adolescents and young adults in particular are targeted by manipulative online content [77], and online stereotypes and image models imposed by digital technology cause severe psychological harm, especially in youth [78]. Fear of missing out [79] and an overall lack of awareness [80] are factors that contribute to the victimization of youth by digital technology and social media. Thus, digital technology has produced a novel form of adversity we may call digital adversity (Figure 3). Figure 3 Digital adversity as a source of altered mental health in terms of reward deficiency syndrome linked to internet addiction, sleep disorders, chronic stress, and depression leading in extreme cases to anhedonia and suicidal ideation. Exposure to a certain type of image material in social media can have detrimental effects on the self-esteem and body image of adolescents, in particular girls [81]. According to a study by the World Health Organization [82], girls’ psychological and mental health are particularly affected by the consequences of online deception, with adolescent girls twice as likely to report feeling lonely compared with boys, and to suffer from eating disorders more frequently. Overexposure to algorithm-driven online contents also affects the ability to learn. While digital technology has the potential to improve access to education content and enhance the learning experience, children’s social lives increasingly focus on social media, which affects their concentration and learning ability in school. Studies on young people across countries [83, 84] highlight the association between the use of social media and body image concerns, eating disorders, and generally poor mental health. Addictive design features [85] and excess of screen time and blue light exposure for long hours day and night, pose a threat to general health especially that of children [86]. Excess of digital content exposure distracts children and the young in general from academic as well as relaxing extracurricular activities. In 2024, the WHO proposed an extended universal definition of mental health [87] in terms of “... a state of mental well-being that enables people to cope with the stresses of life, realize their abilities, learn well and work well, and contribute to their community. It has intrinsic and instrumental value and is integral to our well-being. Mental health conditions include mental disorders and psychosocial disabilities as well as other mental states associated with significant distress, impairment in functioning or risk of self-harm.” The document further states that, in 2019, 970 million people globally were reported living with a mental disorder, anxiety and depression being the most common. Mental health conditions affect all aspects of life, and account for of six years lived with disability. People with severe mental health conditions die ten to twenty years earlier than the general population. Having a mental health condition increases the risk of suicide and experiencing violations of human rights. In addition, the economic consequences of mental health conditions are incommensurable, with productivity losses that outstrip the cost of healthcare. In 2024, the World Health Organization [88] has formally recognized that addiction to digital technology has become a worldwide threat to mental health and well-being of individuals and groups, in particular youth. Excessive online activity and internet use lead to inability to manage time, energy, and attention during daytime, and to disturbed sleep patterns and insomnia during nighttime. The psychological mechanisms underlying the many ways in which human mental autonomy has been hijacked and mental health put into jeopardy or compromised by digital and AI-driven technology are not known, but it is possible, under the light of what has been reviewed here above, to suggest likely candidate mechanisms. Psychological phenomena and mechanisms Mental manipulation by digital technology relies on universal mechanisms that are mostly non-conscious and grounded in life-long psychological conditioning to submit to authority, including digital authority. Obedience to digital authority and fear of missing out Individuals or groups believed to have authority can effectively assert such over individuals or crowds. Milgram’s [89] famous experiments on blind obedience to fake authority had clearly shown that belief is what matters, not whether the authority is authentic in terms of competence such as the authority of a certified expert, in terms of eligibility such as the authority of an individual with the highest qualifications among individuals in a group, or in terms of law such as the authority of a police officer or a director in a clearly established corporate hierarchy. Individuals, groups, and media perceived by others to have authority acquire the power to influence these others positively or negatively. When the perceived authority is in close proximity, their ability to exert influence is maximized [90, 91]. As millions of individuals worldwide spend long hours day and night tethered to their digital devices (smartphones, tablets), digital authority is close by at all times, and has replaced real world authority of parents, peers, or teachers. The positive side of obedience to authority is that it keeps children safe from danger, and permits society to function under law and civic order. However, when a perceived authority is fake and groups and individuals obey such authority uncritically without questioning, the alleged “authority” has the power to exploit and abuse them to specific ends. In an article on the impact of the digital revolution on human brain and behaviour [92], the author starts his introduction with a comment on E. M. Forster’s short story “The Machine Stops” [93], published in 1909 in The Oxford and Cambridge Review”. The piece describes a futuristic scenario in which a mysterious machine controls society and people’s lives and habits in a way that evokes internet and digital media of today. In this dystopia, all communication is remote, face-to-face meetings no longer happen. The machine controls the individual and collective mindset and, ultimately, everybody depends on it. When the machine stops working, society collapses. Submitting to perceived authority is something that we have learnt to do from early childhood on every day without realising it, prompting our desire to conform to digital technology. This has produced Fear of Missing Out (FoMO), defined in terms of “the tendency to experience anxiety over missing out on the rewarding experiences of others” [94] in studies that have provided clear evidence for FoMO's central role in digital technology use-related mental disorders [94, 95]. De-individuation and online conformity Obedience to authority may produce de-individuation [96, 97], a process that occurs when individuals abandon their individual motivations and mindset to take on those of a group they aspire to belong to. In extreme forms, individuals conform to the behaviour of the rest of that group to an extent that all sense of individual responsibility for decision and action is relinquished to the group, a phenomenon frequently observed in sectarian movements [98]. Human inclination to conform to group consensus may stem from an initially healthy desire to gain more information about the group [99], however, specific negative and positive feed-back processes [100], especially in an online and social media context with instant feed-back, can produce and fuel an increasingly pathological need for group approval. De-individuated desire to conform to the thoughts and actions of a group explains why individuals may be prompted by groups towards “nasty”, violent, or unlawful online activities [101] while are completely out of character when they behave in the real world outside the online group. Obedience to digital authority on the one hand, and de-individuation on the other are mechanisms at play in the phenomenon of polarization [102] produced by social media. Current analyses identify two factors behind such polarization: social desire to comply with a group’s expectations, and concerns about acceptance of one’s own behaviour by the group. Pathological adaptation to chronic digital stress The ill-adapted psychological subordination to digital environments or social media and the resulting progressive de-individuation are akin to some of the consequences of post-traumatic stress-disorder [103], and likely to involve a novel form of pathological adaptive learning [104, 105]. One of the most compelling characteristics of the brain is the plasticity of neural circuits and functions [106]. It underlies the ability of the brain to continuously adapt its functional and structural organization to relevant changes in stimulations and environments. However, although brain plasticity favours healthy processes of functional adaptation and adaptive learning, its mechanisms also govern maladaptive, i.e. pathological, changes in the brain. This may be illustrated on the example of stressful events. The brain can learn from stressful events to respond in an adaptive manner in the future. While an optimal stress level leads to enhancement of memory performance, the exposure to extreme, traumatic or chronic stressors is a risk factor for the development of psychopathologies which are associated with memory impairment and cognitive deficits [107]. A crucial role of the accumbal and prefrontal dopamine brain pathways and inflammatory responses in the brain and body as the direct adaptive and maladaptive consequences of stress related to digital overexposure or addiction may be involved here [108, 109]. In behavioural addictions [110], which include digital addiction [64, 109], pathological brain adaptation is known to involve complex interactions between the reward system and other functional areas. These interactions form the neurobiological correlates of the transition from the initial pleasures associated with a specific activity to compulsive behaviour [111] and pathological craving [112] characteristic of all forms of addiction [110]. Reward Deficiency Syndrome (RDS) and digital addiction In all addictions including digital addiction [113, 109], the initially positive reward response of the brain ultimately reverses into a negative reward response [113] through psychological and physiological mechanisms of pathological adaptation. The initial pleasure associated with the activity or drug is gone, but the need to resort to the drug persists as a consequence of a neurobiological condition called Reward Deficiency Syndrome (RDS) [114]. RDS refers to abnormal behaviours and brain mechanisms resulting from a breakdown of a cascade of reward loops in neurotransmission, most likely due to genetic and epigenetic influences [114]. This involves a whole so-called anti-reward brain circuitry [115, 116, 117]. The important role of neurotransmitter deregulation, i.e. progressive dopamine depletion as the anti-reward networks reinforce and consolidate in the brain [114], as well as important links to elevated corticosteroid levels and chronic stress [117] have been discussed in several major review articles in association with digital addiction [64, 73, 86]. Dopamine depletion and stress-related mechanisms form the neurobiological basis of craving and relapse in addiction [73, 114, 117]. As far as environmental trigger cues are concerned, essentially three cue types involved in craving and relapse have been identified: (a) re-exposure to addictive drugs or activities, (b) stress caused by external stressors and/or withdrawal-related internal stress, and (c) re-exposure to relevant environmental contexts (situations, people, places) associated with the drug or activity [117]. Although some individuals can stop compulsive behaviour on their own before it leads to chronic addiction, genetic and non-genetic factors readily explain why most of us cannot [115, 116, 118]. In digital addiction, we keep consuming technology that is not good for us because it has adverse effects on our brain health, but our brains keep craving it, even when it no longer rewards us. Compulsion combined with digital adversity (negative feed-back, cyber bullying, online harassment e.a.) explains clinical symptoms of anxiety, depression, anhedonia, and suicidal ideation [61, 64, 65, 73] associated with digital addiction [64]. Learned helplessness In his book Beyond Freedom and Dignity written in 1971 [119], the American psychologist B. F. Skinner discussed the basic mechanisms of classic and psychological conditioning. The repeated association of positive or negative (aversive) stimuli with situations and events will influence how a person feels, behaves, and act [120], there is no way out. The underlying psychobiological mechanisms are non-conscious, and deeply rooted in the brain’s neural networks [121, 122]. The concept of learned helplessness refers to the condition of a living individual, human or animal, where mindset and behaviour are affected by the learnt belief that the individual’s actions do not matter in as far as they are perceived to not ever impact outcomes positively. In humans, this typically manifests in the perception of events, and life in general, as uncontrollable and unpredictable. At the behavioural level, the individual does not proactively respond to adversity in order to avoid or counter a negative response or outcome, not even when opportunities for avoidance or change exist. The concept harks back to experiments on dogs described by Seligman and collaborators [123, 124]. Learned helplessness theory posits that repeated exposure to uncontrollable and aversive environmental stimuli leads gradually to the firm belief that the aversive situation is inescapable. This feeling of “no way out” produces a sense of helplessness that can result in anxiety, depression, and suicide [125, 126]. Learned helplessness is likely to involve the same psychological and physiological mechanisms of conditioning as those that occur in the transition from pleasurable consumption to anti-reward and RDS, which has been linked to brain evolution by some authors [114]. These mechanisms of brain conditioning exploited cogently by some AI-driven internet technology and social media, thereby creating hostile online environments and a whole new universe of digital adversity. The association between dark patterns in digital technology and learned helplessness in consumer populations was first brought forward in 2022 in the context of an expert group meeting on digital technology and mental health of the Consumer Safety Network of the European Commission [36]. The author therein stipulates that feeling that there is no way out of manipulative technology can evolve into a generalized state of learned helplessness and psychological subordination, especially in youth and vulnerable populations, where the individual becomes a victim, and thereby an ideal target to further exploitation and abuse in the online world, and beyond. Conclusions Since the end of the Covid-19 pandemic, the impact of digital technology on the mental health and wellbeing of individuals, in particular the young, has received considerable attention from public health organizations worldwide. Adding to concerns about increase in suicidal behaviour in the young in a society context producing ontological fears and insecurities worldwide, WHO experts worldwide witnessed a significant rise in mental health issues, anti-depressant prescription uptakes, and suicide statistics for young individuals. In France, for example, a significant increase in medical consultations for suicidal behaviors and thoughts associated with anxiety and depression among 18–24 -year-olds in the global national population, with a peak in 2021, was revealed by public health barometer survey statistics [127]. The Health Behaviour in School-aged Children (HBSC) study [128] surveyed approximately 280 000 young people aged between ten and fifteen across 44 countries in Europe, central Asia and Canada in 2022. The report concludes on a significant proportion of youth with problematic social media use as a pattern of behaviour characterized by addiction-like symptoms such as inability to control social media usage, experiencing withdrawal when not using it, neglecting other activities in favour of social media, and experiencing negative consequences in school and daily life due to excessive use. There is a worldwide consensus that the impact of digital technology on mental health urgently warrants further epidemiological and clinical research. About the author: Birgitta Dresp-Langley was born in Berlin, Germany, and is a Research Director with the Centre National de la Recherche Scientifique (CNRS) in Paris, France. She obtained her PhD in Cognitive Psychology at Paris Descartes University and joined the CNRS in 1993. Birgitta has authored more than 100 articles in peer reviewed multidisciplinary science journals, several science book chapters, and is an independent scientific expert to the European Commission. Member of the Editorial Board of international science journals released by Elsevier, Frontiers, and MDPI, her favoured science topics are brain-behaviour function, “natural” human “intelligence”, and reliable AI. The article presents the stance of the author and does not necessarily reflect the stance of IFIMES. References: 1. McCarthy-Jones, S. (2019). The Autonomous Mind: The Right to Freedom of Thought in the Twenty-First Century. Frontiers in Artificial Intelligence, 2. https://doi.org/10.3389/frai.2019.000192. Al-Rodhan, N. (2021, March 27). Artificial Intelligence: Implications for human dignity and governance | Nayef Al-Rodhan | Oxford Political Review. Oxford Political Review. https://oxfordpoliticalreview.com/2021/03/27/artificial-intelligence/3. World Health Organization. (2021, October 8). Mental Health ATLAS 2020. Www.who.int. https://www.who.int/publications/i/item/97892400367034. World Health Organization. (2022, March 2). Mental health and COVID-19: Early evidence of the pandemic’s impact: Scientific brief, 2 march 2022. World Health Organization. https://www.who.int/publications/i/item/WHO-2019-nCoV-Sci_Brief-Mental_health-2022.15. Regulation - 2022/2065 - EN - DSA - EUR-Lex. (n.d.). Eur-Lex.europa.eu. https://eur-lex.europa.eu/eli/reg/2022/20656. EUROPEAN COMMISSION (2021, February 10). The Artificial Intelligence Act. https://artificialintelligenceact.eu/the-act/7. Woodward, A. (n.d.). Research Guides: Types of Reviews: Narrative Reviews. Laneguides.stanford.edu. https://laneguides.stanford.edu/types-of-reviews/narrative8. Clark, D. A., & Purdon, C. L. (1995). The assessment of unwanted intrusive thoughts: A review and critique of the literature. Behaviour Research and Therapy, 33(8), 967–976. https://doi.org/10.1016/0005-7967(95)00030-29. Robert Jay Lifton. (1961). Thought reform and the psychology of totalism : a study of “brainwashing” in China. Penguin Books.10. European Union. (2023). Involuntary placement and involuntary treatment of persons with mental health problems. Publications Office of the EU. https://op.europa.eu/en/publication-detail/-/publication/220b40a2-6ae8-4c57-aa86-16df21823e2b/language-en11. Holmes, M. (2017). Brainwashing the cybernetic spectator: The Ipcress File, 1960s cinematic spectacle and the sciences of mind. History of the Human Sciences, 30(3), 3–24. https://doi.org/10.1177/095269511770329512. A. Brad Schwartz. (2015). Broadcast Hysteria. Hill and Wang.13. Birdsall, C. (2012). Nazi Soundscapes : Sound, Technology and Urban Space in Germany, 1933-1945. In BiblioBoard Library Catalog (Open Research Library). https://doi.org/10.26530/oapen_42453214. United States Holocaust Memorial Museum. (n.d.). State of Deception: the Power of Nazi Propaganda - State of Deception: The Power of Nazi Propaganda - United States Holocaust Memorial Museum. Exhibitions.ushmm.org. https://exhibitions.ushmm.org/propaganda/home/state-of-deception-the-power-of-nazi-propaganda15. Maffei, R., Convertini, L. S., Quatraro, S., Ressa, S., & Velasco, A. (2015). Contributions to a neurophysiology of meaning: the interpretation of written messages could be an automatic stimulus-reaction mechanism before becoming conscious processing of information. PeerJ, 3, e1361. https://doi.org/10.7717/peerj.136116. Greifeneder, R., Jaffé, M. E., Newman, E. J., & Schwarz, N. (Eds.). (2020). The Psychology of Fake News. Routledge. https://doi.org/10.4324/978042929537917. Boyer, P., & Liénard, P. (2020). Ingredients of “rituals” and their cognitive underpinnings. Philosophical Transactions of the Royal Society B: Biological Sciences, 375(1805), 20190439. https://doi.org/10.1098/rstb.2019.043918. Killen, A. (2011). Homo pavlovius: Cinema, Conditioning, and the Cold War Subject. Grey Room, 45, 42–59. https://doi.org/10.1162/grey_a_0004919. Vance Oakley Packard. (2007). The hidden persuaders. Brooklyn, N.Y Ig Pub. (Original work published 1957)20. Vance Oakley Packard. (1964). The Naked Society. McKay Publishers.21. Pantic, I. (2014). Online Social Networking and Mental Health. Cyberpsychology, Behavior, and Social Networking, 17(10), 652–657. https://doi.org/10.1089/cyber.2014.007022. Pérez-Sales, P. (2022). The future is here: Mind control and torture in the digital era. Torture Journal, 32(1-2), 280–290. https://doi.org/10.7146/torture.v32i1-2.13284623. Heatley Tejada, A., Dunbar, R. I. M., & Montero, M. (2020). Physical Contact and Loneliness: Being Touched Reduces Perceptions of Loneliness. Adaptive Human Behavior and Physiology, 6(3). https://doi.org/10.1007/s40750-020-00138-024. Schneider, E., Hopf, D., Aguilar-Raab, C., Scheele, D., Neubauer, A. B., Sailer, U., René Hurlemann, Eckstein, M., & Beate Ditzen. (2023). Affectionate touch and diurnal oxytocin levels: An ecological momentary assessment study. ELife, 12. https://doi.org/10.7554/elife.8124125. Weizenbaum, J., & Gunna Wendt. (2015). Islands in the cyberstream: seeking havens of reason in a programmed society. Litwin Books.26. Dark Patterns - Types of Dark Pattern. (2019). Darkpatterns.org. https://www.darkpatterns.org/types-of-dark-pattern27. Di Geronimo, L., Braz, L., Fregnan, E., Palomba, F., & Bacchelli, A. (2020). UI Dark Patterns and Where to Find Them [Review of UI Dark Patterns and Where to Find Them]. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–14). Association for Computing Machinery. https://dl.acm.org/doi/proceedings/10.1145/331383128. Luguri, J., & Strahilevitz, L. J. (2020). Shining a Light on Dark Patterns. Journal of Legal Analysis, 13(1). https://doi.org/10.1093/jla/laaa00629. Mathur, A., Mayer, J., & Kshirsagar, M. (2021). What Makes a Dark Pattern... Dark? Design Attributes, Normative Considerations, and Measurement Methods. ArXiv:2101.04843 [Cs]. https://doi.org/10.1145/3411764.344561030. Fogg, B. J. (2003). Persuasive technology : using computers to change what we think and do. Morgan Kaufmann.31. Nodder, C. (2013). Evil by design : interaction design to lead us into temptation. Wiley.32. Russell, N. J. C. (2011). Milgram’s obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology, 50(1), 140–162. https://doi.org/10.1348/014466610x49220533. Simons, D. J., & Rensink, R. A. (2005). Change blindness: past, present, and future. Trends in Cognitive Sciences, 9(1), 16–20. https://doi.org/10.1016/j.tics.2004.11.00634. Dark commercial patterns. (2024). OECD. https://www.oecd.org/en/publications/dark-commercial-patterns_44f5e846-en.html35. OECD (2021), Roundtable on Dark Commercial Patterns Online: Summary of discussion, https://one.oecd.org/document/DSTI/CP/CPS(2020)23/FINAL/en/pdf.36. Dresp, B. (2022). Learned helplessness as a result of dark patterns in Artificial Intelligence. Hal.science. https://hal.science/hal-0387199137. European Union. (2016, April 27). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance). Europa.eu. https://eur-lex.europa.eu/eli/reg/2016/679/oj38. Milos Novovic. (2024). The EU Digital Services Act (DSA). Kluwer Law International B.V.39. European Union. (2021, September 7). The EU Artificial Intelligence Act. https://artificialintelligenceact.eu/40. Scorici, G., Schultz, M. D., & Seele, P. (2022). Anthropomorphization and beyond: conceptualizing humanwashing of AI-enabled machines. AI & Society, 39(2), 789–795. https://doi.org/10.1007/s00146-022-01492-141. Freiman, O. (2022). Making sense of the conceptual nonsense ‘trustworthy AI.’ AI And Ethics, 3(4), 1351–1360. https://doi.org/10.1007/s43681-022-00241-w42. Global Risks Report 2023 | World Economic Forum. (n.d.). World Economic Forum. https://www.weforum.org/reports/global-risks-report-2023/digest/43. Hunter, L. Y., Biglaiser, G., McGauvran, R. J., & Collins, L. (2023). The effects of social media on domestic terrorism. Behavioral Sciences of Terrorism and Political Aggression, 16(4), 556–580. https://doi.org/10.1080/19434472.2022.216000144. Ferrara, E., Cresci, S., & Luceri, L. (2020). Misinformation, manipulation, and abuse on social media in the era of COVID-19. Journal of Computational Social Science, 3(2), 271–277. https://doi.org/10.1007/s42001-020-00094-545. Swire-Thompson, B., & Lazer, D. (2019). Public health and online misinformation: challenges and recommendations. Annual Review of Public Health, 41(1), 433–451. https://doi.org/10.1146/annurev-publhealth-040119-09412746. Uyheng, J., & Carley, K. M. (2020). Bots and online hate during the COVID-19 pandemic: case studies in the United States and the Philippines. Journal of Computational Social Science, 3(2), 445–468. https://doi.org/10.1007/s42001-020-00087-447. Obermaier, M., & Schmuck, D. (2022). Youths as targets: factors of online hate speech victimization among adolescents and young adults. Journal of Computer-Mediated Communication, 27(4). https://doi.org/10.1093/jcmc/zmac01248. Pew Research Center. (2024, December 12). Pew Research Center | Teens, Social Media, and Technology. https://www.pewresearch.org/49. Suarez-Lledo, V., & Alvarez-Galvez, J. (2020). Prevalence of health Misinformation on social media: Systematic review. Journal of Medical Internet Research, 23(1), e17187. https://doi.org/10.2196/1718750. Bizzotto, N., Schulz, P. J., & De Bruijn, G. (2023). The “Loci” of Misinformation and Its Correction in Peer- and Expert-Led Online Communities for Mental Health: Content Analysis. Journal of Medical Internet Research, 25, e44656. https://doi.org/10.2196/4465651. Van Bavel, J. J., Rathje, S., Harris, E., Robertson, C., & Sternisko, A. (2021). How social media shapes polarization. Trends In Cognitive Sciences, 25(11), 913 916. https://doi.org/10.1016/j.tics.2021.07.01352. De Mello, V. O., Cheung, F., & Inzlicht, M. (2024). Twitter (X) use predicts substantial changes in well-being, polarization, sense of belonging, and outrage. Communications Psychology, 2(1). https://doi.org/10.1038/s44271-024-00062-z53. Vasist, P. N., Chatterjee, D., & Krishnan, S. (2023). The Polarizing Impact of Political Disinformation and Hate Speech : A Cross-country Configural Narrative. Information Systems Frontiers. https://doi.org/10.1007/s10796-023-10390-w54. De Nicola, G., Mambou, V. H. T., & Kauermann, G. (2023). COVID-19 and social media : Beyond polarization. PNAS Nexus, 2(8). https://doi.org/10.1093/pnasnexus/pgad24655. Falkenberg, M., Galeazzi, A., Torricelli, M., Di Marco, N., Larosa, F., Sas, M., Mekacher, A., Pearce, W., Zollo, F., Quattrociocchi, W., & Baronchelli, A. (2022). Growing polarization around climate change on social media. Nature Climate Change, 12(12), 1114 1121. https://doi.org/10.1038/s41558-022-01527-x56. Smiley, C., & Fakunle, D. (2016). From “brute” to “thug : ” The demonization and criminalization of unarmed Black male victims in America. Journal Of Human Behavior In The Social Environment, 26(3 4), 350 366. https://doi.org/10.1080/10911359.2015.112925657. Ahmed, S., Jaidka, K., Chen, V. H. H., Cai, M., Chen, A., Emes, C. S., Yu, V., & Chib, A. (2024). Social media and anti-immigrant prejudice : a multi-method analysis of the role of social media use, threat perceptions, and cognitive ability. Frontiers In Psychology, 15. https://doi.org/10.3389/fpsyg.2024.128036658. Rulis, M. (2024). The Influences of Misinformation on Incidences of Politically Motivated Violence in Europe. The International Journal Of Press/Politics. Online First, https://doi.org/10.1177/1940161224125787359. Piazza, J. A. (2022). Fake News: The Effects of Social Media Disinformation on Domestic Terrorism. Dynamics of Asymmetric Conflict 15(1):55–77. https://doi.org/10.1080/17467586.2021.189526360. Reisach, U. (2020). The responsibility of social media in times of societal and political manipulation. European Journal Of Operational Research, 291(3), 906 917. https://doi.org/10.1016/j.ejor.2020.09.02061. Pereira, F. S., Bevilacqua, G. G., Coimbra, D. R., & Andrade, A. (2020). Impact of Problematic Smartphone Use on Mental Health of Adolescent Students: Association with Mood, Symptoms of Depression, and Physical Activity. Cyberpsychology Behavior and Social Networking, 23(9), 619–626. https://doi.org/10.1089/cyber.2019.025762. Fineberg, N. A., Demetrovics, Z., Potenza, M. N., Mestre-Bach, G., Ekhtiari, H., Roman-Urrestarazu, A., Achab, S., Kattau, T., Bowden-Jones, H., Thomas, S. A., Babor, T. F., Kidron, B., & Stein, D. J. (2024). Global action on problematic usage of the internet: announcing a Lancet Psychiatry Commission. The Lancet Psychiatry. https://doi.org/10.1016/s2215-0366(24)00323-763. Meng, S., Cheng, J., Li, Y., Yang, X., Zheng, J., Chang, X., Shi, Y., Chen, Y., Lu, L., Sun, Y., Bao, Y., & Shi, J. (2022). Global prevalence of digital addiction in general population: A systematic review and meta-analysis. Clinical Psychology Review, 92, 102128. https://doi.org/10.1016/j.cpr.2022.10212864. Dresp-Langley, B., & Hutt, A. (2022). Digital addiction and sleep. International Journal of Environmental Research and Public Health, 19(11), 6910. https://doi.org/10.3390/ijerph1911691065. Haidt, J. (2024). The anxious generation: How the great rewiring of childhood is causing an epidemic of mental illness. Penguin Random House.66. Maftei, A., & Pătrăușanu, A. (2023). Digital reflections: narcissism, stress, social media addiction, and nomophobia. The Journal of Psychology, 158(2), 147–160. https://doi.org/10.1080/00223980.2023.225645367. Kerr, B., Garimella, A., Pillarisetti, L., Charlly, N., Sullivan, K., & Moreno, M. A. (2024). Associations between social media use and anxiety among Adolescents: A Systematic review study. Journal of Adolescent Health. https://doi.org/10.1016/j.jadohealth.2024.09.00368. Du, M., Zhao, C., Hu, H., Ding, N., He, J., Tian, W., Zhao, W., Lin, X., Liu, G., Chen, W., Wang, S., Wang, P., Xu, D., Shen, X., & Zhang, G. (2024). Association between problematic social networking use and anxiety symptoms: a systematic review and meta-analysis. BMC Psychology, 12(1). https://doi.org/10.1186/s40359-024-01705-w69. Keles, B., McCrae, N., & Grealish, A. (2019). A systematic review: the influence of social media on depression, anxiety and psychological distress in adolescents. International Journal of Adolescence and Youth, 25(1), 79–93. https://doi.org/10.1080/02673843.2019.159085170. Hu, J. M., Balow, S., Meng, J., Ellithorpe, M., & Meshi, D. (2024). Social network density mediates the association between problematic social media use and depressive symptoms. International Journal of Human-Computer Interaction, 1–6. https://doi.org/10.1080/10447318.2024.240426271. Guillot, C. R., Bello, M. S., Tsai, J. Y., Huh, J., Leventhal, A. M., & Sussman, S. (2016). Longitudinal associations between anhedonia and internet-related addictive behaviors in emerging adults. Computers in Human Behavior, 62, 475–479. https://doi.org/10.1016/j.chb.2016.04.01972. Cangelosi, G., Biondini, F., Sguanci, M. E., Nguyen, C. T. T., Palomares, S. M., Mancin, S., & Petrelli, F. (2024). Systematic Review Protocol: Anhedonia in Youth and the Role of Internet-Related Behavior. Psychiatry International, 5(3), 447–457. https://doi.org/10.3390/psychiatryint503003173. Dresp-Langley, B. (2023). From reward to Anhedonia-Dopamine function in the global mental health context. Biomedicines, 11(9), 2469. https://doi.org/10.3390/biomedicines1109246974. Cheng, Y., Tseng, P., Lin, P., Chen, T., Stubbs, B., Carvalho, A. F., Wu, C., Chen, Y., & Wu, M. (2018). Internet addiction and its relationship with suicidal behaviors. The Journal of Clinical Psychiatry, 79(4). https://doi.org/10.4088/jcp.17r1176175. Chamarro, A., Díaz-Moreno, A., Bonilla, I., Cladellas, R., Griffiths, M. D., Gómez-Romero, M. J., & Limonero, J. T. (2024). Stress and suicide risk among adolescents: the role of problematic internet use, gaming disorder and emotional regulation. BMC Public Health, 24(1). https://doi.org/10.1186/s12889-024-17860-z76. Morese, R., Gruebner, O., Sykora, M., Elayan, S., Fadda, M., & Albanese, E. (2022). Detecting Suicide Ideation in the Era of Social Media: The Population Neuroscience Perspective. Frontiers in Psychiatry, 13. https://doi.org/10.3389/fpsyt.2022.65216777. Dadi, A. F., Dachew, B. A., & Tessema, G. A. (2024). Problematic internet use: A growing concern for adolescent health and well-being in a digital era. Journal of Global Health, 14. https://doi.org/10.7189/jogh.14.0303478. Sumner, S. A., Ferguson, B., Bason, B., Dink, J., Yard, E., Hertz, M., Hilkert, B., Holland, K., Mercado-Crespo, M., Tang, S., & Jones, C. M. (2021). Association of online risk factors with subsequent Youth Suicide-Related Behaviors in the US. JAMA Network Open, 4(9), e2125860. https://doi.org/10.1001/jamanetworkopen.2021.258679. Groenestein, E., Willemsen, L., Van Koningsbruggen, G. M., Ket, H., & Kerkhof, P. (2024). The relationship between fear of missing out, digital technology use, and psychological well-being: A scoping review of conceptual and empirical issues. PLoS ONE, 19(10), e0308643. https://doi.org/10.1371/journal.pone.030864380. Young, E., McCain, J. L., Mercado, M. C., Ballesteros, M. F., Moore, S., Licitis, L., Stinson, J., Jones, S. E., & Wilkins, N. J. (2024). Frequent Social Media Use and Experiences with Bullying Victimization, Persistent Feelings of Sadness or Hopelessness, and Suicide Risk Among High School Students — Youth Risk Behavior Survey, United States, 2023. MMWR Supplements, 73(4), 23–30. https://doi.org/10.15585/mmwr.su7304a381. Jacob, N., Evans, R., & Scourfield, J. (2017). The influence of online images on self‐harm: A qualitative study of young people aged 16–24. Journal of Adolescence, 60(1), 140–147. https://doi.org/10.1016/j.adolescence.2017.08.00182. The World Health Organization. (2024).Health Behaviour in School-aged Children (HBSC) study (September 25). https://www.who.int/europe/initiatives/health-behaviour-in-school-aged-children-(hbsc)-study83. Jiotsa, B., Naccache, B., Duval, M., Rocher, B., & Grall-Bronnec, M. (2021). Social Media Use and Body Image Disorders: Association between Frequency of Comparing One’s Own Physical Appearance to That of People Being Followed on Social Media and Body Dissatisfaction and Drive for Thinness. International Journal of Environmental Research and Public Health, 18(6), 2880. https://doi.org/10.3390/ijerph1806288084. Shannon, H., Bush, K., Villeneuve, P. J., Hellemans, K. G., & Guimond, S. (2022). Problematic social media use in adolescents and young Adults: Systematic review and Meta-analysis. JMIR Mental Health, 9(4), e33450. https://doi.org/10.2196/3345085. Montag, C., Lachmann, B., Herrlich, M., & Zweig, K. (2019). Addictive Features of Social Media/Messenger Platforms and Freemium Games against the Background of Psychological and Economic Theories. International Journal of Environmental Research and Public Health, 16(14), 2612. https://doi.org/10.3390/ijerph1614261286. Dresp-Langley, B. (2020). Children’s health in the digital age. International Journal of Environmental Research and Public Health, 17(9), 3240. https://doi.org/10.3390/ijerph1709324087. World Health Organization: WHO. (2019, December 19). Mental health. https://www.who.int/health-topics/mental-health#tab=tab_288. The World Health Organization. (2024, December 11). Newsroom. https://www.who.int/europe/news-room/25-09-2024-teens--screens-and-mental-health89. Russell, N. J. C. (2010). Milgram’s obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology, 50(1), 140–162. https://doi.org/10.1348/014466610x49220590. Gibson, S. (2018). Obedience without orders: Expanding social psychology’s conception of ‘obedience.’ British Journal of Social Psychology, 58(1), 241–259. https://doi.org/10.1111/bjso.1227291. Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: compliance and conformity. Annual Review of Psychology, 55(1), 591–621. https://doi.org/10.1146/annurev.psych.55.090902.14201592. Korte, M. (2020). The impact of the digital revolution on human brain and behavior: where do we stand? Dialogues in Clinical Neuroscience, 22(2), 101–111. https://doi.org/10.31887/dcns.2020.22.2/mkorte93. Forster, E. M. (1909). The machine stops. The Oxford and Cambridge Review (November 1909).94. Rozgonjuk, D., Sindermann, C., Elhai, J. D., & Montag, C. (2020). Fear of Missing Out (FoMO) and social media’s impact on daily-life and productivity at work: Do WhatsApp, Facebook, Instagram, and Snapchat Use Disorders mediate that association? Addictive Behaviors, 110, 106487. https://doi.org/10.1016/j.addbeh.2020.10648795. Beyens, I., Frison, E., & Eggermont, S. (2016). “I don’t want to miss a thing”: Adolescents’ fear of missing out and its relationship to adolescents’ social needs, Facebook use, and Facebook related stress. Computers in Human Behavior, 64, 1–8. https://doi.org/10.1016/j.chb.2016.05.08396. Reicher, S. D., Spears, R., & Postmes, T. (1995). A social Identity model of deindividuation phenomena. European Review of Social Psychology, 6(1), 161–198. https://doi.org/10.1080/1479277944300004997. Liao, C., Squicciarini, A., Griffin, C., & Rajtmajer, S. (2016). A hybrid epidemic model for deindividuation and antinormative behavior in online social networks. Social Network Analysis and Mining, 6(1). https://doi.org/10.1007/s13278-016-0321-598. Merrilees, C. E., Taylor, L. K., Goeke‐Morey, M. C., Shirlow, P., Cummings, E. M., & Cairns, E. (2013). The Protective Role of Group Identity: Sectarian antisocial behavior and adolescent emotion problems. Child Development, 85(2), 412–420. https://doi.org/10.1111/cdev.1212599. Chung, J. E. (2018). Peer influence of online comments in newspapers: Applying Social Norms and the Social Identification Model of Deindividuation Effects (SIDE). Social Science Computer Review, 37(4), 551–567. https://doi.org/10.1177/0894439318779000100. Weiß, M., Gollwitzer, M., & Hewig, J. (2024). Social influence and external feedback control in humans. F1000Research, 12, 438. https://doi.org/10.12688/f1000research.133295.3101. Anderson, A. A., Brossard, D., Scheufele, D. A., Xenos, M. A., & Ladwig, P. (2013). The “Nasty Effect:” Online Incivility and Risk Perceptions of Emerging Technologies.Journal of Computer-Mediated Communication, 19(3), 373–387. https://doi.org/10.1111/jcc4.12009102. Panizza, F., Vostroknutov, A., & Coricelli, G. (2021). How conformity can lead to polarised social behaviour. PLoS Computational Biology, 17(10), e1009530. https://doi.org/10.1371/journal.pcbi.1009530103. Smith, R. T., & True, G. (2014). Warring identities. Society and Mental Health, 4(2), 147–161. https://doi.org/10.1177/2156869313512212104. Bonne, O., Grillon, C., Vythilingam, M., Neumeister, A., & Charney, D. S. (2004). Adaptive and maladaptive psychobiological responses to severe psychological stress: implications for the discovery of novel pharmacotherapy. Neuroscience & Biobehavioral Reviews, 28(1), 65–94. https://doi.org/10.1016/j.neubiorev.2003.12.001105. Flor H. (2008). Maladaptive plasticity, memory for pain and phantom limb pain: review and suggestions for new therapies. Expert review of neurotherapeutics, 8(5), 809–818. https://doi.org/10.1586/14737175.8.5.809106. Von Bernhardi, R., Bernhardi, L. E., & Eugenín, J. (2017). What is neural plasticity? Advances in Experimental Medicine and Biology, 1–15. https://doi.org/10.1007/978-3-319-62817-2_1107. Deppermann, S., Storchak, H., Fallgatter, A., & Ehlis, A. (2014). Stress-induced neuroplasticity: (Mal)adaptation to adverse life events in patients with PTSD – A critical overview. Neuroscience, 283, 166–177. https://doi.org/10.1016/j.neuroscience.2014.08.037108. Ali, Z., Janarthanan, J., & Mohan, P. (2024). Understanding Digital dementia and Cognitive Impact in the current Era of the Internet: a review. Cureus. https://doi.org/10.7759/cureus.70029109. Ding, K., Shen, Y., Liu, Q., & Li, H. (2023). The Effects of digital addiction on brain function and structure of children and Adolescents: A scoping review. Healthcare, 12(1), 15. https://doi.org/10.3390/healthcare12010015110. Hyman, S. E., Malenka, R. C., & Nestler, E. J. (2006). NEURAL MECHANISMS OF ADDICTION: The role of Reward-Related Learning and Memory. Annual Review of Neuroscience, 29(1), 565–598. https://doi.org/10.1146/annurev.neuro.29.051605.113009111. el‐Guebaly, N., Mudry, T., Zohar, J., Tavares, H., & Potenza, M. N. (2011). Compulsive features in behavioural addictions: the case of pathological gambling. Addiction, 107(10), 1726–1734. https://doi.org/10.1111/j.1360-0443.2011.03546.x112. Trotzke, P., Müller, A., Brand, M., Starcke, K., & Steins-Loeber, S. (2020). Buying despite negative consequences: Interaction of craving, implicit cognitive processes, and inhibitory control in the context of buying-shopping disorder. Addictive Behaviors, 110, 106523. https://doi.org/10.1016/j.addbeh.2020.106523113. Volkow, N. D., Michaelides, M., & Baler, R. (2019). The neuroscience of drug reward and addiction. Physiological Reviews, 99(4), 2115–2140. https://doi.org/10.1152/physrev.00014.2018114. Blum, K., McLaughlin, T., Bowirrat, A., Modestino, E. J., Baron, D., Gomez, L. L., Ceccanti, M., ² Braverman, E. R., Thanos, P. K., Cadet, J. L., Elman, I., Badgaiyan, R. D., Jalali, R., Green, R., ² Simpatico, T. A., Gupta, A., & Gold, M. S. (2022). Reward Deficiency Syndrome (RDS) Surprisingly Is Evolutionary and Found Everywhere: Is It "Blowin' in the Wind"?. Journal of personalized medicine, 12(2), 321. https://doi.org/10.3390/jpm12020321115. Koob, G. F., & Le Moal, M. (2008). Addiction and the brain antireward system. Annual review of psychology, 59, 29–53. https://doi.org/10.1146/annurev.psych.59.103006.093548116. Blum, K., Gardner, E., Oscar-Berman, M., & Gold, M. (2012). "Liking" and "wanting" linked to Reward Deficiency Syndrome (RDS): hypothesizing differential responsivity in brain reward circuitry. Current pharmaceutical design, 18(1), 113–118. https://doi.org/10.2174/138161212798919110117. Gardner E. L. (2011). Addiction and brain reward and antireward pathways. Advances in psychosomatic medicine, 30, 22–60. https://doi.org/10.1159/000324065118. Luigjes, J., Lorenzetti, V., De Haan, S., Youssef, G. J., Murawski, C., Sjoerds, Z., Van Den Brink, W., Denys, D., Fontenelle, L. F., & Yücel, M. (2019). Defining compulsive behavior. Neuropsychology Review, 29(1), 4–13. https://doi.org/10.1007/s11065-019-09404-9119. Skinner, B. F. (1971). Beyond Freedom and Dignity. New York : Knopf.120. Lee, D. S., Clement, A., Grégoire, L., & Anderson, B. A. (2024). Aversive conditioning, anxiety, and the strategic control of attention. Cognition & Emotion, 1–9. https://doi.org/10.1080/02699931.2024.2413360121. Hebb, D. O. (1949). The Organization of Behavior, Wiley: New York.122. Brown, R. E., & Milner, P. M. (2003). The legacy of Donald O. Hebb: more than the Hebb synapse. Nature reviews. Neuroscience, 4(12), 1013–1019. https://doi.org/10.1038/nrn1257123. Seligman M. E. (1972). Learned helplessness. Annual review of medicine, 23, 407–412. https://doi.org/10.1146/annurev.me.23.020172.002203124. Peterson, C., & Seligman, M. E. P. (1983). Learned helplessness and victimization. Journal of Social Issues, 39(2), 103–116. https://doi.org/10.1111/j.1540-4560.1983.tb00143.x125. Morgan, T.A. (2020). Learned Helplessness. In: Gellman, M.D. (eds) Encyclopedia of Behavioral Medicine. Springer, Cham. https://doi.org/10.1007/978-3-030-39903-0_1201126. Ye, Y., Li, Y., Wu, X., & Zhou, X. (2024). Longitudinal Associations Between Depression, Suicidal Ideation, and Lack of Certainty in Control among Adolescents: Disaggregation of Within-Person and Between-Person Effects. The Journal of adolescent health : official publication of the Society for Adolescent Medicine, 75(2), 288–297. https://doi.org/10.1016/j.jadohealth.2024.03.008127. Santé Publique France. (2024).Bulletin épidémiologique hebdomadaire. (s. d.). https://beh.santepubliquefrance.fr/beh/2024/3/2024_3_1.html128. Boniel-Nissim, M., Marino, C., Galeotti, T., Blinka, L., Ozoliņa, K., Craig, W., Lahti, H., Wong, S. L., Brown, J., Wilson, M., Inchley, J., & Regina, V. D. E. (2024, 25 septembre). A focus on adolescent social media use and gaming in Europe, central Asia and Canada : Health Behaviour in School-aged Children international report from the 2021/2022 survey. https://iris.who.int/handle/10665/378982
Mental health in India is often shrouded in silence and stigma. Despite the growing awareness, many still view mental health issues as personal...
Addiction is a widespread and non-discriminatory mental health condition that affects millions of people across the world.
“YOUNG PEOPLE need to be trained to be good digital citizens to help combat cyberbullying,” Justice Frank Seepersad told students, parents and...
With the growing threat of scams in today’s digital environment, it is critical to investigate the evolution of scams through technology, the...
Russell Brand has shut down his addiction and mental health charity, UK media revealed yesterday. The Sun newspaper reports that the national Charity...
Position: Health and Social Service Worker IMMEDIATE SUPERVISOR: Recovery Health Nurse POSITION SUMMARY: Directly reporting to the Recovery Health...
Cameco is teaming up with the Ryan Huffman Foundation as presenting sponsor of their 4th Annual Charity Golf Tournament. The premier event of the...
A kind hearted banker, scholar and philanthropist, Dr. [Mrs] Ayobami Akin- Desmond, has shone the bright light of hope and motivation on the...
The relationship between sexual violence and caste in India exposes ingrained social injustices that have persisted for decades despite the...
In recent years, conversations surrounding mental health have gained significant traction in Canada. However, one aspect of mental health that often...