The UK Addiction Treatment Group has reported a rise in parents expressing concern about their children’s dependency on social media apps.
The treatment group has stated that TikTok is “especially addictive”, having received an uptick in phone call enquiries surrounding the adverse effects of “TikTok brain” on children around the country.
“TikTok brain” refers to the negative cognitive effects of using the platform. In particular, concerns have been raised around the dangers of the infinite amount of personalised short videos.
A recent study shows that personalised TikTok videos activate reward systems in the brain and create higher addiction levels when compared to non-personalised algorithms. Symptoms have been shown to include attention deficit, irregular sleep patterns and lower levels of self-control, with 5.9% of respondents showing significantly problematic use.
It is obvious that short-length social media videos have changed how we consume media. The world is now dominated by quick-hitting videos accompanied by flashy editing styles, highly stimulating dramatic content and catchy choruses designed especially for TikTok.
Increasingly, even YouTube videos are beginning to be thought of as long-form content, with other apps such as Facebook and Instagram redesigning their platforms to cash in on the craze of quick-hitting content.
While many might enjoy gathering around trends and catchy TikTok songs and dances, such excessive amounts of immediate overstimulation have presumably affected us all in one way or another. When most, if not all, of the Western world have access to free procrastination devices with unlimited personalised distractions to scroll through, it is inevitable that we have short attention spans and cannot focus.
This societal shift towards quick-hitting, personalised algorithms does not only present danger in terms of attention deficits but also in our perceptions of self and the world.
For example, ‘doom-scrolling’ threatens whole generations with the effects of immediate and constant access to news cycles and unlimited information. 24/7 reports of war, famine, disease, and other societal ills are fed directly to our news feeds only to fill us with a continual sense of dread and anxiety.
Personalised algorithms arguably reflect and self-fulfil our own negative perceptions, creating a cyclical process of polarisation and self-confirmation. If an insecure individual uses TikTok, they would presumably interact with content engaging with the effects of insecurity, or content that plays on the user’s insecurity. TikTok would then of course fill the user’s ‘for you’ page with infinite amount of content on insecurity, creating an endless cycle of self-confirmation. At the same time, the algorithm will present the user with many reels of famous, wealthy and attractive individuals to compare themselves to, continually worsening their insecurity and warping their view of reality.
This algorithmic ‘bubble’ threatens more than just our mental and emotional states, however.
Self-fulfilling thoughts of insecurity affect only the individual, whereas political videos fuel misinformation and radicalisation, putting pressure on our political institutions.
When anyone can be presented with infinite rabbit holes of apparent truth without any regard for fact-checking, thousands of individuals are already influenced by misinformed narratives that fill their personalised feeds, as was seen in the previous US Presidential Election, during COVID-19 lockdowns, and more recently, during the Ukrainian-Russian war.
TikTok now has over 1.5 billion total downloads, and many measures have been put in place to minimise its negative effects. For example, TikTok’s Chinese sister company, Douyin, caps use time to 40 minutes and restricts access to the app from 10pm to 6am to prevent sleeping disorders.
In terms of content, personalised algorithms are banned to prevent issues previously discussed. Revealing clothing is completely banned from the platform, and the trend of ad-sponsored influencers is non-existent. Meanwhile, to replace such over-stimulatory content, Douyin has invested in tens of thousands of educational videos for young people, covering various themes across science and the humanities.
While such measures may potentially reduce harm such as the development of “TikTok brain”, the West does not operate as China does and favours its freedoms.
However, herein lies the cultural issue presented to us: where should we draw the line between freedom and regulation?
We should not completely crush free expression and individual agency. Not everyone wants to sit in bed after work or school to watch educational videos.
However, social media apps such as TikTok should be pressured to combat the addictive nature of their algorithm and to clamp down on the negative consequences personalised algorithms can cause.
Image credit: Nik via Unsplash