Three months ago I decided that it was no longer morally responsible to use Twitter and deleted my account. Missing the ‘conversations’ on social media and not having a platform to inform the wider world about the latest Bournemouth University Research Blog, I joined Bluesky a week or two later. The first thing to note is that Bluesky is: (1) much quieter; (2) much more civilised; and (3) much more North American based. Moreover, at a personal level, I have not really managed to accumulate that many followers, yesterday the total stood at just ten!
There are growing calls for young people under the age of 16 to be banned from having smartphones or access to social media. The Smartphone Free Childhood WhatsApp group aims to normalise young people not having smartphones until “at least” 14 years old. Esther Ghey, mother of the murdered teenager Brianna Ghey, is campaigning for a ban on social media apps for under-16s.
The concerns centre on the sort of content that young people can access (which can be harmful and illegal) and how interactions on these devices could lead to upsetting experiences.
However, as an expert in young people’s use of digital media, I am not convinced that bans at an arbitrary age will make young people safer or happier – or that they are supported by evidence around young people’s use of digital technology.
In general, most young people have a positive relationship with digital technology. I worked with South West Grid for Learning, a charity specialising in education around online harm, to produce a report in 2018 based upon a survey of over 8,000 young people. The results showed that just over two thirds of the respondents had never experienced anything upsetting online.
Large-scale research on the relationship between social media and emotional wellbeing concluded there is little evidence that social media leads to psychological harm.
Sadly, there are times when young people do experience upsetting digital content or harm as a result of interactions online. However, they may also experience upsetting or harmful experiences on the football pitch, at a birthday party or playing Pokémon card games with their peers.
It would be more unusual (although not entirely unheard of) for adults to be making calls to ban children from activities like these. Instead, our default position is “if you are upset by something that has happened, talk to an adult”. Yet when it comes to digital technology, there seems to be a constant return to calls for bans.
We know from attempts at prevention of other areas of social harms, such as underage sex or access to drugs or alcohol, that bans do not eliminate these behaviours. However, we do know that bans will mean young people will not trust adults’ reactions if they are upset by something and want to seek help.
I recall delivering an assembly to a group of year six children (aged ten and 11) one Safer Internet Day a few years ago. A boy in the audience told me he had a YouTube channel where he shared video game walkthroughs with his friends.
I asked if he’d ever received nasty comments on his platform and if he’d talked to any staff about it at his school. He said he had, but he would never tell a teacher because “they’ll tell me off for having a YouTube channel”.
This was confirmed after the assembly by the headteacher, who said they told young people not to do things on YouTube because it was dangerous. I suggested that empowering what was generally a positive experience might result in the young man being more confident to talk about negative comments – but was met with confusion and repetition of “they shouldn’t be on there”.
Need for trust
Young people tell us that two particularly important things they need in tackling upsetting experiences online are effective education and adults they can trust to talk to and be confident of receiving support from. A 15 year old experiencing abuse as a result of social media interactions would likely not be confident to disclose if they knew the first response would be, “You shouldn’t be on there, it’s your own fault.”
There is sufficient research to suggest that banning under-16s having mobile phones and using social media would not be successful. Research into widespread youth access to pornography from the Children’s Commissioner for England, for instance, illustrates the failures of years of attempts to stop children accessing this content, despite the legal age to view pornography being 18.
The prevalence of hand-me-down phones and the second hand market makes it extremely difficult to be confident that every mobile phone contract accurately reflects the age of the user. It is a significant enough challenge for retailers selling alcohol to verify age face to face.
The Online Safety Act is bringing in online age verification systems for access to adult content. But it would seem, from the guidance by communications regulator Ofcom, that the goal is to show that platforms have demonstrated a duty of care, rather than being a perfect solution. And we know that age assurance (using algorithms to estimate someone’s age) is less accurate for under-13s than older ages.
By putting up barriers and bans, we erode trust between those who could be harmed and those who can help them. While these suggestions come with the best of intentions, sadly they are doomed to fail. What we should be calling for is better understanding from adults, and better education for young people instead.
Among the bill’s key aims is to ensure it is more difficult for young people (under the age of 18) to access content that is considered harmful – such as pornography and content that promotes suicide or eating disorders. It places a “duty of care” on tech companies to ensure their users, especially children, are safe online. And it aims to provide adults with greater control over the content they interact with, for example if they wish to avoid seeing sexual content.
The legislation puts the onus on service providers (such as social media companies and search engines) to enforce minimum age requirements, publish risk assessments, ensure young people cannot access harmful content (while still granting adults access) and remove illegal content such as self-harm and deepfake intimate images.
The government has said the new law will make the UK the “safest place to be online”, but this isn’t something that can happen overnight. Ofcom, the UK’s communications regulator, is in charge of turning the legislation into something they can actually regulate. By the regulator’s own calculations, this process will take months.
There are many who view the bill as poorly thought out, with potential overreach that could conflict with fundamental human rights. The Open Rights Group has raised serious concerns around privacy and freedom of expression.
The challenges of regulating the internet
There are also aspects of the bill that are, currently, technically impossible. For example, the expectation that platforms will inspect the content of private, end-to-end encrypted messages to ensure that there is no criminal activity (for example, sexual communication with children) on their platforms – this cannot be done without violating the privacy afforded by these technologies.
If platforms are expected to provide “back doors” to technology designed to ensure that communications are private, they may contradict privacy and human rights law. At present, there is no way to grant some people access to encrypted communications without weakening the security of the communications for everyone. Some platforms have said they will leave the UK if such erosions in encryption are enacted.
There is a rich history of governments wrongly assuming encryption can be accessed that is not being reflected upon in current debates.
Furthermore, age verification and estimation technology is not yet foolproof, or indeed accurate enough to determine someone’s exact age. Yoti, a leading age verification and estimation technology provider has stated that their technology could correctly predict a user aged 13-17 being “under 25” 99.9% of the time. It’s entirely possible that many young adults would be falsely identified as being minors – which might prevent them from accessing legal content. There have been previous attempts to legislate age verification for pornography providers (such as in the 2017 Digital Economy Act), which the UK repealed due to the complexities of implementation.
While technology continues to develop, it seems unlikely there will be perfect implementations anytime soon for these issues.
What is ‘harmful’ content?
The other major argument against the bill is that, even with the best of intentions, the protections designed to keep children safe could have a chilling impact on freedom of speech and freedom of expression.
Previous versions of the bill placed expectations on platforms to explicitly tackle “legal but harmful” content for adults. This was defined at the time as content that would be viewed as offensive by a “reasonable person of ordinary sensibilities”. While these provisions are now removed, there is still a great deal of intangibility around what it means to protect children from “harmful” content.
Outside of illegal content, who decides what is harmful?
Platforms will be expected to make rules around content they deem might be harmful to certain users, and censor it before it can be published. As a result, this might also prevent children from accessing information related to gender and sexuality that could be caught up in the filtering and monitoring systems platforms will put in place. Without a clear definition of what harmful content is, it will be down to platforms to guess – and with moving goalposts, depending on the government of the day.
Young people want adult support in dealing with what they see online – not regulation banning them from seeing it. Prostock-studio/Shutterstock
What would actually make the internet safe?
As someone who researches the ethics of technology and the habits of young people online, my concern is that this bill will be viewed as the solution to online harms – it clearly is not.
These measures, if effectively implemented, will make it more difficult for young people to stumble across content meant for adults, but they will not prevent the determined teenager. Furthermore, a lot of intimate content shared by young people is shared between peers and not accessed via platforms, so this legislation will do nothing to tackle this.
I often to speak to young people about what help they would like to be safer online. They rarely ask for risk assessments and age verification technologies – they want better education and more informed adults to help them when things go wrong. Far better, young people tell me, to provide people with the knowledge to understand the risks, and how to mitigate them, rather than demanding they are stopped by the platforms.
I am reminded of a quote from the American cybersecurity researcher Marcus Ranum: “You can’t solve social problems with software.”
When I told my family and friends I intended to pursue a PhD researching HIV awareness among married women in Libya, my home country, the reaction was not encouraging: “You’d be lucky to even get members of your family to respond,” said one.
They weren’t being unnecessarily pessimistic but rather managing my expectations, considering I was not only researching HIV awareness in a conservative country often perceived oppressive, but I was also looking to recruit women.
Historically, Libyan women have been placed under severe social and cultural constraints that rendered them difficult to reach. Libya is shaped by and works within a patriarchal society where simply approaching women on such a taboo topic as HIV/Aids – which in Libya is often associated with immoral practices such as pre or extra-marital sex, substance abuse and homosexuality – made the research even more complex.
I knew that the lack of confidentiality and the fear of being stigmatised were going to be a problem. So I needed a method that would provide a platform whereby the women can respond to the survey without prying eyes.
This is where the power of online surveys comes in. Using an anonymous, self-completed questionnaire reduces the effect of the topic’s sensitivity and helps reduce people’s fear of the possible social stigma attached to those self-disclosures.
But online surveys have their limitations. In Libya, these include poor telecommunication infrastructure, especially away from the large cities, as well as the high cost of internet access and the relatively poor service there. But the fast-growing smartphone market is encouraging and facilitating internet use in the country. According to the most recent available figures there were 3.14 million internet users in Libya in 2023 – approximately 45.9% of the population.
My questionnaire included five main sections. I asked for some limited demographic information (age, city, educational level, employment status). There were also sections on HIV/Aids related knowledge, responsents’ perceptions of HIV risk, their attitude toward HIV and where they sourced healthcare information. I took particular care to ensure that I was gathering the maximum amount of information while remaining sensitive to Libya’s religious and social contexts.
Armed with approval from the university’s research ethics committee, I sent out a recruitment post with the questionnaire, mainly to family and friends in the Libyan diaspora in the UK and the US. The principle aim of this pilot study was to ensure that the wording, language and questions were understandable and that the mechanics of the survey functioned correctly. Within a month I’d received more than 168 complete questionnaires, which reassured me that sharing the survey with family and friends and asking them to forward the link to their various social and family networks would be the ideal approach for my main research on Libyan women in Libya.
What is ‘wasta’?
Libya has a population of around 7.1 million which is heavily skewed towards large networked tribes and well-established families, meaning the degree of separation across the whole of society is quite small. This has traditionally meant that the best way to get things done is by using these big family or tribal networks. This is known as “wasta”.
Wasta is a common practice of calling on personal connections for assistance. It’s a social norm in most Arab countries, defined by one academic as “a personal exchange system between members of society that is entrenched in the tribal structure of the country”. The concept has been tied to a tribal tradition which obliges those within the group to provide assistance in the same network.
I have a large family in Libya which straddles two different tribes, as well as family friends, so I was confident that wasta was the best approach to take. I sent the link to all the members of my wasta network through WhatsApp and asked them to forward it onto their friends and extended family. I also posted on Twitter and reached out to various Facebook pages. I only needed 323 complete questionnaires and I was confident that method would yield the best response.
Days went by and I only had a handful of responses. Much of the feedback I received from family members was worrying. People said they had exhausted their networks without much success. Clearly, recruitment using wasta wasn’t working. So I decided to fall back on my experiences of working in marketing and created a targeted post, aimed at “women, ages 18-65+ living in Libya, married, divorced, separated and widowed”. In direct contrast to wasta, this didn’t rely on who I know.
Social media has grown massively in popularity as a research tool in recent years. So, bearing in mind that Facebook is the most popular social media platform in Libya, with more than 6 million users, I created a Facebook page with the title, in Arabic: دراسة النساء الليبيات المتزوجات (Research on Libyan married women). I linked in papers I had published in the past (also in Arabic) and the recruitment poster below.
The recruitment poster used by the author in her Facebook recruitment campaign. Abier Hamidi, Author provided
I launched the post and the response was immediate, with replies and completed questionnaires and supportive comments coming in fairly rapidly to start with. But within a few days the response rate slowed down and still I wasn’t anywhere near my response target. Then I realised my mistake. The initial post targeting women who are married, divorced, separated or widowed hadn’t taken into account that the majority of women didn’t tend to include their marital status on Facebook. This meant I was only reaching a small percentage of my target audience.
I removed the status and the reach shot up. In six months, my post reached 446,906 women in Libya. The stats were impressive: 59,422 engagements, 1,549 reactions and 703 comments. I received more than 1,000 completed questionnaires.
In the end, this showed me that while for certain things, wasta can yield results, for an issue such as this, Libyan women wanted to ensure their anonymity and the confidentiality of their responses. Social media, which doesn’t mandate use of real names or photographs, was able to offer this in a way that extended family and friends, naturally, never could.
“I don’t listen to adults when it comes to this sort of thing”, a 17-year-old told me.
We were discussing how digital technology affects his life, as part of a long-term project in the west of England that I carried out with colleagues to explore young people’s mental health – including the impact of digital technology on their emotional wellbeing.
There is a widespread perception that being online is bad for young people’s mental health. But when we began the project, we quickly realised that there was very little evidence to back this up. The few in-depth studies around social media use and children’s mental health state that impacts are small and it is difficult to draw clear conclusions.
We wanted to find out if and how young people’s wellbeing was actually being affected in order to produce resources to help them. We talked to around 1,000 young people as part of our project. What we found was that there was a disconnect between what young people were worried about when it came to their online lives, and the worries their parents and other adults had.
One of the things young people told us was that adults tended to talk down to them about online harms, and had a tendency to “freak out” about these issues. Young people told us that adults’ views about online harms rarely reflected their own. They felt frustrated that they were being told what was harmful, rather than being asked what their experiences were.
Common concerns
The concerns the young people told us they had included bullying and other forms of online conflict. They were afraid of missing out on both online group interactions and real-life experiences others were showing in their social media posts. They worried that their posts were not getting as many likes as someone else’s.
But these concerns are rarely reflected in the media presentation of the harsher side of online harms. This has a tendency to explore the criminal side of online abuse, such as grooming, the prevalence of online pornography. It also tends to describe social media use in similar language to that used to talk about addiction.
It is no surprise, therefore, that parents might approach conversations with young people with excessive concern and an assumption their children are being approached by predators or are accessing harmful or illegal content.
Young people and their parents’ concerns about online safety may not match up. George Rudy/Shutterstock
We have run a survey with young people for several years on their online experiences. Our latest analysis was based on 8,223 responses. One of the questions we ask is: “Have you ever been upset by something that has happened online?”. While there are differences between age groups, we found the percentage of those young people who say “yes” is around 30%. Or, to put it another way, more than two-thirds of the young people surveyed had never had an upsetting experience online.
Meanwhile, the online experiences reported by the 30% who reported being upset often didn’t tally with the extreme cases reporting in the media. Our analysis of responses showed that this upset is far more likely to come from abusive comments by peers and news stories about current affairs.
This disconnect means that young people are reluctant to talk to adults about their concerns. They are afraid of being told off, that the adult will overreact, or that talking to an adult might make the issue worse. The adults they might turn to need to make it clear this won’t happen and that they can help.
How to help
There are three things that young people have consistently told us over the duration of the project, and in our previous work, that adults can do to help. They are: listen and understand – don’t judge.
Conversations are important, as is showing an interest in young people’s online lives. However, those conversations do not have to be confrontational. If a media story about young people and online harms causes parents concern or alarm, the conversation does not have to start with: “Do you do this?” This can result in a defensive response and the conversation being shut down. It would be far better to introduce the topic with: “Have you seen this story? What do you think of this?”
Working in partnership with others, such as schools, is also important. If a parent has concerns, having a conversation with tutors can be a useful way of supporting the young person. The tutor might also be aware that the young person is not acting like themselves, or might have noticed changes in group dynamics among their peer group.
But, even if they are not aware of anything, raising concerns with them – and discussing from where those concerns arise – will mean both parents and school are focused in the same direction. It is important that young people receive both consistent messages and support. And schools will also be able to link up with other support services if they are needed.
Ultimately, we want young people to feel confident that they can ask for help and receive it. This is particularly important, because if they do not feel they can ask for help, it is far less likely the issue they are facing will be resolved – and there is a chance things might become worse without support.
Dr Sarah Hodge writes for The Conversation about research asking teachers about their experiences of how young people use technology and the effect it has on them…
What teachers think of children and young people’s technology use
Mobile phones, computers, social media and the internet are part of the daily lives of children and young people, including at school. Concerns over the risks of too much screen time or online activity for children and young people have been tempered by the reality of technology use in education and leisure.
The experience of life during the pandemic, when much schooling and socialising went online, has also changed attitudes to technology use. UK communications regulator Ofcom reported that in 2020 only a minority of children and young people did not go online or have internet access.
Teachers are in a unique position when it comes to assessing how children and young people use technology such as mobile phones and the effect it has on them. They see how children and young people use technology to learn, socialise, and how it affects their relationships with their peers.
Together with colleagues, I carried out in-depth research with eight teachers from different backgrounds, ages, years of professional experience, and type of educational institution from across the UK. We asked the teachers about their experiences of children and young people’s use of technology: how they thought it affected their emotions, behaviour and learning both before and during the pandemic.
The teachers talked about the importance of technology as a tool in the classroom and learning and the opportunities it provides for creativity. As one teacher put it:
It is what the children are used to, and it engages them more – it is a useful tool that can add to our teaching.
Empowered through tech
We also found that teachers were optimistic about the role technology could play in empowering children and young people. One said:
They use social networking sites to learn from one another and to express their beliefs – even children who are quiet in the classroom, they find it easier to express themselves online.
They thought that children and young people could learn to understand and recognise the signs of unhealthy technology use from their own emotions and behaviour when using technology. This included showing empathy and care through noticing how they and others feel. One teacher said children and young people were becoming more compassionate and offering their help to friends who were showing signs of distress through their online posts.
However, some teachers did express concern about how interacting online affected children and young people’s social skills. One teacher said:
They don’t know how to have proper conversations with their friends. They don’t know how to resolve anything because it’s easy to be mean behind a screen and not have to resolve it.
Another questioned how technology use was affecting play. They said:
They don’t know how to play and actually you will see groups of them surrounding a phone.
Teachers also pointed to the problems of disengaging from technology use. One teacher stated:
The parents have ongoing battles trying to pull their children away from screens and the next day they are exhausted, and they find it difficult to get them into school because the children are so tired.
Teachers discussed how they encouraged their pupils to take part in team sports as a way to encourage face-to-face communication and conflict resolution. However, while some online safety and internet use is covered at school, guidance on how to live with technology, be resilient towards challenges and use technology in a balanced could be more explicitly taught.
The PHSE Association – a national body for personal, social, health and economic education – offers guidance on online safety and skills for the curriculum, such as the potential harms of pornography but there is much scope to develop a broader approach to supporting healthy technology use.
Teachers felt that there should be more discussion of online behaviour in the classroom. Daisy Daisy/Shutterstock
In class, this could be as simple as working on how to make informed decisions about technology use – such as being more cautious if online activity involves talking with strangers, or recognising if spending time online is a large time commitment. It could include using social media posts as real-world examples to encourage childrenand young people to be informed, critical and resilient towards content they are likely to see and interact with.
Teachers felt that adding online safety to the curriculum would be valuable, as would providing opportunities for children and young people to talk about their experiences and content of technology. One teacher said:
There are predators out there and we do discuss online safety issues with my students, but some stuff should be part of the curriculum as well, and parents should access it too.
The teachers highlighted that they, too, needed support in their knowledge about technology and suggested this should be more incorporated into teacher training. One teacher said:
We need to keep up with the times and if there is something this pandemic taught us, is that not all of us are keeping up… one-off training is not adequate, schools need to invest in continuous professional development activities related to technology.
Children and young people can get significant benefits from technology, but it has risks, too. More attention to how teachers can address this in school can be an invaluable way to help children and young people understand and balance their time online.
Social media offers the opportunity to get your research seen by millions, gain valuable insights and facilitate real involvement. However, there are also challenges and dangers.
Sign up for our new training session for BU researchers, part of the RKEDF, ‘Using social media to engage the public with your research‘, to learn how navigate the rapidly changing online world of social media for the best results.
Workshop
Date
Time
Location
Using social media to engage the public with your research
Tuesday, 21st June 2022
14:00 – 15:00
Online
In this session, you’ll learn:
The key social media platforms you should know about, and how to use them
What social media could do for you and your research
Best practice – and common pitfalls – when communicating online
How to find your ‘voice’ and to portray the right image on social media
How social media can enable genuine two-way engagement and collaboration with the public
How to define, measure and evaluate the success of your social media engagement.
This session is aimed at academics at any level.
How to sign up
In the workshop booking form, select the session ‘Using social media to engage the public with your research‘ from the dropdown list, enter your details and you’ll get an invitation by email.
Anders is one of the world’s leading researchers of online political communication. He is also a truly interdisciplinary scholar and has mastered the methods of extracting and analysing social media data from all of the major platforms. His talk will reflect on his 10+ years of working on social media platforms, and how the rules and methods of collecting data have changed.
The subject of Anders’ session is: Social media analysis – possibilities and pitfalls
What types of data can we get from different social media platforms, such as Facebook, Instagram or Twitter? I will present the possibilities we currently have, and I will also touch upon some interesting opportunities for analysis of social media data.
Two resources are now available on the NIHR Learn website for researchers –
Patient and Public Involvement: Inspiring New Researchers – an online course developed by the Department of Health and NIHR. It is intended to help researchers to understand the benefits of good Patient and Public involvement into their research.
Social Media Toolkit – a combination of practical resources on how to get started and real case studies from how colleagues across the NIHR Clinical Research Network are currently using social media to support their work.
To access the above resources you will need to have access to the NIHR Learn website. Once you have an account select the tab ‘Health Research Innovations’ and then click on ‘NIHR Endorsed Learning’. Both courses are free and do not require an enrolment key.
Remember – support and guidance is on offer at BU if you are thinking of conducting clinical research, whether in the NHS, private healthcare or social care – get in touch with Research Ethics. You can also take a look at the Clinical Governance blog for resources and updates.
Dr Emma Kavanagh and Dr Lorraine Brown (FoM) have just published a paper entitled ‘Towards a research agenda for examining online gender-based violence against women academics’. Work on this topic was inspired by Emma’s research on the online violence experienced by female athletes and further influenced by work on sexual harassment by the Women’s Academic Network (WAN), which ran a symposium on the topic in June this year. The writing of the paper was supported through writing retreats organised by WAN. The focus of this paper builds upon the critical mass of research being conducted exploring inter-personal violence and gender-based violence in sporting spaces by members of the Department of Sport and Event Management, and the work of the Bournemouth University Gender Research Group.
There is an increasing call for academics to promote their research and enhance their impact through engaging in digital scholarship through social media platforms. While there are numerous benefits concerned with increasing the reach of academic work using virtual platforms, it has been widely noted that social media sites, such as Twitter, are spaces where hostility towards women and hate speech are increasingly normalised. In their paper, Emma and Lorraine provide a review of the current literature concerning violence toward women academics online and further provide suggestions for a research agenda which aims to understand the phenomena of gender-based violence more clearly and work toward safeguarding (female) academics engaging in digital scholarship. As they rightly state: “institutions such as universities that are increasingly placing pressure on women academics to engage in virtual platforms to disseminate their work have a responsibility in the prevention and protection of harm”.
Dr Elvira Bolat, Dr Parisa Gilani, Samreen Ashraf and Dr Nasiru Taura from the Faculty of Management will be hosting an event entitled ‘Influencers for Good as part of the ESRC Festival of Social Science on the 8th November 6-8pm at South Coast Roast, Bournemouth. We’ll be drawing upon our research to discuss responsibility, how to understand your own behaviour and identity, and the power dynamics between influencers and followers. Importantly – we’ll be discussing how influencers can make the world a better place. The event is free to attend but registration is required via
In December 2018 BBC World Service has broadcasted its new documentary titled “When you tire of tech”. The documentary is presented by Ana Matronic who explored dangers associated with tech addiction and what is done currently to minimise our over-reliance on tech.
The World Health Organisation is to include “gaming disorder”, the inability to stop gaming, into the International Classification of Diseases. By doing so, the WHO is recognising the serious and growing problem of digital addiction. The problem has also been acknowledged by Google, which recently announced that it will begin focusing on “Digital Well-being”.
Although there is a growing recognition of the problem, users are still not aware of exactly how digital technology is designed to facilitate addiction. We’re part of a research team that focuses on digital addiction and here are some of the techniques and mechanisms that digital media use to keep you hooked.
Compulsive checking
Digital technologies, such as social networks, online shopping, and games, use a set of persuasive and motivational techniques to keep users returning. These include “scarcity” (a snap or status is only temporarily available, encouraging you to get online quickly); “social proof” (20,000 users retweeted an article so you should go online and read it); “personalisation” (your news feed is designed to filter and display news based on your interest); and “reciprocity” (invite more friends to get extra points, and once your friends are part of the network it becomes much more difficult for you or them to leave).
Some digital platforms use features normally associated with slot machines. Antoine Taveneaux/Wikimedia, CC BY
Technology is designed to utilise the basic human need to feel a sense of belonging and connection with others. So, a fear of missing out, commonly known as FoMO, is at the heart of many features of social media design.
Groups and forums in social media promote active participation. Notifications and “presence features” keep people notified of each others’ availability and activities in real-time so that some start to become compulsive checkers. This includes “two ticks” on instant messaging tools, such as Whatsapp. Users can see whether their message has been delivered and read. This creates pressure on each person to respond quickly to the other.
The concepts of reward and infotainment, material which is both entertaining and informative, are also crucial for “addictive” designs. In social networks, it is said that “no news is not good news”. So, their design strives always to provide content and prevent disappointment. The seconds of anticipation for the “pull to refresh” mechanism on smartphone apps, such as Twitter, is similar to pulling the lever of a slot machine and waiting for the win.
Most of the features mentioned above have roots in our non-tech world. Social networking sites have not created any new or fundamentally different styles of interaction between humans. Instead they have vastly amplified the speed and ease with which these interactions can occur, taking them to a higher speed, and scale.
Addiction and awareness
People using digital media do exhibit symptoms of behavioural addiction. These include salience, conflict, and mood modification when they check their online profiles regularly. Often people feel the need to engage with digital devices even if it is inappropriate or dangerous for them to do so. If disconnected or unable to interact as desired, they become preoccupied with missing opportunities to engage with their online social networks.
According to the UK’s communications regulator Ofcom, 15m UK internet users (around 34% of all internet users) have tried a “digital detox”. After being offline, 33% of participants reported feeling an increase in productivity, 27% felt a sense of liberation, and 25% enjoyed life more. But the report also highlighted that 16% of participants experienced the fear of missing out, 15% felt lost and 14% “cut-off”. These figures suggest that people want to spend less time online, but they may need help to do so.
Gaming disorder is to be recognised by the WHO.
At the moment, tools that enable people to be in control of their online experience, presence and online interaction remain very primitive. There seem to be unwritten expectations for users to adhere to social norms of cyberspace once they accept participation.
But unlike other mediums for addiction, such as alcohol, technology can play a role in making its usage more informed and conscious. It is possible to detect whether someone is using a phone or social network in an anxious, uncontrolled manner. Similar to online gambling, users should have available help if they wish. This could be a self-exclusion and lock-out scheme. Users can allow software to alert them when their usage pattern indicates risk.
The borderline between software which is legitimately immersive and software which can be seen as “exploitation-ware” remains an open question. Transparency of digital persuasion design and education about critical digital literacy could be potential solutions.
Are you interested in developing the real-world impact of your research? How about increasing your public profile? If you’ve answered yes to any one of those questions, then the Research Communication Day is for you.
This day event on Wednesday 23 May 2018, 10.30 – 16.00, will provide you with a better understanding of the benefits of communicating your research, how you can go about it and who can help you with this within BU.
Subjects covered will include: planning and promoting your Festival of Learning event, practising in our radio studio and in front of a camera, writing for the research website and sharing your research via social media.
You’ll also have the chance to pitch your work to Stephen Harris, Commissioning and Science Editor for The Conversation. Stephen covers science, technology and health for The Conversation. He previously spent five years as senior reporter and special projects editor at The Engineer, the world’s oldest technology magazine and professional journal.
Workshops include:
Media Training
Creating & Marketing your Public Engagement Event
Sharing your Research via Social Media
Developing the Impact of your Research
Pitching to The Conversation
Developing your Digital Profile
Broadcast Training
Influencing Policy Makers
More information about the day schedule and a booking link can be found here.
#TalkBU is a monthly lunchtime seminar on Talbot Campus, open to all students and staff at Bournemouth University and free to attend. Come along to learn, discuss and engage in a 20-30 minute presentation by an academic or guest speaker talking about their research and findings, with a Q&A to finish.
Social media has created a different dimension of consumers for luxury products in particular. That being, the aspirational consumer’s desires for luxury derive from content produced on social media. Often, despite their strong yearning for luxury goods, due to economic reasons, aspirational consumers are unable to frequently purchase luxury. Social media provides an avenue for aspirational consumers to conspicuously consume without the need to purchase, enabling them to use luxury brands to create value amongst themselves.
In this #TalkBU session, Dr Elvira Bolat will examine the influence that social media has on the consumption of luxury products by introducing the Henry family: Hailey, Harriet, Hollie, Hannah, and Hilary.
When: Thursday 16 November at 1pm – 2pm
Where: Room FG04, Ground Floor in the Fusion Building
#TalkBU is a monthly lunchtime seminar on Talbot Campus, open to all students and staff at Bournemouth University and free to attend. Come along to learn, discuss and engage in a 20-30 minute presentation by an academic or guest speaker talking about their research and findings, with a short Q&A at the end.
Let’s talk about the Henry’s…
When: Thursday 16 November at 1pm – 2pm
Where: Room FG04, Ground Floor in the Fusion Building
Social media has created a different dimension of consumers for luxury products in particular. That being, the aspirational consumer’s desires for luxury derive from content produced on social media. Often, despite their strong yearning for luxury goods, due to economic reasons, aspirational consumers are unable to frequently purchase luxury. Social media provides an avenue for aspirational consumers to conspicuously consume without the need to purchase, enabling them to use luxury brands to create value amongst themselves.
In this #TalkBU session, Dr Elvira Bolat will examine the influence that social media has on the consumption of luxury products by introducing the Henry family: Hailey, Harriet, Hollie, Hannah, and Hilary.
Welcome to this week’s political scene within research. Here is a summary of the week’s generic policy reports and releases, alongside new niche consultations and inquiries.
The role of EU funding in UK research and innovation
This week the role of EU funding in UK research and innovation has hit the headlines. Its an analysis of the academic disciplines most reliant on EU research and innovation funding at a granular level.
Jointly commissioned by Technopolis and the UK’s four national academies (Medical Sciences, British Academy, Engineering and Royal Society) it highlights that of the 15 disciplines most dependent on EU funding 13 are within the arts, humanities and social science sphere.
Most reliant on the EU funding as a proportion of their total research funding are Archaeology (38% of funding), Classics (33%) and IT (30%).
The full report dissects the information further considering the funding across disciplines, institutions, industrial sectors, company sizes and UK regions. It differentiates between the absolute value of the research grant income from EU government bodies, and the relative value of research grant income from EU government bodies with respect to research grant income from all sources, including how the EU funding interacts with other funding sources.
There are also 11 focal case studies, including archaeology and ICT. Here’s an excerpt from the archaeology case study considering the risks associated with Brexit and the UK’s industrial strategy:
“As archaeologists are heavily dependent on EU funding, a break away from EU funding sources puts the discipline in a vulnerable position. This is exacerbated by the fact that the UK is short of archaeologists and/or skilled workers active in the field of Archaeology because of the surge in large scale infrastructure projects (e.g. HS2, Crossrail, and the A14), which drives away many archaeologists from research positions.” Source
See the full report page 25 for particular detail on ICT and digital sector, and page 39 for archaeology. For press coverage see the Financial Times article.
Bathing Water Quality
The European Environment Agency published European Bathing Water Quality in 2016. It sees the UK as second to bottom in the league table for quality of bathing water. While 96.4% of British beaches were found safe to swim in last year 20 sites failed the annual assessment. Only Ireland had a higher percentage of poor quality bathing waters at 4%.
This week Research Professional ran a succinct article encouraging researchers to think more about when and how they submit evidence to policy makers. Timing is key, policy makers often want information instantaneously and the article urges researchers to be responsive but pragmatic, including a pro-active approach of gently keeping key policy makers informed of new developments.
Responding to a select committee call for evidence is a great way for academics to influence UK policy. If you respond to a consultation or inquiry as a BU member of staff please let us know of your interest by emailing policy@bournemouth.ac.uk at least one week before you submit your response.
This week there are three new inquiries and consultations that may be of interest to BU academics.
Examples where a community based approach has been successful in removing barriers to participation in sport and physical activity?
Approaches that were particularly successful in increasing participation among certain social groups, like women, ethnic minorities, certain age-groups?
The barriers facing volunteers and how can they be overcome? The aim is to inform how Scotland might increase participation rates across all groups and sectors of society, respondents can select to answer only the most relevant questions.
The call for evidence closes on 30 June.
Body Image
The British Youth Council has opened an inquiry into body image and how the growth of social media and communications platforms has encouraged attitudes that entrench poor body image. Included among the inquiry questions are:
Has the growing use of social media and communications platforms amongst young people encouraged practices and attitudes that entrench poor body image? What is the link between “sexting” and body dissatisfaction?
Do internet companies, social media platforms or other platforms have a responsibility to tackle trends which entrench poor body image? What are they already doing in this area? What more should they be doing?
Are particular groups of young people particularly prone to poor body image, or less likely to seek help? What causes these trends?
In relation to young men and boys, minority ethnic groups, and those who self-identify as transgender: what are the specific challenges facing young people in these groups? How effective is existing support?
To what extent is dissatisfaction with body image contributing to the increase in mental health problems amongst children and young people?
Forming part of a media package to support innovation funding at BU, a new Instagram Account is now live. Oliver Cooke a third year student on the BA Honours Media Production course is developing a number of different media channels to showcase the range of Higher Education Innovation Funded (HEIF) projects at BU.
Ollie is also working on a short video documentary and website as part of this project.
Ollie’s experience with HEIF came from the time on his work placement last year. He worked within the Research and Knowledge Exchange Office (RKEO) as the Student Engagement Co-Ordinator and had the chance to be involved in a number of initiatives including HEIF. Whilst reflecting on his time in RKEO and ideas for his Graduate Project, it was clear that there are many interesting projects at BU.
Commenting on his chosen topic Ollie comented “It also struck me that here was an ideal opportunity to create some really engaging media content in order to showcase the innovation journeys and provide more information about innovation and knowledge exchange at BU. This will aim to highlight the people involved with HEIF at BU, as well as the research.”