Tagged / metaverse

Conversation article: Children have been interacting in the metaverse for years – what parents need to know about keeping them safe

Professor Andy Phippen writes for The Conversation about the virtual worlds children access, and how parents can support using them safely.

Children have been interacting in the metaverse for years – what parents need to know about keeping them safe

Frame Stock Footage/Shutterstock

Andy Phippen, Bournemouth University

The metaverse sounds like it could be a scary place. Recent headlines have highlighted the dangers to children of the metaverse – a generic term for the range of online virtual worlds, developed by different tech companies, in which users can interact. Children’s charities have raised concerns about its potential for harm.

Recently, Meta – Facebook’s parent company – announced that teenagers would be able to use its VR Horizon Worlds app in North America. In this online environment, users are represented by avatars and spend time in virtual worlds, making use of virtual reality (VR) headsets. Some politicians in the US have already voiced their unease. It is certainly possible that Meta could extend this access to teens elsewhere in the world.

It would be no surprise if parents were concerned about this technology and how it might affect their children. In fact, children are already online in the metaverse – and there are steps parents can take to understand this technology, the risks it may pose, and what they can do.

Avatars and online games

Perhaps the most famous current interactive world aimed at children is Roblox, an online platform that allows users to create avatars, play games, make their own games, and interact with others. Young people play games developed by other users – the most popular is currently Adopt Me!, in which players adopt animals and live with them in a virtual world.

This mix of gameplay, interaction with others, and opportunity for creativity are all reasons Roblox is so popular. While it can be played using VR headsets, the vast majority of interaction takes place using more traditional devices such as phones, tablets and laptops.

Another emerging platform, Zepeto, has a similar model of allowing users to create environments, access “worlds” developed by others, and chat with others within these environments. Some young people will interact solely with their own group of friends in a specific world; other worlds will allow interaction with people they don’t know.

However, there is a rich history of platforms that could be considered, in modern terminology, to be “metaverses”. One is Minecraft, perhaps the most popular platform before Roblox. Launched in 2011, Minecraft is a block-building game which also allows for interaction with other users.

Before Minecraft, there were other platforms such as multiplayer online games Club Penguin (launched 2005) and Moshi Monsters (launched 2008) which, while smaller in scope, still allowed young people to engage with others on online platforms with avatars they created. These games also attracted moral panics at the time.

While new terms such as the metaverse and unfamiliar technology like VR headsets might make us fear these things are new, as with most things in the digital world, they are simply progressions of what has come before.

And on the whole, the risks remain similar. Headsets in VR-based worlds do present new challenges in terms of how immersive the experience is, and how we might monitor what a young person is doing. But otherwise, there is little new in the risks associated with these platforms, which are still based around interactions with others. Children may be exposed to upsetting or harmful language, or they may find themselves interacting with someone who is not who they claim to be.

Parental knowledge

In my work with colleagues on online harms, we often talk about mitigating risk through knowledge. It is important for parents to have conversations with their children, understand the platforms they are using, and research the tools these platforms provide to help reduce the potential risks.

Most provide parental controls and tools to block and report abusive users. Roblox offers a wide range of tools for parents, ranging from being able to restrict who their children play with to monitoring a child’s interactions in a game. Zepeto has similar services.

As a parent, understanding these tools, how to set them up and how to use them is one of the best ways of reducing the risk of upset or harm to your child in these environments.

However, perhaps the most important thing is for parents to make sure their children are comfortable telling them about issues they may have online. If your child is worried or upset by what has happened on one of these platforms, they need to know they can tell you about it without fear of being told off, and that you can help.

It is also best to have regular conversations rather than confrontations. Ask your child’s opinion or thoughts on news stories about the metaverse. If they know you are approachable and understanding about their online lives, they are more likely to talk about them.The Conversation

Andy Phippen, Professor of IT Ethics and Digital Rights, Bournemouth University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Shaping the metaverse into reality: multidisciplinary perspectives on opportunities, challenges, and future research. 

New METAVERSE paper coauthored by Professor Dimitrios Buhalis

Koohang, A, Nord, J, Ooi, K, Tan, G, Al-Emran, M, Aw, E, Baabdullah, A, Buhalis, D, Cham, T, Dennis, C, Dutot, V, Dwivedi, Y, Hughes, L, Mogaji, E., Pandey, N, Phau, I, Raman, R, Sharma, A, Sigala, M, Ueno, A and Wong, L (2023) 

Shaping the metaverse into reality: multidisciplinary perspectives on opportunities, challenges, and future research.
Journal of Computer Information Systems. ISSN 0887-4417 https://doi.org/10.1080/08874417.2023.2165197 

The term metaverse is described as the next iteration of the Internet. Metaverse is a virtual platform that uses extended reality technologies, i.e. augmented reality, virtual reality, mixed reality, 3D graphics, and other emerging technologies to allow real-time interactions and experiences in ways that are not possible in the physical world. Companies have begun to notice the impact of the metaverse and how it may help maximize profits. The purpose of this paper is to offer perspectives on several important areas, i.e. marketing, tourism, manufacturing, operations management, education, the retailing industry, banking services, healthcare, and human resource management that are likely to be impacted by the adoption and use of a metaverse. Each includes an overview, opportunities, challenges, and a potential research agenda.

 

Professor Dimitrios Buhalis research on Metaverse

 

Research in metaverse for value co-creation and technology-enhanced experience.

Research in metaverse for value co-creation and technology-enhanced experience.

 

Professor Dimitrios Buhalis latest research focuses on Metaverse and is published in two recent high ranking publications.

Metaverse blends the physical and virtual worlds, revolutionizing how hospitality customers and hospitality organizations facilitate the co-creation of transformational experiences and values. Research aims to explore the opportunities and challenges that Metaverse introduces as part of the customer experience and value co-creation process and the potential transform of customer experience and value co-creation. Further research is planned to exploit the full potential of metaverse in value co-creation and technology-enhanced experience.

 

Buhalis, D., Lin, M.S. and Leung, D. (2023), “Metaverse as a driver for customer experience and value co-creation: implications for hospitality and tourism management and marketing”, International Journal of Contemporary Hospitality Management, Vol.35,  https://doi.org/10.1108/IJCHM-05-2022-0631

Dwivedi,. Y., Hughes, L., Baabdullah, A., Ribeiro-Navarrete, S., Giannakis, M., Al-Debei, M, Dennehy, D., Metri, B., Buhalis, D., Cheung, C., Conboy, K, Doyle, R., Goyal, D.P, Gustafsson, A., Jebabli, I., Young-Gab Kim, Kim, J., Koos; S., Kreps, D., Kshetri, Kumar, V., Oui, K., Papagiannidis, S., Pappas, I., Polyviou, A., Park, S., Pandey, N., Queiroza, M., Raman, R., Rauschnabel, R., Shirish, A., Sigala, M., Spanaki, K., Wei-Han Tana, G., Tiwari, M., Viglia, G., Fosso Wamba, 2023, Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy  International Journal of Information Management, Vol.66, October 2022, 102542,
https://doi.org/10.1016/j.ijinfomgt.2022.102542  

 

Conversation article: Protecting children in the metaverse – it’s easy to blame big tech but we all have a role to play

Professor Andy Phippen writes for The Conversation about child safety in virtual spaces…

Protecting children in the metaverse: it’s easy to blame big tech, but we all have a role to play

Newman Studio/Shutterstock

Andy Phippen, Bournemouth University

In a recent BBC news investigation, a reporter posing as a 13-year-old girl in a virtual reality (VR) app was exposed to sexual content, racist insults and a rape threat. The app in question, VRChat, is an interactive platform where users can create “rooms” within which people interact (in the form of avatars). The reporter saw avatars simulating sex, and was propositioned by numerous men.

The results of this investigation have led to warnings from child safety charities including the National Society for the Prevention of Cruelty to Children (NSPCC) about the dangers children face in the metaverse. The metaverse refers to a network of VR worlds which Meta (formerly Facebook) has positioned as a future version of the internet, eventually allowing us to engage across education, work and social contexts.

The NSPCC appears to put the blame and the responsibility on technology companies, arguing they need to do more to safeguard children’s safety in these online spaces. While I agree platforms could be doing more, they can’t tackle this problem alone.

Reading about the BBC investigation, I felt a sense of déjà vu. I was surprised that anyone working in online safeguarding would be – to use the NSPCC’s words – “shocked” by the reporter’s experiences. Ten years ago, well before we’d heard the word “metaverse”, similar stories emerged around platforms including Club Penguin and Habbo Hotel.

These avatar-based platforms, where users interact in virtual spaces via a text-based chat function, were actually designed for children. In both cases adults posing as children as a means to investigate were exposed to sexually explicit interactions.

The demands that companies do more to prevent these incidents have been around for a long time. We are locked in a cycle of new technology, emerging risks and moral panic. Yet nothing changes.

It’s a tricky area

We’ve seen demands for companies to put age verification measures in place to prevent young people accessing inappropriate services. This has included proposals for social platforms to require verification that the user is aged 13 or above, or for pornography websites to require proof that the user is over 18.

If age verification was easy, it would have been widely adopted by now. If anyone can think of a way that all 13-year-olds can prove their age online reliably, without data privacy concerns, and in a way that’s easy for platforms to implement, there are many tech companies that would like to talk to them.

In terms of policing the communication that occurs on these platforms, similarly, this won’t be achieved through an algorithm. Artificial intelligence is nowhere near clever enough to intercept real-time audio streams and determine, with accuracy, whether someone is being offensive. And while there might be some scope for human moderation, monitoring of all real-time online spaces would be impossibly resource-intensive.

The reality is that platforms already provide a lot of tools to tackle harassment and abuse. The trouble is few people are aware of them, believe they will work, or want to use them. VRChat, for example, provides tools for blocking abusive users, and the means to report them, which might ultimately result in the user having their account removed.

A man assists a child to put on a VR headset.
People will access the metaverse through technology like VR headsets.
wavebreakmedia/Shutterstock

We cannot all sit back and shout, “my child has been upset by something online, who is going to stop this from happening?”. We need to shift our focus from the notion of “evil big tech”, which really isn’t helpful, to looking at the role other stakeholders could play too.

If parents are going to buy their children VR headsets, they need to have a look at safety features. It’s often possible to monitor activity by having the young person cast what is on their headset onto the family TV or another screen. Parents could also check out the apps and games young people are interacting with prior to allowing their children to use them.

What young people think

I’ve spent the last two decades researching online safeguarding – discussing concerns around online harms with young people, and working with a variety of stakeholders on how we might better help young people. I rarely hear demands that the government needs to bring big tech companies to heel from young people themselves.

They do, however, regularly call for better education and support from adults in tackling the potential online harms they might face. For example, young people tell us they want discussion in the classroom with informed teachers who can manage the debates that arise, and to whom they can ask questions without being told “don’t ask questions like that”.

However, without national coordination, I can sympathise with any teacher not wishing to risk complaint from, for example, outraged parents, as a result of holding a discussion on such sensitive topics.

I note the UK government’s Online Safety Bill, the legislation that policymakers claim will prevent online harms, contains just two mentions of the word “education” in 145 pages.

We all have a role to play in supporting young people as they navigate online spaces. Prevention has been the key message for 15 years, but this approach isn’t working. Young people are calling for education, delivered by people who understand the issues. This is not something that can be achieved by the platforms alone.The Conversation

Andy Phippen, Professor of IT Ethics and Digital Rights, Bournemouth University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Facebook goes ‘Meta’: what does it mean in practice?

Dr Carina E I Westling and Dr Hongchuan Yu write about Facebook’s recent announcement and what the ‘Metaverse’ will entail… 

Facebook’s recent announcement about its new focus creates more questions than clarity. The new brand name suggests a desire for greater confluence which, in line with the legacy business model inherited from Facebook, means closer meshing with the totality of our personal and professional lives. How this will manifest is yet to be seen but the devil will be in the detail, as Meta will create both challenges and opportunities for innovation in design and policy. With close attention to emerging technologies and the policy frameworks that support their implementation, researchers and educators at the Faculty of Media and Communication at Bournemouth University are collaborating with industry advisors to make our new programmes BA Immersive Media and BSc Virtual and Augmented Reality crucibles for responsible creative development.

The ‘embodied internet’ is an oxymoron, but virtual (VR) and to some extent also augmented (AR) and mixed reality (XR), technologies seek to produce an approximation of physical experience. With 5G, many scenarios of real-time interaction based on cloud computing can be fulfilled. This offers new possibilities to the creative industries through VR/AR/XR technologies, ostensibly to realise the ‘metaverse’; the convergence of our physical and digital lives. However, you cannot accelerate connectivity without proportionate risks of exposure.

Effective storytelling will need to be ethical 

Our research and teaching programmes are geared towards development of the human skills that drive excellent storytelling in and beyond games and experience design, and we are keenly aware of the changing policy landscape that is sure to follow in the wake of interactive VR/AR/XR. Since 2015-2016, the management of and risks associated with the type of personal data that is the bread and butter of all free-at-the-point-of-use, audience-facing digital platforms is a top priority, and the opening up of VR/AR/XR technologies to real-time interactivity will raise the stakes further.

Meta’s vision, or rather proposition for a technologically convergent interactive and social space is broad, meaning that audiences will comprise naïve users in everyday situations as well as seasoned users in professional situations, and every type of audience in between. At scale, services, social spaces and interactive storytelling designed for this virtual milieu will present new challenges to research and development, including known and ‘unknown unknown’ problems with data management and security. Delivery of complex interactive media environments with default open web connectivity will create a host of new attack surfaces for cybercriminals and digital mavericks. Public appetite for more exposure – particularly of children and vulnerable adult populations – to malign actors is about as great as their trust in the brand that Meta seeks to leave behind.

No alt text provided for this image

Broad adoption, sustainable development and effective storytelling in this domain will require that research, design and production are framed in a clear commitment to ethical principles and mitigation of risks to privacy and data security. Early publicity materials indicate their awareness of this, but Zuckerberg & co. still have to regain the trust of peers and public. That is not to say Meta is doomed to join Second Life – its reception in the industry press may have been on the chilly side, but the rebrand presents an opportunity to be more than a clean slate. We will need to see unflinching recognition of past errors and genuine steps taken to integrate data security with appropriate risk modelling and attention to scaling effects. But if Meta walks the walk, it may come to play a part in, and perhaps even lead the ‘coming of age’ of social media.

High stakes 

As Meta, Facebook are planning to spend at least $10 billion on metaverse-related projects this year. Bloomberg Intelligence further predicts “The global metaverse revenue opportunity could approach $800 billion in 2024”. Whether we greet such developments with enthusiasm or trepidation, it is clear that social media will see a step change even if we cannot be certain of its nature.

Original VR technology was derived from computer graphics and relied on specialist hardware to deliver expert applications such as surgery training and planning, high-end games and flight simulators. In addition to 5G, recent advances in computer vision and machine learning (sometimes called AI) technologies applied to VR/AR/XR technologies may help realise their broad adoption, which is the Meta vision for a 3D, virtual, social space where you might share, in real time, experiences that aren’t feasible in the physical world.

Technology marketing has not always delivered on its promises but innovation has created real change, and content producers will need to be aware of developments in this domain. As Cathy Hackl says: “If the internet and social media changed your business or changed the way you interact with people, then you should be paying attention to what 3.0 and the metaverse will do, because it will change those things as well.” We might speculate about effects on how we tell stories and socialise remotely, but we will almost certainly see this type of platform used as a productivity tool, made more relevant by imperatives to reduce travel and carbon footprints.

As with most predictions, the actuality is likely to be more prosaic than any utopias or dystopias we conjure up, but probably not unimportant. In the past decade and a half, social media have become a critical concern with real-world impacts. It will be interesting to see if Meta can finally shed Facebook’s unfortunate association with FaceMash, Zuckerberg’s jockish student experiment. Growing up is overdue.

By Dr Carina E I Westling and Dr Hongchuan Yu, Bournemouth University

This piece was originally published on BU’s LinkedIn page