Date: Tuesday 8 November 2016
Location: Executive Business Centre (EBC), Bournemouth University, 89 Holdenhurst Road, Bournemouth, BH8 8EB
As part of the ESRC Festival of Social Science, BU’s Dr Nava Tintarev and Dr Paolo Palmieri will host an information session and focus group to talk about their research into digital privacy and discuss people’s opinions on the use of their personal data
As many of us may be aware our personal data is used to filter our Facebook timeline and that Amazon personalises our suggestions. With an exponential growth of information, many online systems filter and adapt what information we are exposed to. However as users, we have not always agreed to this personalisation and we’re often left unaware whether our personal data is being used to benefit us. There is also a risk that if personalisation gets ‘too good’; this can narrow down our exposure to new things. We’re left to wonder how our personal information is being used.
Imagine a tourist visiting a city and looking for new places to explore. A recommender system would suggest several places to visit and a certain sequence to visit them in. To give good recommendations, the system needs to consider the relationship between suggested items. For example, the tourist may not want to visit more than two museums in a day, and they would not want to go from one museum straight to another one, so there should be something else to visit in-between.
Working together with adults of different backgrounds and degrees of technological experience Dr Nava Tintarev hopes to understand what factors influence a good explanation for these sequences, through the focus group and better understand how people feel about this personalisation.
One scenario is when there is no best option and the sequences have taken into account trade-offs. For example, someone travelling in a group may not see their top preference within the sequence, because the system took into account the preference of others in the group.
The second scenario in which explanations can really help is when the recommendations contain unexpected, but risky items. Often recommender systems suggest safe items, for example suggesting the latest Star Wars movie to a Star Wars fan, or inferring from a user’s consumption habits that they are similar to users that like the newly released Jurassic World movie. The problem is that even though these systems suggest novel items, these recommendations are predictable. They miss out on what potentially could be the greatest strength of recommender systems: helping users discover new items and new interests they didn’t realise they had.
It is vital to understand people’s concerns. Dr Nava Tintarev’s research looks specifically at the use of explanations to help users make good decisions about recommendations of sequences of items. It is important that they are confident that they can trust these systems. Through her research, she hopes to help users gain a better sense of how their personal data is being used. This event will help you gather a better understanding of how big data companies use personal data and the challenges they face.
The 14th annual Festival of Social Science takes place from 5-12 November 2016 with more than 250 free events nationwide. Run by the Economic and Social Research Council, the Festival provides an opportunity for anyone to meet with some of the country’s leading social scientists and discover, discuss and debate the role that research plays in everyday life. With a whole range of creative and engaging events there’s something for everyone including businesses, charities, schools and government agencies. A full programme is available at www.esrc.ac.uk/festival You can also join the discussion on Twitter using #esrcfestival.
This event is running twice on Tuesday 8 November to allow as many people as possible to attend. I if you would like to attend the 14:30-17:00 session please book your free place here. If you would prefer to attend in the evening there is a session running 18:00-20:30 for which you can book your free place here.