Ethical Issues

This area is something I haven’t thought of as much as the appropriateness, so please do feel free to add to and dispute anything I say here! ? 

There are of course lots and lots of ethical issues to consider when making an emotive adaptive game. For instance, people may object to have their emotional states and/or changes monitored to begin with, and many may not like the idea of the game storing this information for future use (in TEDDI the information was stored for future use only for a minute, but as I had to do analysis afterwards all data was written to a log file). The information storage could be short and temporary (e.g. 60 seconds long buffer), or it could be more long term allowing the game to learn more about the player, thus aiding in adapting the game in meaningful ways. Many people are wary of any information being stored about them, and though it is not obvious to me how this sort of information would be misused, it is an important issue to consider. The way I see it however, is that if people were worried about their data being stored and misused they would likely not choose to play the game, and so this issue is avoided. Designers should be aware of this issue though when determining their target group. 

To keep the post fairly short I’ll only mention one other ethical issue at this time, namely the issue of adapting a computer game when played in a player vs. player mode. In a battle between two players, the game can’t adapt the players’ skill levels or luck, as this would make the game unfair to the better player. These sorts of adaptations are exactly the sorts used in TEDDI, which was a one player game only. So what can be done about this versus situation? I would suggest that the environment to be adapted, the colours of the scenery (even subtle changes could have an impact), the tone and speed of the music, and other environmental aspects specific to the game. This could aid in making a player feel better or worse about the situation, without actually helping or obstructing the game play in favour of one player and without interfering with the battle. 

What do you guys think of these ethical issues, and possible solutions? And more importantly, what sort of ethical issues do you envisage with emotionally adaptive computer games?
   

9 Comments
10 Comments
Posted by Phewsie

This area is something I haven’t thought of as much as the appropriateness, so please do feel free to add to and dispute anything I say here! ? 

There are of course lots and lots of ethical issues to consider when making an emotive adaptive game. For instance, people may object to have their emotional states and/or changes monitored to begin with, and many may not like the idea of the game storing this information for future use (in TEDDI the information was stored for future use only for a minute, but as I had to do analysis afterwards all data was written to a log file). The information storage could be short and temporary (e.g. 60 seconds long buffer), or it could be more long term allowing the game to learn more about the player, thus aiding in adapting the game in meaningful ways. Many people are wary of any information being stored about them, and though it is not obvious to me how this sort of information would be misused, it is an important issue to consider. The way I see it however, is that if people were worried about their data being stored and misused they would likely not choose to play the game, and so this issue is avoided. Designers should be aware of this issue though when determining their target group. 

To keep the post fairly short I’ll only mention one other ethical issue at this time, namely the issue of adapting a computer game when played in a player vs. player mode. In a battle between two players, the game can’t adapt the players’ skill levels or luck, as this would make the game unfair to the better player. These sorts of adaptations are exactly the sorts used in TEDDI, which was a one player game only. So what can be done about this versus situation? I would suggest that the environment to be adapted, the colours of the scenery (even subtle changes could have an impact), the tone and speed of the music, and other environmental aspects specific to the game. This could aid in making a player feel better or worse about the situation, without actually helping or obstructing the game play in favour of one player and without interfering with the battle. 

What do you guys think of these ethical issues, and possible solutions? And more importantly, what sort of ethical issues do you envisage with emotionally adaptive computer games?
   

Posted by Emilio

What is TEDDI?

Posted by Termite
@Emilio said:
" What is TEDDI? "
Well you see, he's this bear, but he's like...empty inside....
Posted by KaosAngel
@Termite said:
" @Emilio said:
" What is TEDDI? "
Well you see, he's this bear, but he's like...empty inside.... "
...and he loves to be kissed.
Posted by Butchio
@KaosAngel said:
" @Termite said:
" @Emilio said:
" What is TEDDI? "
Well you see, he's this bear, but he's like...empty inside.... "
...and he loves to be kissed. "
hahaha. i havent got a clue what this thread is about
Posted by Phewsie

TEDDI is an emotive adaptive computer game which I developed for my PhD project. It uses biometrics and game context to determine emotional changes in the player. For more details look at earlier blog posts.. :)

Posted by torus
@Phewsie:    

I haven't read much beyond a linked article and this blog post, but I'm not clear on where exactly ethical concerns enter the picture here. You only really mention it in one context, when you point out that people might be uncomfortable with psychological data being... stored? Used in the future? If we're talking about the confidentiality of a persons' psychological profile, then of course it's up to the game designer to make sure that personal data is kept secure, and that the user knows that data is being collected. Am I missing something?  
 
PS. I would love to have someone like you who thinks about these things in game design on my team. Too bad I'm a poor OSS dev :P 
 
PSS. My hands don't sweat much, I wonder if this would work on me...
Posted by crystalskull2

In which language is this thread written?I can't understand it.

Posted by Phewsie
@torus: 
 
I try to keep my blog posts as short as possible in hopes that people will actually read them, and perhaps comment (like you did!).
It is a problem for new people to my blog though, as they don't already know what it is I'm up to..
 
The post is actually about two ethical issues, that of storing personal data (which is an ethical issue outside games as well), and in developing an adaptive game in a player versus player situation. The ethical issue here is about fairness to the player. It's an obvious issue, to which I have suggested a solution, but there could maybe be others, perhaps more interesting ones?
 
What I'm searching for is a list if you like, of ethical issues which would arise when attempting to develop and emotionally adaptive computer game. I am of course attempting to create this list myself, but am searching for inspiration and other people's input. As always, anything I use in my PhD Thesis will be referenced, I'm not out trying to steal anyone's ideas!
 
There would for instance be an issue in an online pve game if the monsters encountered were easier to kill for one person than another. Taking WoW as an example, if one person experienced fear, or too much stress and the environment adapted to make it easier for this player, it would require a lot less of this person to get to level 80 (this is still highest level, right?) than someone who may be more experienced with games and as such handled everything with 'emotional ease'.
 
Responding to your point on the psychological data. We're actually not really talking about psychological data, there is no profile as such based on the player's psychology. We're collecting physiology only, and the question is, is there a way to abuse this data? What could be the result of a 'data leak'? It is an ethical issue, because people are wary of any data being stored regarding them, and especially if they don't really understand how it could be used..
 
@ur ps.: When I'm done with my PhD I might be able to help.. :) For now I need to focus on this stuff!
@ur pps.: I have not met anyone yet on who this hasn't worked. The tiniest changes are detected, and it does go up and down on all people I've tested it on (about 100 in total). So, yes, I think it would work on you :)
Posted by RagingLion

Ethical issues arising from the storing of physiological data would only be of consequence if there are nefarious applications that can be created using the data - you're a better judge than me if that's even possible, but I understand that thinking of specific applications could be hard to tie down.
 
In terms of storing data about users I think Silent Hill: Shattered Memories is an excellent example of that actually being used in a serious way for perhaps the first time (as far as I know).  Don't know if you've followed the GB coverage of the game but the QuickLooks give a lot of info about it.  It's not physiological data but it is psychological data - quite how deep it searches for information and then how it applies it is not completely known and arguable but it's certainly in the vein of building up a picture of the person playing the game.  As far as I'm aware it's not a feature against which there's been any adverse reaction, rather it seems players have just found it interesting and engaging because of that.  I don't know, but SH:SM could be an interesting game in general to look at because it seems to have some similarities to the work you're doing though only from information gained in game.