Due to the potential sensitive nature of the data and overall purpose of TrustPal, there are several serious social and ethical points which must be considered carefully.
– Application name and purpose: we chose the name TrustPal for the application as it signifies a means of talking to someone in an informal manner about problems, issues or concerns, in a safe environment. We feel it promotes both a positive outlook, as both trust and pal bring up positive ideas, and it is a simple and specific description of what the service offers. Since the application gives individuals a chance to talk about personal troubles, or other type of information that can be sensitive, establishing the fact that it is a safe and trustworthy application is very important. We chose ātrustā over āsafeā because trust also refers to being able to trust the volunteers or individuals behind the application, not just the technical aspect, such as the anonymity protocols. Furthermore, trust has a more reassuring quality than safe, which can sound more severe. We avoided words such as mental or counseling in order to not give the impression of a formal service (such as the counseling service at the university or NHS provided services) which would be misleading, sinceĀ TrustPal would be operated by volunteers and not by paid experts. The application intends to be a form of informal support, not a formalized service. We have also avoided references to mental healthy or mental problems, since they are generally socially stigmatized. Such stigma could prevent individuals from using the service and avoid seeking help. Furthermore, it can limit the scope of the application: since it is meant to help people with any type of difficulties, it might be interpreted or advertised in the wrong way.
– Privacy:
- Personal and sensitive data: TrustPal is an application where people can chat anonymously. Because of the potential exchange of sensitive or personal information, both the identity of users and the safety of the data must be guaranteed. Anonymization protocols are considered in a separate post. This main feature helps protect the userās identity, so people can feel safe communicating about things they struggle with, without the concern of being exposed or stigmatized. By keeping the usersā identity secret from the public, the use of disclosed information for the purpose of harassment or bullying is minimized. Nevertheless, the design has to allow the possibility of moderating or banning users who attempt to cause disturbance or harm to other users or the volunteers, or act in any inappropriate way. We propose that the service has one or two admins who can identify individual users only in exceptional circumstances. This would either be done by a system of unique identification or by being able to see which student account is being used. This will not only allow them to manage inappropriate behaviour, but in cases where an individualās health and safety or the health and safety of people around them are deemed to be in danger (instances of self-harm for example), there is a way of identifying the user and delivering appropriate help. This is a policy maintained by all similar services, both university based and NHS based. Another point of concern is the disclosure of information from group chats by other users.Ā However, the anonymity of users should help in minimizing the damage, and can also prevent malicious use of information from group chats. Because of this, the personal gain from disclosing such information is rather low, so the likelihood of it happening is consequently low. Nevertheless, users casually mentioning things from the group chat in personal conversations with friends for example cannot be prevented. In order to improve on this point, a clause can be added to the user agreement that users agree to not share information from group chats for a malicious purpose or in any way that might harm or affect the safety and integrity of other users, volunteers, or the application itself.
- Employees/Volunteers: one other point of concern is the reliability and professionalism of the volunteers operating the application service. One way of tackling this concern is by selecting the volunteers based on their previous experience and training them before they are allowed to work in the service and interact with users. This allows for the security that they have the appropriate knowledge and skills to be effective and not cause any damage. Additionally, volunteers must not disclose or share in any way information from within the service, such as what users talk about. They can be asked to sign a type of non-disclosure agreement specifically tailored for their position. This will also help to further reassure users that their information is treated appropriately. Volunteersā identity should also be kept anonymous, as is the protocol with other similar services such as Nightline. This helps protect volunteers from any form of social stigma which might arise, but more importantly it will not discourage individuals to use the service out of fear of their friends or acquaintances finding out about their problems.
- Scenarios involving suicidal issues bring with them social and ethical issues regarding TrustPals responsibilities. Such scenarios could call for TrustPal employees to remove anonymity and escalate to relevant university welfare personnel and/or national charities. This is not something we have agreed upon with the TrustPal application but it should be noted as a social and ethical consideration.
– Trust: the issue of trust arises at every step for TrustPal, from the very name, to the infrastructure, and the volunteers and user community. Due to the nature and purpose of the application, trust becomes even more important than in other circumstances, since users would expose sensitive information which can potentially cause damage if misused or compromised. The use of ‘trust’ as part of the application name is aimed at establishing the trustworthiness of the service from the start. The user agreement which would include the clause regarding forbidding users to share information from the group chat and the volunteer non-disclosure type of agreement are also steps to ensure the integrity of the service and thus improve the users’ trust. Furthermore, the build of the application has to be very well done, in order to insure that there are no technical dangers or risks from this side (such as the appropriate protocols being implemented correctly). Looking at trust between users and volunteers, a rating system for volunteers can be proposed, where users who have interacted with a volunteer can rate him/her, either overall or rate according to various categories of problems/issues (such as a volunteer can have a high rating for emotional issues, medium for educational problems, and fairly low for personal relations etc.). This will allow other users to choose the volunteers best suited for their needs, but it will also provide feedback on how well a volunteer is performing, which will in turn help improve the quality of the service. Since TrustPal is initially aimed only at students from the University of Southampton, having the approval of the University or the Students Union will further help validate the service and increase users’ trust in it.