Project Background
You are invited to take part in the following project:

Trust of Conversational A.I. for Interpreting Complex Rule-Based Frameworks in Decision-Critical Environments

Before agreeing to take part, please read this information sheet carefully and let us know if anything is unclear or you would like further information. Project Supervisor (UoY): Michael O’Dea, michael.odea@york.ac.uk

What is the purpose of the project?
This project is being conducted by Alex Gidman, ag983@york.ac.uk and Michael O'Dea, michael.odea@york.ac.uk The aim of this project is to understand to what extent can conversational A.I. systems designed to interpret complex rule-based frameworks be trusted by users to make authoritative decisions in decision-critical environments. Your participation in this project is voluntary.
Do I have to take part?
No, participation is optional. If you do decide to take part, you will be asked to provide your consent. If you change your mind at any point during the research activity, you will be able to withdraw your participation without having to provide a reason. To withdraw your participation you need to contact either Alex Gidman or Michael O’Dea within one week (see contact details above) and request that your data be withdrawn, and this will be actioned as soon as possible.
How will you use my data?
We will collect responses to questions about the outputs of a novel A.I. application intended to deliver game rulings and judgements for the Magic: The Gathering trading card game. The data collected from you will be anonymised, analysed and used to produce reports. Anonymised data may be reused by the research team or other third parties for secondary research purposes.  
What will I be asked to do?
We are recruiting participants for the study. Participating would involve taking part in an anonymous survey, answering questions about the outputs of a novel A.I. application intended to deliver game rulings and judgements for the Magic: The Gathering trading card game. The aim is to learn more about techniques available to improve trust of conversational A.I. for decision making. We expect that participating will take approximately 10 minutes.
How will you keep my data secure?
Information will be treated confidentially and shared on a need-to-know basis only. We are committed to the principle of data protection by design and default and will collect the minimum amount of data necessary for the project. In addition, we will anonymise or pseudonymise data wherever possible.  In compliance with the General Data Protection Regulation (GDPR) and Data Protection Act 2018, the “data controller” is identified as the University of York. Contact the Data Protection Officer (dataprotection@york.ac.uk) for any data protection questions, comments or complaints. The legal basis for undertaking research under the GDPR are GDPR Articles Article 6 (1) (e) for personal data.

P.S: This survey contains Karma to get free survey responses at SurveySwap.io