AI tools and trust
People generally trust AI tools, but they have some concerns about certain aspects of them. By empowering users with deeper insights into the nuances and limitations of AI tools, we have the potential to foster even greater trust and confidence in their usage.
Trust in AI tools in general
In general, more than half of the people trust the AI tools they have tried.
Humanity
The majority of participants thought that AI tools have limits on human social skills, understanding context, or thinking creatively.
Transparency
When it comes to the transparency of AI tools, people would be open to learning about how these tools work exactly.
Capability
People believe AI tools can help them work faster, which is in line with previous findings on the fact that people use these tools to save time.
However, their views on how much AI tools know, or being biased are inconclusive, the majority of people still think they need to edit the output they get.
Reliability
Related to the reliability of AI tools, we can see that more than half of our participants fact-check the output they got from the tools, however, only one third of them would check it for plagiarism.
Moreover, still almost half of them think that reading articles and searching on Google would give better results than using an AI tool.