in

In the age of AI, trust has never been more important – here’s why

Getty Images/Andriy Onufriyenko

Follow ZDNET: Add us as a preferred source<!–> on Google.


ZDNET’s key takeaways

  • AI is transforming the way humans interact with the world. 
  • Synthetic media is eroding human trust. 
  • Humans need to trust themselves in order to coexist with AI.

This month marks exactly three years since ChatGPT was launched, and in that time, AI tools have grown increasingly capable. The result is a reality in which synthetic media is increasingly realistic and prevalent, eroding human trust. While this creates a ton of problems, it is also an opportunity for humans to shine.

Also: Is that an AI video? 6 telltale signs it’s a fake

At SpiceWorks’ annual IT conference, SpiceWorld, AI and AR/VR thought leader and expert Helen Papagiannis, Ph.D., led a keynote session focused on how AI is challenging the concept of reality as we know it, and what people can do to preserve humanity and coexist with emerging augmentative AI experiences. The key pillar is trust.

“How [these technologies] augment rather than replace you, how do we coexist? Trust is central to that,” said Papagiannis.

The AI reality shift

Augmented and virtual reality (AR/VR), an area Papagiannis has been studying for decades, initially blurred the lines between reality and fiction. However, AI is paving the way for a new type of synthetic reality in which it becomes increasingly difficult to distinguish what is real and what is not. 

Also: I’ve been testing AI content detectors for years – these are your best options in 2025

The beginning of the keynote featured a game called “Is it real or AI?” in which Papagiannis showed the crowd images and videos and had them guess whether they thought they were real or not. The audience, despite being comprised of tech professionals, got the responses wrong a majority of the time. 

In the keynote, Papagiannis highlighted that this change is occurring simultaneously as frontier technologies augment human capabilities to create a new type of workforce. As we use AI in and outside of work, the tech reshapes our expectations for humanity. 

“It’s in these two shifts colliding – the disappearance of the real and becoming augmented humans – where trust now becomes the new axis,” Papagiannis said. “This is rewriting human experience itself.”

Now that what we see can be fabricated, she added, trust is what will define the future. 

With AR/AR, it kind of wets our palates a little bit about buying into something virtual or that didn’t exist,” Papagiannis told ZDNET. “Whereas now, AI just feels all pervasive, and whether it’s social media or having a conversation with a chatbot, elements of trust are playing such a big role.” 

Also: I tried the Samsung Galaxy XR headset, and I wasn’t worried for smart glasses at all

Papagiannis coined the term “AfterNext,” which refers to what comes after the frontier technology that is already present. This involves considering the present change occurring in the moment and arming yourself for the next wave of technological advances. Preparing for the AfterNext, Papagiannis said, is all about foresight – which, amongst all the confusion and unease AI has brought, includes acting with confidence and purpose. She told the audience and ZDNET a few ways to do just that. 

‘Trust is the KPI’

In a world where trust is eroding, building it is more important than ever. This is especially true for working professionals, as trust will be a key component of every industry, according to Papagiannis. 

Also: Is your business ready for a deepfake attack? 4 steps to take before it’s too late

“Leadership can start now by drafting clear ethical guardrails as to what is acceptable and what is not before becomes commonplace because trust is the KPI, and we have this incredible opportunity to use contextual AI to support human flourishing,” said Papagiannis.

Beyond how to interact with others, preparing for this wave also involves becoming aware of your own practices and making a conscious effort to reexamine who and what you can trust. This involves not only assessing whether what you come across online is real or fake, but also trusting your own judgment. 

For example, if you have a question or a difficult task, it may come naturally to many to ask a chatbot for help with the answer. However, Papagiannis encourages users to make an effort to trust their own skills and take on tasks, reclaiming agency, keeping a human in the loop, and prioritizing curiosity and connection. 

Rely on yourself before AI tools

“I think the biggest step people can take right now is trusting themselves, rather than their first instinct is to open ChatGPT,” said Papagiannis. “How can you continue to empower yourself where you’re putting the human first, but then you can still enter a scenario where you are coexisting?”

Also: Don’t fall for AI-powered disinformation attacks online – here’s how to stay sharp

Beyond reclaiming your human agency, overreliance on chatbots can also eliminate any friction, which, Papagiannis says, can be “vital” in slowing people down and encouraging practices such as reflection. That pause can enable personal growth and even lead to fulfillment, she says.

“Boredom becomes even more important in the era of AI because it protects against overstimulation and even helps to restore creativity and focus, and it’s in this momentary pause that can even help us to rediscover the human in the machine we’re in,” said Papagiannis. 

Sick of online ads and trackers? How I block them across my entire home network

How I use this hidden Android security feature to easily turn off sensors – and where to enable it