Skip to main content
Generative AI is flooding the social web with content that looks authentic but isn’t — and the only structural defence is a shift from global popularity metrics to verifiable local trust.

What this piece is

A reflection on the AI-driven credibility collapse of the social internet, and the emerging model that replaces it: trust circles, federated infrastructure, and feeds built around verified human relationships rather than engagement algorithms. Originally published on Medium, March 2026.

What is the core idea?

The internet is entering a “digital uncanny valley” — content that passes credibility tests on surface quality but registers as hollow to human readers. As generative AI makes high-quality writing cheap and ubiquitous, the craft of content has been severed from the identity of its author. We can no longer infer trustworthiness from quality. The response is structural: move from viral reach to verifiable source, from follower counts to trust circles, from centralised platforms to federated infrastructure.

What are the key themes?

The digital uncanny valley. Perfectly formatted posts, surgically precise professional updates — technically high-quality but missing lived experience. Our brains register the absence even when we cannot name it. As AI mimics personal anecdotes and niche expertise at scale, content quality alone becomes a less reliable indicator of a trustworthy source. Trust circles and transitive trust. The emerging model uses layered real-world relationships rather than raw follower counts. Peer attestations create signed edges in a social graph. Transitive trust means that if Alice verifies Bob’s identity, and Carol trusts Alice, Carol can infer a degree of trust in Bob. Graph algorithms (SybilRank, PageRank variants) filter automated clusters and surface genuine human signals. Decentralised social infrastructure. Federation and data portability move the power to define “truth” away from a central corporation and into communities. Individual instances set their own moderation rules. If one server becomes a bot farm, other communities can block it by consensus — without censoring the whole network. Users own their social graph rather than renting access to a platform’s database. The emerging protocols. Three decentralised models are currently competing:
  • Mastodon — federation via ActivityPub/Fediverse; community-led servers that interoperate
  • Bluesky — AT Protocol with hard decentralisation; users choose their own algorithms and can migrate their full profile
  • Nostr — fully ownerless; every post is a cryptographically signed message; authenticity verified directly through the signature
Trust-first feeds. The successor to the algorithmic “For You” page is a feed that prioritises posts from within one to three degrees of personal trust separation. This creates a human filter that naturally excludes generic AI noise — not through content moderation, but through relationship architecture.