RF

RealityFoundation

Research for a verifiable world

We study the boundaries between authentic and synthetic media — and build the academic infrastructure to defend them. Our work underpins the tools, policies, and standards needed to restore trust in visual evidence.

2025

Founded

4

Research themes

3

Founding members

🇪🇺

EU

Based in Europe

Why this work matters

The proliferation of generative AI has made it trivially easy to fabricate images, documents, and video. Legal systems, journalism, and democratic processes depend on the integrity of evidence — a dependency that is now under attack.

The Problem

Visual truth is eroding. Deepfakes, synthetic media, and AI-generated imagery are indistinguishable from authentic records. Courts, journalists, and ordinary people have no reliable mechanism to verify what they see.

The Gap

No academic anchor. Industry-led detection tools exist, but they lack independent verification, peer review, and the permanence required for legal or archival use. Independent academic research is missing.

Our Response

Blockchain-anchored provenance. We research cryptographic timestamping, NFT-based provenance records, and immutable ledger architectures as tools to establish unambiguous chains of authenticity.

Long term

Academic validation layer. RealityFoundation will serve as the independent scientific counterpart to RealityCheck — providing peer-reviewed methodology, published datasets, and policy recommendations.

Research themes

Four interlocking areas that together address the challenge of maintaining verifiable reality.

Theme 01: Proof of Reality

Defining what cryptographic and physical evidence constitutes proof that a digital asset was captured, not generated.

Cryptography Standards

Theme 02: NFT Timestamping

Evaluating blockchain-based timestamp mechanisms for evidentiary integrity across legal and journalistic contexts.

Blockchain Legal

Theme 03: Insurance Fraud Detection

Studying how manipulated imagery enters insurance claims processes and how provenance records can prevent it.

Applied Insurance

Theme 04: Deepfake Forensics

Developing open methodologies for detecting and attributing synthetic media at a forensic standard of rigor.

Forensics AI

Built for the European regulatory moment

The EU AI Act, the Digital Services Act, and the forthcoming Media Authenticity Regulation are creating binding obligations around synthetic media transparency. RealityFoundation is positioned to provide the independent academic grounding these frameworks require — from methodology to evidence standards.

EU AI Act

Requires watermarking of AI-generated content. Our research informs verification methodology.

Digital Services Act

Mandates risk assessments for synthetic media on platforms. We provide independent benchmarks.

C2PA & Content Provenance

We study interoperability between blockchain records and emerging open standards like C2PA and IPTC.

Get Involved

Researchers, policymakers, journalists

We are building a multidisciplinary research community. There is a role for every background.

See how to contribute →