Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesEarnSquareMore
Data-Anchored Tokens (DAT) and ERC-8028: The Native AI Asset Standard for the Decentralized AI (dAI) Era on Ethereum

Data-Anchored Tokens (DAT) and ERC-8028: The Native AI Asset Standard for the Decentralized AI (dAI) Era on Ethereum

ForesightNewsForesightNews2025/12/05 13:33
Show original
By:ForesightNews

If Ethereum is to become the settlement and coordination layer for AI agents, it will need a way to represent native AI assets—something as universal as ERC-20, but also capable of meeting the specific economic model requirements of AI.

In this article, we will trace how "AI tokens" have evolved from a narrative concept to infrastructure, and the position of Data Anchoring Tokens (DAT) within Ethereum's emerging decentralized AI (dAI) tech stack. The discussion will focus on the following topics:


  1. Unveiling the façade of "decentralized AI"
  2. Exploring Ethereum's decentralized AI (dAI) mission
  3. Introducing DAT and ERC-8028 as native AI asset standards on Ethereum
  4. Explaining the core differences between DAT and existing AI tokens
  5. Detailing the new possibilities that the DAT Ethereum Improvement Proposal (EIP, i.e., ERC-8028) brings to developers
  6. Demonstrating the practical applications of DAT through Lazbubu, CreateAI, and LazAI Alpha mainnet


I. The Façade of "Decentralized AI"


Since the end of 2022, AI tokens have become one of the hottest narratives in the crypto space, delivering outsized returns, with almost every cycle spawning a new wave of "decentralized AI" projects. However, most so-called AI tokens are not native AI assets. The first generation of related projects can be roughly divided into three categories:


· Computing networks (such as Render or Akash Network): tokens are used to pay for GPU computing resources;

· Agents or intelligent networks (such as Bittensor or Fetch.ai): tokens are used to reward model developers or autonomous agents;

· Data marketplaces (such as Ocean Protocol): access rights to datasets are tokenized, rather than tokenizing the AI outputs generated from using those datasets. 


These systems are important experiments. In theory, they promise to realize "decentralized AI"; they also demonstrate that tokens can coordinate computing resources, data, and agents at scale. However, they generally suffer from the following structural issues:


· AI workloads are almost entirely off-chain, with the blockchain serving only as a payment and registry layer;

· Tokens do not represent the AI assets themselves;

· Revenue distribution, usage statistics, and provenance mechanisms are often added as afterthoughts, rather than being coded as reusable standards.


These limitations also explain why recent academic reviews describe most current AI token ecosystems as "the illusion of decentralized AI": their architectures and economic models are often highly similar to centralized AI services, with a token layer forcibly added on top.


II. Ethereum's Decentralized AI (dAI) Shift: Agents Need Assets


Against this backdrop, Ethereum has begun to clarify its more specific role in the AI economy. In recent articles and speeches, Vitalik Buterin has outlined a cautious yet clear direction for "crypto + AI": "Blockchains do not exist to let AI govern protocols; rather, they should provide verification, provenance, and trusted neutrality for AI-driven systems — an open foundational architecture that allows agents to settle, verify, and share value under transparent rules."


The Ethereum Foundation's newly launched decentralized AI (dAI) initiative is based on this philosophy: Ethereum should become the settlement and coordination layer for AI agents and the machine economy. Led by Davide Crapis (@DavideCrapis), it explicitly positions itself as "the infrastructure for the AI economy," with its announcement on X (formerly Twitter) stating its mission: "Make Ethereum the preferred settlement and coordination layer for artificial intelligence (AI)."


To seriously pursue this vision, two core requirements must be met immediately:


1. Agent and payment standards (e.g., ERC-8004/x402 solutions for agent-to-agent settlement);

2. Standards for AI assets themselves — including data, models, inference results, and interaction histories consumed and generated by agents.


If Ethereum is to become the settlement and coordination layer for AI agents, it needs a way to represent native AI assets: something as universal as ERC-20, yet tailored to the economic models specific to AI.


DAT and ERC-8028 were created to solve this problem, proposing a standardized solution that allows AI assets to exist on Ethereum as programmable, verifiable rights, rather than mere metadata behind a symbol.


III. DAT and ERC-8028: Positioning AI as On-Chain Assets


Data Anchoring Token (DAT) is a semi-fungible token (SFT) standard developed by LazAI Network, "designed for native AI assets." Each DAT is defined as a dynamic combination of three elements: proof of ownership, usage rights, and a share of revenue associated with a specific AI asset.


LazAI has submitted DAT as an Ethereum official standard via Ethereum Improvement Proposal (EIP-8028), aiming to upgrade DAT from a protocol-specific mechanism to a reusable, audited, and ecosystem-wide Ethereum AI asset representation standard.

See EIP-8028 details here: Fellowship of Ethereum Magicians


This proposal adds a set of AI-specific semantics to existing ERC standards:


· Class concept: represents abstract AI assets (such as datasets, models, agent profiles, or inference pools), with metadata and integrity references;

· Quota concept: associated with each token, quantifying the consumable quota of the underlying asset;

· Share concept: specifies how the revenue generated by the asset should be distributed among token holders;

· Standard events and methods: used to record class usage and settle revenue in ETH or ERC-20 tokens.


This is exactly the kind of concrete standard needed for Ethereum's decentralized AI (dAI) roadmap: a verifiable, composable standard that treats AI products as first-class on-chain assets.


IV. Differences Between DAT and Existing AI Tokens


4.1 Asset-Centric Design


Existing AI tokens are often unnecessary layers added on top of protocols, lacking real utility. In DAT (ERC-8028), the core organizational unit is the AI asset itself. The class concept allows datasets, models, and agents to be referenced individually, and tokens represent specific rights to a "class" — which aligns more closely with the actual construction logic of AI systems.


4.2 Usage as the Core Economic Variable


DAT's design inherently binds "allowed usage" to token rights. The specific measurement unit is defined by the class's rules (e.g., token amount, number of calls, steps, or composite metrics), but the quota mechanism at the standard level ensures consistency in usage statistics and pricing.


This is the prerequisite for building sustainable economic models: contributors' revenue is linked to actual workload, not just overall protocol metrics.


4.3 Standardized Revenue Distribution at the Asset Layer


The revenue sharing model in ERC-8028 is part of the standard interface. This allows the value generated by AI assets to be distributed in a universal format to data contributors, model builders, fine-tuners, evaluators, and infrastructure providers. This is crucial for on-chain analysis and risk assessment: revenue streams become auditable and composable, rather than hidden in custom contracts.


In summary, these features mean that DAT is not a traditional governance or payment token. It is more like a tokenized right, verifiable for AI activity and bound to a specific asset, with standardized semantics describing how that activity consumes quota and distributes value.


V. The Significance of the DAT Ethereum Improvement Proposal (ERC-8028) for Developers


Advancing DAT as an Ethereum standard (rather than a protocol-specific mechanism) has tangible implications for developers. The DAT Ethereum Improvement Proposal (ERC-8028) codifies it as a reusable, audited, and ecosystem-wide standard:


· Defines interfaces for class creation, token minting, quota statistics, usage recording, revenue settlement, and rights claims;

· Specifies on-chain representations for AI-specific semantics such as quota and shareRatio;

· Sets specifications for metadata, integrity, and rule references, enabling wallets, block explorers, indexers, and analytics tools to understand and visualize AI assets without custom integration.


5.1 Data and Model Providers


For data and model providers, ERC-8028 offers a standardized way to publish AI products as on-chain assets, featuring:


· Verifiable metadata and integrity references;

· Clear usage and authorization rules;

· A standard interface for revenue sharing among multiple contributors.


Providers no longer need to repeatedly implement authorization or royalty logic for each project, but can rely on a unified interface understood by all downstream protocols.


5.2 Agent and Application Developers


DAT provides a unified abstraction layer for external assets relied upon by agents. An agent that needs to consume multiple datasets and models from different ecosystems only needs to hold DAT rights for each relevant class, and can handle usage permissions and economic settlement through a single, coherent interface, rather than relying on fragmented integration solutions.


5.3 Infrastructure, DeFi, and Analytics Projects


This proposal formally defines a new class of on-chain objects for indexing, collateralization, or hedging. Because DAT exposes usage and value flows, it can support new financial instruments: revenue-backed notes for specific models, dataset exposure portfolios, or structured products reflecting future AI workload for specific classes.


5.4 The Broader Ethereum Ecosystem


At the ecosystem level, ERC-8028 clarifies the concept of "on-chain AI."


It does not attempt to migrate heavy training or inference processes to the base layer, but rather standardizes the economic and provenance layers of AI — that is, "which asset is being used, under what rules, and who benefits" — and represents them natively in the Ethereum Virtual Machine (EVM), enabling interoperability in the following scenarios:


· Rollups (Layer 2 scaling solutions);

· Sidechains;

· Dedicated off-chain computing networks.


This is highly consistent with the Ethereum Foundation's vision of "decentralized AI systems with verifiable security guarantees," and aligns with the dAI team's goal of building a decentralized AI tech stack on Ethereum's consensus and cryptographic foundations.


VI. Practical Applications of DAT: Lazbubu, CreateAI, and Alpha Mainnet


Standards only matter when applied in real systems.


Currently, DAT has been experimented with on LazAI — a Web3-native AI infrastructure protocol focused on verifiable data, agent economy, and programmable AI revenue streams.


Specific application cases include:


  1. Lazbubu: Data-anchored companion agent. Lazbubu is an AI companion agent whose behavior is shaped by the user's ongoing interaction history. LazAI uses DAT to anchor these interaction data, turning users' chat logs, task progress, and choices into structured assets. To date, over 14,307 Lazbubu DATs have been minted;
  2. CreateAI: AI agent marketplace. This platform uses the GMPayer cross-chain payment hub (based on the x402 protocol), treating each agent as an independent economic entity and assigning it a dedicated DAT class, making downstream monetization transparent and programmable;
  3. SoulTarot: Tarot card divination AI agent. This application extends DAT's use to more narrative and emotionally driven scenarios, with each divination result settled on-chain and cross-chain payments enabled via GMPayer.
  4. Alpha Mainnet: Turning AI interactions into on-chain value. The LazAI Alpha mainnet (coming soon) will further realize: all interactions with AI agents such as Lazbubu and CreateAI will be anchored as DATs, with METIS used for underlying Gas fee settlement, and based on Proof of Stake (PoS) + QBFT (Practical Byzantine Fault Tolerance) consensus mechanism.


These early deployments are less about achieving perfect economic efficiency and more about validating this mental model: AI can exist as an asset, with rights, usage rules, and reward mechanisms that are machine-readable.


Conclusion


The first wave of AI tokens proved the market's strong demand for "AI + crypto," but also showed that simply wrapping off-chain infrastructure with a token shell is far from enough. The real leverage comes from AI itself — data, models, agents, interaction histories — becoming on-chain assets with clear rights, usage rules, and value flows.


DAT, through the formal standardization of ERC-8028, is designed to clarify these core rules at the asset layer. It does not compete with computing networks, AI Layer1s (base chains), or model marketplaces, but instead provides them with a shared "syntax": for describing the assets sold, how they are used, and the revenue distribution rules for all participants in the supply chain.


If decentralized AI is to move beyond the fleeting hype and price charts to maturity, it needs such concrete standards. DAT is the first serious attempt to define such a standard for native AI assets, which is precisely where its core value lies.

0
0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Earn new token airdrops
Lock your assets and earn 10%+ APR
Lock now!

You may also like

Tether Theory: The Architecture of Monetary Sovereignty and Private Dollarization

A privately-owned company based in the British Virgin Islands, with a very small number of employees, has built a monetary system on a scale comparable to that of a central bank, with profitability even surpassing that of central banks.

Block unicorn2025/12/05 17:13
Tether Theory: The Architecture of Monetary Sovereignty and Private Dollarization

Enemies reconciled? CZ and former employees jointly launch prediction platform predict.fun

Dingaling, who was previously criticized by CZ due to the failure of boop.fun and the "insider trading" controversy, has now reconciled with CZ to jointly launch a new prediction platform, predict.fun.

Chaincatcher2025/12/05 16:40
Enemies reconciled? CZ and former employees jointly launch prediction platform predict.fun
© 2025 Bitget