Cointime

Download App
iOS & Android

When Data Has Value, AI Gets Fair: Evaluating Your Contribution Without Ever Seeing Your Data

Validated Individual Expert

Pay More, Get More AI? Time to Rethink That

As Web3 and AI increasingly blend into decentralized digital ecosystems, a practical question is becoming harder to avoid: who should AI services prioritize?

Many systems still follow a pay-first rule. The more you pay, the more AI resources you receive. Meanwhile, the people who contribute data and genuinely help AI improve may not receive benefits that match their contribution. At the same time, data value and privacy risk often rise together. The more useful the data is, the more users worry that contributing it could reveal personal details, daily habits, or other sensitive information.

This creates two major bottlenecks that slow down sustainable AI adoption:

  • A fairness bottleneck: Resource allocation ignores contribution and rewards payment, which can sideline data contributors and lead to either resource monopolies or wasted service capacity.
  • A privacy bottleneck: To evaluate data value, systems traditionally need to "see" the data, but the more they see, the greater the risk of privacy leakage.

Our work, accepted at the IEEE International Conference on Multimedia and Expo (ICME 2026), aims to provide a more user-friendly answer. Users should not need to upload raw data to prove contribution. Instead, the system should be able to assess the value of a user's data contribution while keeping privacy protected, then convert that value into fairer and more stable AI usage quotas.

In short: data contribution becomes a passport to AI usage rights — without forcing users to trade away privacy.

Knowing What Your Data Is Worth — Without Ever Seeing It

A User-Centric "Cloud–Edge–User" Three-Layer Architecture

To make AI service both fast and fair, the system adopts a cloud, edge, and user collaboration model that stays close to how modern AI services are actually delivered.

  • Cloud layer — the AI toolbox: Provides general capabilities and support;
  • Edge layer — nearby service stations: Deployed closer to users, enabling low-latency responses and handling key computation tasks;
  • User layer — personal AI agents: Runs agents that submit requests on behalf of the user, such as email summarization and image description.

Two Key Innovations: Turning Contribution into Access — Without Leaking Privacy

The technical core is a closed loop that links three elements together: data contribution, a value signal, and usage quota allocation. Two mechanisms make this loop practical and explainable to general users.

(A) Data Contribution Receipts: Recording Value Without Exposing Raw Data

The system records contributions through a digital credential called a Data Anchoring Token (DAT), acting as a contribution receipt. When a user contributes data, the system produces a verifiable contribution record that can later be used to adjust the user's quota. Importantly, this credential is designed to reflect contribution value rather than exposing the user's raw data.

(B) Topic-Based Data Directories: Comparing Apples to Apples

The system organizes contributions into semantic-based data clusters through Individualistic Decentralized Organization (iDAO) groups — topic-driven catalogs that ensure contextual accuracy. By moving away from a "one-size-fits-all" pool, the system evaluates performance through like-for-like analysis within relevant sectors: pet data against pet data, traffic against traffic. This structured approach provides clearer insights and ensures that data valuations are fair and meaningful within their specific context.

Contribute More, Access More — The "AI Quota Voucher" Mechanism

Resource allocation is implemented through an intuitive quota model. AI quotas work like redeemable service coupons.

  • The system periodically issues usage credits that correspond to AI service capacity;
  • Each user agent has a quota limit — picture it as a basket that holds coupons;
  • When the basket is full, any additional coupons overflow automatically — preventing long-term hoarding by a small number of users and ensuring resources remain broadly available.

The key difference from traditional subscription tiers is the rule that allocates quotas. In common pay-first systems, quotas are set by payment level. In this research, quota allocation is driven by privacy-preserving data valuation. Users who contribute higher-quality, more unique, and more timely data receive a larger quota and can access more AI services. Users who contribute less still receive a baseline level of access, supporting a core principle of universal usability.

In effect, fairness becomes an executable rule in the system. It avoids both monopoly and waste, and it ensures contributors receive real, measurable benefits.

How Data Value Is Assessed Without Exposing Raw Data

A central challenge is measuring data value while protecting privacy. The valuation logic considers multiple dimensions — it's not just about how much data a user provides, but also how useful and timely it is.

  • Volume is evaluated with diminishing returns in mind. Early unique contributions tend to deliver more value, while later similar contributions add less marginal benefit.
  • Novelty is assessed through semantic similarity checks to discourage copy-paste behavior and prevent low-value duplicates from being over-rewarded.
  • Freshness reflects the fact that some data is most valuable when it is recent. Timely information can be highly valuable at the moment of contribution, then gradually loses value as time passes.

Most importantly, the system is built so that users do not need to upload raw data. Instead, user data is transformed into a high-dimensional numeric representation that captures general semantic characteristics. The system then applies privacy protection so that the edge server can evaluate value from the processed representation while making it significantly harder to infer the original content or personal identifiers.

The intended outcome is a strong separation between value and exposure: the system can compute how valuable a contribution is, but it should not learn the sensitive details a user wants to keep private.

80% vs 20%: Privacy Protection Validated by Reconstruction Tests

Privacy claims must be backed by evidence, not marketing language. To test privacy robustness, the research uses a standard evaluation approach in the literature: reconstruction attacks. In this setting, an attacker attempts to "stitch back" the user's original data from the signals visible to the system.

Experimental comparisons indicate a clear gap:

  • Some existing methods may produce reconstructions that preserve recognizable semantic cues. In reported examples, reconstructed outputs remain identifiable — such as being able to recognize objects like a tennis ball or racket — with similarity exceeding 80%. That level of reconstruction suggests that user privacy can still be "seen through" in practice.
  • In contrast, reconstructions under the proposed method largely lose identifiable semantics, with similarity around 20%, and the reduction in semantic leakage appears in both image and text settings.

A straightforward interpretation: the system still understands the type of contribution and can measure its value, but it becomes much harder to infer what the exact content was or who the user is.

That is the privacy goal in a contribution economy: value remains usable, while personally identifiable content remains protected.

Not "Just Another Privacy Solution"

Within Crypto and Web3, related products and research directions often follow several recognizable routes:

  1. "Data stays local": Bringing computation to the data, allowing data to stay on personal devices rather than being directly transferred;
  2. Confidential computing / trusted hardware: Computation happens in protected environments, typically relying on specific hardware;
  3. Compute marketplaces: Resources purchased through transactions, auctions, or pricing mechanisms;
  4. AI network incentives / verification: Using scoring, weighted voting, or verifiable execution to distribute rewards.

Our positioning is best understood as closing a loop that many systems leave open. We combine privacy protection and quota allocation into a single mechanism: the system produces a privacy-preserving value signal for data contributions, then directly uses that value signal to adjust each user's AI usage quota.

The difference in one sentence: Many approaches address how to trade data or compute, or how to run private computation. This work concentrates on how to convert data contribution into explainable, enforceable AI usage rights — with privacy protection designed as the starting point rather than an afterthought.

Why This Matters Right Now

This research aligns with the rise of personalized AI services and AI agents. In the near future, a single user may operate multiple assistants: an email helper, a study coach, a travel planner, and a creative partner. These agents will require stable and predictable access to AI resources. Users may also be willing to contribute data that improves the system's overall intelligence, but they do not want privacy to become the cost of participation.

The proposed mechanism offers a practical path forward. It converts privacy-preserving data valuation into fair AI usage quotas while supporting a stable service experience — reducing wait times and preventing data contributors from being undervalued.

From an ecosystem perspective:

  • For users: Contribution becomes visible, measurable, and redeemable — making rights clearer
  • For platforms: A sustainable incentive for high-quality, unique data to flow in, improving service quality
  • For industry and governance: A gentler way to unlock data value by emphasizing controlled usage rather than uncontrolled disclosure

From the Research Team

"In the AI era, everyone's data contribution has unique value. We hope resource allocation will move beyond the idea of 'who pays gets priority' toward 'who contributes benefits.' Moreover, users should not have to sacrifice privacy to receive fair recognition. Our goal is to let the system identify contribution value without seeing raw data, then convert that value into real, usable AI quotas. This is how decentralized AI can become both more equitable and more trustworthy."— Ming Guo, Co-founder of LazAI

About the Research

This project is carried out through a Mitacs-supported academic partnership and talent development program, with ecosystem engagement that includes industry partners. The research was developed by a team affiliated with The University of British Columbia (UBC) in Vancouver, Canada, and is connected to Blockchain@UBC, a multidisciplinary research cluster that explores blockchain technologies and how emerging technologies can benefit Canada and the broader global community — linking academic insight with industry needs, including structured cross-disciplinary training through its Blockchain Graduate Training Pathway.

Industry ecosystem partners include:

Metis is building full-stack Web3 infrastructure designed for AI agents. Its settlement layer, Andromeda, is the first Layer 2 to implement a decentralized sequencer. Its scaling layer, Hyperion, provides a high-performance execution environment for AI workloads. And its application layer, LazAI, uses the DAT asset standard to transform AI data and agent behavior into verifiable, ownable, and revenue-generating on-chain assets. Three layers working in concert to power the decentralized agent economy.

LazAI Network is a Web3-native decentralized AI ecosystem for building agents and data-driven AI assets, with a mission to align AI with humanity. Through iDAO and the DAT standard, LazAI addresses three core challenges in AI: the difficulty of sharing data, evaluating data quality, and distributing revenue fairly. Its Alpha Mainnet is live.

About IEEE ICME 2026

The IEEE International Conference on Multimedia and Expo (ICME) is a flagship multimedia conference jointly sponsored by four IEEE societies: Computer, Circuits and Systems, Signal Processing, and Communications. Since its founding in 2000, ICME has served as a leading forum for advances in multimedia technologies, systems, and applications, attracting researchers, developers, and practitioners from academia and industry.

ICME 2026 will be held in Bangkok, Thailand, under the theme "Beyond Perception: Intelligent Media in the Age of Autonomous Agents." This year's conference received 3,810 submissions and accepted 1,101 papers, for an acceptance rate of 28.89%.

Get in Touch

If you are building AI services, operating Web3 platforms, designing incentive mechanisms, or researching privacy and fairness, we welcome discussion and collaboration around privacy-preserving data valuation and quota-based AI access. Paper, code, and demo links will be added when available.

Media Contact: [email protected]

Comments

All Comments

Recommended for you

  • US Spot Ethereum ETFs Saw $129.85 Million Net Outflow Yesterday

    On March 20th, according to Trader T monitoring, US spot Ethereum ETFs experienced a net outflow of $129.85 million yesterday.

  • GOP Senators Eye Bank Deregulation Rider for Crypto Bill

    On March 20, Cointelegraph reported, citing Politico, that Senate Republicans are considering adding bank deregulation provisions to a cryptocurrency market structure bill in an effort to advance stalled housing legislation.

  • US Treasury Secretary: Powell's Continued Fed Board Tenure Would Break Historical Precedent

    On March 20, U.S. Treasury Secretary Scott Bessent stated in an interview with Maria Bartiromo that if Jerome Powell continues to serve on the Federal Reserve's Board of Governors, it would break with precedent. He said, "Whether he continues to serve as a governor is his decision. But from a historical perspective, I will tell you, it would be a departure from past practice."

  • Crypto Clarity Act Advances to Senate Hearing Amidst Lawmaker Deliberations

    On March 20, sources revealed that the "Crypto Clarity Act," a top policy priority for the cryptocurrency industry, remains in the advancement phase. Republican members of the Senate Banking Committee convened again today, and an updated text of the bill has been circulated to the White House. While the issue of revenue sharing for stablecoins has not been fully resolved, individuals involved in the negotiations indicated that parties are nearing a compromise. Concurrently, U.S. market regulators have taken the initiative to begin implementing cryptocurrency policies, with plans to eventually establish a new law to support these measures.

  • Golden Morning News | Key Overnight Developments on March 20

    9:00 PM - 7:00 AM Keywords: Worsh, Iran, ECB, Kalshi 1. Kalshi raises over $1 billion at a $22 billion valuation. 2. Fed investigation stalls, casting uncertainty on Worsh's succession. 3. ECB: Maintains three key interest rates unchanged, in line with market expectations. 4. US eases capital requirements for large banks, potentially freeing up billions for lending. 5. Iranian Foreign Minister: Iran will show no further restraint if infrastructure is attacked again. 6. JPMorgan: Hyperliquid crude oil contract trading volume surged to $1.7 billion over the weekend, with non-crypto investors flocking in. 7. Progress made in crypto market structure negotiations, but still in a 'delicate state'; unexpected path forward emerged during the meeting.

  • BTC Drops Below $70,000

    Market data shows that BTC has fallen below $70,000, currently trading at $69,991.15. It has experienced a 1.47% decrease in the past 24 hours. The market is experiencing significant volatility, so please implement risk control measures.

  • ETH Falls Below $2100

    Market data shows that ETH has fallen below $2100, currently priced at $2099.84, with a 24-hour decline of 3.78%. The market is experiencing significant volatility, so please ensure proper risk management.

  • BTC Falls Below $69,000

    Market data shows that BTC has fallen below $69,000, currently priced at $68,993.6, with a 24-hour decline of 3.35%. The market is experiencing significant volatility, so please ensure proper risk management.

  • US Dollar Index DXY Falls 0.50% to 99.79

    The US Dollar Index DXY has decreased by 0.50% during the day, currently reported at 99.79.

  • Spot Gold Price Falls Below $4600

    the spot gold price has fallen below $4600, decreasing by 4.53% during the day.