Demand for AI data will go viral, creating job opportunities for all, says OORT CEO
The artificial intelligence boom has lifted the economy to unprecedented heights, with the stocks of major players like Nvidia, Microsoft, Google, and Apple hitting multi-trillion-dollar market caps. It’s a fast-rising tide, but unfortunately, it’s not floating everybody’s boats.
Instead, there’s a growing disparity between the AI industry’s “haves” and “have-nots”. While investors enjoy unrivaled returns, many of us are worried about whether or not AI is going to take our jobs.
But there’s still an opportunity for the masses. Dr. Max Li, founder and CEO of the data cloud company OORT that supports decentralized AI applications, has emerged as one of the most vocal advocates for a new kind of economy centered around the commoditization of decentralized data. According to him, AI is going to become so prevalent that there will be an insatiable demand for specialized, high-quality, and domain-specific datasets in every possible niche, creating prospects for everyone to participate in this nascent economy.
There’s just not enough data for AI to achieve its potential, Dr. Li tells TechStartups. “Consider agriculture AI: training a robust AI model to detect early signs of crop disease requires tens of thousands of labeled images of greenhouse tomatoes with specific infestations,” Dr. Li wrote for Crypto.News. “Even Google would struggle to source such niche, real-world data at scale and speed.”
A data economy driven by people
The only way AI companies can possibly source specialized agricultural data is to reach out to the people who specialize in agriculture itself. They’ll need to establish relationships with a global network of farmers that grow every kind of crop, spread across a multitude of different climates and geographic locations.
Big tech cannot get enough data for AI because it’s not naturally centralized, Dr. Li says. “Millions of individuals and organizations across the globe hold valuable fragments of the world’s knowledge,” he wrote. “From farmers with smartphone cameras to doctors with anonymized medical records, the capacity to generate AI-ready data is widely distributed.”
Dr. Li’s solution is to create a decentralized data marketplace built on a blockchain-based infrastructure that’s able to both handle the massive volumes of information AI requires and also distribute rewards to a global community of data contributors. In an article in Forbes, Dr. Li outlined how this model might work, utilizing cryptocurrency as the primary means of payment.
“Collecting data globally means working with data contributors from diverse countries, many of which lack robust financial infrastructure,” Dr. Li said. “Paying individuals in traditional fiat currencies involves friction, high fees and in some cases, outright impossibility.”
Read more: “How to Make Passive Income from Home by Becoming a Decentralized Data Contributor” by Home Business Magazine
Crypto is perfect because digital tokens can be sent anywhere in the world instantly and with minuscule transaction fees. It means people won’t need bank accounts to get paid – all they need is a smartphone with internet access, and that device will also be a primary tool for collecting the data they want to sell, Dr. Li said. “This dramatically lowers the barrier for participation, allowing data collection platforms to tap into truly global networks,” he explained.
Blockchain also helps to ensure data can be authenticated, so AI companies can trust they’re getting a high-quality dataset. It does this by recording the provenance of data points on a transparent ledger that anyone can see, enabling companies to verify its origins, the date it was created, and the submission process. Smart contracts can be used to enforce quality checks, keep records of each contributor’s reputation, and handle dispute resolution mechanisms, Li added.
The role of data tokens
Dr. Li believes that much of the world’s data may ultimately be “tokenized”, where datasets are represented as digital tokens, making them easier to buy and sell. Whoever holds the tokens owns the underlying dataset it represents, allowing them to view and share the information within it and collect revenue from its use when AI developers pay for access. Tokenization also supports fractionalization, which allows multiple individuals or organizations to share ownership of datasets.
This can enable new possibilities for data collaboration, Dr. Li said: “Imagine a research institution tokenizing specific scientific datasets, allowing other researchers to buy fractional access or contribute to a collective data pool.”
Tokenized data is a compelling proposition because the utility of data is embedded in its existence, Dr. Li explained. He points out that data is only created to be used, and the more useful it is – for instance, if it’s unique, high quality, formatted correctly and verified – the more valuable it will be. To trade such high-value assets, companies will need a reliable mechanism, especially if they’re dealing with essentially unknown entities. For instance, an agricultural AI company will likely be buying data from farmers in far-flung regions such as Africa and Latin America, while a healthcare AI firm might want to acquire data from a network of dentists that spans hundreds of different countries.
“As decentralized AI grows, the market will demand decentralized, permissionless access to high-quality datasets, and tokenized data offers the most elegant infrastructure for that future,” Dr. Li stated.
Data supply chains will go viral
Decentralized marketplaces will eventually become the most trusted source of high-quality domain-specific data, leading to economic opportunities for millions of potential contributors, Dr. Li predicts. Ultimately, they’ll evolve into what he calls “data launchpads“, where AI companies can post “jobs,” or requests for the specific data they need.
A developer who’s creating an algorithm that predicts crop failure might need thousands of photographs of local agricultural pests. To obtain that data, they could post a request for farm workers and gardeners from Florida, Kenya, Brazil, South Africa, and other parts of the world to snap photos of whatever pests they come across, in exchange for crypto rewards.
Others can contribute by helping to verify that the images are original, and label and organize them into comprehensive datasets ready for AI model training. They, too, will earn rewards for their work. And if the developer doesn’t have the money to pay for their data upfront, blockchain supports alternative economic models that will enable contributors to be paid based on how often their application is used. This creates possibilities for a long-term income stream for contributors. “[They] would be part of a global data supply chain, coordinated not by a central corporation, but by cryptographic protocols and community governance,” Dr. Li said.
As AI accelerates, the value of rich data will grow exponentially, and in time, the biggest decentralized data marketplaces and launchpads might morph into platforms that look more like TikTok, Dr. Li said. He sees them growing virally, attracting both contributors looking for income and companies searching for rare data points. Because in many cases, it will be the only viable way for companies to source such data and ensure that it’s real, useful, and free from copyright issues.
“The question is not whether AI will scale, but who will fuel that scale,” Dr. Li stated. “It won’t just be the data scientists – it will be the data stewards, aggregators, contributors, and the platforms that bring them together.”

