Near Protocol team builds next-generation AI model with 1.4T parameters

Near Protocol announced its ambitious plan to construct the world’s largest open-source artificial intelligence model, featuring 1.4 trillion parameters, at its Redacted conference in Bangkok. The model, three and a half times larger than Meta’s open-source LLaMA, will be developed collaboratively by thousands of contributors via Near’s new AI Research hub. Participants can immediately start training a smaller, 500-million-parameter prototype as the project scales toward its final, extensive AI framework.
The model, which Near hopes to scale across seven phases, will utilize competitive crowdsourcing to recruit top contributors who will progress with the increasingly complex models. This decentralized approach not only fuels rapid innovation but also incorporates robust privacy measures. Trusted Execution Environments will secure contributor rewards, facilitating encrypted participation and paving the way for frequent, community-driven updates.
Privacy and decentralization drive AI future
Near co-founder Illia Polosukhin, a key author of the transformative research that influenced models like ChatGPT, shared that the estimated $160 million cost for training and compute infrastructure will be funded by token sales. “We have a business model that allows token holders to earn returns based on the model’s usage, creating a loop where investment feeds back into future model development,” he explained.
Amid growing concerns about the risks of centralized AI, the Near initiative taps into a wider movement toward decentralized AI models that champion user privacy. The project's decentralized architecture seeks to avoid the potential surveillance state Edward Snowden warned against at the conference. Snowden emphasized the importance of a Web3-aligned AI model, stressing that true technological freedom requires decentralization to prevent monopolistic control.
Decentralized AI comes with technical challenges, however. Project co-founder Alex Skidanov explained that achieving the necessary GPU capacity for training would be a “massive” undertaking, likely requiring “tens of thousands of GPUs” in a traditional setup. To overcome these challenges, Near is exploring novel distributed training techniques, inspired by recent breakthroughs from AI research firm DeepMind. Skidanov also acknowledged that a collaborative approach across the AI and blockchain ecosystems may further fuel innovation and decentralization.
If successful, Near Protocol’s decentralized AI model could set a new precedent for large-scale AI in the Web3 ecosystem, challenging the dominance of tech giants. The Near model’s scalability, paired with its commitment to privacy, offers a compelling blueprint for future AI projects in cryptocurrency and beyond. With the initial prototype now live, the world is watching to see if Near can achieve what it envisions as “the most important technology right now—and probably in the future.”
Read also: Analyst predicts BTC price surge to $500,000 using Stock-to-Flow model