SubQuery has introduced its decentralized AI inference hosting at the Web3 Summit in Berlin, aiming to enhance Web3 application development. COO James Bayly showcased the LLama model running on a decentralized network of Node Operators. This initiative seeks to provide developers with decentralized data indexers and RPCs, offering an alternative to centralized services.
The focus of SubQuery's new service is on AI inference, utilizing pre-trained models for predictions while maintaining an open-source approach. This move is designed to challenge the dominance of centralized AI providers like OpenAI and Google Cloud AI. By leveraging a decentralized network, SubQuery ensures privacy and fosters a community-driven ecosystem, enabling scalable AI services for the Web3 landscape.
SubQuery Launches Decentralized AI Hosting at Berlin Web3 Summit
Disclaimer: The content provided on Phemex News is for informational purposes only. We do not guarantee the quality, accuracy, or completeness of the information sourced from third-party articles. The content on this page does not constitute financial or investment advice. We strongly encourage you to conduct you own research and consult with a qualified financial advisor before making any investment decisions.