SubQuery has introduced its decentralized AI inference hosting at the Web3 Summit in Berlin, aiming to enhance Web3 application development. COO James Bayly showcased the LLama model running on a decentralized network of Node Operators. This initiative seeks to provide developers with decentralized data indexers and RPCs, offering an alternative to centralized services. The focus of SubQuery's new service is on AI inference, utilizing pre-trained models for predictions while maintaining an open-source approach. This move is designed to challenge the dominance of centralized AI providers like OpenAI and Google Cloud AI. By leveraging a decentralized network, SubQuery ensures privacy and fosters a community-driven ecosystem, enabling scalable AI services for the Web3 landscape.