NEAR AI Cloud and Private Chat are live! We're bringing hardware-backed, verifiable privacy to AI inference for enterprises, developers, & end users. Privacy is a requirement for User-Owned AI. None of today’s centralized AI products guarantee privacy for users & businesses. We’re moving into a transformational stage where AI will make decisions on our behalf, run on all our devices, and have more context about our lives. But we need AI to be on our side and aligned to our success, not prioritizing corporate profits.
We should be able to use AI without exposing everything we do to the company running the inference or even the owner of the hardware. We should be able to trust that our chatbot isn’t trying to sell us to the highest bidder and that our data isn’t in danger of leaking on a search engine. This also applies to businesses, who not only have compliance requirements for customers’ data but also want to make the most of their own IP and expert data.
NEAR AI utilizes decentralized confidential machine learning (DCML), processing data in a fully encrypted environment with Intel TDX and NVIDIA Confidential Computing. Inference is end-to-end encrypted, every interaction is private, and both user data and model weights are only decrypted inside the confidential enclave, ensuring that nobody can access either. Each inference generates a cryptographic attestation proving the model ran in genuine, verified hardware with the expected code, which can be independently validated.
Unlocking real, verifiable privacy means users and businesses can finally share full context with AI. This means better products, better results, and better AI that is on our side. To learn more about NEAR AI Cloud and Private Chat, read . I shared a new post about why we need private AI:
2.92K