nilGPT: Towards Verifiable Trust
TL;DR: nilGPT is now open source and comes with attestations. This means you can now verify the code running matches the published repository.
Two weeks ago we launched the first version of nilGPT with a clear mission: to give people a truly private option when using AI chatbots. This is important as most AI services today are not private by design—users are handing over their sensitive data and thoughts with no guarantees about how that information will be used or who may see it in the future.
Still, claiming privacy on its own isn't enough. Anybody can claim that their service is private.
We are taking all of the necessary steps to prove that our service is private by introducing a layer of verifiable trust to nilGPT, making it possible for anyone to confirm that nilGPT is behaving exactly as promised.
Alongside smaller upgrades like bug fixes, the two key features of this release are:
- open-sourcing the nilGPT repository.
- exposing attestations inside nilGPT.
Why should this matter to you? Because open source alone isn't a guarantee.
nilGPT could, in theory, run different code in the background. Attestations close that gap, giving you a way to check that the code running nilGPT is the same code published in the repository. That means you don't just have to trust us—you can verify it yourself.
In the rest of this post, we dive into the details.
1. Open-Sourcing the Code
The nilGPT codebase is now open source. You can find the repository here: https://github.com/NillionNetwork/nilgpt
This means anyone —not just our team —can look at the code, review it, and understand exactly what's happening under the hood.
Transparency is the foundation of verifiable trust, and open-sourcing the repository is our way of letting the world see, first hand, how the app is working.
For developers, it will also allow for external contributions or even the ability to run it totally locally, if so desired.
Open sourcing the code, however, is not fool-proof. Let's dive into why attestations are a vital component below.
2. Attestations Exposed in the App
This release also gives users the ability to see and verify attestations inside nilGPT. These attestations are generated in nilCC (where nilGPT runs) within a TEE.
A TEE, or a trusted execution environment, is special hardware that executes code in a secure enclave.
In short, in the app itself, you can now verify that the code running in the app matches the code open-source repository and not a modified version.

What are Attestations?
Attestations are cryptographic proofs generated by TEEs. They act like a seal of authenticity, proving that the program inside the TEE, or the TEE itself, have not been tampered with. The key part of the attestation that we are exposing in this release is the measurement hash.
What is a Measurement Hash and what do I need to check?
You can think of a measurement hash like a fingerprint for software. When the TEE starts up, it takes the code and configuration that will run inside it, and creates a unique cryptographic fingerprint (the hash).
- If the code changes, the hash changes.
- If the hash matches the hash produced from the open source repository, you know the code hasn't been altered.
As of this release, the measurement hash generated in the TEE can be fetched by the user directly on nilGPT meaning users can:
- See the measurement hash generated in nilCC (the TEE).
- Compare it with a pre-computed hash (generated from the open-source repository).
- If the two match, you can be confident the TEE is running exactly the same code as the open-source nilGPT.
This is a big step toward verifiable trust as it removes the need to just trust that the code running is the code from the open sourced repository.
Note that currently, the pre-computed hash of the open source code base is exactly that: pre-computed by us. In future versions, we will give anyone the ability to easily generate this hash independently, further strengthening trust and transparency.
Looking Ahead
This release marks another milestone in nilGPT's journey to verifiably trust.
We plan to release new versions and features continuously over the next few months.
Some forthcoming features to get excited about include: voice notes, image and PDF uploads, dark mode, and more.
Stay tuned!