Fund Data Passports & The NAV Registry
From file rooms to .
12/3/2025
Private equity has modern spreadsheets and medieval plumbing.
Every quarter, GPs upload NAVs, cash flow tables, capital account statements, and ESG reports into a patchwork of portals and data rooms. LPs then download those files, re-key them into internal models, and try very hard not to make subtle mistakes at 1 a.m.
Meanwhile, the same manager may be re-underwritten by multiple teams, multiple times, from multiple copies of the same data.
This post sketches a different shape:
- GPs publish standardized fund data once into a shared NAV Registry.
- GPs issue revocable data passports that define exactly which LPs (or tools) can see which slice of that data.
- LPs and analytics tools fetch the data directly, with cryptographic fingerprints and an audit trail.
Think of it as turning a data room from a pile of PDFs into a data utility.
This is not a spec. It is a system sketch: a mental model, some diagrams, and the outlines of how the pieces hang together.
The simple picture: three things and a passport
Before any schema registry or Rust microservice, imagine three simple objects:
- A Registry that holds standardized fund data.
- A Passport that says who can see which parts of that data, for what purpose.
- A Consumer (LP, consultant, internal LP-facing tool) that wants to read.
At this level, the whole system is just:
The core promises:
- The GP publishes once per period, not once per LP.
- The LP gets standardized, machine readable data instead of bespoke spreadsheets.
- Everyone can prove, after the fact, what data was used in an IC memo or model.
Everything else is just how seriously you want to take identity, schemas, and audit.
What problem this actually solves
It helps to connect this to the very boring, very real pain points:
-
Repeated re-underwriting. You already have the manager in house. A new fund or continuation vehicle arrives, and suddenly you are re-collecting NAV histories and cash flows from scratch.
-
Portal and data room sprawl. Each GP picks their own data room. Each LP has a folder called "ManagerName_v4_final_FINAL_really.xlsx" somewhere.
-
Version drift. Two PMs pull similar but not identical data from different vintages or different portals and end up arguing about whose number is "right."
-
Weak lineage. When an LP or an internal committee asks "where did this figure come from", you are often reconstructing the answer from emails and PDFs.
The NAV Registry + data passport idea attacks these head on:
- Reduce repeated data asks by making the registry the primary channel.
- Reduce confusion by enforcing standard schemas and obvious versioning.
- Reduce risk by giving every dataset a cryptographic fingerprint.
- Reduce friction by letting LPs integrate once into a stable, documented API instead of a new data room every time.
A slightly richer mental model
Let us upgrade our cartoon from three boxes to five:
- GPs and their internal systems.
- The Registry itself.
- A Passport service (issue, verify, revoke).
- A simple audit log for commits and access receipts.
- LPs and their tools.
Flow in plain language:
- GPs push standardized data into the Registry.
- The Registry validates and stores datasets, and logs a commit entry.
- GPs issue passports tied to specific LPs and scopes.
- LPs present passports when they read.
- The Registry logs each access as a receipt.
Now we zoom in.
The publish path: from spreadsheets to verifiable datasets
First, the "write lane": how a GP publishes into the Registry.
Design notes:
- The schemas are public, versioned contracts (NAV v1, capital account v1, ESG metric v1).
- The hash is a fingerprint of the exact bytes. If someone tampers with the dataset later, the hash no longer matches.
You can still drop CSVs or Excel files into the Registry. Under the hood, the system translates them into canonical JSON or columnar formats and applies the same machinery.
To a GP, the workflow can look like:
- "Upload this NAV JSON each quarter."
- Or "Drop a CSV on SFTP and let the agent handle it."
The PE industry does not have to become a Rust shop to participate.
The access path: passports at the door
Now, the "read lane": how LPs and tools access the data.
A data passport is a signed credential that says:
- who the data is for (subject),
- which funds and data types it covers (scope),
- how long it is valid (time),
- what it can be used for (purpose, if you want to get fancy).
In practice this could be a signed JSON or token in an early version.
The important bit is not the specific crypto; it is the shape of the interaction:
- Access is not an ad hoc "does this user have folder rights".
- Access is a portable credential that GPs can issue and revoke, and LPs can reuse across multiple tools.
You can start with quite boring tokens (say, signed JSON) and evolve the credential format over time without changing the mental model.
How this relates to data rooms (and does not try to kill them)
Classic data rooms are very good at:
- Holding PDFs of legal documents, marketing decks, and scanned signatures.
- Providing a human-centric, folder-based view of a transaction.
- Managing Q&A threads and redacted documents.
They are not good at:
- Serving structured datasets for analytics.
- Ensuring every LP sees the same structured data without manual reconciliation.
- Proving that a specific figure in a model came from a specific dataset version.
The NAV Registry does not need to replace data rooms entirely. It can:
- Handle structured fund data (NAV, capital, cash flows, ESG) as its primary domain.
- Integrate with VDRs by linking documents to dataset ids and hashes.
- Act as the source behind LP-facing tools (your own portals or analytics).
You can picture it like a layered cake:
- Data rooms keep doing what they already do well.
- The Registry quietly upgrades the data path from "spreadsheet zoo" to "shared, verifiable utility."
Why this becomes a competitive advantage
You could treat this as yet another internal tooling project. Or you could frame it as a structural edge:
-
Faster diligence. PMs get clean NAV and cash flow histories in minutes, not days. Re-underwriting becomes "load the manager" instead of "rebuild the model."
-
Consistent numbers. IC discussions stop being "which spreadsheet did you use" and start being "what do we think this means", because everyone pulls from the same commit.
-
Cleaner story for LPs. You can literally show the lineage from LP report back to the underlying dataset hash and GP publication.
-
Platform for new products. NAV-based financing, secondaries platforms, and risk tools all need the same underlying: standardized data and strong identity. You only build that plumbing once.
There is a meta advantage too: the first allocator to offer "plug in once, get every GP's verified data" will look very attractive to time-poor, data-rich partners.
What this sketch deliberately leaves open
This post has ducked some thorny decisions:
- Which audit log format to use, if any.
- Whether passports stay as simple internal tokens or evolve over time.
- How much of the system to build in house vs. buy or co-develop.
That is intentional. The important part at this stage is the shape:
- Publish once to a Registry.
- Control access via portable Passports.
- Track everything in an append-only Log.
You can make quite a bit of progress with:
- JSON Schemas,
- a SQL database and S3,
- a boring REST API,
- and a simple append-only log.
Broader connections
This idea is not unique to private equity. It rhymes with other pieces of modern infrastructure:
- Software registries (npm, crates.io, PyPI) with content hashes.
- Content-addressed software registries (npm, crates.io, PyPI) that anchor who published what, when.
- Identity providers and OIDC flows that separate "who are you" from "what are you allowed to do."
The twist here is applying those patterns to fund data, NAVs, capital accounts, and ESG metrics, which have historically lived inside PDFs and spreadsheets.
The NAV Registry and data passports sketch a future where:
- GPs stop spending time on bespoke data pulls.
- LPs stop spending time reconciling everyone else's spreadsheets.
- The boring plumbing quietly becomes an edge: less friction, better lineage, more time on actual investment judgment.