Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/knip.jsonc
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,15 @@
"ignoreFiles": ["src/custom.css"]
},
"packages/synapse-sdk": {
"entry": ["src/index.ts", "src/{payments,pdp,session,storage,subgraph,warm-storage,sp-registry,filbeam}/index.ts"]
"entry": ["src/index.ts", "src/{payments,session,storage,warm-storage,sp-registry,filbeam}/index.ts"]
},
"packages/synapse-react": {
"entry": ["src/index.ts", "src/filsnap.ts"]
},
"packages/synapse-core": {
"entry": [
"src/index.ts",
"src/sp.ts",
"src/sp/index.ts",
"src/chains.ts",
"src/piece.ts",
"src/usdfc.ts",
Expand Down
4 changes: 0 additions & 4 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,6 @@ packages/synapse-sdk/src/
├── storage/
│ ├── manager.ts # StorageManager (auto-managed contexts)
│ └── context.ts # StorageContext (explicit provider+dataset ops)
├── pdp/
│ ├── auth.ts # PDPAuthHelper (EIP-712 signatures)
│ ├── server.ts # PDPServer (Curio HTTP client)
│ └── verifier.ts # PDPVerifier (contract wrapper)
├── piece/ # PieceCID utilities
├── session/ # Session key support
├── subgraph/ # Subgraph queries
Expand Down
37 changes: 0 additions & 37 deletions docs/src/content/docs/developer-guides/components.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,6 @@ The SDK is built from these core components:
- **`PaymentsService`** - SDK client for managing deposits, approvals, and payment rails (interacts with Filecoin Pay contract)
- **`StorageManager`**, **`StorageContext`** - Storage operation classes
- **`WarmStorageService`** - SDK client for storage coordination and pricing (interacts with WarmStorage contract)
- **`PDPVerifier`** - Client for PDPVerifier contract - get data set and piece status, create data sets and add pieces
- **`PDPServer`** - HTTP client for Curio providers - create data sets and add pieces
- **`PDPAuthHelper`** - Signature generation utility - Generate EIP-712 signatures for authenticated operations (create data sets and add pieces)

The following diagram illustrates how these components relate to each other and the external systems they interact with:

Expand All @@ -36,17 +33,11 @@ graph LR
subgraph "Lower-Level"
WSS[WarmStorageService]
SC[StorageContext]
PDPS[PDPServer]
PDPA[PDPAuthHelper]
PDPV[PDPVerifier]
end
Synapse --> SM
Synapse --> PS
SM --> SC
SM --> WSS
SC --> PDPS
SC --> PDPA
SC --> PDPV
PS --> SC
```

Expand Down Expand Up @@ -110,34 +101,6 @@ Check out the [Storage Context](/developer-guides/storage/storage-context/) guid

**API Reference**: [WarmStorageService API Reference](/reference/filoz/synapse-sdk/warmstorage/classes/warmstorageservice/)

### PDPComponents

#### PDPVerifier

**Purpose**: Client for PDPVerifier contract - get dataset and piece status, create data sets and add pieces.

**API Reference**: [PDPVerifier API Reference](/reference/filoz/synapse-sdk/pdp/classes/pdpverifier/)

**PDPVerifier Example**:

```ts twoslash
// @lib: esnext,dom
import { PDPVerifier } from "@filoz/synapse-sdk/pdp";
const dataSetId = 1n;
// ---cut---
const pdpVerifier = PDPVerifier.create();

// Check if data set is live
const isLive = await pdpVerifier.dataSetLive(dataSetId);

// Query data set information
const nextPieceId = await pdpVerifier.getNextPieceId(dataSetId);
const listener = await pdpVerifier.getDataSetListener(dataSetId);
const storageProvider = await pdpVerifier.getDataSetStorageProvider(dataSetId);
const leafCount = await pdpVerifier.getDataSetLeafCount(dataSetId);
const activePieces = await pdpVerifier.getActivePieces(dataSetId);
```

## Complete Data Flow

This sequence diagram shows the complete lifecycle of a file upload operation, from initialization through verification. Each step represents an actual blockchain transaction or API call.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ interface StorageContextAPI {

// Upload & Download
upload(
data: Uint8Array | ArrayBuffer,
data: File,
options?: UploadOptions
): Promise<UploadResult>;
download(pieceCid: string | PieceCID): Promise<Uint8Array>;
Expand Down Expand Up @@ -231,7 +231,7 @@ const storageContext = await synapse.storage.createContext({
const llmModel = "sonnnet-4.5";
const conversationId = "1234567890";

const data = new TextEncoder().encode("Deep research on decentralization...");
const data = new TextEncoder().encode("Deep research on decentralization...")

const preflight = await storageContext.preflightUpload(data.length);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ Add metadata to organize uploads and enable faster data set reuse - SDK will reu
import { Synapse } from "@filoz/synapse-sdk";
import { privateKeyToAccount } from 'viem/accounts'

const data = null as unknown as Uint8Array;
const data = new Uint8Array([1, 2, 3, 4, 5]);
const synapse = Synapse.create({ account: privateKeyToAccount('0x...') });
// ---cut---
const context = await synapse.storage.createContext({
Expand Down Expand Up @@ -151,23 +151,6 @@ for await (const piece of context.getPieces()) {
console.log(`Found ${pieces.length} pieces`);
```

### Getting data set size

Calculate total storage size by summing piece sizes extracted from PieceCIDs:

```ts twoslash
// @lib: esnext,dom
import { PDPVerifier } from "@filoz/synapse-sdk/pdp";

const dataSetId = 1n;
// ---cut---
const pdpVerifier = PDPVerifier.create();

const leafCount = await pdpVerifier.getDataSetLeafCount(dataSetId);
const sizeInBytes = leafCount * 32n; // Each leaf is 32 bytes
console.log(`Data set size: ${sizeInBytes} bytes`);
```

### Getting a data set piece metadata

Access custom metadata attached to individual pieces for organization and filtering:
Expand Down
25 changes: 15 additions & 10 deletions docs/src/content/docs/getting-started/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -90,13 +90,13 @@ async function main() {
console.log(`✅ USDFC deposit and Warm Storage service approval successful!`);

// 3) Upload
const data = new TextEncoder().encode(
const file = new TextEncoder().encode(
`🚀 Welcome to decentralized storage on Filecoin Onchain Cloud!
Your data is safe here.
🌍 You need to make sure to meet the minimum size
requirement of 127 bytes per upload.`
);
const { pieceCid, size } = await synapse.storage.upload(data)
const { pieceCid, size } = await synapse.storage.upload(file)
console.log(`✅ Upload complete!`);
console.log(`PieceCID: ${pieceCid}`);
console.log(`Size: ${size} bytes`);
Expand Down Expand Up @@ -179,13 +179,13 @@ import { privateKeyToAccount } from 'viem/accounts'
const synapse = Synapse.create({ account: privateKeyToAccount('0x...') })
// ---cut---
// Upload data - SDK automatically selects provider and creates data set if needed
const data = new TextEncoder().encode(
`🚀 Welcome to decentralized storage on Filecoin Onchain Cloud!
Your data is safe here.
🌍 You need to make sure to meet the minimum size
requirement of 127 bytes per upload.`
const file = new TextEncoder().encode(
`🚀 Welcome to decentralized storage on Filecoin Onchain Cloud!
Your data is safe here.
🌍 You need to make sure to meet the minimum size
requirement of 127 bytes per upload.`
);
const { pieceCid } = await synapse.storage.upload(data);
const { pieceCid } = await synapse.storage.upload(file);

// Download data from any provider that has it
const downloadedData = await synapse.storage.download(pieceCid);
Expand Down Expand Up @@ -224,9 +224,14 @@ For more control over provider selection and data set management:
```ts twoslash
// @lib: esnext,dom
import { Synapse } from "@filoz/synapse-sdk";
const data = {} as unknown as Uint8Array;
import { privateKeyToAccount } from 'viem/accounts'
const synapse = Synapse.create({ account: privateKeyToAccount('0x...') })
const file = new TextEncoder().encode(
`🚀 Welcome to decentralized storage on Filecoin Onchain Cloud!
Your data is safe here.
🌍 You need to make sure to meet the minimum size
requirement of 127 bytes per upload.`
);
// ---cut---
// Create a storage context with specific provider
const context = await synapse.storage.createContext({
Expand All @@ -239,7 +244,7 @@ const context = await synapse.storage.createContext({
});

// Upload to this specific context
const result = await context.upload(data);
const result = await context.upload(file);

// Download from this context
const downloaded = await context.download(result.pieceCid);
Expand Down
2 changes: 1 addition & 1 deletion examples/cli/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
"@clack/prompts": "^1.0.0",
"@filoz/synapse-core": "workspace:^",
"@filoz/synapse-sdk": "workspace:^",
"@remix-run/fs": "^0.3.0",
"@remix-run/fs": "^0.4.1",
"cleye": "^2.0.0",
"conf": "^15.0.2",
"terminal-link": "^5.0.0",
Expand Down
17 changes: 7 additions & 10 deletions examples/cli/src/commands/datasets-create.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ import {
getPDPProvider,
getPDPProviders,
} from '@filoz/synapse-core/sp-registry'
import { createDataSet } from '@filoz/synapse-core/warm-storage'
import { type Command, command } from 'cleye'
import type { Account, Chain, Client, Transport } from 'viem'
import { privateKeyClient } from '../client.ts'
Expand Down Expand Up @@ -32,7 +31,6 @@ export const datasetsCreate: Command = command(
async (argv) => {
const { client, chain } = privateKeyClient(argv.flags.chain)

const spinner = p.spinner()
try {
const provider = argv._.providerId
? await getPDPProvider(client, {
Expand All @@ -43,27 +41,26 @@ export const datasetsCreate: Command = command(
p.log.info(
`Selected provider: #${provider.id} - ${provider.serviceProvider} ${provider.pdp.serviceURL}`
)
spinner.start(`Creating data set...`)
p.log.info(`Creating data set...`)

const result = await createDataSet(client, {
const result = await sp.createDataSet(client, {
payee: provider.payee,
payer: client.account.address,
endpoint: provider.pdp.serviceURL,
serviceURL: provider.pdp.serviceURL,
cdn: argv.flags.cdn,
})

spinner.message(
p.log.info(
`Waiting for tx ${hashLink(result.txHash, chain)} to be mined...`
)
const dataset = await sp.waitForDataSetCreationStatus(result)
const dataset = await sp.waitForCreateDataSet(result)

spinner.stop(`Data set created #${dataset.dataSetId}`)
p.log.info(`Data set created #${dataset.dataSetId}`)
} catch (error) {
if (argv.flags.debug) {
spinner.clear()
console.error(error)
} else {
spinner.error((error as Error).message)
p.log.error((error as Error).message)
}
}
}
Expand Down
61 changes: 9 additions & 52 deletions examples/cli/src/commands/datasets-terminate.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
import * as p from '@clack/prompts'
import { getDataSets, terminateDataSet } from '@filoz/synapse-core/warm-storage'
import { terminateServiceSync } from '@filoz/synapse-core/warm-storage'
import { type Command, command } from 'cleye'
import type { Account, Chain, Client, Transport } from 'viem'
import { waitForTransactionReceipt } from 'viem/actions'
import { privateKeyClient } from '../client.ts'
import { globalFlags } from '../flags.ts'
import { hashLink } from '../utils.ts'
import { hashLink, selectDataSet } from '../utils.ts'

export const datasetsTerminate: Command = command(
{
Expand All @@ -23,67 +21,26 @@ export const datasetsTerminate: Command = command(
async (argv) => {
const { client, chain } = privateKeyClient(argv.flags.chain)

const spinner = p.spinner()
try {
const dataSetId = argv._.dataSetId
? BigInt(argv._.dataSetId)
: await selectDataSet(client, argv.flags)
spinner.start(`Terminating data set ${dataSetId}...`)
p.log.info(`Terminating data set ${dataSetId}...`)

const tx = await terminateDataSet(client, {
const { event } = await terminateServiceSync(client, {
dataSetId,
onHash(hash) {
p.log.info(`Waiting for tx ${hashLink(hash, chain)} to be mined...`)
},
})

spinner.message(`Waiting for tx ${hashLink(tx, chain)} to be mined...`)
await waitForTransactionReceipt(client, {
hash: tx,
})

spinner.stop(`Data set terminated`)
p.log.info(`Data set #${event.args.dataSetId} terminated.`)
} catch (error) {
if (argv.flags.debug) {
spinner.clear()
console.error(error)
} else {
spinner.error((error as Error).message)
p.log.error((error as Error).message)
}
}
}
)

async function selectDataSet(
client: Client<Transport, Chain, Account>,
options: { debug?: boolean }
) {
const spinner = p.spinner()
spinner.start(`Fetching data sets...`)

try {
const dataSets = await getDataSets(client, {
address: client.account.address,
})
spinner.stop(`Fetching data sets complete`)

const dataSetId = await p.select({
message: 'Pick a data set to terminate.',
options: dataSets.map((dataSet) => ({
value: dataSet.dataSetId,
label: `#${dataSet.dataSetId} - SP: #${dataSet.providerId} ${dataSet.pdp.serviceURL} ${dataSet.pdpEndEpoch > 0n ? `Terminating at epoch ${dataSet.pdpEndEpoch}` : ''}`,
})),
})
if (p.isCancel(dataSetId)) {
p.cancel('Operation cancelled.')
process.exit(1)
}

return dataSetId
} catch (error) {
spinner.error('Failed to select data set')
if (options.debug) {
console.error(error)
} else {
p.log.error((error as Error).message)
}
process.exit(1)
}
}
8 changes: 4 additions & 4 deletions examples/cli/src/commands/datasets.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import * as p from '@clack/prompts'
import { getDataSets } from '@filoz/synapse-core/warm-storage'
import { getPdpDataSets } from '@filoz/synapse-core/warm-storage'
import { type Command, command } from 'cleye'
import { getBlockNumber } from 'viem/actions'
import { privateKeyClient } from '../client.ts'
Expand Down Expand Up @@ -27,13 +27,13 @@ export const datasets: Command = command(

spinner.start('Listing data sets...')
try {
const dataSets = await getDataSets(client, {
address: client.account.address,
const dataSets = await getPdpDataSets(client, {
client: client.account.address,
})
spinner.stop('Data sets:')
dataSets.forEach(async (dataSet) => {
p.log.info(
`#${dataSet.dataSetId} ${dataSet.cdn ? 'CDN' : ''} ${dataSet.pdp.serviceURL} ${dataSet.pdpEndEpoch > 0n ? `Terminating at epoch ${dataSet.pdpEndEpoch}` : ''} ${dataSet.live ? 'Live' : ''} ${dataSet.managed ? 'Managed' : ''}`
`#${dataSet.dataSetId} ${dataSet.cdn ? 'CDN' : ''} ${dataSet.provider.pdp.serviceURL} ${dataSet.pdpEndEpoch > 0n ? `Terminating at epoch ${dataSet.pdpEndEpoch}` : ''} ${dataSet.live ? 'Live' : ''} ${dataSet.managed ? 'Managed' : ''}`
)
})
p.log.warn(`Block number: ${blockNumber}`)
Expand Down
Loading