English
English
Appearance
English
English
Appearance
Import entire IPFS DAGs in a single request using CAR (Content Addressable aRchive) files. Your CIDs are preserved exactly — no re-chunking or re-hashing.
A CAR file packages an entire IPFS directory tree or DAG into a single portable archive. Each block is stored with its original CID, so the service imports everything as-is. This means:
POST /upload/new
Same endpoint as regular uploads — add car: true to signal a CAR import.
| Parameter | Type | Required | Description |
|---|---|---|---|
content | string | Yes | Base64-encoded CAR file |
car | boolean | Yes | Set to true to enable CAR import |
description | string | No | Short description of the import |
folderId | string | No | Folder ID to organize the imported content into |
metadata | object | No | Custom key-value pairs (same rules as regular uploads) |
Step 1: Create a CAR file from a local directory using ipfs-car:
npx ipfs-car pack ./my-directory -o my-archive.carStep 2: Upload the CAR file:
curl -X POST https://api.ipfs.ninja/upload/new \
-H "X-Api-Key: bws_your_api_key_here" \
-H "Content-Type: application/json" \
-d "{
\"content\": \"$(base64 -w0 my-archive.car)\",
\"car\": true,
\"description\": \"My directory import\"
}"import fs from "fs";
const carBuffer = fs.readFileSync("my-archive.car");
const base64Content = carBuffer.toString("base64");
const response = await fetch("https://api.ipfs.ninja/upload/new", {
method: "POST",
headers: {
"X-Api-Key": "bws_your_api_key_here",
"Content-Type": "application/json"
},
body: JSON.stringify({
content: base64Content,
car: true,
description: "My directory import"
})
});
const result = await response.json();
console.log("Root CID:", result.cid);
console.log("Gateway:", result.uris.url);import requests
import base64
with open("my-archive.car", "rb") as f:
car_content = base64.b64encode(f.read()).decode()
response = requests.post(
"https://api.ipfs.ninja/upload/new",
headers={
"X-Api-Key": "bws_your_api_key_here",
"Content-Type": "application/json"
},
json={
"content": car_content,
"car": True,
"description": "My directory import"
}
)
result = response.json()
print("Root CID:", result["cid"])
print("Gateway:", result["uris"]["url"])200 OK {
"cid": "bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi",
"sizeMB": 4.2,
"car": true,
"fileCount": 12,
"uris": {
"ipfs": "ipfs://bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi",
"url": "https://ipfs.ninja/ipfs/bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi"
}
}Each file inside the CAR counts individually against your plan's file limit. A CAR containing 12 files uses 13 slots (12 children + 1 root CAR entry). A single-file CAR counts as 1 slot.
carParentCid) so you can group themIf importing the CAR would exceed your plan's file count, the request is rejected with a 402 error and the content is unpinned automatically:
{
"error": "file limit exceeded: this CAR contains 523 files, you have 800/1000. Upgrade your plan."
}Use the x-amz-meta-import: car header on a PutObject request to import a CAR file through the S3 API.
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import fs from "fs";
const s3 = new S3Client({
endpoint: "https://s3.ipfs.ninja",
credentials: {
accessKeyId: "bws_628bba35",
secretAccessKey: "bws_628bba35e9e0079d9ff9c392b1b55a7b"
},
region: "us-east-1",
forcePathStyle: true
});
const result = await s3.send(new PutObjectCommand({
Bucket: "my-project",
Key: "my-archive.car",
Body: fs.readFileSync("my-archive.car"),
ContentType: "application/vnd.ipld.car",
Metadata: { import: "car" } // ← triggers CAR import
}));
console.log("Root CID:", result.ETag);The ipfs_import_car tool is available in the MCP Server (v1.3.0+):
You: Import my-archive.car to IPFS
Claude: [calls ipfs_import_car with base64 content]
→ Root CID: bafybeig... — https://ipfs.ninja/ipfs/bafybeig...Use the ipfs-car CLI tool:
# Install
npm install -g ipfs-car
# Pack a directory into a CAR file
ipfs-car pack ./my-directory -o my-archive.car
# Check the root CID before uploading
ipfs-car roots my-archive.car
# bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdiExport any CID as a CAR file using the Kubo CLI:
ipfs dag export QmXyz... > my-archive.carUse the @ipld/car library:
import { CarWriter } from "@ipld/car";
import { CID } from "multiformats/cid";
import * as raw from "multiformats/codecs/raw";
import { sha256 } from "multiformats/hashes/sha2";
// Create blocks
const block1 = new TextEncoder().encode("Hello, IPFS!");
const hash1 = await sha256.digest(block1);
const cid1 = CID.create(1, raw.code, hash1);
// Write CAR
const { writer, out } = CarWriter.create([cid1]);
writer.put({ cid: cid1, bytes: block1 });
writer.close();
// Collect output
const chunks = [];
for await (const chunk of out) chunks.push(chunk);
const carBuffer = Buffer.concat(chunks);The key benefit of CAR import is CID preservation. You can verify the root CID matches before and after upload:
# 1. Pack directory and note the root CID
ipfs-car pack ./my-nft-collection -o collection.car
ipfs-car roots collection.car
# bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi
# 2. Upload to IPFS Ninja
curl -s -X POST https://api.ipfs.ninja/upload/new \
-H "X-Api-Key: bws_your_api_key" \
-H "Content-Type: application/json" \
-d "{\"content\": \"$(base64 -w0 collection.car)\", \"car\": true}" \
| jq .cid
# "bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi"
# ✓ CIDs match — content imported exactly as built locally# Export from Pinata using IPFS gateway
ipfs dag export QmYourCID > export.car
# Import to IPFS Ninja
curl -X POST https://api.ipfs.ninja/upload/new \
-H "X-Api-Key: bws_your_api_key" \
-H "Content-Type: application/json" \
-d "{\"content\": \"$(base64 -w0 export.car)\", \"car\": true}"# Filebase supports CAR export via their S3 API
aws s3 cp s3://your-bucket/your-file.car export.car \
--endpoint-url https://s3.filebase.com
# Import to IPFS Ninja
curl -X POST https://api.ipfs.ninja/upload/new \
-H "X-Api-Key: bws_your_api_key" \
-H "Content-Type: application/json" \
-d "{\"content\": \"$(base64 -w0 export.car)\", \"car\": true}"| Limit | Value |
|---|---|
| Max CAR file size | 100 MB |
| Max single root | Must have at least one root CID |
| CAR format | CARv1 (universally supported) |
| Availability | All plans (Dharma, Bodhi, Nirvana) |
Storage and file count limits from your plan apply. The imported DAG counts as one file entry, and the CAR file size is deducted from your storage quota.
The uploaded content is less than 40 bytes, which is too small to be a valid CAR file. Make sure you're sending the full base64-encoded CAR content.
The decoded CAR file exceeds 100 MB. Split your content into multiple smaller CAR files using the carbites package, or upload files individually.
The IPFS node couldn't process the CAR file. Verify the file is a valid CARv1 archive:
ipfs-car roots my-archive.carIf this command fails, the CAR file is malformed. Regenerate it with ipfs-car pack or ipfs dag export.
Your plan's storage limit has been reached. Delete unused files or upgrade your plan at ipfs.ninja/pricing.