Storage Advanced Features
Learn about advanced storage features including search, copy, move operations, batch uploads, and file size management.
Advanced Search & Filtering
Taruvi Storage provides two powerful ways to search and filter objects:
- GET with Query Parameters ⭐ Recommended - RESTful, cacheable, supports 40+ filters
- POST Search Endpoint - Legacy endpoint, limited filtering
Method 1: Advanced Query Filters (GET) ⭐ Recommended
The object list endpoint supports comprehensive filtering via query parameters with production-grade performance optimized through database indexes.
Endpoint
GET /api/apps/{app_slug}/storage/buckets/{bucket_slug}/objects/?[filters]
Authorization: Bearer YOUR_TOKEN
Available Filters (40+ Options)
Range Filters
size__gte,size__lte,size__gt,size__lt- File size in bytesmin_size,max_size- Friendly alternatives for size rangescreated_at__gte,created_at__lte- Date/time ranges (ISO 8601)created_after,created_before- Friendly date alternativesupdated_at__gte,updated_at__lte- Modification date ranges
Search Filters
search- Search in both filename AND file pathfilename__icontains- Search in filename onlyprefix- Filter by bucket-relative path prefix (simple, recommended)file- Exact file path matchfile__icontains- Search anywhere in file pathfile__startswith- Filter by bucket-relative path prefix (fastest, explicit)file__istartswith- Case-insensitive bucket-relative path prefix filtermetadata_search- Search in JSON metadata
MIME Type Filters
mimetype- Exact MIME type matchmimetype__in- Multiple types (comma-separated)mimetype_category- Filter by category:image,video,audio,application,text
Visibility & User Filters
visibility- Filter bypublicorprivatecreated_by_me- Files created by current user (true/false)modified_by_me- Files modified by current usercreated_by__username- Filter by creator usernamecreated_by__username__icontains- Username contains
Sorting
ordering- Sort bycreated_at,updated_at,size,filename,path(prefix with-for descending)
Performance Features
- ✅ Sub-second queries on 100K+ objects with proper indexing
- ✅ Composite indexes for multi-filter queries
- ✅ Optimized range queries with B-tree indexes
- ✅ Fast path prefix searches with indexed lookups
Examples
Files Between 1MB and 10MB
curl "https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/?size__gte=1048576&size__lte=10485760" \
-H "Authorization: Bearer YOUR_TOKEN"
Files Uploaded This Month
curl "https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/?created_at__gte=2024-01-01&created_at__lte=2024-01-31" \
-H "Authorization: Bearer YOUR_TOKEN"
All Images (Multiple MIME Types)
curl "https://your-api.com/api/apps/my-app/storage/buckets/assets/objects/?mimetype__in=image/png,image/jpeg,image/webp" \
-H "Authorization: Bearer YOUR_TOKEN"
All Images (By Category)
curl "https://your-api.com/api/apps/my-app/storage/buckets/assets/objects/?mimetype_category=image" \
-H "Authorization: Bearer YOUR_TOKEN"
My Public Files
curl "https://your-api.com/api/apps/my-app/storage/buckets/photos/objects/?created_by_me=true&visibility=public" \
-H "Authorization: Bearer YOUR_TOKEN"
Search in Metadata
curl "https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/?metadata_search=invoice" \
-H "Authorization: Bearer YOUR_TOKEN"
Complex Multi-Filter Query
curl "https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/ \
?mimetype=application/pdf \
&size__gte=5242880 \
&created_by__username=john-doe \
&created_at__gte=2024-01-01 \
&created_at__lte=2024-01-31 \
&ordering=-size" \
-H "Authorization: Bearer YOUR_TOKEN"
Search Full File Path (Not Just Filename)
# Finds /projects/financials/2024/report.pdf
curl "https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/?file__icontains=financials" \
-H "Authorization: Bearer YOUR_TOKEN"
For complete filter reference, see Objects API - Query Parameters.
Method 2: POST Search Endpoint (Legacy)
The POST search endpoint provides basic filtering but is limited compared to GET query parameters.
Endpoint
POST /api/apps/{app_slug}/storage/buckets/{bucket_slug}/objects/search/
Authorization: Bearer YOUR_TOKEN
Content-Type: application/json
Request Body
{
"prefix": "users/john-doe/",
"search": "avatar",
"sortBy": {
"column": "created_at",
"order": "desc"
},
"limit": 100,
"offset": 0
}
Parameters
| Field | Type | Required | Description |
|---|---|---|---|
prefix | string | No | Filter by bucket-relative path prefix (e.g., users/john-doe/) |
search | string | No | Search in filename (case-insensitive) |
sortBy | object | No | Sort configuration |
sortBy.column | string | No | Column to sort by (created_at, updated_at, size, filename) |
sortBy.order | string | No | Sort order: asc or desc (default: asc) |
limit | integer | No | Max results (default: 100, max: 1000) |
offset | integer | No | Skip first N results (default: 0) |
Examples
Search in Specific Folder
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/user-avatars/objects/search/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"prefix": "users/john-doe/",
"search": "profile",
"limit": 50
}'
Sort by Size (Largest First)
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/search/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"sortBy": {
"column": "size",
"order": "desc"
},
"limit": 20
}'
Pagination Example
# Page 1
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/media/objects/search/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"limit": 100,
"offset": 0
}'
# Page 2
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/media/objects/search/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"limit": 100,
"offset": 100
}'
Complex Search
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/search/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"prefix": "projects/project-123/",
"search": "report",
"sortBy": {
"column": "created_at",
"order": "desc"
},
"limit": 50,
"offset": 0
}'
Response (200 OK)
{
"success": true,
"message": "Objects retrieved successfully",
"status_code": 200,
"data": {
"objects": [
{
"id": 1,
"uuid": "7c9e6679-7425-40de-944b-e07fc1f90ae7",
"filename": "profile-avatar.jpg",
"file": "users/john-doe/profile-avatar.jpg",
"size": 245678,
"mimetype": "image/jpeg",
"created_at": "2025-01-15T10:05:00Z",
"updated_at": "2025-01-15T10:05:00Z"
}
],
"count": 1,
"bucket": "user-avatars"
}
}
JavaScript Example
async function searchObjects(
bucketSlug: string,
options: {
prefix?: string;
search?: string;
sortBy?: { column: string; order: 'asc' | 'desc' };
limit?: number;
offset?: number;
}
) {
const response = await fetch(
`https://your-api.com/api/apps/my-app/storage/buckets/${bucketSlug}/objects/search/`,
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
body: JSON.stringify(options),
}
);
return response.json();
}
// Usage
const results = await searchObjects('user-avatars', {
prefix: 'users/john-doe/',
search: 'avatar',
sortBy: { column: 'created_at', order: 'desc' },
limit: 50,
});
Copy Objects
Copy files to new locations within the same or different buckets.
Endpoint
POST /api/apps/{app_slug}/storage/buckets/{bucket_slug}/objects/copy/
Authorization: Bearer YOUR_TOKEN
Content-Type: application/json
Request Body
{
"source_path": "users/john-doe/avatar.jpg",
"destination_bucket": "avatars-backup",
"destination_path": "backups/john-doe-avatar.jpg"
}
Parameters
| Field | Type | Required | Description |
|---|---|---|---|
source_path | string | Yes | Path of source object in current bucket |
destination_bucket | string | No | Target bucket slug (defaults to source bucket) |
destination_path | string | Yes | Path for copied object |
Behavior
- Creates a new object with a new UUID
- Copies file content to the new location
- Copies metadata from source object
- Source object remains unchanged
Examples
Copy within Same Bucket
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/user-avatars/objects/copy/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"source_path": "users/john-doe/avatar.jpg",
"destination_path": "users/john-doe/avatar-backup.jpg"
}'
Copy to Different Bucket
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/user-avatars/objects/copy/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"source_path": "users/john-doe/avatar.jpg",
"destination_bucket": "avatars-archive",
"destination_path": "2025/john-doe-avatar.jpg"
}'
Batch Copy (Multiple Requests)
const filesToCopy = [
{ source: 'file1.jpg', dest: 'backup/file1.jpg' },
{ source: 'file2.jpg', dest: 'backup/file2.jpg' },
{ source: 'file3.jpg', dest: 'backup/file3.jpg' },
];
for (const file of filesToCopy) {
await fetch(
'https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/copy/',
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
body: JSON.stringify({
source_path: file.source,
destination_path: file.dest,
}),
}
);
}
Response (201 Created)
{
"success": true,
"message": "Object copied successfully",
"status_code": 201,
"data": {
"id": 42,
"uuid": "a3c5e2b7-9d4f-4b2a-8e6c-1f5d3a8b9c0d",
"bucket": 2,
"bucket_slug": "avatars-backup",
"bucket_name": "Avatars Backup",
"filename": "john-doe-avatar.jpg",
"file": "backups/john-doe-avatar.jpg",
"size": 245678,
"mimetype": "image/jpeg",
"metadata": {
"copied_from": "users/john-doe/avatar.jpg"
},
"created_at": "2025-01-15T12:00:00Z",
"updated_at": "2025-01-15T12:00:00Z",
"created_by": 1,
"modified_by": null
}
}
Error Responses
Source Not Found (404)
{
"success": false,
"message": "Source file not found in storage",
"status_code": 404
}
Copy Failed (500)
{
"success": false,
"message": "Failed to copy object",
"status_code": 500,
"data": {
"detail": "Permission denied"
}
}
Move/Rename Objects
Move or rename objects within a bucket. Since files are stored by UUID, this is a metadata-only operation.
Endpoint
POST /api/apps/{app_slug}/storage/buckets/{bucket_slug}/objects/move/
Authorization: Bearer YOUR_TOKEN
Content-Type: application/json
Request Body
{
"source_path": "users/john-doe/old-name.jpg",
"destination_path": "users/john-doe/new-name.jpg"
}
Parameters
| Field | Type | Required | Description |
|---|---|---|---|
source_path | string | Yes | Current path |
destination_path | string | Yes | New path |
Behavior
- Updates path and filename metadata
- Fast operation - no file copy required
- Instant completion
- Object UUID remains unchanged
Examples
Rename File
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/user-avatars/objects/move/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"source_path": "users/john-doe/avatar-old.jpg",
"destination_path": "users/john-doe/avatar-new.jpg"
}'
Move to Different Folder
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/move/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"source_path": "temp/report.pdf",
"destination_path": "reports/2025/q1-report.pdf"
}'
Reorganize User Files
// Move all files from old structure to new structure
const files = await listObjects('my-bucket', 'old-folder/');
for (const file of files.data) {
const newPath = file.path.replace('old-folder/', 'new-folder/');
await fetch(
'https://your-api.com/api/storage/buckets/my-bucket/objects/move/',
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
body: JSON.stringify({
source_path: file.path,
destination_path: newPath,
}),
}
);
}
Response (200 OK)
{
"success": true,
"message": "Object moved successfully",
"status_code": 200,
"data": {
"id": 1,
"uuid": "7c9e6679-7425-40de-944b-e07fc1f90ae7",
"bucket": 1,
"bucket_slug": "user-avatars",
"bucket_name": "User Avatars",
"filename": "avatar-new.jpg",
"file": "users/john-doe/avatar-new.jpg",
"size": 245678,
"mimetype": "image/jpeg",
"metadata": {},
"created_at": "2025-01-15T10:05:00Z",
"updated_at": "2025-01-15T12:30:00Z",
"created_by": 1,
"modified_by": 1
}
}
Performance
Move operations are extremely fast because:
- No file copy required in cloud storage
- Only metadata is updated
- No network transfer of file content
Perfect for:
- Renaming files
- Reorganizing folder structures
- Batch path updates
File Size Limitations
Taruvi Storage currently supports file uploads with the following limitations:
Upload Limits
Single File Upload
- Default limit: 50MB per file
- Configurable: Can be adjusted via bucket's
file_size_limitsetting during bucket creation - Upload methods: Regular upload, PUT/POST object endpoints
Batch Upload
- Max files: 10 files per request
- Total size limit: 100MB across all files combined
- Per-file limit: Respects bucket's
file_size_limit
Current Capabilities
| Upload Method | Max File Size | Best For |
|---|---|---|
| Single upload (POST/PUT) | 50MB (default, configurable) | Images, documents, small videos |
| Batch upload | 100MB total, 10 files max | Photo galleries, multiple documents |
| Raw binary upload | 50MB (default, configurable) | S3-compatible clients, API integrations |
Recommendations
For files within limits (<50MB):
- Use standard upload endpoints
- Fast and reliable for most use cases
- Ideal for images, documents, audio files, short videos
For larger files (>50MB):
- Increase bucket
file_size_limitduring bucket creation - Note: Server memory constraints apply for very large files
- Consider file compression when appropriate
Example: Increasing Bucket Limit
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Large Videos",
"slug": "large-videos",
"file_size_limit": 524288000,
"description": "Bucket for video files up to 500MB"
}'
Upcoming Features
Multipart Upload (Coming Soon)
Future versions will include multipart upload support for:
- Files >100MB with chunked uploads
- Resumable uploads for unreliable connections
- Better memory efficiency for large files
- Progress tracking for long uploads
Batch Operations
Perform operations on multiple files efficiently.
Batch Delete Objects
Delete multiple objects in a single request with detailed success/failure tracking.
Endpoint
POST /api/apps/{app_slug}/storage/buckets/{bucket_slug}/objects/batch-delete/
Authorization: Bearer YOUR_TOKEN
Content-Type: application/json
Request Body
{
"paths": [
"users/123/avatar.jpg",
"users/123/banner.png",
"temp/file1.pdf"
]
}
Parameters
| Field | Type | Required | Description |
|---|---|---|---|
paths | array | Yes | Array of object paths to delete (max 100 items) |
Behavior
- Max 100 objects per request
- Partial success - Continues deleting even if some fail
- Permission-aware - Skips objects user doesn't own (no errors for unauthorized)
- Detailed reporting - Returns which objects succeeded/failed and why
- Automatic deduplication - Duplicate paths are removed
Response Codes
| Code | Description |
|---|---|
200 OK | All objects deleted successfully |
207 Multi-Status | Some objects deleted, some failed |
400 Bad Request | Invalid request (empty array, >100 items) |
500 Internal Server Error | Server error during deletion |
Examples
Basic Batch Delete (curl)
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/user-files/objects/batch-delete/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"paths": [
"users/john-doe/old-avatar.jpg",
"users/john-doe/temp-file.pdf",
"users/john-doe/draft.docx"
]
}'
Success Response (200 OK)
{
"success": true,
"message": "Successfully deleted 3 objects",
"status_code": 200,
"data": {
"deleted_count": 3,
"failed": []
}
}
Partial Success Response (207 Multi-Status)
{
"success": true,
"message": "Deleted 2 of 3 objects",
"status_code": 207,
"data": {
"deleted_count": 2,
"failed": [
{
"path": "users/john-doe/missing.jpg",
"error": "Object not found"
}
]
}
}
JavaScript Example with Error Handling
async function batchDeleteFiles(bucketSlug, paths) {
const response = await fetch(
`https://your-api.com/api/apps/my-app/storage/buckets/${bucketSlug}/objects/batch-delete/`,
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
body: JSON.stringify({ paths }),
}
);
const result = await response.json();
if (response.status === 200) {
console.log(`✓ All ${result.deleted_count} files deleted`);
} else if (response.status === 207) {
console.log(`⚠ ${result.deleted_count} deleted, ${result.failed.length} failed`);
result.failed.forEach(f => {
console.error(` ✗ ${f.path}: ${f.error}`);
});
} else {
console.error('✗ Batch delete failed:', result.error);
}
return result;
}
// Usage
await batchDeleteFiles('user-avatars', [
'users/123/old-photo.jpg',
'users/123/temp-file.pdf',
'users/123/draft.docx',
]);
Python Async Example
import aiohttp
import asyncio
async def batch_delete_objects(bucket_slug: str, paths: list[str]) -> dict:
"""Delete multiple objects in one request"""
url = f"https://your-api.com/api/apps/my-app/storage/buckets/{bucket_slug}/objects/batch-delete/"
async with aiohttp.ClientSession() as session:
async with session.post(
url,
headers={
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
json={'paths': paths}
) as response:
result = await response.json()
if response.status == 200:
print(f"✓ All {result['deleted_count']} files deleted")
elif response.status == 207:
print(f"⚠ {result['deleted_count']} deleted, {len(result['failed'])} failed")
for item in result['failed']:
print(f" ✗ {item['path']}: {item['error']}")
return result
# Usage
paths_to_delete = [
'users/john/file1.jpg',
'users/john/file2.pdf',
'users/john/file3.png',
]
result = asyncio.run(batch_delete_objects('documents', paths_to_delete))
Delete All Files in a Folder
// Step 1: List all files in folder
const listResponse = await fetch(
'https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/?file__startswith=temp/',
{ headers: { 'Authorization': 'Bearer YOUR_TOKEN' } }
);
const { data } = await listResponse.json();
// Step 2: Extract paths
const paths = data.map(obj => obj.path);
// Step 3: Batch delete (handle batches of 100)
for (let i = 0; i < paths.length; i += 100) {
const batch = paths.slice(i, i + 100);
await fetch(
'https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/batch-delete/',
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
body: JSON.stringify({ paths: batch }),
}
);
}
Error Responses
Empty Paths Array (400)
{
"paths": ["Paths list cannot be empty"]
}
Too Many Paths (400)
{
"paths": ["Cannot delete more than 100 objects at once"]
}
All Failed (207)
{
"deleted_count": 0,
"failed": [
{
"path": "file1.jpg",
"error": "Permission denied"
},
{
"path": "file2.pdf",
"error": "Object not found"
}
],
"message": "No objects were deleted"
}
Use Cases
1. Cleanup Temporary Files
// Delete all user's temporary uploads
const tempFiles = await listObjects('uploads', 'temp/user-123/');
const paths = tempFiles.data.map(f => f.path);
await batchDeleteFiles('uploads', paths);
2. Delete User Data (GDPR)
// Delete all files belonging to a user
const userFiles = await listObjects('user-data', `users/${userId}/`);
const paths = userFiles.data.map(f => f.path);
// Batch delete in chunks of 100
for (let i = 0; i < paths.length; i += 100) {
const batch = paths.slice(i, i + 100);
await batchDeleteFiles('user-data', batch);
}
3. Clean Up Old Files
// Delete files older than 30 days
const cutoffDate = new Date();
cutoffDate.setDate(cutoffDate.getDate() - 30);
const oldFiles = await fetch(
`https://your-api.com/api/apps/my-app/storage/buckets/temp-uploads/objects/?created_before=${cutoffDate.toISOString()}`
);
const { data } = await oldFiles.json();
const paths = data.map(f => f.path);
await batchDeleteFiles('temp-uploads', paths);
Best Practices
1. Batch Size
// ✓ Good: Respect the 100-item limit
const batches = [];
for (let i = 0; i < paths.length; i += 100) {
batches.push(paths.slice(i, i + 100));
}
for (const batch of batches) {
await batchDeleteFiles('my-bucket', batch);
}
2. Error Handling
// ✓ Good: Handle partial failures gracefully
const result = await batchDeleteFiles('my-bucket', paths);
if (result.failed.length > 0) {
// Log failures
console.error('Some files failed to delete:');
result.failed.forEach(f => {
logger.error(`Failed to delete ${f.path}: ${f.error}`);
});
// Optionally retry failures
const retryPaths = result.failed.map(f => f.path);
setTimeout(() => {
batchDeleteFiles('my-bucket', retryPaths);
}, 5000);
}
3. Performance vs Single Deletes
// ❌ Bad: Deleting files one by one (slow)
for (const path of paths) {
await deleteFile(bucketSlug, path); // 100 requests for 100 files
}
// ✓ Good: Use batch delete (fast)
await batchDeleteFiles(bucketSlug, paths); // 1 request for up to 100 files
4. Permission Awareness
// Batch delete automatically skips unauthorized files
// No need to pre-check permissions
const allFiles = ['my-file.jpg', 'other-user-file.jpg', 'public-file.jpg'];
const result = await batchDeleteFiles('my-bucket', allFiles);
// Only deletes files you own, skips others without error
Performance Benefits
- ~100x faster than individual deletes for 100 files
- Single database transaction for all deletions
- Efficient S3 batch API - up to 1000 files per S3 request
- Reduced network overhead - one HTTP request vs many
Comparison: Batch Delete vs Loop
| Metric | Loop (100 files) | Batch Delete (100 files) |
|---|---|---|
| HTTP Requests | 100 | 1 |
| Time (typical) | ~30-60 seconds | ~1-2 seconds |
| Network Overhead | High | Low |
| Error Tracking | Manual | Built-in |
| Partial Success | Not supported | Supported |
Batch Upload Objects
Upload multiple files in a single request for efficient bulk uploads.
Endpoint
POST /api/apps/{app_slug}/storage/buckets/{bucket_slug}/objects/batch-upload/
Authorization: Bearer YOUR_TOKEN
Content-Type: multipart/form-data
Request Format
Multipart form data with:
files: Array of file uploads (field name:files)paths: JSON array of paths for each filemetadata: Optional JSON array of metadata objects
Parameters
| Field | Type | Required | Description |
|---|---|---|---|
files | file[] | Yes | Array of files to upload (max 10) |
paths | JSON array | Yes | Corresponding paths for each file (must match files length) |
metadata | JSON array | No | Optional metadata for each file (must match files length if provided) |
Constraints
- Max 10 files per request
- Total size limit: 100MB across all files
- Paths required: Must provide explicit path for each file
- Unique paths: No duplicate paths allowed
- Per-file validation: Each file validated against bucket constraints
Response Codes
| Code | Description |
|---|---|
200 OK | All files uploaded successfully |
207 Multi-Status | Some files uploaded, some failed |
400 Bad Request | Invalid request (validation errors) |
Examples
Basic Batch Upload (curl)
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/user-files/objects/batch-upload/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-F "files=@photo1.jpg" \
-F "files=@photo2.jpg" \
-F "files=@document.pdf" \
-F 'paths=["users/123/photo1.jpg","users/123/photo2.jpg","users/123/document.pdf"]'
With Metadata
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/documents/objects/batch-upload/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-F "files=@report1.pdf" \
-F "files=@report2.pdf" \
-F 'paths=["reports/2025/q1.pdf","reports/2025/q2.pdf"]' \
-F 'metadata=[{"quarter":"Q1","year":2025},{"quarter":"Q2","year":2025}]'
Success Response (200 OK)
{
"success": true,
"message": "Successfully uploaded 3 files",
"status_code": 200,
"data": {
"uploaded_count": 3,
"failed_count": 0,
"total": 3,
"successful": [
{
"index": 0,
"path": "users/123/photo1.jpg",
"object": {
"id": 101,
"uuid": "7c9e6679-7425-40de-944b-e07fc1f90ae7",
"filename": "photo1.jpg",
"file": "users/123/photo1.jpg",
"size": 245678,
"mimetype": "image/jpeg",
"created_at": "2025-01-15T10:05:00Z"
}
},
{
"index": 1,
"path": "users/123/photo2.jpg",
"object": { /* ... */ }
},
{
"index": 2,
"path": "users/123/document.pdf",
"object": { /* ... */ }
}
],
"failed": []
}
}
Partial Success Response (207 Multi-Status)
{
"success": true,
"message": "Uploaded 2 of 3 files",
"status_code": 207,
"data": {
"uploaded_count": 2,
"failed_count": 1,
"total": 3,
"successful": [
{
"index": 0,
"path": "users/123/photo1.jpg",
"object": { /* full object data */ }
},
{
"index": 2,
"path": "users/123/document.pdf",
"object": { /* full object data */ }
}
],
"failed": [
{
"index": 1,
"path": "users/123/large-video.mp4",
"error": "File size (60000000 bytes) exceeds bucket limit (50.0MB)"
}
]
}
}
JavaScript Example
async function batchUploadFiles(bucketSlug, filesWithPaths) {
const formData = new FormData();
// Add files
filesWithPaths.forEach(item => {
formData.append('files', item.file);
});
// Add paths as JSON
const paths = filesWithPaths.map(item => item.path);
formData.append('paths', JSON.stringify(paths));
// Optional: Add metadata
const metadata = filesWithPaths.map(item => item.metadata || {});
formData.append('metadata', JSON.stringify(metadata));
const response = await fetch(
`https://your-api.com/api/apps/my-app/storage/buckets/${bucketSlug}/objects/batch-upload/`,
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
},
body: formData,
}
);
const result = await response.json();
if (response.status === 200) {
console.log(`✓ All ${result.uploaded_count} files uploaded`);
} else if (response.status === 207) {
console.log(`⚠ ${result.uploaded_count} uploaded, ${result.failed_count} failed`);
result.failed.forEach(f => {
console.error(` ✗ ${f.path}: ${f.error}`);
});
}
return result;
}
// Usage
const filesToUpload = [
{
file: fileInput1.files[0],
path: 'users/123/photo.jpg',
metadata: { uploaded_from: 'web', camera: 'iPhone' }
},
{
file: fileInput2.files[0],
path: 'users/123/document.pdf',
metadata: { document_type: 'invoice' }
}
];
await batchUploadFiles('user-files', filesToUpload);
Python Example
import requests
def batch_upload_files(bucket_slug: str, files_data: list) -> dict:
"""
Upload multiple files in one request
files_data: [
{
'file_path': '/path/to/file1.jpg',
'storage_path': 'users/123/photo.jpg',
'metadata': {'key': 'value'}
},
...
]
"""
url = f"https://your-api.com/api/apps/my-app/storage/buckets/{bucket_slug}/objects/batch-upload/"
# Prepare files and form data
files = []
paths = []
metadata_list = []
for item in files_data:
# Open file and add to files list
f = open(item['file_path'], 'rb')
files.append(('files', (item['file_path'].split('/')[-1], f)))
# Collect paths and metadata
paths.append(item['storage_path'])
metadata_list.append(item.get('metadata', {}))
# Prepare form data
data = {
'paths': json.dumps(paths),
'metadata': json.dumps(metadata_list)
}
response = requests.post(
url,
headers={'Authorization': 'Bearer YOUR_TOKEN'},
files=files,
data=data
)
# Close file handles
for _, (_, file_handle) in files:
file_handle.close()
result = response.json()
if response.status_code == 200:
print(f"✓ All {result['uploaded_count']} files uploaded")
elif response.status_code == 207:
print(f"⚠ {result['uploaded_count']} uploaded, {result['failed_count']} failed")
for failure in result['failed']:
print(f" ✗ {failure['path']}: {failure['error']}")
return result
# Usage
files_to_upload = [
{
'file_path': '/tmp/photo.jpg',
'storage_path': 'users/123/photo.jpg',
'metadata': {'uploaded_from': 'python_script'}
},
{
'file_path': '/tmp/document.pdf',
'storage_path': 'users/123/document.pdf',
'metadata': {'document_type': 'report'}
}
]
result = batch_upload_files('user-files', files_to_upload)
React File Upload Component Example
function BatchFileUploader({ bucketSlug }) {
const [files, setFiles] = useState([]);
const [uploading, setUploading] = useState(false);
const handleUpload = async () => {
if (files.length === 0) return;
setUploading(true);
const filesWithPaths = files.map((file, index) => ({
file: file,
path: `users/${userId}/${file.name}`,
metadata: {
original_name: file.name,
uploaded_at: new Date().toISOString()
}
}));
try {
const result = await batchUploadFiles(bucketSlug, filesWithPaths);
if (result.failed_count > 0) {
alert(`Upload completed with ${result.failed_count} failures`);
} else {
alert(`Successfully uploaded ${result.uploaded_count} files!`);
setFiles([]);
}
} catch (error) {
alert('Upload failed: ' + error.message);
} finally {
setUploading(false);
}
};
return (
<div>
<input
type="file"
multiple
onChange={(e) => setFiles(Array.from(e.target.files))}
disabled={uploading}
/>
<p>Selected: {files.length} files (max 10)</p>
<button onClick={handleUpload} disabled={uploading || files.length === 0}>
{uploading ? 'Uploading...' : `Upload ${files.length} files`}
</button>
</div>
);
}
Error Responses
Too Many Files (400)
{
"files": ["Cannot upload more than 10 files at once"]
}
Total Size Exceeds Limit (400)
{
"non_field_errors": ["Total file size (120000000 bytes) exceeds limit of 100MB"]
}
Paths Mismatch (400)
{
"non_field_errors": ["Paths array length (2) must match files array length (3)"]
}
Empty File (400)
{
"non_field_errors": ["File at index 1 ('users/123/empty.txt') is empty"]
}
Use Cases
1. Photo Gallery Upload
// User selects multiple photos from gallery
const photos = document.querySelector('#photo-input').files;
const filesWithPaths = Array.from(photos).map((photo, i) => ({
file: photo,
path: `galleries/${galleryId}/${Date.now()}-${i}.jpg`,
metadata: { gallery_id: galleryId, upload_date: new Date().toISOString() }
}));
await batchUploadFiles('photo-gallery', filesWithPaths);
2. Document Batch Import
// Import multiple documents at once
const documents = [
{ file: file1, path: 'documents/invoices/2025-001.pdf' },
{ file: file2, path: 'documents/invoices/2025-002.pdf' },
{ file: file3, path: 'documents/receipts/2025-jan.pdf' }
];
await batchUploadFiles('company-docs', documents);
3. User Onboarding Assets
// Upload profile picture, banner, and resume at once
const onboardingFiles = [
{ file: profilePic, path: `users/${userId}/profile.jpg` },
{ file: banner, path: `users/${userId}/banner.jpg` },
{ file: resume, path: `users/${userId}/resume.pdf` }
];
await batchUploadFiles('user-assets', onboardingFiles);
Best Practices
1. Respect File Limits
// ✓ Good: Check limits before upload
if (files.length > 10) {
alert('Maximum 10 files allowed. Please select fewer files.');
return;
}
const totalSize = files.reduce((sum, f) => sum + f.size, 0);
if (totalSize > 100 * 1024 * 1024) {
alert('Total size exceeds 100MB limit');
return;
}
2. Handle Partial Failures
// ✓ Good: Retry failed uploads
const result = await batchUploadFiles('my-bucket', filesWithPaths);
if (result.failed_count > 0) {
// Extract failed files and retry
const failedFiles = result.failed.map(f => {
const originalIndex = f.index;
return filesWithPaths[originalIndex];
});
console.log(`Retrying ${failedFiles.length} failed uploads...`);
await batchUploadFiles('my-bucket', failedFiles);
}
3. Show Upload Progress
// ✓ Good: Provide user feedback
function showProgress(uploaded, total) {
const percent = Math.round((uploaded / total) * 100);
progressBar.style.width = `${percent}%`;
progressText.textContent = `${uploaded} of ${total} files uploaded`;
}
// After batch upload
showProgress(result.uploaded_count, result.total);
4. Validate Files Client-Side
// ✓ Good: Pre-validate before uploading
function validateFiles(files) {
const errors = [];
files.forEach((file, index) => {
// Check file size (50MB per file)
if (file.size > 50 * 1024 * 1024) {
errors.push(`File ${file.name} is too large (max 50MB)`);
}
// Check file type
if (!file.type.startsWith('image/') && file.type !== 'application/pdf') {
errors.push(`File ${file.name} has invalid type`);
}
});
return errors;
}
Performance Benefits
- ~10x faster than sequential individual uploads
- Single HTTP request for up to 10 files
- Reduced network overhead and connection setup time
- Better user experience with consolidated progress tracking
Comparison: Batch Upload vs Individual Uploads
| Metric | Individual (10 files) | Batch Upload (10 files) |
|---|---|---|
| HTTP Requests | 10 | 1 |
| Time (typical) | ~10-20 seconds | ~2-3 seconds |
| Network Overhead | High (10x handshakes) | Low (1 handshake) |
| Progress Tracking | Complex (10 states) | Simple (1 response) |
| Error Handling | Per-request | Consolidated |
| Code Complexity | Higher | Lower |
Batch Delete (Bucket-Level)
When deleting a bucket, all files are automatically deleted:
curl -X DELETE https://your-api.com/api/apps/my-app/storage/buckets/old-bucket/ \
-H "Authorization: Bearer YOUR_TOKEN"
The system handles:
- Deletion of all files in the bucket
- Efficient batch processing for large numbers of files
- Complete cleanup of all objects and metadata
Manual Batch Operations
For custom batch operations, use loops:
Delete Multiple Objects
const objectsToDelete = ['file1.jpg', 'file2.jpg', 'file3.jpg'];
for (const path of objectsToDelete) {
await fetch(
`https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/object/${path}`,
{
method: 'DELETE',
headers: { 'Authorization': 'Bearer YOUR_TOKEN' },
}
);
}
Copy Multiple Objects
import asyncio
import aiohttp
async def copy_files(files):
async with aiohttp.ClientSession() as session:
tasks = []
for file in files:
task = session.post(
'https://your-api.com/api/apps/my-app/storage/buckets/source/objects/copy/',
headers={'Authorization': 'Bearer YOUR_TOKEN'},
json={
'source_path': file['source'],
'destination_bucket': 'destination',
'destination_path': file['dest'],
}
)
tasks.append(task)
results = await asyncio.gather(*tasks)
return results
# Usage
files = [
{'source': 'file1.jpg', 'dest': 'backup/file1.jpg'},
{'source': 'file2.jpg', 'dest': 'backup/file2.jpg'},
]
asyncio.run(copy_files(files))
Folder Operations
While the API doesn't have explicit folder support, you can implement folder-like behavior using path prefixes.
List "Folder" Contents
curl "https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/?file__startswith=users/john-doe/" \
-H "Authorization: Bearer YOUR_TOKEN"
Create "Folder" Structure
Simply upload files with the desired path:
curl -X POST https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/upload/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-F "file=@file.jpg" \
-F "path=users/john-doe/photos/2025/file.jpg"
The "folders" (users, john-doe, photos, 2025) are created implicitly.
"Delete Folder" (Delete All Files with Prefix)
// 1. List all files in "folder"
const response = await fetch(
'https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/?file__startswith=users/john-doe/',
{ headers: { 'Authorization': 'Bearer YOUR_TOKEN' } }
);
const { data } = await response.json();
// 2. Delete each file
for (const file of data) {
await fetch(
`https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/${file.uuid}/`,
{
method: 'DELETE',
headers: { 'Authorization': 'Bearer YOUR_TOKEN' },
}
);
}
"Rename Folder" (Move All Files)
// Move all files from old-folder/ to new-folder/
const response = await fetch(
'https://your-api.com/api/apps/my-app/storage/buckets/my-bucket/objects/?file__startswith=old-folder/',
{ headers: { 'Authorization': 'Bearer YOUR_TOKEN' } }
);
const { data } = await response.json();
for (const file of data) {
const newPath = file.path.replace('old-folder/', 'new-folder/');
await fetch(
'https://your-api.com/api/storage/buckets/my-bucket/objects/move/',
{
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json',
},
body: JSON.stringify({
source_path: file.path,
destination_path: newPath,
}),
}
);
}
Performance Optimization
Efficient Queries
Use File Path Prefixes
# Fast: Optimized prefix search
?file__startswith=users/john-doe/
# Slower: Full text search
?search=john-doe
Limit Results
?limit=20 # Only fetch what you need
Efficient Queries The API automatically optimizes queries for best performance.
Caching Strategies
Cache Object Metadata
// Cache object metadata in localStorage/Redis
const cacheKey = `object:${objectUuid}`;
let metadata = cache.get(cacheKey);
if (!metadata) {
const response = await fetch(`/api/apps/my-app/storage/buckets/my-bucket/objects/${objectUuid}/`);
metadata = await response.json();
cache.set(cacheKey, metadata, { ttl: 3600 }); // 1 hour
}
CDN for Downloads Put a CDN in front of download URLs for better performance:
https://cdn.example.com/api/apps/my-app/storage/buckets/my-bucket/object/file.jpg
Connection Pooling
For batch operations, use connection pooling:
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.retry import Retry
session = requests.Session()
retry = Retry(total=3, backoff_factor=0.3)
adapter = HTTPAdapter(max_retries=retry, pool_connections=10, pool_maxsize=20)
session.mount('https://', adapter)
# Use session for all requests
for file in files:
session.post(url, headers=headers, files={'file': file})
Security Best Practices
Path Validation
Always validate and sanitize paths:
function sanitizePath(path) {
// Remove leading/trailing slashes
path = path.replace(/^\/+|\/+$/g, '');
// Remove double slashes
path = path.replace(/\/+/g, '/');
// Remove parent directory references
path = path.replace(/\.\./g, '');
return path;
}
const userPath = sanitizePath(userInput);
File Type Validation
Validate on both client and server:
const allowedTypes = ['image/jpeg', 'image/png', 'image/gif'];
function validateFile(file) {
// Check MIME type
if (!allowedTypes.includes(file.type)) {
throw new Error('Invalid file type');
}
// Check file extension
const ext = file.name.split('.').pop().toLowerCase();
if (!['jpg', 'jpeg', 'png', 'gif'].includes(ext)) {
throw new Error('Invalid file extension');
}
// Check file size (5MB)
if (file.size > 5 * 1024 * 1024) {
throw new Error('File too large');
}
return true;
}
Secure Metadata
Don't store sensitive information in metadata:
// ❌ Bad: Sensitive data in metadata
metadata: {
user_ssn: "123-45-6789",
credit_card: "4111-1111-1111-1111"
}
// ✓ Good: Safe metadata
metadata: {
uploaded_at: "2025-01-15T10:00:00Z",
image_dimensions: { width: 1920, height: 1080 },
tags: ["profile", "2025"]
}
Related Documentation
- Storage Overview - Architecture and concepts
- Storage Quickstart - Getting started
- Buckets API Reference - Bucket operations
- Objects API Reference - Basic file operations