Automate Google Drive File Uploads with n8n: Dynamic Folders & Bulk Upload
Automate Google Drive File Uploads with n8n: Dynamic Folders & Bulk Upload
• Logic Workflow Team

Automate Google Drive File Uploads with n8n: Dynamic Folders & Bulk Upload

#n8n #Google Drive #file upload #automation #workflow #tutorial

You just received 50 invoices via email. Manually downloading each one, creating monthly folders in Google Drive, and uploading them to the correct location will consume your entire morning. And you’ll do this again next week. And the week after.

This scenario repeats daily across organizations everywhere. Finance teams drowning in receipts. Marketing departments juggling campaign assets. HR managers organizing employee documents. The files keep coming, and manual organization doesn’t scale.

The Hidden Cost of Manual File Management

Manual file uploads create three problems that compound over time:

Inconsistent organization. When multiple team members upload files, folder naming conventions drift. “Invoices_January” becomes “January-Invoices” becomes “invoices jan.” Finding files later becomes a treasure hunt.

Lost time. Research shows knowledge workers spend nearly two hours daily searching for and gathering information. Much of that time goes toward finding files in poorly organized systems.

Human error. Files end up in wrong folders. Duplicates proliferate. Important documents get overwritten. These mistakes cascade into bigger problems downstream.

What You’ll Learn

  • How to configure the Google Drive node for reliable file uploads
  • Building dynamic folder structures that create themselves (year/month organization)
  • Handling bulk uploads from forms with multiple file attachments
  • Processing email attachments directly to organized Drive folders
  • Working with Shared Drives and team permissions
  • Error handling and retry patterns for production workflows
  • Performance optimization for large file batches

Understanding the Google Drive Node

The Google Drive node in n8n provides comprehensive file management capabilities. Before diving into advanced patterns, you need to understand what operations are available and how authentication works.

Available Operations

ResourceOperations
FileUpload, Download, Copy, Delete, Move, Share, Update
FolderCreate, Delete, Share
DriveCreate, Delete, Get, List (for Shared Drives)

The most common workflow involves uploading files to specific folders, which requires combining multiple operations: searching for folders, creating them if needed, and then uploading files.

Authentication Setup

Google Drive requires OAuth2 authentication. The setup process involves:

  1. Creating credentials in the Google Cloud Console
  2. Enabling the Google Drive API
  3. Configuring OAuth consent screen
  4. Creating OAuth client credentials
  5. Adding the credentials to n8n

Important: OAuth tokens expire. If your workflows suddenly fail with authentication errors after working for weeks, the refresh token may have expired. See our authentication troubleshooting guide for solutions.

For detailed credential setup, check our n8n credential management guide.

Key Node Parameters

When configuring the Google Drive node for uploads, you’ll work with these parameters:

  • Resource: Select “File” for file operations
  • Operation: Choose “Upload” to add new files
  • Input Data Field Name: The binary data field containing your file (default: data)
  • File Name: The name for the uploaded file
  • Parent Drive: My Drive or a Shared Drive
  • Parent Folder: The destination folder (by list, URL, or ID)

Understanding these parameters is essential because most upload failures stem from incorrect configuration of the parent folder or binary data field.

Simple File Upload: The Foundation

Before building complex workflows, master the basic upload pattern. This foundation applies to every Google Drive automation you’ll build.

Basic Upload Configuration

The simplest upload workflow takes binary data from any source and sends it to a Google Drive folder:

{
  "nodes": [
    {
      "parameters": {},
      "name": "Manual Trigger",
      "type": "n8n-nodes-base.manualTrigger",
      "position": [250, 300]
    },
    {
      "parameters": {
        "url": "https://example.com/sample.pdf",
        "options": {}
      },
      "name": "Download File",
      "type": "n8n-nodes-base.httpRequest",
      "position": [450, 300]
    },
    {
      "parameters": {
        "resource": "file",
        "operation": "upload",
        "name": "sample-document.pdf",
        "folderId": {
          "__rl": true,
          "mode": "id",
          "value": "YOUR_FOLDER_ID_HERE"
        }
      },
      "name": "Google Drive",
      "type": "n8n-nodes-base.googleDrive",
      "position": [650, 300]
    }
  ]
}

Replace YOUR_FOLDER_ID_HERE with your actual Google Drive folder ID. You can find this ID in the folder’s URL: https://drive.google.com/drive/folders/FOLDER_ID_HERE.

Understanding Binary Data in n8n

n8n handles files as binary data attached to items. When a node produces a file, it stores the file content in a binary property (usually named data).

The Google Drive upload node reads from this binary property. If your upstream node names the binary data differently, you must specify the correct field name in Input Data Field Name.

Common binary data field names:

SourceDefault Binary Field
HTTP Request (file download)data
Gmail (attachments)attachment_0, attachment_1, etc.
Read Binary Filesdata
Form submissionField name from form

Pro tip: Use the “Item Lists” panel in n8n to inspect what binary fields exist on your items. This prevents upload failures from incorrect field names.

Dynamic File Naming

Hard-coded file names cause overwrites. Use expressions to generate unique names:

// Include timestamp
{{ $now.format('yyyy-MM-dd_HHmmss') }}_invoice.pdf

// Include data from the item
{{ $json.customerName }}_{{ $json.invoiceNumber }}.pdf

// Preserve original filename
{{ $binary.data.fileName }}

For more expression patterns, see our n8n expressions guide.

Dynamic Folder Creation: The Game Changer

Static folder IDs work for simple use cases. But real workflows need folders that organize files automatically, such as creating monthly folders for invoices or client-specific folders for documents.

The Problem with Static Folders

Hard-coded folder IDs create several issues:

  • All files dump into one location regardless of context
  • Manual folder creation doesn’t scale
  • Changing folder structure requires workflow edits
  • No automatic organization by date, client, or category

Building the Check-Then-Create Pattern

The solution is a workflow pattern that:

  1. Determines what folder should exist (e.g., “Year/Month”)
  2. Searches Google Drive for that folder
  3. Creates the folder if it doesn’t exist
  4. Uploads the file to the folder (existing or newly created)

Here’s the complete workflow:

{
  "nodes": [
    {
      "parameters": {},
      "name": "Manual Trigger",
      "type": "n8n-nodes-base.manualTrigger",
      "position": [200, 300]
    },
    {
      "parameters": {
        "assignments": {
          "assignments": [
            {
              "name": "folderName",
              "value": "={{ $now.format('yyyy/MMMM') }}",
              "type": "string"
            },
            {
              "name": "yearFolder",
              "value": "={{ $now.format('yyyy') }}",
              "type": "string"
            },
            {
              "name": "monthFolder",
              "value": "={{ $now.format('MMMM') }}",
              "type": "string"
            }
          ]
        }
      },
      "name": "Set Folder Names",
      "type": "n8n-nodes-base.set",
      "position": [400, 300]
    },
    {
      "parameters": {
        "resource": "fileFolder",
        "operation": "search",
        "queryString": "=mimeType='application/vnd.google-apps.folder' and name='{{ $json.monthFolder }}' and trashed=false",
        "options": {
          "driveId": {
            "__rl": true,
            "mode": "list",
            "value": "MyDrive"
          }
        }
      },
      "name": "Search for Folder",
      "type": "n8n-nodes-base.googleDrive",
      "position": [600, 300],
      "alwaysOutputData": true
    },
    {
      "parameters": {
        "conditions": {
          "boolean": [
            {
              "value1": "={{ $json.id }}",
              "value2": ""
            }
          ]
        }
      },
      "name": "Folder Exists?",
      "type": "n8n-nodes-base.if",
      "position": [800, 300]
    },
    {
      "parameters": {
        "resource": "folder",
        "operation": "create",
        "name": "={{ $('Set Folder Names').item.json.monthFolder }}",
        "options": {}
      },
      "name": "Create Folder",
      "type": "n8n-nodes-base.googleDrive",
      "position": [1000, 400]
    },
    {
      "parameters": {
        "resource": "file",
        "operation": "upload",
        "name": "=invoice_{{ $now.format('yyyy-MM-dd') }}.pdf",
        "folderId": {
          "__rl": true,
          "mode": "id",
          "value": "={{ $json.id }}"
        }
      },
      "name": "Upload to Existing",
      "type": "n8n-nodes-base.googleDrive",
      "position": [1000, 200]
    },
    {
      "parameters": {
        "resource": "file",
        "operation": "upload",
        "name": "=invoice_{{ $now.format('yyyy-MM-dd') }}.pdf",
        "folderId": {
          "__rl": true,
          "mode": "id",
          "value": "={{ $json.id }}"
        }
      },
      "name": "Upload to New",
      "type": "n8n-nodes-base.googleDrive",
      "position": [1200, 400]
    }
  ]
}

Critical Setting: Always Output Data

Warning: The “Search for Folder” node must have “Always Output Data” enabled. Without this setting, the workflow stops when no folder is found, preventing folder creation.

This setting ensures the workflow continues executing even when the search returns empty results. Enable it in the node settings under “Settings” > “Always Output Data.”

Nested Folder Structures

For hierarchical organization (Year > Month), you need recursive folder creation. Each level requires its own search-and-create logic:

  1. Search for year folder
  2. Create year folder if missing
  3. Search for month folder within year folder
  4. Create month folder if missing
  5. Upload file to month folder

The key is using the parent folder ID from the previous step when creating child folders:

// Create month folder inside year folder
{
  "resource": "folder",
  "operation": "create",
  "name": "{{ $json.monthName }}",
  "options": {
    "parents": ["{{ $('Create Year Folder').item.json.id }}"]
  }
}

For complex folder logic, the IF node provides the conditional branching you need.

Putting It All Together

The complete workflow combines these elements:

  1. Form Trigger → Receives file uploads with a category field
  2. Set Node → Generates folder names from date expressions
  3. Search Year Folder → Checks if year folder exists (with “Always Output Data” enabled)
  4. IF Node → Routes based on folder existence
  5. Create Year Folder → Creates if missing
  6. Search Month Folder → Checks for month folder within year
  7. IF Node → Routes based on folder existence
  8. Create Month Folder → Creates if missing
  9. Code Node → Splits multiple files into individual items
  10. Loop Over Items → Processes uploads in batches
  11. Google Drive Upload → Uploads each file to the target folder

Ready-to-use template: The n8n template library has a similar workflow you can import directly and customize.

Bulk File Uploads: Processing Multiple Files

Single file uploads are straightforward. The complexity increases when handling multiple files, whether from form submissions, email attachments, or batch processing jobs.

Form-Triggered Bulk Uploads

n8n’s Form Trigger node accepts multiple file uploads. Each file arrives as a separate binary data property:

FILE_0: invoice1.pdf
FILE_1: invoice2.pdf
FILE_2: receipt.pdf

The challenge: uploading each file requires separate Google Drive node executions, but all files arrive in a single item.

Handling Multiple Binary Attachments

The solution uses a special n8n feature: the $binary reference. This object contains all binary data on the current item.

To iterate through multiple attachments:

  1. Extract binary keys to identify all files
  2. Split into separate items using a Code node
  3. Upload each item using Loop Over Items

Here’s the Code node that splits binary attachments into separate items:

const items = [];

// Get all binary keys from the input item
const binaryKeys = Object.keys($input.first().binary);

for (const key of binaryKeys) {
  items.push({
    json: {
      binaryKey: key,
      fileName: $input.first().binary[key].fileName,
      mimeType: $input.first().binary[key].mimeType
    },
    binary: {
      data: $input.first().binary[key]
    }
  });
}

return items;

This transforms one item with multiple binary properties into multiple items, each with a single data binary property that the Google Drive node expects.

For more Code node patterns, see our JavaScript in n8n guide.

Dynamic Binary Data Reference

When binary field names are unpredictable (like attachment_0, attachment_1), use dynamic expressions:

// Reference first binary key dynamically
{{ $binary.keys()[0] }}

// Get filename of first binary
{{ Object.values($binary)[0].fileName }}

This pattern handles any binary field name without hard-coding.

Loop Over Items for Controlled Uploads

After splitting files into separate items, use the Loop Over Items node to process uploads in batches:

{
  "parameters": {
    "batchSize": 5,
    "options": {}
  },
  "name": "Loop Over Items",
  "type": "n8n-nodes-base.splitInBatches",
  "position": [600, 300]
}

Why batch uploads?

  • Rate limiting: Google Drive API has usage quotas
  • Error isolation: One failed upload doesn’t break the entire batch
  • Memory management: Processing thousands of files at once exhausts memory

For deep coverage of batch processing patterns, read our n8n batch processing guide.

Advanced Pattern: Gmail to Google Drive

One of the most requested automations: automatically saving email attachments to organized Google Drive folders. This pattern combines Gmail triggers with dynamic folder creation.

Why This Pattern Matters

Every business receives important documents via email: invoices from vendors, contracts from clients, reports from partners. These attachments sit in inboxes until someone manually downloads and organizes them. That manual process creates bottlenecks:

  • Attachments get buried in email threads
  • Team members can’t find documents shared via email
  • No consistent organization across the team
  • Important files get lost when employees leave

Automating this flow ensures every attachment lands in the right folder immediately, searchable by anyone with access.

The Complete Workflow

{
  "nodes": [
    {
      "parameters": {
        "pollTimes": {
          "item": [{ "mode": "everyMinute" }]
        },
        "filters": {
          "q": "has:attachment"
        }
      },
      "name": "Gmail Trigger",
      "type": "n8n-nodes-base.gmailTrigger",
      "position": [250, 300]
    },
    {
      "parameters": {
        "assignments": {
          "assignments": [
            {
              "name": "senderName",
              "value": "={{ $json.from.value[0].name.replace(/[^a-zA-Z0-9]/g, '_') }}",
              "type": "string"
            },
            {
              "name": "datePath",
              "value": "={{ DateTime.fromISO($json.date).toFormat('yyyy/MM') }}",
              "type": "string"
            }
          ]
        }
      },
      "name": "Extract Metadata",
      "type": "n8n-nodes-base.set",
      "position": [450, 300]
    },
    {
      "parameters": {
        "jsCode": "const items = [];\nconst binaryKeys = Object.keys($input.first().binary || {});\n\nfor (const key of binaryKeys) {\n  items.push({\n    json: {\n      ....$input.first().json,\n      binaryKey: key,\n      fileName: $input.first().binary[key].fileName\n    },\n    binary: {\n      data: $input.first().binary[key]\n    }\n  });\n}\n\nreturn items.length > 0 ? items : $input.all();"
      },
      "name": "Split Attachments",
      "type": "n8n-nodes-base.code",
      "position": [650, 300]
    }
  ]
}

Organizing by Sender

Create folders based on who sent the email:

// Sanitize sender name for folder creation
{{ $json.from.value[0].name.replace(/[^a-zA-Z0-9]/g, '_') }}

This expression removes special characters that Google Drive doesn’t allow in folder names.

Avoiding Duplicate Uploads

Prevent re-uploading the same attachments:

  1. Track message IDs in a database or Google Sheet
  2. Check before processing each new email
  3. Mark as processed after successful upload

A practical implementation stores processed message IDs in a Google Sheet:

// Check if message was already processed
const processedIds = $('Get Processed IDs').all().map(item => item.json.messageId);
const currentMessageId = $json.id;

if (processedIds.includes(currentMessageId)) {
  return []; // Skip this message
}

return $input.all();

After successful upload, add the message ID to your tracking sheet. This prevents duplicate processing even if the workflow runs multiple times on the same email.

Filtering Specific Attachments

Not every attachment deserves a place in your Drive. Filter by file type, size, or sender:

// Gmail search query for specific criteria
from:[email protected] has:attachment filename:pdf

// Or filter in n8n after receiving
{{ $json.attachments.filter(a => a.mimeType === 'application/pdf') }}

Common filters include:

  • File type: Only PDFs, only images, only spreadsheets
  • Size threshold: Skip tiny signature images (< 10KB)
  • Sender domain: Only from specific vendors or partners
  • Subject keywords: Invoices, contracts, reports

For data transformation patterns that help with deduplication and filtering, see our data transformation guide.

Working with Shared Drives and Teams

Personal “My Drive” works for individual use. Team collaboration requires Shared Drives (formerly Team Drives).

Shared Drive vs My Drive

AspectMy DriveShared Drive
OwnershipIndividual userOrganization
AccessPersonal + sharedTeam members
File limit500,000 per folder500,000 per folder
Trash behaviorOwner controlsAdmins control
Service accountsLimitedRecommended

Permission Considerations

When uploading to Shared Drives:

  1. The authenticated account must have Contributor or higher access
  2. Parent folder ID must reference a location within the Shared Drive
  3. Set the driveId parameter to the Shared Drive ID
{
  "resource": "file",
  "operation": "upload",
  "driveId": {
    "__rl": true,
    "mode": "id",
    "value": "SHARED_DRIVE_ID"
  },
  "folderId": {
    "__rl": true,
    "mode": "id",
    "value": "FOLDER_ID_IN_SHARED_DRIVE"
  }
}

Service Account Limitations

According to Google’s documentation, service accounts cannot own files in My Drive. They must either:

  • Upload to Shared Drives
  • Use domain-wide delegation to act on behalf of users

If your automation runs without user interaction (scheduled jobs, webhooks), consider using a Shared Drive for reliable operation.

Finding Your Shared Drive ID

The Shared Drive ID appears in the URL when you open the drive:

https://drive.google.com/drive/folders/SHARED_DRIVE_ID_HERE

You can also retrieve it programmatically using the Google Drive node’s “List Drives” operation, which returns all Shared Drives accessible to the authenticated account.

Folder Structure in Shared Drives

Shared Drives have their own root folder. When creating folder hierarchies:

  1. Use the Shared Drive ID as the initial parent
  2. Create top-level folders directly in the drive
  3. Build nested structures using folder IDs from previous operations

The same check-then-create pattern works, but you must specify the driveId parameter on every operation targeting the Shared Drive.

Error Handling and Reliability

Production workflows need robust error handling. Google Drive uploads can fail for various reasons, and your workflow should handle each gracefully.

Common Upload Failures

ErrorCauseSolution
403 Rate LimitToo many requestsAdd Wait node, reduce batch size
404 Not FoundInvalid folder IDVerify folder exists, check permissions
400 Bad RequestInvalid file nameSanitize names, remove special characters
413 Too LargeFile exceeds limitUse resumable upload for large files
401 UnauthorizedToken expiredRe-authenticate credentials

File Size Limits

Google Drive supports different upload methods based on file size:

MethodMax SizeBest For
Simple upload5 MBSmall files, quick uploads
Multipart upload5 MBSmall files with metadata
Resumable upload5 TBLarge files, unreliable networks

n8n’s Google Drive node handles this automatically for most cases. For very large files (100MB+), consider chunked processing or direct API calls.

For more details, see Google’s upload documentation.

Retry Patterns for Failed Uploads

Implement retry logic using the Error Trigger node:

{
  "nodes": [
    {
      "parameters": {},
      "name": "Error Trigger",
      "type": "n8n-nodes-base.errorTrigger",
      "position": [250, 300]
    },
    {
      "parameters": {
        "unit": "seconds",
        "value": 30
      },
      "name": "Wait",
      "type": "n8n-nodes-base.wait",
      "position": [450, 300]
    },
    {
      "parameters": {
        "conditions": {
          "number": [
            {
              "value1": "={{ $json.retryCount || 0 }}",
              "operation": "smallerEqual",
              "value2": 3
            }
          ]
        }
      },
      "name": "Retry Limit?",
      "type": "n8n-nodes-base.if",
      "position": [650, 300]
    }
  ]
}

For comprehensive rate limit handling, see our API rate limits guide.

Logging Upload Results

Track successful and failed uploads for debugging:

  1. Use the Execution Data node to store upload results
  2. Send failure notifications via email or Slack
  3. Log to external services for monitoring

A practical logging pattern captures upload outcomes:

// After successful upload
{
  "timestamp": "{{ $now.toISO() }}",
  "fileName": "{{ $json.name }}",
  "fileId": "{{ $json.id }}",
  "folderId": "{{ $json.parents[0] }}",
  "status": "success",
  "size": "{{ $json.size }}"
}

// After failed upload (in error workflow)
{
  "timestamp": "{{ $now.toISO() }}",
  "fileName": "{{ $json.fileName }}",
  "error": "{{ $json.error.message }}",
  "status": "failed"
}

Store these logs in Google Sheets, a database, or send to monitoring services like Datadog or PagerDuty for alerting.

Our n8n logging guide covers comprehensive monitoring strategies.

Handling Folder Name Conflicts

Google Drive allows multiple folders with identical names in the same location. This can cause confusion when your workflow searches for a folder:

// This query might return multiple results
mimeType='application/vnd.google-apps.folder' and name='Invoices' and trashed=false

Solutions:

  1. Use the first result: Accept that uploads go to whichever “Invoices” folder was created first
  2. Add uniqueness: Include dates or IDs in folder names (e.g., “Invoices_Q1” or “Invoices_ProjectName”)
  3. Specify parent: Add 'PARENT_FOLDER_ID' in parents to the search query

The safest approach combines unique naming with parent folder constraints.

Performance Optimization

Large-scale file operations require careful resource management. Without optimization, you’ll encounter memory errors, timeouts, and rate limits.

Batch Size Recommendations

ScenarioRecommended Batch Size
Small files (< 1 MB)10-20 per batch
Medium files (1-10 MB)5-10 per batch
Large files (> 10 MB)1-3 per batch
Rate-limited API1 per batch with Wait

When to Use Wait Nodes

Insert Wait nodes between batches when:

  • Google Drive returns 429 (rate limit) errors
  • Processing hundreds of files in sequence
  • External triggers fire faster than uploads complete
{
  "parameters": {
    "unit": "seconds",
    "value": 2
  },
  "name": "Wait",
  "type": "n8n-nodes-base.wait"
}

Memory Management for Large Files

Large files consume memory during processing. Strategies to reduce memory pressure:

  1. Process files individually rather than loading all into memory
  2. Use streaming where possible (direct HTTP to Google Drive)
  3. Clean up binary data after upload using Edit Fields node
  4. Increase n8n memory if self-hosting: NODE_OPTIONS=--max-old-space-size=4096

For comprehensive batch processing strategies, see our batch processing guide.

Real-World Use Cases

Understanding the patterns is one thing. Seeing how they combine in production scenarios shows their true power.

Invoice Processing Pipeline

A common accounting workflow:

  1. Trigger: Gmail receives invoice from *@vendor.com
  2. Extract: Pull PDF attachments from email
  3. Organize: Create folder structure: Invoices/Year/Month/VendorName
  4. Upload: Save PDF to organized location
  5. Notify: Send Slack message with link to uploaded file
  6. Track: Log invoice in Google Sheet for reconciliation

This pipeline handles 50+ invoices daily without manual intervention. The folder structure makes tax season audits straightforward.

Client Document Portal

Professional services firms use this pattern:

  1. Trigger: Form submission with client name and documents
  2. Validate: Check file types (only allow PDF, DOCX, XLSX)
  3. Create: Build client folder if first submission
  4. Upload: Save documents with standardized naming
  5. Share: Set appropriate permissions for client access
  6. Confirm: Send email with document links to client

Each client gets their own folder automatically. New clients don’t require manual folder creation.

Marketing Asset Library

Marketing teams organize campaign materials:

  1. Trigger: Webhook from design tool (Figma, Canva)
  2. Categorize: Determine asset type from metadata
  3. Organize: Route to appropriate folder (Social/Blog/Ads/Email)
  4. Upload: Save with campaign-specific naming
  5. Index: Add to asset database with searchable tags

This keeps creative assets organized as campaigns launch, preventing the “where’s that image?” scramble.

Professional Help

Building reliable Google Drive automations takes time to get right. If you need:

  • Custom workflow development for complex file management
  • Integration with existing systems
  • Production-ready error handling and monitoring

Our n8n workflow development service delivers production-ready automations. For troubleshooting existing workflows, use our free workflow debugger.

Frequently Asked Questions

How do I upload files to a dynamically generated path like Year/Month?

Use the check-then-create pattern. Set folder names using date expressions like {{ $now.format('yyyy/MM') }}, then follow this sequence:

  1. Search for the year folder
  2. Create if missing, capture the folder ID
  3. Search for the month folder within the year
  4. Create if missing
  5. Upload to the final destination

Critical: Enable “Always Output Data” on search nodes. Without this setting, your workflow stops when folders don’t exist instead of creating them.


Why does only one file upload when I have multiple attachments?

The Google Drive node processes one binary field per execution. Multiple attachments arrive as separate binary properties (attachment_0, attachment_1, etc.) on a single item, but the upload node only reads the field you specify.

The solution: Use a Code node to split attachments into separate items, where each item contains one file in the data binary property. Then process each item with Loop Over Items.

The Code example in the Bulk File Uploads section provides the exact pattern you need.


Can I upload to Google Drive Shared Drives with n8n?

Yes. Set the driveId parameter to your Shared Drive ID and ensure the authenticated account has Contributor access or higher.

Key considerations:

  • Find your Shared Drive ID in the URL when viewing the drive
  • Service accounts work better with Shared Drives than My Drive
  • Every operation targeting the Shared Drive needs the driveId parameter

How do I reference the folder ID from a previous node?

Use n8n expressions to reference output from earlier nodes. The syntax depends on your workflow structure:

  • From a named node: {{ $('Create Folder').item.json.id }}
  • From the previous node: {{ $json.id }}
  • From a specific item: {{ $('Search Folder').first().json.id }}

The expression editor shows available fields from previous nodes. Click the gear icon next to any field to explore what data is available.


What’s the maximum file size I can upload through n8n?

Google Drive supports up to 5 TB, but n8n’s practical limits are lower due to memory constraints.

Guidelines by file size:

  • Under 100 MB: Works reliably with default settings
  • 100 MB to 500 MB: Process files individually, not in batches
  • Over 500 MB: Increase n8n memory with NODE_OPTIONS=--max-old-space-size=4096
  • Multi-GB files: Consider direct API integration with chunked/resumable uploads

For self-hosted instances, memory allocation is the primary constraint. Cloud-hosted n8n may have additional limits based on your plan.

Ready to Automate Your Business?

Tell us what you need automated. We'll build it, test it, and deploy it fast.

âś“ 48-72 Hour Turnaround
âś“ Production Ready
âś“ Free Consultation
⚡

Create Your Free Account

Sign up once, use all tools free forever. We require accounts to prevent abuse and keep our tools running for everyone.

or

You're in!

Check your email for next steps.

By signing up, you agree to our Terms of Service and Privacy Policy. No spam, unsubscribe anytime.

🚀

Get Expert Help

Add your email and one of our n8n experts will reach out to help with your automation needs.

or

We'll be in touch!

One of our experts will reach out soon.

By submitting, you agree to our Terms of Service and Privacy Policy. No spam, unsubscribe anytime.