Data Processing Nodes
Transform, parse, filter, and manipulate data as it flows through your workflows
Data Processing Nodes
Data processing nodes are your data transformers. They reshape, parse, filter, encrypt, and manipulate data as it flows through your workflow. Let's build some data processing muscle.
Data Transform Node
When to use: When you need to reshape or restructure data—combining fields, changing field names, or computing new values.
The Data Transform Node uses expressions to map source data to a new structure. It's perfect for transforming API responses to match your database schema, or flattening nested objects.
Configuration
Configuration:
transform:
name: "{{ input.first_name }} {{ input.last_name }}"
email: "{{ lower(input.email) }}"
tags: "{{ input.categories | join(',') }}"
Example 1: Normalize User Data
Incoming data from sign-up form:
{
"firstName": "Alice",
"lastName": "Johnson",
"emailAddress": "Alice.Johnson@EXAMPLE.COM",
"interests": ["tech", "design", "coffee"]
}
Transform Node Configuration:
transform:
first_name: "{{ input.firstName }}"
last_name: "{{ input.lastName }}"
email: "{{ lower(input.emailAddress) }}"
interests: "{{ input.interests | join(', ') }}"
signup_date: "{{ now() }}"
status: "active"
Output:
{
"first_name": "Alice",
"last_name": "Johnson",
"email": "alice.johnson@example.com",
"interests": "tech, design, coffee",
"signup_date": "2025-02-10T14:30:00Z",
"status": "active"
}
Example 2: Transform E-Commerce API Response
Incoming data from product API:
{
"id": "PROD-123",
"title": "Blue Sneakers",
"original_price": 129.99,
"discount_percent": 20,
"in_stock": true
}
Transform Node Configuration:
transform:
product_id: "{{ input.id }}"
name: "{{ input.title | upper }}"
original_price: "{{ input.original_price }}"
discount: "{{ input.original_price * input.discount_percent / 100 | round(2) }}"
final_price: "{{ (input.original_price * (1 - input.discount_percent / 100)) | round(2) }}"
available: "{{ input.in_stock }}"
Output:
{
"product_id": "PROD-123",
"name": "BLUE SNEAKERS",
"original_price": 129.99,
"discount": 26.00,
"final_price": 103.99,
"available": true
}
Tip: Use the expression system for powerful transformations.
|chains operations:upper,lower,round(2),join(','), etc.
Set Node
When to use: When you want to create or update variables that persist through your workflow.
The Set Node lets you define variables with computed values. These variables are available to all downstream nodes.
Configuration
Configuration:
variables:
- name: "total"
value: "{{ input.price * input.quantity }}"
- name: "status"
value: "processed"
Example 1: Compute Order Total
Incoming order data:
{
"items": [
{ "name": "Shirt", "price": 25, "quantity": 2 },
{ "name": "Pants", "price": 60, "quantity": 1 }
],
"tax_rate": 0.08,
"coupon_discount": 10
}
Set Node Configuration:
variables:
- name: "subtotal"
value: "{{ input.items | sum('price * quantity') }}"
- name: "tax"
value: "{{ input.subtotal * input.tax_rate | round(2) }}"
- name: "final_total"
value: "{{ input.subtotal + input.tax - input.coupon_discount | round(2) }}"
- name: "processing_timestamp"
value: "{{ now() }}"
- name: "order_status"
value: "ready_to_ship"
These variables are now available to all downstream nodes as {{ subtotal }}, {{ tax }}, etc.
Example 2: User Engagement Score
variables:
- name: "page_views_score"
value: "{{ input.page_views * 5 }}"
- name: "time_on_site_score"
value: "{{ input.minutes_on_site / 10 | round(0) }}"
- name: "engagement_score"
value: "{{ input.page_views_score + input.time_on_site_score }}"
- name: "is_highly_engaged"
value: "{{ input.engagement_score > 100 }}"
Tip: Use Set nodes to break down complex calculations into readable steps.
Data Parser Node
When to use: When you have raw data in a structured format (JSON, CSV, XML, YAML) that needs to be converted to objects.
The Data Parser Node converts strings into usable data structures. Perfect for parsing CSV uploads, XML responses, or YAML configs.
Configuration
Configuration:
format: json | xml | csv | yaml | toml
options:
csv_delimiter: ","
csv_skip_rows: 1 # Skip headers
xml_array_tags: ["item"]
Example 1: Parse CSV Upload
Raw CSV data:
name,email,phone,signup_date
Alice Johnson,alice@example.com,555-0101,2025-01-15
Bob Smith,bob@example.com,555-0102,2025-01-16
Charlie Davis,charlie@example.com,555-0103,2025-01-17
Data Parser Configuration:
format: csv
options:
csv_delimiter: ","
csv_skip_rows: 1
Output:
[
{ "name": "Alice Johnson", "email": "alice@example.com", "phone": "555-0101", "signup_date": "2025-01-15" },
{ "name": "Bob Smith", "email": "bob@example.com", "phone": "555-0102", "signup_date": "2025-01-16" },
{ "name": "Charlie Davis", "email": "charlie@example.com", "phone": "555-0103", "signup_date": "2025-01-17" }
]
Connect this to a Loop Node to process each row!
Example 2: Parse XML Response
Raw XML data:
<products>
<product>
<id>PROD-001</id>
<name>Widget</name>
<price>19.99</price>
</product>
<product>
<id>PROD-002</id>
<name>Gadget</name>
<price>39.99</price>
</product>
</products>
Data Parser Configuration:
format: xml
options:
xml_array_tags: ["product"]
Output:
{
"products": [
{ "id": "PROD-001", "name": "Widget", "price": "19.99" },
{ "id": "PROD-002", "name": "Gadget", "price": "39.99" }
]
}
Example 3: Parse JSON with Nesting
format: json
options:
flatten: true # Flatten nested objects
Note: Always validate your data format is correct before connecting downstream nodes. Invalid CSV or malformed JSON will cause the parser to error.
Text Processing Node
When to use: When you need to manipulate text—regex extraction, sentiment analysis, template rendering, or text comparisons.
The Text Processing Node handles various text operations beyond simple formatting.
Configuration
Configuration:
operation: regex | sentiment | diff | tokenize | template
pattern: "\\d{3}-\\d{4}" # For regex
template: "Hello, {{name}}!" # For template
Example 1: Extract Phone Numbers (Regex)
Incoming text:
Please call 555-0101 or email alice@example.com. Emergency: 555-9999
Text Processing Configuration:
operation: regex
pattern: "\\d{3}-\\d{4}"
Output:
{
"matches": ["555-0101", "555-9999"],
"count": 2
}
Example 2: Render Email Template
Incoming data:
{
"customer_name": "Alice",
"order_id": "ORD-123",
"total": 99.99
}
Text Processing Configuration:
operation: template
template: |
Dear {{customer_name}},
Thank you for your order {{order_id}}!
Your total is ${{total}}.
Best regards,
The Team
Output:
Dear Alice,
Thank you for your order ORD-123!
Your total is $99.99.
Best regards,
The Team
Example 3: Sentiment Analysis
Incoming text:
I absolutely love this product! It's amazing and worth every penny!
Text Processing Configuration:
operation: sentiment
Output:
{
"sentiment": "positive",
"score": 0.95,
"magnitude": 0.8
}
Perfect for categorizing customer reviews or feedback!
DateTime Operations Node
When to use: When you need to parse, format, calculate, or manipulate dates and times.
The DateTime Operations Node handles all your temporal needs—parsing date strings, formatting for display, calculating durations, etc.
Configuration
Configuration:
operation: parse | format | add | subtract | diff | convert_timezone
input_format: "yyyy-MM-dd"
output_format: "MMMM d, yyyy"
timezone: "America/New_York"
Example 1: Parse and Format Date
Incoming data:
{
"created_at": "2025-01-15T14:30:00Z"
}
DateTime Configuration:
operation: format
input_format: "yyyy-MM-ddThh:mm:ssZ"
output_format: "MMMM d, yyyy 'at' h:mm a"
Output:
{
"formatted_date": "January 15, 2025 at 2:30 PM"
}
Example 2: Add Days to Date (Expiration Calculation)
Incoming data:
{
"order_date": "2025-02-10",
"return_window_days": 30
}
DateTime Configuration:
operation: add
input_format: "yyyy-MM-dd"
output_format: "yyyy-MM-dd"
unit: days
amount: "{{ input.return_window_days }}"
Output:
{
"return_deadline": "2025-03-12"
}
Perfect for calculating trial expiration dates, return deadlines, etc.
Example 3: Calculate Age (Date Difference)
Incoming data:
{
"birth_date": "1990-05-15",
"current_date": "2025-02-10"
}
DateTime Configuration:
operation: diff
input_format: "yyyy-MM-dd"
unit: years
Output:
{
"age": 34
}
Example 4: Convert Timezone
Incoming data:
{
"meeting_time": "2025-02-15 10:00:00",
"timezone": "America/Los_Angeles"
}
DateTime Configuration:
operation: convert_timezone
from_timezone: "America/Los_Angeles"
to_timezone: "Europe/London"
output_format: "yyyy-MM-dd HH:mm:ss"
Output:
{
"converted_time": "2025-02-15 18:00:00"
}
Tip: Always be explicit about timezones when working with dates. Time zones cause more bugs than any other date feature!
Encryption & Hashing Node
When to use: When you need to encrypt sensitive data, hash passwords, or sign messages for security.
The Encryption & Hashing Node provides cryptographic operations. Use it for PII (personally identifiable information), payment data, or any sensitive information.
Configuration
Configuration:
operation: encrypt | decrypt | hash | sign | verify | encode | decode
algorithm: aes-256-gcm | sha256 | rsa | hmac
key_reference: "credential_id"
Example 1: Hash Password
Incoming registration data:
{
"email": "alice@example.com",
"password": "MySecurePassword123!"
}
Encryption Configuration:
operation: hash
algorithm: sha256
Output:
{
"email": "alice@example.com",
"password_hash": "a1b2c3d4e5f6g7h8..."
}
Store the hash in your database, never store the plain password!
Example 2: Encrypt Sensitive Data
Incoming user data:
{
"name": "Alice Johnson",
"ssn": "123-45-6789"
}
Encryption Configuration:
operation: encrypt
algorithm: aes-256-gcm
key_reference: "encryption_key_prod"
Output:
{
"name": "Alice Johnson",
"ssn_encrypted": "xK9mQ2Lp8nR5..."
}
Decrypt when you need to display or process the data.
Example 3: Sign Data for API Request
Outgoing webhook data:
{
"event": "order.created",
"order_id": "ORD-123",
"timestamp": 1707567000
}
Encryption Configuration:
operation: sign
algorithm: hmac
key_reference: "webhook_secret"
Output:
{
"event": "order.created",
"order_id": "ORD-123",
"timestamp": 1707567000,
"signature": "2Dh5Kx9pL7mQ..."
}
The receiving webhook endpoint can verify the signature to confirm the message came from you.
Warning: Never hardcode encryption keys! Use credential references instead. Keys are stored securely in your DeepChain vault.
Cache Storage Node
When to use: When you want to cache expensive operations (API calls, database queries) to speed up subsequent runs.
The Cache Storage Node provides key-value caching with optional TTL (time-to-live). Great for reducing API calls and improving performance.
Configuration
Configuration:
operation: get | set | delete | clear
key: "user_{{ input.user_id }}"
ttl: 3600 # Seconds (1 hour)
Example 1: Cache User Profile
First request—API call is expensive:
{
"user_id": "USER-123"
}
Cache Node Configuration (first run):
operation: set
key: "user_profile_{{ input.user_id }}"
ttl: 3600
Stores the user profile in cache for 1 hour.
Subsequent requests (within 1 hour):
operation: get
key: "user_profile_{{ input.user_id }}"
Returns cached result instantly—no API call needed!
Example 2: Cache API Rate Limit Status
operation: set
key: "api_rate_limit_{{ input.api_key }}"
ttl: 60 # 1 minute cache
Prevent checking rate limits on every call. Cache the status for 1 minute.
Example 3: Clear Cache on Update
When user profile is updated:
operation: delete
key: "user_profile_{{ input.user_id }}"
Next request will fetch fresh data from the API.
Tip: Use cache for read-only data that doesn't change frequently. For real-time data, use shorter TTLs (10-60 seconds).
Common Data Processing Patterns
Pattern 1: Normalize and Validate
Start
↓
Data Transform (reshape incoming data)
↓
Data Parser (if it's raw CSV/JSON string)
↓
Set Node (compute derived fields)
↓
Code Node (custom validation)
↓
Database Insert
Pattern 2: Cache Expensive Operations
Start
↓
Cache Node (get)
├─ Cache hit → continue downstream
└─ Cache miss ↓
HTTP Request (fetch from API)
↓
Cache Node (set)
↓
Continue
Pattern 3: Parse and Loop
Start (with raw CSV string)
↓
Data Parser (parse CSV to array)
↓
Loop (for each row)
├─ Transform (normalize)
├─ Database (insert)
└─ Next row
↓
End
Next Steps
- Need to make decisions? Check out Control Flow Nodes
- Want to call APIs? See Communication Nodes
- Need AI assistance? Visit AI & Intelligence Nodes
- Working with databases? Explore Database Nodes
- Writing complex logic? Try Code Node