AWS Connector
Integrate with Amazon Web Services — S3, Lambda, EC2, RDS, SQS, and SNS
AWS Connector
Harness the power of Amazon Web Services in your workflows. The AWS connector lets you work with S3 buckets, invoke Lambda functions, manage EC2 instances, queue messages with SQS, publish events to SNS, and more—all without leaving DeepChain.
Overview
The AWS connector gives you access to core AWS services with 17 operations across multiple services. Whether you're uploading files to S3, processing data with Lambda, or managing your infrastructure, we've got a node for that.
Authentication
You have two options for authenticating with AWS. Pick whichever fits your setup:
Option 1: IAM User Credentials (Simple)
Use an IAM user's access key and secret key. This is great for development or simple integrations:
auth_type: iam
access_key_id: "AKIAIOSFODNN7EXAMPLE"
secret_access_key: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
region: "us-east-1"
Tip: Create a dedicated IAM user with minimal permissions (least privilege). Attach only the policies the connector actually needs.
Option 2: IAM Role (Recommended for Production)
If you're running DeepChain on EC2 or using Lambda, use an IAM role. It's more secure because you don't need to manage keys:
auth_type: role
role_arn: "arn:aws:iam::123456789012:role/MyRole"
region: "us-east-1"
Available Operations
Here's what you can do with each AWS service:
S3 (File Storage)
Perfect for uploading reports, exporting data, or storing files:
| Operation | What It Does |
|---|---|
listBuckets |
List all your S3 buckets |
listObjects |
See files in a bucket |
getObject |
Download a file from S3 |
putObject |
Upload a file to S3 |
deleteObject |
Remove a file |
copyObject |
Copy files between buckets |
Lambda (Serverless Compute)
Run custom code on demand without managing servers:
| Operation | What It Does |
|---|---|
invokeFunction |
Trigger a Lambda function and wait for result |
listFunctions |
See all your Lambda functions |
SQS (Message Queue)
Decouple parts of your system by queuing messages:
| Operation | What It Does |
|---|---|
sendMessage |
Add a message to a queue |
receiveMessage |
Pull messages from a queue |
deleteMessage |
Remove a message after processing |
SNS (Publish/Subscribe)
Broadcast events to subscribers (emails, SMS, HTTP endpoints, etc.):
| Operation | What It Does |
|---|---|
publish |
Send a message to a topic |
listTopics |
List all your SNS topics |
DynamoDB (NoSQL Database)
Store and retrieve data in a serverless database:
| Operation | What It Does |
|---|---|
getItem |
Fetch a single item by key |
putItem |
Create or update an item |
query |
Find items matching a condition |
scan |
Browse all items in a table |
Practical Examples
Example 1: Upload a Report to S3
Save a daily report to S3 with a date-stamped filename:
- id: upload_report
type: aws_connector
config:
operation: putObject
service: s3
bucket: "my-reports"
key: "daily-reports/{{ formatDate(now(), 'yyyy-MM-dd') }}/report.json"
body: "{{ json_stringify(transform_1.output) }}"
contentType: "application/json"
After this node runs, you can access the file at:
s3://my-reports/daily-reports/2025-02-10/report.json
Example 2: Process Data with Lambda
Invoke a Lambda function to do heavy lifting (image processing, ML predictions, etc.):
- id: process_image
type: aws_connector
config:
operation: invokeFunction
service: lambda
functionName: "image-processor"
payload:
imageUrl: "{{ input.image_url }}"
format: "{{ input.desired_format }}"
quality: 85
The response will contain whatever your Lambda function returns.
Example 3: Queue a Task with SQS
Send a task to a queue so another service can pick it up later:
- id: queue_job
type: aws_connector
config:
operation: sendMessage
service: sqs
queueUrl: "https://sqs.us-east-1.amazonaws.com/123456789/task-queue"
messageBody: "{{ json_stringify({
jobId: input.job_id,
userId: input.user_id,
data: input.payload
}) }}"
Example 4: Publish an Event to SNS
Alert multiple subscribers about an important event:
- id: notify_subscribers
type: aws_connector
config:
operation: publish
service: sns
topicArn: "arn:aws:sns:us-east-1:123456789:important-events"
message: "Order {{ input.order_id }} has been shipped!"
subject: "Order Shipped"
Subscribers could include email addresses, phone numbers (SMS), or HTTP endpoints—all triggered by one publish.
Example 5: Fetch Data from DynamoDB
Retrieve an item from a NoSQL table:
- id: get_user
type: aws_connector
config:
operation: getItem
service: dynamodb
table: "users"
key:
userId: "{{ input.user_id }}"
Rate Limits & Quotas
AWS services have different limits depending on your account type and service. Here's what to expect:
- S3: 3,500 PUT/COPY/POST/DELETE operations per second per prefix, 5,500 GET/HEAD per second
- Lambda: 1,000 concurrent executions (default; can be increased)
- SQS: Unlimited throughput for standard queues
- DynamoDB: Depends on your capacity mode (on-demand scales automatically)
Note: DeepChain automatically handles retries and backoff, so you rarely need to worry about hitting these limits.
Error Handling
Things can go wrong—let's talk about what to do when they do:
Common AWS Errors
| Error | What It Means | How to Fix |
|---|---|---|
AccessDenied |
Your IAM user/role doesn't have permission | Check the IAM policy and add required permissions |
NoSuchBucket |
The bucket doesn't exist | Verify the bucket name (typos are common!) |
ResourceNotFoundException |
Resource not found (function, table, etc.) | Check that the resource exists in the region you specified |
ThrottlingException |
You're exceeding the rate limit | Slow down or request a quota increase from AWS |
InvalidParameterException |
Bad data sent to the operation | Check field names and data types match the operation |
Debugging
Enable debug logging to see the exact request/response:
Node Configuration:
debug: true
logRequest: true
logResponse: true
This will show you what AWS received and what it returned, which usually reveals the issue right away.
Permission Errors?
If you get AccessDenied, your IAM user/role needs the right permissions. Here's a minimal policy to get started:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket",
"lambda:InvokeFunction",
"sqs:SendMessage",
"sns:Publish",
"dynamodb:GetItem",
"dynamodb:Query"
],
"Resource": "*"
}
]
}
Security Tip: For production, restrict resources to only what you need (e.g., specific bucket ARNs, not
*).
Best Practices
1. Use Environment-Specific Credentials
Keep development and production separate:
# In development
credential_id: cred_aws_dev
# In production
credential_id: cred_aws_prod
2. Handle Large Files Streaming
For really large S3 uploads, consider breaking them into chunks. DeepChain can handle multi-part uploads—just ask in the docs or contact support.
3. Monitor Lambda Errors
Always check the Lambda response for errors. Lambda returns a 200 status even if your function crashed:
- id: check_lambda
type: aws_connector
config:
operation: invokeFunction
functionName: "my-processor"
payload: "{{ input }}"
- id: handle_response
type: conditional
config:
# Check for FunctionError in response
condition: "{{ lambda_1.response.FunctionError }}"
true_path: "error_handler"
false_path: "success_handler"
4. Set TTL on Queue Messages
In SQS, set a visibility timeout so failed messages get retried:
- id: receive_task
config:
operation: receiveMessage
queueUrl: "{{ input.queue_url }}"
visibilityTimeout: 300 # 5 minutes
maxMessages: 10
Next Steps
- Connectors Overview — See all 13 connectors
- Lambda Examples — Learn Lambda best practices
- S3 Documentation — Deep dive into S3
- SQS vs SNS — Choose the right queue service