Skip to content

vekexasia/s3-image-optimizer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Optim

Automagitically optimize your images on S3 with the magic of AWS Lambda.

Optim is a super-simple Lambda function that can listen to an S3 bucket for uploads, and runs everything it can through imagemin.

Setup

  • Clone this repo

  • Run npm install

  • Fill in AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in .env to a set of credentials that can create Lambda functions (alternatively have these already in your environment)

  • Create an IAM role for Optim to use. It needs the following permissions on all the S3 buckets you want to use (allowing these operations on ARN * is easiest to start with):

    • getObject
    • putObject
    • putObjectAcl
  • Find the ARN for this role. It looks something like arn:aws:iam::1234567890:role/rolename.

  • Fill in AWS_ROLE_ARN in .env

  • Run npm run deploy

  • Hurrah, your Lambda function is now deployed! It'll be created with the name optim-production unless you changed values in .env

  • You can now hook this function up to any S3 bucket you like in the management console. Easiest way is to follow AWS's guide

Configuration

There are two sets of configuration here. The .env file contains configuration related to setup and deployment.

In .env:

  • AWS_ACCESS_KEY_ID: the AWS access key used to deploy the Lambda function
  • AWS_SECRET_ACCESS_KEY: the corresponding secret access key
  • AWS_ROLE_ARN: role with which the lambda function will be executed
  • AWS_REGION: which region to deploy to
  • AWS_FUNCTION_NAME and AWS_ENVIRONMENT control naming of the lambda function created
  • AWS_MEMORY_SIZE is the amount of memory given to your Lambda. It's also related to how much CPU share it gets. Since optimizing images is fairly intensive, probably best to keep this high
  • AWS_TIMEOUT runtime timeout for the lambda in seconds up to 5 minutes. Again, image optimization is fairly intensive so you'll probably want to leave this at the maximum of 300.
  • EXCLUDE_PREFIX avoid optimizing images that have such prefix in its filename.

In event_sources.json:

  • Bucket: configure the bucket where to listen.

Lambda deployment

After configuring, deploy lamda with npm run deploy

Current Bucket Optimization

This project can optimize all the existing images in the bucket. You need to: npm run package and node dist/optimizeAll.js

It will optimize images using all the CPUs on your pc.

Stats

I've used this repo to successfully optimize a bucket with 490k images (90% jpg, 9%png, 1% others) In less than 2hours with a 16core EC2 instance.

About

Optimize images in large S3 storage buckets

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 99.1%
  • JavaScript 0.9%