Amazon’s S3 service can be really annoying. For example invalidating a couple thousands objects not only will cost you a lot, it is not even possible without using the API, but here’s a neat little workaround to invalidate 30,000 objects either using DNS only or using ONLY sed and split commands. without having to program the API

TRY THIS FIRST: There is a short version using DNS

There is a far easier method that I used before and only requires a DNS change. If you fear DNS propagation will take too long and your TTL is set to a really high value you can skip this.

1. Create a new distribution using the same origin as your original distribution!

2. CNAME will be “” or you will get the error One Or More Of The Cnames You Provided Are Already Associated With A Different Resource

One Or More Of The Cnames You Provided Are Already Associated With A Different Resource.png

3. Now go into your DNS control panel and change the Cloudfront distribution URL (looks like *.cloudfrontnet) to your new distribution.

Dns Cloudfront Url.png

Make sure to enter a really low TTL (Time To Live) just in case you want to change it again soon or it will take up to 72 hours to propagate which is really unnecessary when using a service such as DNSMadeEasy – but keep in mind a lower TTL will mean higher expenses for a time (more lookups = more DNS requests). 10 million DNS requests are included in the business membership of DNSMadeEasy. You can sign up here if you are not a member yet – it is by far the most reliable DNS service out there.

You can simply remove the old distribution after a while.


If you still want to invalidate objects, but don’t use the API you can use the approach below.

Step 1: Gather Your Filelist

Ok, I assume you have a backup of your S3 objects somewhere on your drive? If you don’t grab a copy of BucktExplorer and download all files.

Then we can simply echo all filenames into a file called output.txt.

1. CD directly into the folder and run this command

ls -l | awk '{print $9}' > ../output.txt

What this command will do is list all files and then use awk to print only the filename variable (in this case it’s $9, but if that echos something else for you, try changing it to 6, 7 or 8 for starters.

The output.txt will be stored one level above the current folder (we dont want to spend time searching for this txt file amongst thousand other files)

Step 2: Split Text File Into Several

Let’s assume the text file output.txt is over 30,000 lines long, then you can’t import this into the Cloudfront invalidation interface (

I have not tested it a 100% but Cloudfront invalidation supports 500-1000 objects in a single batch (on the web interface).
To be safe, I went with 500 lines per text file, even if that means pasting a few more text files.

Step 3: Add Leading Paths

Next we are going to use sed to append a leading path. This is easier to do than it sounds:

sed 's/^/\/pictures\//' splitaa

Let me break this down for you. A more simple example that will only append “pictures” (without the quotes):

sed 's/^/pictures/' splitaa

But a filepath contains slashes, we are escaping the slashes using a backslash before each slash and end up with the command above.

Alright, now we need to return the results somewhere.

sed 's/^/\/pictures\//' splitaa > splitaa2

This will copy the results directly into the file splitaa2.

Next, we need a for loop to split all files, instead of having to execute the command multiple times

for i in split*; do
    sed 's/^/\/pictures\//' $i > $i-2

Ok, next we are going to create a little shell script where we put this in:

pico splitter

Copy and paste the 3 lines. Hit CTRL + O to write and CTRL + X to exit

Then add execute permissions:

chmod a+x splitter

Alright, we are done. Run the splitter file and you will now have plenty of files with the correct paths!

Step 4:

Head over to and select Cloudfront

1. Select your distribution and click Distribution settings

2. The invalidation tab is on the far right:

Creating Cloudfront S3 Object Invalidation.png

3. Copy and paste your files by hand

Alternatively, you can use the API, but spending 10 minutes on pasting a few files is considerably less work than doing all of the steps above manually.

Eventually I will look into the API for invalidation purposes, which may be smoother than this, but as a workaround the method above certainly works.

Here are some other workarounds



– If you have time, upload the entire path e.g. /files/ to /files2014/
– Within your CMS replace all filepaths labeled /files/ to /files2014/