Because Blogofile(at least the 0.7.1 version I'm using) regenerates every file and directory every time it makes it hard to update just the changed files on Amazon's S3 (Simple Storage System) can scale incredibly high
So I wrote a shell script that updates only the changed files and also pings GooglePing so that Google and other services come along and read your update.
Over time, the difference between what the old page 2 on S3 has and what it should be will build. There's a simple answer, which is to just upload all of the page files.
Every 10 blog posts or so, you could just update all the category and archive files as well. Maybe that's somthing to automate as well
!/bin/bash
jdeibele [ at ] gmail . com
What this does:
builds your blog with blogofile, then builds a list of key
files that have changed. Blogofile is a python program
that uses Mako to generate static pages.
Blogofile: https://www.blogofile.com
For me, these files are:
the post itself
the first page of the categories it appears in
feeds for the categories (RSS and atom)
the archive page (year/month)
the first page of /page
feed for the blog (RSS and atom)
the main page (index.html) for the blog
BLOGURL="www.siriusventures.com" BLOGNAME="Sirius Stuff" BLOGHOME=$HOMEDIR/siriusventures
cd $BLOGHOME blogofile build rmdir *
blogofile builds extra directories for these but they're empty
cd $BLOGHOME/_posts
file=ls -t * | head -1
permalink=grep "^permalink:" $file | cut -f3 -d: | cut -f4 -d"/"
archive=grep "^date:" $file | cut -f2 -d: | cut -f1,2 -d/ | sed 's/ //g'
"slugify" the category names
categories=grep "^categories:" $file | tr [:upper:] [:lower:] | cut -f2 -d: | sed s'/^ //' | sed s'/ /-/'
echo $categories
cd $BLOGHOME/_site
IFS=','
echo index.html > /tmp/postit
for category in $categories
do
echo category/$category/index.html >> /tmp/postit
echo category/$category/1/index.html >> /tmp/postit
echo category/$category/feed/index.xml >> /tmp/postit
echo category/$category/feed/atom/index.xml >> /tmp/postit
done
echo archive/$archive/1/index.html>> /tmp/postit
echo feed/index.xml >> /tmp/postit
echo feed/atom/index.xml >> /tmp/postit
echo page/1/index.html>> /tmp/postit
s3cmd can be installed with homebrew - brew install s3cmd
s3cmd needs -recursive to upload a new directory
s3cmd put --recursive $BLOGHOME/_site/$permalink s3://$BLOGURL
while read file do s3cmd put $BLOGHOME/_site/$file s3://$BLOGURL/$file echo $BLOGHOME/_site/$file done