xtim
Tuesday, September 16, 2008
 
Delivery from S3
Got S3 delivery working last night on my local test system. It's still at the proof-of-concept stage, but it seems fairly credible. Page images are delivered from S3 and come through about as fast as they do from our existing live servers.

There's a new storage abstraction layer which coordinates actions through an appropriate storage service; one for local and another for S3. Each title will be associated with one or the other service as we make the transition. The services are mainly stubs at the moment but will get fleshed out over the next couple of days.

The plan:


  1. Pick up S3 connection parameters from our per-machine config file.

  2. Add storage platform field to Magazine

  3. Release

  4. Upload all issues of a test title to S3 and switch platform

  5. Test performance

  6. Pick up wordmaps, links, keywords and page images through storage layer

  7. Enable caching

  8. Integrate index rebuilds



Unforseen benefit: my test server can now deliver anything that's available through the live site, as they're both serving the same content (well, it'll still use a local database, but the page images, links and wordmaps will all be common). No more staring at screens of placeholders while testing...

T

Labels: ,


Comments: Post a Comment

<< Home

Powered by Blogger