Skip to main content
  1. Home
  2. >
  3. AWS
  4. >
  5. SAA-C03
  6. >
  7. AWS SAA-C03 Exam Scenarios
  8. >
  9. S3 Lifecycle Tiering Cost vs Retrieval | SAA-C03

S3 Lifecycle Tiering Cost vs Retrieval | SAA-C03

Jeff Taakey
Author
Jeff Taakey
21+ Year Enterprise Architect | Multi-Cloud Architect & Strategist.

While preparing for the AWS SAA-C03, many candidates get confused by S3 storage class transitions and lifecycle policies. In the real world, this is fundamentally a decision about Storage Cost vs. Retrieval Requirements. Let’s drill into a simulated scenario.

The Scenario
#

GlobalHealth Analytics, a medical research company, maintains comprehensive patient outcome datasets that must be retained for regulatory compliance. Currently, all 500 TB of historical research data resides in Amazon S3 Standard storage class, costing the company approximately $11,500 per month in storage fees alone.

The compliance team has clarified two critical requirements:

  • All data must be retained for at least 25 years (regulatory mandate)
  • Data from the most recent 2 years must remain highly available and support instant retrieval for active research projects
  • Data older than 2 years is rarely accessed (less than once per year) but must be retrievable when needed

The CFO has challenged the Solutions Architect to significantly reduce storage costs while maintaining full compliance.

Key Requirements
#

Minimize total cost of ownership (TCO) for 25-year data retention while ensuring the most recent 2 years of data remains highly available with instant retrieval capability.

The Options
#

  • A) Configure an S3 lifecycle policy to immediately transition all objects to S3 Glacier Deep Archive storage.
  • B) Configure an S3 lifecycle policy to transition objects to S3 Glacier Deep Archive storage after 2 years.
  • C) Use S3 Intelligent-Tiering storage class and enable archive access tiers to automatically move data to S3 Glacier Deep Archive.
  • D) Configure an S3 lifecycle policy to immediately transition objects to S3 One Zone-IA storage, then transition to S3 Glacier Deep Archive after 2 years.

Correct Answer
#

Option B — Configure S3 lifecycle policy to transition objects to S3 Glacier Deep Archive storage after 2 years.

Step-by-Step Winning Logic
#

This solution perfectly balances cost optimization with business requirements:

  1. Requirement Alignment: Data remains in S3 Standard (or Standard-IA if access is infrequent) for the first 2 years, providing millisecond retrieval latency and 99.99% availability for active research.

  2. Cost Optimization: After 2 years, data automatically transitions to Glacier Deep Archive ($0.00099/GB/month), reducing storage costs by ~95% for data that’s rarely accessed but must be retained.

  3. Simplicity: Single lifecycle rule with age-based transition—no monitoring overhead, no manual intervention, no complex access pattern analysis required.

  4. Compliance: All data remains durable (99.999999999%) regardless of storage class, meeting the 25-year retention mandate.

Monthly Cost Impact (500 TB dataset, assuming uniform age distribution):

  • Current state (all S3 Standard): ~$11,500/month
  • After optimization: ~$920/month for recent data (2 years) + ~$455/month for archived data (23 years) = ~$1,375/month total
  • Savings: ~$10,125/month or 88% cost reduction

💎 The Architect’s Deep Dive: Why Options Fail
#

The Traps (Distractor Analysis)
#

  • Why not Option A?

    • Violates the 2-year instant retrieval requirement. Moving all data immediately to Glacier Deep Archive means even yesterday’s data would require 12-hour retrieval—unacceptable for active research projects.
    • Cost savings are marginal compared to Option B since you’d pay significantly higher retrieval fees for frequently accessed recent data.
  • Why not Option C?

    • Over-engineered and more expensive. S3 Intelligent-Tiering costs $0.023/GB/month (same as Standard) plus $0.0025 per 1,000 objects monitoring fee.
    • Intelligent-Tiering is designed for unpredictable access patterns. This scenario has perfectly predictable patterns (recent = frequent, old = rare).
    • Archive tiers in Intelligent-Tiering still require 3-12 hours retrieval, but you’re paying monitoring fees for data that has a known access pattern.
  • Why not Option D?

    • Unnecessary complexity with availability risk. S3 One Zone-IA stores data in a single AZ (99.5% availability vs. 99.99% for Standard).
    • The intermediate transition adds no value—if data must be instantly retrievable for 2 years, it should remain in a high-availability class.
    • One Zone-IA only saves ~20% vs. Standard, but introduces data loss risk if the AZ fails—unacceptable for regulatory data.
    • You’re adding a lifecycle transition step that provides minimal benefit while introducing availability SLA degradation.

💎 Professional Decision Matrix

This SAA-C03 professional section is locked.
Free beta access reveals the exam logic.

100% Free Beta Access

The Architect Blueprint
#

graph TD
    A[New Research Data Upload] -->|Day 0| B[S3 Standard Storage]
    B -->|0-730 days| C{Active Research Period}
    C -->|Instant Retrieval| D[Research Team Access]
    C -->|Day 730| E[Lifecycle Policy Trigger]
    E -->|Automatic Transition| F[S3 Glacier Deep Archive]
    F -->|730-9,125 days| G[Long-Term Compliance Storage]
    G -->|Rare Access| H[12-hour Retrieval Process]
    
    style B fill:#ff9900,stroke:#232f3e,color:#fff
    style F fill:#1e73be,stroke:#232f3e,color:#fff
    style E fill:#32cd32,stroke:#232f3e,color:#000

💎 Professional Decision Matrix

This SAA-C03 professional section is locked.
Free beta access reveals the exam logic.

100% Free Beta Access

Diagram Note: Objects automatically transition from high-cost, instant-access S3 Standard to ultra-low-cost Glacier Deep Archive at the 2-year mark, aligning storage tier with actual business access requirements.

Real-World Practitioner Insight
#

Exam Rule
#

For the SAA-C03 exam, when you see:

  • “Long-term retention” (>1 year) + “reduce cost” → Think Glacier tiers
  • “Instant retrieval” or “high availability” for recent data → S3 Standard/Standard-IA
  • Time-based access pattern → Lifecycle policies (not Intelligent-Tiering)
  • “Deep Archive” appears in options → Check if instant retrieval is required for all data or just recent data

The formula: Identify the time boundary where access requirements change, then apply lifecycle transitions at that boundary.

Real World
#

In production environments, we would likely enhance this approach:

  1. Standard-IA for cost optimization: Transition objects to S3 Standard-IA after 30-90 days (if infrequent access is acceptable) before the final transition to Glacier Deep Archive. This adds another 45% savings during the 2-year active period.

  2. Glacier Flexible Retrieval as intermediate tier: For data 2-5 years old that occasionally needs faster retrieval (3-5 hours vs. 12 hours), add an intermediate Glacier Flexible Retrieval tier ($0.0036/GB/month) before Deep Archive.

  3. Intelligent-Tiering with Archive: For datasets with unpredictable access patterns (e.g., patient records where some cases are referenced repeatedly), Intelligent-Tiering with Deep Archive access tier enabled makes sense—but not for this scenario with clear time-based patterns.

  4. S3 Storage Lens: Implement S3 Storage Lens metrics to validate assumptions about access patterns and optimize lifecycle rules based on actual retrieval frequency.

  5. Object tagging for exceptions: Use object tags to exclude certain datasets (e.g., ongoing longitudinal studies) from automatic archival, keeping them in Standard storage beyond the 2-year threshold.

The exam tests foundational service selection; production requires operational maturity and continuous optimization.

💎 Professional Decision Matrix

This SAA-C03 professional section is locked.
Free beta access reveals the exam logic.

100% Free Beta Access