Skip to content
COST03

COST03-BP01 - Configure billing and cost management tools

One-Click Remediation

Deploy CloudFormation stacks to implement this best practice with a single click.

Cost Management Baseline
Configure baseline billing exports, cost visibility tooling, and notifications for organization-wide cost management
Launch Stack

Stacks deploy to your AWS account. Review parameters before creating. Standard AWS charges apply.

Implementation guidance

AWS provides a comprehensive suite of billing and cost management tools that, when properly configured, provide detailed visibility into your cloud spending patterns. These tools form the foundation of effective cost monitoring and enable data-driven optimization decisions.

Core Billing and Cost Management Tools

AWS Cost Explorer: Provides interactive cost analysis and visualization capabilities with pre-built reports and custom analysis options. Essential for understanding spending patterns and identifying trends.

AWS Cost and Usage Report (CUR): Delivers the most comprehensive and detailed cost and usage data available, suitable for advanced analytics and custom reporting requirements.

AWS Budgets: Enables creation of custom budgets with automated alerts and actions, providing proactive cost management capabilities.

AWS Cost Anomaly Detection: Uses machine learning to automatically identify unusual spending patterns and provides early warning of potential cost issues.

Tool Configuration Strategy

Comprehensive Coverage: Configure tools to monitor all AWS accounts, services, and resources within your organization to ensure complete cost visibility.

Appropriate Granularity: Set up monitoring at the right level of detail for different use cases, from high-level executive summaries to detailed resource-level analysis.

Automated Processing: Configure automated data collection, processing, and reporting to reduce manual effort and ensure timely cost insights.

Integration Ready: Set up tools with appropriate APIs, data exports, and integration capabilities to support broader cost management workflows.

Configuration Best Practices

Centralized Management: Use AWS Organizations to centrally manage billing and cost tools across all accounts in your organization.

Role-Based Access: Configure appropriate access controls to ensure different stakeholders have access to relevant cost information without compromising security.

Data Retention: Configure appropriate data retention policies to maintain historical cost data for trend analysis and forecasting.

Export and Integration: Set up automated data exports and integrations to support advanced analytics and business intelligence requirements.

AWS Services to Consider

AWS Cost Explorer

Configure Cost Explorer with custom filters, groupings, and saved reports. Set up regular report generation and sharing for different stakeholder groups.

AWS Cost and Usage Report (CUR)

Configure CUR to deliver detailed cost data to S3 buckets. Set up appropriate data formats, compression, and delivery schedules for your analytics needs.

AWS Budgets

Create comprehensive budget structures covering different organizational dimensions. Configure automated alerts and actions for proactive cost management.

AWS Cost Anomaly Detection

Configure anomaly detection monitors for different cost dimensions. Set up appropriate alert thresholds and notification channels.

AWS Organizations

Configure consolidated billing and organizational structure to support centralized cost management across multiple accounts.

AWS Identity and Access Management (IAM)

Configure appropriate permissions and roles for cost management tools. Ensure secure access to billing information based on organizational needs.

Amazon S3

Configure S3 buckets for storing cost and usage reports. Set up appropriate lifecycle policies and access controls for cost data storage.

AWS CloudFormation

Use CloudFormation to automate the deployment and configuration of cost management tools across multiple accounts and regions.

Implementation Steps

1. Assess Current Configuration

  • Review existing billing and cost management tool configurations
  • Identify gaps in cost visibility and monitoring coverage
  • Document current access patterns and user requirements
  • Assess integration needs with existing business systems

2. Plan Tool Configuration

  • Define requirements for each cost management tool
  • Plan organizational structure and account hierarchy
  • Design access control and permission strategies
  • Plan data export and integration requirements

3. Configure Core Tools

  • Set up AWS Cost Explorer with appropriate filters and reports
  • Configure AWS Cost and Usage Report delivery
  • Create comprehensive budget structure with AWS Budgets
  • Set up AWS Cost Anomaly Detection monitors

4. Implement Access Controls

  • Configure IAM roles and policies for cost management access
  • Set up cross-account access for centralized cost management
  • Implement role-based access to different cost management features
  • Configure audit logging for cost management activities

5. Set Up Data Export and Integration

  • Configure automated data exports to support analytics
  • Set up integration with business intelligence tools
  • Implement data processing pipelines for cost data
  • Configure APIs and webhooks for real-time integration

6. Validate and Optimize

  • Test all configured tools and integrations
  • Validate data accuracy and completeness
  • Optimize performance and cost of monitoring infrastructure
  • Document configuration and provide user training

Cost Explorer Configuration

Report Configuration

View code
{
  "ReportName": "Monthly Cost by Service",
  "TimeRange": {
    "Start": "2024-01-01",
    "End": "2024-12-31"
  },
  "Granularity": "MONTHLY",
  "Metrics": ["BlendedCost", "UnblendedCost", "UsageQuantity"],
  "GroupBy": [
    {
      "Type": "DIMENSION",
      "Key": "SERVICE"
    },
    {
      "Type": "TAG",
      "Key": "Environment"
    }
  ],
  "Filter": {
    "Dimensions": {
      "Key": "LINKED_ACCOUNT",
      "Values": ["123456789012", "123456789013"]
    }
  }
}

Saved Reports Setup

  • Executive Summary: High-level cost overview for leadership
  • Service Breakdown: Detailed analysis by AWS service
  • Account Analysis: Cost breakdown by AWS account
  • Tag-Based Reports: Cost analysis by business dimensions

Custom Filters and Groupings

  • Time-based Analysis: Daily, weekly, monthly, and yearly views
  • Service Groupings: Logical grouping of related AWS services
  • Cost Type Analysis: Separate views for different cost types
  • Regional Analysis: Cost breakdown by AWS region

Cost and Usage Report Configuration

CUR Setup Parameters

View code
ReportName: "detailed-billing-report"
TimeUnit: "HOURLY"
Format: "Parquet"
Compression: "GZIP"
AdditionalSchemaElements:
  - "RESOURCES"
  - "SPLIT_COST_ALLOCATION_DATA"
AdditionalArtifacts:
  - "REDSHIFT"
  - "QUICKSIGHT"
RefreshClosedReports: true
ReportVersioning: "OVERWRITE_REPORT"
S3Bucket: "company-cost-reports"
S3Prefix: "cur-reports/"
S3Region: "us-east-1"

Data Processing Pipeline

View code
import boto3
import pandas as pd
from datetime import datetime, timedelta

def process_cur_data():
    """Process Cost and Usage Report data for analysis"""
    s3 = boto3.client('s3')
    
    # List available CUR files
    bucket = 'company-cost-reports'
    prefix = 'cur-reports/'
    
    response = s3.list_objects_v2(
        Bucket=bucket,
        Prefix=prefix
    )
    
    # Process latest CUR file
    latest_file = max(response['Contents'], key=lambda x: x['LastModified'])
    
    # Download and process the file
    obj = s3.get_object(Bucket=bucket, Key=latest_file['Key'])
    df = pd.read_parquet(obj['Body'])
    
    # Aggregate data by key dimensions
    summary = df.groupby([
        'bill/BillingEntity',
        'product/ProductName',
        'lineItem/UsageAccountId',
        'resourceTags/user:Environment'
    ]).agg({
        'lineItem/BlendedCost': 'sum',
        'lineItem/UnblendedCost': 'sum',
        'lineItem/UsageAmount': 'sum'
    }).reset_index()
    
    # Store processed data
    processed_key = f"processed/{datetime.now().strftime('%Y-%m-%d')}/cost-summary.parquet"
    summary.to_parquet(f's3://{bucket}/{processed_key}')
    
    return summary

def create_cost_dashboard_data():
    """Create data for cost dashboards"""
    summary = process_cur_data()
    
    # Create different views for dashboards
    dashboard_data = {
        'total_cost': summary['lineItem/BlendedCost'].sum(),
        'cost_by_service': summary.groupby('product/ProductName')['lineItem/BlendedCost'].sum().to_dict(),
        'cost_by_account': summary.groupby('lineItem/UsageAccountId')['lineItem/BlendedCost'].sum().to_dict(),
        'cost_by_environment': summary.groupby('resourceTags/user:Environment')['lineItem/BlendedCost'].sum().to_dict()
    }
    
    return dashboard_data

Budget Configuration

Budget Structure Design

View code
import boto3

def create_comprehensive_budgets():
    """Create a comprehensive budget structure"""
    budgets_client = boto3.client('budgets')
    account_id = boto3.client('sts').get_caller_identity()['Account']
    
    # Master budget for total AWS spending
    master_budget = {
        'BudgetName': 'Total-AWS-Spending',
        'BudgetLimit': {
            'Amount': '100000',
            'Unit': 'USD'
        },
        'TimeUnit': 'MONTHLY',
        'BudgetType': 'COST',
        'CostFilters': {},
        'TimePeriod': {
            'Start': datetime(2024, 1, 1),
            'End': datetime(2024, 12, 31)
        }
    }
    
    # Service-specific budgets
    service_budgets = [
        {'service': 'Amazon Elastic Compute Cloud - Compute', 'amount': '30000'},
        {'service': 'Amazon Simple Storage Service', 'amount': '15000'},
        {'service': 'Amazon Relational Database Service', 'amount': '20000'},
        {'service': 'Amazon CloudFront', 'amount': '5000'}
    ]
    
    for service_budget in service_budgets:
        budget = {
            'BudgetName': f"Budget-{service_budget['service'].replace(' ', '-')}",
            'BudgetLimit': {
                'Amount': service_budget['amount'],
                'Unit': 'USD'
            },
            'TimeUnit': 'MONTHLY',
            'BudgetType': 'COST',
            'CostFilters': {
                'Service': [service_budget['service']]
            }
        }
        
        # Create budget with alerts
        budgets_client.create_budget(
            AccountId=account_id,
            Budget=budget,
            NotificationsWithSubscribers=[
                {
                    'Notification': {
                        'NotificationType': 'ACTUAL',
                        'ComparisonOperator': 'GREATER_THAN',
                        'Threshold': 80,
                        'ThresholdType': 'PERCENTAGE'
                    },
                    'Subscribers': [
                        {
                            'SubscriptionType': 'EMAIL',
                            'Address': 'finance-team@company.com'
                        }
                    ]
                }
            ]
        )

def create_environment_budgets():
    """Create budgets for different environments"""
    environments = ['production', 'staging', 'development', 'testing']
    budget_amounts = {'production': '60000', 'staging': '15000', 'development': '15000', 'testing': '10000'}
    
    budgets_client = boto3.client('budgets')
    account_id = boto3.client('sts').get_caller_identity()['Account']
    
    for env in environments:
        budget = {
            'BudgetName': f'Environment-{env.title()}',
            'BudgetLimit': {
                'Amount': budget_amounts[env],
                'Unit': 'USD'
            },
            'TimeUnit': 'MONTHLY',
            'BudgetType': 'COST',
            'CostFilters': {
                'TagKey': ['Environment'],
                'TagValue': [env]
            }
        }
        
        # Create budget with multiple alert thresholds
        notifications = []
        for threshold in [50, 80, 100]:
            notifications.append({
                'Notification': {
                    'NotificationType': 'FORECASTED' if threshold < 100 else 'ACTUAL',
                    'ComparisonOperator': 'GREATER_THAN',
                    'Threshold': threshold,
                    'ThresholdType': 'PERCENTAGE'
                },
                'Subscribers': [
                    {
                        'SubscriptionType': 'EMAIL',
                        'Address': f'{env}-team@company.com'
                    }
                ]
            })
        
        budgets_client.create_budget(
            AccountId=account_id,
            Budget=budget,
            NotificationsWithSubscribers=notifications
        )

Cost Anomaly Detection Configuration

Anomaly Detection Setup

View code
import boto3

def setup_cost_anomaly_detection():
    """Configure comprehensive cost anomaly detection"""
    ce_client = boto3.client('ce')
    
    # Service-level anomaly detection
    service_monitor = {
        'MonitorName': 'Service-Level-Anomalies',
        'MonitorType': 'DIMENSIONAL',
        'MonitorSpecification': {
            'DimensionKey': 'SERVICE',
            'MatchOptions': ['EQUALS'],
            'Values': [
                'Amazon Elastic Compute Cloud - Compute',
                'Amazon Simple Storage Service',
                'Amazon Relational Database Service'
            ]
        }
    }
    
    # Account-level anomaly detection
    account_monitor = {
        'MonitorName': 'Account-Level-Anomalies',
        'MonitorType': 'DIMENSIONAL',
        'MonitorSpecification': {
            'DimensionKey': 'LINKED_ACCOUNT',
            'MatchOptions': ['EQUALS'],
            'Values': ['123456789012', '123456789013', '123456789014']
        }
    }
    
    # Create monitors
    monitors = [service_monitor, account_monitor]
    
    for monitor in monitors:
        response = ce_client.create_anomaly_monitor(**monitor)
        monitor_arn = response['MonitorArn']
        
        # Create anomaly detector
        detector = {
            'AnomalyDetectorName': f"Detector-{monitor['MonitorName']}",
            'MonitorArnList': [monitor_arn]
        }
        
        detector_response = ce_client.create_anomaly_detector(**detector)
        
        # Create subscription for alerts
        subscription = {
            'SubscriptionName': f"Subscription-{monitor['MonitorName']}",
            'MonitorArnList': [monitor_arn],
            'Subscribers': [
                {
                    'Address': 'cost-alerts@company.com',
                    'Type': 'EMAIL',
                    'Status': 'CONFIRMED'
                }
            ],
            'Threshold': 100.0,  # Alert for anomalies over $100
            'Frequency': 'DAILY'
        }
        
        ce_client.create_anomaly_subscription(**subscription)

def create_custom_anomaly_detection():
    """Create custom anomaly detection for specific use cases"""
    ce_client = boto3.client('ce')
    
    # High-cost resource anomaly detection
    high_cost_monitor = {
        'MonitorName': 'High-Cost-Resources',
        'MonitorType': 'CUSTOM',
        'MonitorSpecification': {
            'CostCategoryArn': 'arn:aws:ce::123456789012:costcategory/high-cost-resources'
        }
    }
    
    # Development environment anomaly detection
    dev_monitor = {
        'MonitorName': 'Development-Environment-Anomalies',
        'MonitorType': 'DIMENSIONAL',
        'MonitorSpecification': {
            'DimensionKey': 'TAG',
            'MatchOptions': ['EQUALS'],
            'Values': ['Environment$development']
        }
    }
    
    for monitor in [high_cost_monitor, dev_monitor]:
        try:
            response = ce_client.create_anomaly_monitor(**monitor)
            print(f"Created monitor: {monitor['MonitorName']}")
        except Exception as e:
            print(f"Error creating monitor {monitor['MonitorName']}: {str(e)}")

Access Control Configuration

IAM Roles and Policies

View code
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "CostExplorerAccess",
      "Effect": "Allow",
      "Action": [
        "ce:GetCostAndUsage",
        "ce:GetDimensionValues",
        "ce:GetReservationCoverage",
        "ce:GetReservationPurchaseRecommendation",
        "ce:GetReservationUtilization",
        "ce:GetUsageReport",
        "ce:DescribeCostCategoryDefinition",
        "ce:GetRightsizingRecommendation"
      ],
      "Resource": "*"
    },
    {
      "Sid": "BudgetsReadAccess",
      "Effect": "Allow",
      "Action": [
        "budgets:ViewBudget",
        "budgets:DescribeBudgets",
        "budgets:DescribeBudgetPerformanceHistory"
      ],
      "Resource": "*"
    },
    {
      "Sid": "CURAccess",
      "Effect": "Allow",
      "Action": [
        "cur:DescribeReportDefinitions"
      ],
      "Resource": "*"
    },
    {
      "Sid": "S3CostDataAccess",
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:ListBucket"
      ],
      "Resource": [
        "arn:aws:s3:::company-cost-reports",
        "arn:aws:s3:::company-cost-reports/*"
      ]
    }
  ]
}

Cross-Account Access Setup

View code
import boto3
import json

def setup_cross_account_cost_access():
    """Set up cross-account access for centralized cost management"""
    iam = boto3.client('iam')
    
    # Create role for cross-account cost access
    trust_policy = {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Principal": {
                    "AWS": "arn:aws:iam::MASTER-ACCOUNT-ID:root"
                },
                "Action": "sts:AssumeRole",
                "Condition": {
                    "StringEquals": {
                        "sts:ExternalId": "cost-management-external-id"
                    }
                }
            }
        ]
    }
    
    # Create the role
    role_response = iam.create_role(
        RoleName='CrossAccountCostManagement',
        AssumeRolePolicyDocument=json.dumps(trust_policy),
        Description='Role for cross-account cost management access'
    )
    
    # Attach cost management policy
    cost_policy = {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "ce:*",
                    "budgets:Describe*",
                    "budgets:View*",
                    "cur:DescribeReportDefinitions",
                    "organizations:ListAccounts",
                    "organizations:DescribeOrganization"
                ],
                "Resource": "*"
            }
        ]
    }
    
    iam.put_role_policy(
        RoleName='CrossAccountCostManagement',
        PolicyName='CostManagementPolicy',
        PolicyDocument=json.dumps(cost_policy)
    )
    
    return role_response['Role']['Arn']

Monitoring and Validation

Configuration Validation

  • Data Accuracy: Verify that cost data is accurate and complete
  • Access Testing: Test access controls and permissions
  • Integration Testing: Validate integrations with external systems
  • Performance Testing: Ensure monitoring infrastructure performs adequately

Ongoing Monitoring

  • Tool Performance: Monitor the performance of cost management tools
  • Data Quality: Continuously monitor data quality and completeness
  • Access Patterns: Monitor access patterns and usage of cost tools
  • Cost of Monitoring: Track the cost of monitoring infrastructure itself

Optimization Opportunities

  • Tool Utilization: Identify underutilized features and capabilities
  • Process Efficiency: Optimize cost management processes and workflows
  • Automation Opportunities: Identify opportunities for further automation
  • Integration Improvements: Enhance integrations with business systems

Common Challenges and Solutions

Challenge: Complex Multi-Account Setup

Solution: Use AWS Organizations for centralized management. Implement standardized configurations across accounts. Use infrastructure as code for consistent deployment. Create comprehensive documentation and training.

Challenge: Data Volume and Processing

Solution: Implement efficient data processing pipelines. Use appropriate data formats and compression. Set up automated data lifecycle management. Consider using AWS analytics services for large-scale processing.

Challenge: Tool Integration Complexity

Solution: Use standard APIs and data formats for integration. Implement robust error handling and retry logic. Create modular integration architectures. Provide comprehensive testing and validation.

Challenge: Access Control Complexity

Solution: Design role-based access control that matches organizational structure. Use principle of least privilege. Implement regular access reviews. Provide clear documentation and training.

Challenge: Cost of Monitoring Infrastructure

Solution: Optimize monitoring infrastructure costs through right-sizing and automation. Use managed services where appropriate. Implement cost controls for monitoring resources. Regular review and optimization of monitoring costs.