How to use this page
Use the left navigation rail for broad topics, then jump into the linked subtopics beneath each category. The layout is intended for fast scanning during design discussions, documentation reviews, and solution planning.
This document groups the most-used AWS services by domain, then breaks each area into easy subtopics so you can move from platform overview to specific service choices without hunting through long vendor menus.
Use the left navigation rail for broad topics, then jump into the linked subtopics beneath each category. The layout is intended for fast scanning during design discussions, documentation reviews, and solution planning.
Services that run compute, storage, and databases.
Networking, edge delivery, and private access patterns.
Identity, encryption, logging, and account-level controls.
Warehouse, ETL, streaming, and analytical query services.
Choose from virtual machines, serverless functions, and managed container orchestration depending on workload duration, scaling profile, and operational ownership.
Elastic Compute Cloud provides resizable virtual machines for workloads that need operating system access, custom networking, or predictable long-running execution.
Serverless event-driven compute for short-lived functions with automatic scaling and no server management, ideal for APIs, event handlers, and automation tasks.
Managed container platforms let teams standardize packaging while choosing between AWS-native orchestration, Kubernetes compatibility, and serverless task execution.
These services target higher-level deployment automation and queued compute execution for teams that want AWS to manage more of the infrastructure lifecycle.
AWS storage spans object, block, and file abstractions. The right choice depends on access pattern, latency needs, durability requirements, and sharing model.
Simple Storage Service is the default durable object store for backups, static sites, logs, data lakes, ML datasets, and application assets.
These services cover persistent volumes for instances, shared elastic file systems, and managed file systems optimized for specific operating environments.
Hybrid storage bridge between on-premises systems and AWS storage services for backup, archival, and cloud-backed file or tape workflows.
Central policy-driven backup service for coordinating retention and recovery across storage, database, and compute-integrated resources.
AWS supports relational, key-value, document, graph, ledger, and cache layers. Pick based on consistency needs, access model, and operational overhead.
Managed relational database platforms for PostgreSQL, MySQL, MariaDB, SQL Server, and Oracle-compatible workloads, with Aurora offering AWS-optimized cloud-native engines.
Fully managed NoSQL key-value and document database built for single-digit millisecond performance at any scale.
Managed Redis and Memcached service for low-latency caching, session storage, rate limiting, and transient data workloads.
AWS also provides purpose-built databases such as Neptune for graph, DocumentDB for document models, QLDB for ledgers, and Timestream for time-series metrics.
AWS networking services define connectivity, traffic routing, segmentation, and public edge delivery for cloud-native and hybrid architectures.
Virtual Private Cloud is the core network boundary for AWS workloads, giving teams control over IP ranges, subnets, routing, and traffic isolation.
These services handle global content delivery, DNS routing, traffic steering, and entry-point optimization for public-facing systems.
Managed traffic distribution across instances, containers, IPs, and Lambda targets using application, network, and gateway load balancers.
API Gateway, PrivateLink, and Direct Connect help expose services cleanly while managing internal and external access boundaries.
Security services shape how identities are granted access, how data is encrypted, how events are audited, and how threats are blocked at multiple layers.
Identity and Access Management provides the permission model for AWS accounts, services, roles, and automation, while Identity Center handles workforce access and SSO.
These services protect data, credentials, and public surfaces through encryption key control, secret rotation, web filtering, and DDoS mitigation.
Governance and detection services collect audit trails, track configuration drift, surface suspicious activity, and centralize security findings.
Account-level governance services standardize multi-account environments using organizational units, service control policies, and baseline landing zones.
AWS analytical tooling supports warehousing, ETL pipelines, ad hoc SQL, streaming ingestion, and managed lakehouse patterns built around S3.
These services form a common AWS data platform foundation for raw storage, metadata catalogs, ETL jobs, serverless SQL, and warehouse analytics.
Streaming services support real-time event ingestion, buffering, and downstream processing for telemetry, clickstreams, fraud detection, and operational analytics.
EMR provides managed big data runtimes while OpenSearch supports indexed search, observability, and log analytics workloads.
Managed business intelligence service for dashboards, embedded analytics, and lightweight self-service reporting on top of AWS and external data sources.
AWS splits AI and ML workflows across two major models: SageMaker for building, training, tuning, and operating custom machine learning systems, and Bedrock for consuming foundation models through managed generative AI APIs.
SageMaker is AWS's end-to-end managed machine learning platform for data preparation, notebook-based experimentation, training jobs, hyperparameter tuning, model registry workflows, hosted inference, and MLOps automation.
Bedrock is AWS's managed generative AI platform for accessing foundation models from multiple providers through a unified API, adding guardrails, knowledge bases, agents, evaluation workflows, and enterprise security controls without managing GPU infrastructure directly.
The choice depends on whether you are building custom predictive models and ML infrastructure or consuming managed foundation models for generative AI experiences.
Beyond SageMaker and Bedrock, AWS offers targeted AI services that solve specific workloads without requiring a full ML platform buildout.
These services connect distributed systems through messaging, event buses, and workflow orchestration so applications can scale beyond synchronous request chains.
Core messaging primitives for decoupling systems, buffering spikes, broadcasting notifications, and building resilient asynchronous architectures.
Workflow and event routing services help model business processes, connect AWS services, and coordinate retries, branching, and scheduled automations.
Managed brokers and data movement tools help teams integrate with legacy messaging protocols and external SaaS systems when modern event buses are not enough.
In practice, AWS application integration often combines API Gateway, Lambda, queues, and event routing to separate user traffic from backend processing.
This layer covers provisioning, monitoring, deployments, systems management, and operational controls for running AWS estates at scale.
Operational telemetry services for metrics, logs, dashboards, alarms, and distributed tracing across applications and infrastructure.
Infrastructure as code tools define AWS resources declaratively so environments can be recreated, versioned, and reviewed like application code.
Native CI/CD services automate build pipelines, artifact packaging, test stages, and application deployment into AWS targets.
Operational tooling for patching, remote commands, inventory, parameter storage, and high-level optimization recommendations.