Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Step 3: Scale Out IISExample - AWS OpsWorks

Step 3: Scale Out IISExample

Important

The AWS OpsWorks Stacks service reached end of life on May 26, 2024 and has been disabled for both new and existing customers. We strongly recommend customers migrate their workloads to other solutions as soon as possible. If you have questions about migration, reach out to the AWS Support Team on AWS re:Post or through AWS Premium Support.

If your incoming user requests start to approach the limit of what you can handle with a single t2.micro instance, you will need to increase your server capacity. You can move to a larger instance, but that has limits. A more flexible approach is to add instances to your stack, and put them behind a load balancer. The basic architecture looks something like the following.

OpsWorks stack architecture with load balancer, Windows instances, and external repositories.

Among other advantages, this approach is much more robust than a single large instance.

  • If one of your instances fails, the load balancer will distribute incoming requests to the remaining instances, and your application will continue to function.

  • If you put instances in different Availability Zones (the recommended practice), your application will continue to function even if an Availability Zone encounters problems.

AWS OpsWorks Stacks makes it easy to scale out stacks. This section describes the basics of how to scale out a stack by adding a second 24/7 PHP App Server instance to IISExample and putting both instances behind an Elastic Load Balancing load balancer. You can easily extend the procedure to add an arbitrary number of 24/7 instances, or you can use time-based instances to have AWS OpsWorks Stacks scale your stack automatically. For more information, see Managing load with time-based and load-based instances.

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.