Galaxy

Containers

Understand container status, monitor resources, and scale your app to handle growth.

Ever wonder what's actually running when your app handles traffic? Each container is a separate instance of your application working independently.

This guide shows you how to monitor containers, understand what the metrics mean, and scale when traffic grows.

You're in Control

Galaxy shows you exactly what's happening with your app in real-time. No guesswork. Just data.


Understanding Containers

A container is a running copy of your app. Small apps might have one. High-traffic apps might have several running in parallel, with Galaxy distributing traffic across them all.

Think of containers like individual servers. Each one runs your code independently. If you have three containers running, Galaxy is essentially running your app three times at once, spreading user traffic across all three.


Reading the Container Display

At the top of the Containers section, you'll see resource usage information for all containers combined.

gCPU: Total CPU power being used across all containers. This helps you understand how hard your app is working right now.

Memory: Total RAM allocation. Shows how much memory all your containers are consuming.

Current Plan: Your hosting plan (Free, Essentials, Professional, or Business).

Replica Count: Something like "2/3 REPLICAS". The first number is how many containers are running right now. The second number is how many should be running based on your configuration. If these don't match, something might be scaling up or down.


Container Status

Each container displays its current status with a badge. Watch these statuses change when you deploy or scale your app.

🟡 Pending: Container is queued to start but hasn't launched yet.

🔵 Creating: Container is starting up for the first time or after a scale event.

🟢 Running: Container is healthy and handling requests.

🟠 Terminating: Container is being shut down (usually during a scale-down or deployment).

Stopped: Container is stopped or suspended.

🟡 Waiting: Container is in a generic waiting state.

🔴 Failed: Container encountered an error state (CrashLoop, ImagePull, etc.).

🟣 Succeeded: Container finished its task (usually for one-time jobs).

When you deploy new code or adjust container count, you'll watch these statuses change in real-time. Creating containers take a moment to spin up. Terminating containers shut down gracefully without dropping traffic.


Real-Time Metrics

Each container shows metrics updated in real-time.

CPU Usage: How much processing power this container is using right now.

Memory Usage: How much RAM this container is consuming.

Uptime: How long this container has been running since it started.

Network: Data in and out.

These metrics help you understand how your app is performing at the container level. See one container consistently using way more CPU than others? That might indicate a load balancing issue. One container spiking in memory usage? That could signal a memory leak.

Watch container metrics when you're testing under load. You'll understand how your app behaves when traffic increases.


What Containers Tell You

The Containers view is most useful when something's not quite right. Here's what to look for.

All containers running smoothly? Everything's fine. Your app is healthy.

One container stuck in "Creating" state? Give it a moment. It takes a few seconds to spin up. If it never finishes, check Logs to see what's blocking startup.

Replica count mismatch (like "1/3 REPLICAS")? You're probably in the middle of a scale event or deployment. Wait a moment and check again. If it doesn't correct itself, there might be an issue preventing containers from starting.

Uneven resource usage across containers? This is normal for stateless apps (traffic distribution is based on many factors). If usage is drastically different, it might indicate a slow container that needs investigation.


Scaling Containers

You can't directly add or remove containers from this page. Instead, head to Plans to adjust how many containers you want running.

This page is just for monitoring what's happening right now. It shows you the real-time picture of your containers in action.

Need more containers? Go to Plans and increase your container count.

Need fewer? Decrease the count.

Galaxy handles the rest.

Scaling is Non-Disruptive

Changing container count doesn't require downtime. Galaxy adds or removes containers gracefully while your app keeps serving traffic.

Free Plan Limitation

Apps on the Free plan are limited to a single container and cannot be scaled. To add more containers or change container size, upgrade your plan from the Plans page.


Troubleshooting

Containers Keep Restarting?

Check Logs. Repeated restarts usually mean your app is crashing, and the logs will tell you why.

High CPU or memory usage? This might indicate an inefficient query, memory leak, or a need to scale up to larger containers. Check your app code and consider upgrading container size from Plans.

One container consistently lagging? Sometimes individual containers can get into a bad state. Try restarting your app from the Overview page. This usually clears up transient issues.


Container Count: Single vs Multiple

Most apps start with one container. But you have options.

Single container: Simple, cheaper. Fine for low to moderate traffic. If that one container crashes, your app goes down until it restarts.

Multiple containers: More resilient. Galaxy distributes traffic across all containers. If one crashes, others keep handling requests. Better performance under load.

For production apps, at least two containers is recommended. For mission-critical apps, three or more is better. This gives you redundancy and better performance.


High-Availability

If you're running a Standard (1GB) container or larger with three or more containers, high-availability is automatically enabled. This distributes your containers across different availability zones.

See the Overview page for more details on high-availability.


Changing Your Plan or Container Size

Ready to scale? It's simple.

  1. Open your app and navigate to Plans
  2. Select your new plan (if changing tiers)
  3. Pick your new container size
  4. Adjust container count if needed
  5. Click Apply

Galaxy applies the change and triggers your next deployment. Your app doesn't go down. The change takes effect when your app restarts.

Scaling is Instant

Want to test a bigger container? Change it here and test. If it doesn't help, change back. No harm in experimenting.


Downscaling to Save Money

Running more containers than you need? Or paying for bigger containers than necessary?

Downscaling is just as easy as scaling up.

  1. Go to Plans
  2. Reduce container size or count
  3. Click Apply

Your app updates and triggers the next deployment. Your monthly bill decreases immediately.

For temporary cost savings, you can also Stop your app from the Overview page. Stopped apps don't cost anything, but users can't access them until you restart.


Different Plans for Different App Types

Different app types (Meteor, Node.js, Python, AdonisJS) have different plan names and container options. The concept is the same: pick a tier, choose a size. But specific options vary by runtime.

Check our billing documentation for details specific to what you're running.


Cost Estimation

The Plans page shows hourly and monthly costs. Monthly estimates assume 24/7 running (720 hours in a month).

Don't worry if your app doesn't run 24/7. Galaxy charges by the minute, so you only pay for time your app actually runs.

Running only during business hours? That's roughly 8/24 of the daily cost.

Running only weekdays? That's roughly 5/7 of the weekly cost.

Adjust the estimates in your head to match your actual usage pattern.


Quick Reference

Containers are instances: Each one runs your app independently.

Monitor in real-time: Watch CPU, memory, and network metrics as they happen.

Scale from Plans page: Add or remove containers, or upgrade container size.

High-availability: Automatically enabled with 3+ Standard containers.

Scaling is non-disruptive: Your app keeps serving traffic during changes.

Check Logs if containers restart: The logs show what's causing the issue.

Free plan is single-container only: Upgrade to scale beyond one container.


What's Next?

Your containers are running. Your app is live. Your users are happy. Now you've got the tools to keep it that way as your traffic grows.