Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RAM consumption of self-hosted sentry hovering at 22GB after upgrade #3467

Open
1 task done
itsafire opened this issue Dec 12, 2024 · 2 comments
Open
1 task done

RAM consumption of self-hosted sentry hovering at 22GB after upgrade #3467

itsafire opened this issue Dec 12, 2024 · 2 comments

Comments

@itsafire
Copy link

Self-Hosted Version

24.11.1

CPU Architecture

x86_64

Docker Version

24.0.6

Docker Compose Version

2.28.1

Machine Specification

  • My system meets the minimum system requirements of Sentry

Steps to Reproduce

We just upgraded from 23.6.2 to 24.11.1. We used to be able to conveniently run sentry on a 16GB machine on version 23.6.2 as advised in the documentation. Since the upgrade we were forced to migrate to a machine with 32GB RAM since the memory usage is now around 22GB. There are now a lot of "small" sentry (23) and snuba (20) container with each of them consume between 200MB to 300MB of RAM.

Expected Result

Since the documentation states 16GB RAM as minimum requirement we expected version 24.11.1 to work with this configuration.

Actual Result

Various container of our updated sentry OOMed with 16GB of RAM.

Installing on a clean sentry 24.11.1 without any projects will result in 13GB of RAM usage.

Installing a clean sentry with version 23.6.2 and no projects resulted in 8 GB RAM usage.

Event ID

No response

@aldy505
Copy link
Collaborator

aldy505 commented Dec 17, 2024

Do you have high event ingestion throughput? The number of projects shouldn't be an issue, what could be the issue is the amount of event throughput.

Can you specify the list of containers that's got out of memory?

@itsafire
Copy link
Author

itsafire commented Dec 19, 2024

We have 3.4 million transactions in 30 days. I will scale back to 16GB on the weekend to see which containers are crashing. But since system memory is completely exhausted (50MB free) OOMs are probably happening on random containers.

@getsantry getsantry bot moved this from Waiting for: Community to Waiting for: Product Owner in GitHub Issues with 👀 3 Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Waiting for: Product Owner
Status: No status
Development

No branches or pull requests

2 participants