this post was submitted on 21 Oct 2023
2 points (100.0% liked)

Self-Hosted Main

502 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

I have a very simple setup running Gitea which I love. However, I enabled Elastic Search because it makes searching much faster than the default method.

I have a VPS running 16GB memory. The only things running on it are Nginx, PHP, Mysql, docker, and a few other things. Very rarely I ever hit over 6GB usage.

The issue comes when I enable Elastic Search. It seems to wipe me out at 15.7GB usage out of 16GB as soon as I start it up.

I searched online and found out about the /etc/elasticsearch/jvm.options.d/jvm.options and adding

-XmxXG
-XmsXG

The question is, what should this amount be. I read that by default, Elastic uses 50%, however, when I started it up, it was wiping me out of memory and making the system almost have a stroke.

But setting it to 2GB seems to make it not as responsive on the Gitea website, sometimes even timing the website out.

So I'm not sure what "range" I should be using here. Or if I'm going to have to upgrade my VPS to 32GB in order to run this properly.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 11 months ago (1 children)
[–] [email protected] 1 points 11 months ago (1 children)

Thanks, I saw the last link when I first set this up, but not the first two. I'll go through them and see if I can find the sweet spot.

It's hard to tell because while I'm the only user using my Gitea repo website, which is pretty much your own personal Github. However, from what I've read, even though there may only be one or two users, the usage of Elastic greatly depends on how much code it has to cache. Then when you search for something, Elastic has to go through all that code.

So from what I understand, the more code you have in a repo, the more Elastic has to work, which makes figuring out the memory a bit of a random gamble.

[–] [email protected] 1 points 11 months ago

I haven't had first hand experience with gitea, but there would be some fine tuning that might ease the memory usage. What backend have you deployed ? You can make some config adjustment to it. If memory constrained, then swapiness could be set and any monitoring could be disabled or kept bare minimum. I read somewhere its useful to pprof https://github.com/google/pprof to get some insight about visually test memory usage, though haven't used it.