This is an automated archive.
The original was posted on /r/sysadmin by /u/Comfortable_Onion318 on 2024-01-23 09:59:59+00:00.
Hello,
we have an application server running software that we use very much in our company and even customers use for different purposes. The ram usage of this server is constantly at 97% for 24h 7 days a week. Does this make sense from a technical standpoint...? If I would plan the integration of a system I would usually make sure that it has more than enough ressources that RAM Usage for example would be no more than 70% on average. This is how I "learned" it from school. Now this server is not hosted by us but by a cloudservice and the provided ressources are probably virtualized anyway but still, is this 'normal' or expected design?
I know this question sounds dumb and probably it's not how it should be, but maybe I'm just making a big deal out of it and that this is not as problematic as I make it out to be. But my goal would be to increase the ressources but also have a good reasoning or arguments for my employer to why I would want to do that. None of the other IT staff has ANY knowledge regarding this stuff.