Ajay Chowdary Kandula
asked on
Bump up the maximum user limit in Linux above the default 1024
Bump up the maximum user limit in Linux above the default 1024 causes the issues that brings down the application throughput
Gateway timeout issues
Maximum process increased
Issue observed in performance testing
Gateway timeout issues
Maximum process increased
Issue observed in performance testing
Provide your exact OS info, like a dump of /etc/os-release.
Also exactly how you increased maximum user limit, which might editing an /etc file or some other means.
Then, as arnold suggested, describe what application might mean + also what application throughput might mean.
Also exactly how you increased maximum user limit, which might editing an /etc file or some other means.
Then, as arnold suggested, describe what application might mean + also what application throughput might mean.
ASKER
@arnold - Its a Physical server ,Application - APIGEE.
@David - I cant share the log due to confidential nature of the job.
Increased the limit By edit the limits.conf
Throughput - App starts throwing time out error which results in degradation in number of request processed successfully
@David - I cant share the log due to confidential nature of the job.
Increased the limit By edit the limits.conf
Throughput - App starts throwing time out error which results in degradation in number of request processed successfully
Spec on server, demand on application?
Is this the sole application running the system?
I think it is better to include all info, sorc of server, number of expected users on whose basis the server was speced, actual utilization.
From your comment, it sounds as an under speced based on what it serves and what you expect.
Presumably the need for exceeding 1024 files is because the user under whose credentials the application runs hits this limit perhaps because of backlog...... 1024 is a limit of open file handles at one time per process. The number of processes that can open are also set/limited.
Your approach might be looking at the wrong issue/cause.
Is this the sole application running the system?
I think it is better to include all info, sorc of server, number of expected users on whose basis the server was speced, actual utilization.
From your comment, it sounds as an under speced based on what it serves and what you expect.
Presumably the need for exceeding 1024 files is because the user under whose credentials the application runs hits this limit perhaps because of backlog...... 1024 is a limit of open file handles at one time per process. The number of processes that can open are also set/limited.
Your approach might be looking at the wrong issue/cause.
If data is confidential, likely best to just hire someone to assist you.
Fixing this type of problem tends to be simple + fast, if all data is available.
Fixing this type of problem tends to be simple + fast, if all data is available.
ASKER
would be able to share data to the maximum
will get them at the earliest possible.
will get them at the earliest possible.
This question needs an answer!
Become an EE member today
7 DAY FREE TRIALMembers can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.
You are not providing details, what application, which user, Apache, httpd, nobody?