ulimits and my limits
- May
- 24
- Posted by TKH Specialist
- Posted in Analysis, java, linux, Oracle, System Administration, System Resources
Even Linux systems have their limits! One of the systems I manage started throwing a “Too many open files” error in /var/log/messages. No error is good but this error is fixable. A little tuning and the error goes away. On this particular server we were running the Oracle Client and a Java Application, two very hungry applications. First you have to figure out the problem.
You can use lsof to understand who’s opening so many files. Of course lsof is a powerful tool which usually drowns you in output, with a little help from standard Linux utilities and we get some useful data. Here is a list of open files by process:
$ lsof | awk '{ print $2 " " $1; }' | sort -rn | uniq -c | sort -rn | head -20 78232 29634 java 144 2369 splunkd 124 2285 virt-who 75 2593 sshfs 68 1024 python 43 2590 ssh 17 4341 lsof 16 1032 tuned 14 4152 bash 12 4347 lsof 12 4342 awk 11 4345 sort 11 4343 sort 9 4346 head 9 4344 uniq 8 688 dbus-daem 8 687 JS 8 656 auditd 8 1025 virt-who 4 9 rcu_sched
The Oracle Client is called from within Java so we don’t see it but we still have to consider it when we tune the Kernel.
Recent Comments
- Stefan on Flush This!
- Timestamping your Bash History | Matt's Entropy on Remember when you issued that command…?
- Matt Smith on Remember when you issued that command…?
- Ruan on Putting ‘lsof’ to use
- Dylan F. Marquis on External Mapping of LDAP attributes in CAS