This is caused by a strange occurrence that often dubbed ``feeping creaturism''. Larry is always adding one more feature, always getting Perl to handle one more problem. Hence, it keeps growing. Once you've worked with perl long enough, you will probably start to do the same thing. You will then notice this problem as you see your scripts becoming larger and larger.
Oh, wait... you meant a currently running program and its stack size. Mea culpa, I misunderstood you. ;) While there may be a real memory leak in the Perl source code or even whichever malloc() you're using, common causes are incomplete eval()s or local()s in loops.
An eval() which terminates in error due to a failed parsing will leave a bit of memory unusable.
A local() inside a loop:
for (1..100) { local(@array); }
will build up 100 versions of @array before the loop is done. The work-around is:
local(@array); for (1..100) { undef @array; }
This local array behaviour has been fixed for perl5, but a failed eval() still leaks.
One other possibility, due to the way reference counting works, is when you've introduced a circularity in a data structure that would normally go out of scope and be unreachable. For example:
sub oops { my $x; $x = \$x; }
When $x goes out of scope, the memory can't be reclaimed, because there's still something point to $x (itself, in this case). A full garbage collection system could solve this, but at the cost of a great deal of complexity in perl itself and some inevitable performance problems as well. If you're making a circular data structure that you want freed eventually, you'll have to break the self-reference links yourself.
Other resources at this site: