PGF90 - Maximum # of Open Files???

Hi All - I’m trying to write out 60000 files (each horizontal point from a 2D gridded data set). In order to speed up the I/O, I would like to keep these files open rather than opening and appending. It looks like pgf90 is limiting me to around 1000 files open at once. Is there a work around for this?

Thanks much.

System Info:

[outfiles]$ uname -a
Linux 2.6.9-55.0.6.plus.c4smp #1 SMP Sat Sep 8 00:24:13 EDT 2007 x86_64 x86_64 x86_64 GNU/Linux

[outfiles]$ pgf90 -V
pgf90 6.0-5 64-bit target on x86-64 Linux
Copyright 1989-2000, The Portland Group, Inc. All Rights Reserved.
Copyright 2000-2005, STMicroelectronics, Inc. All Rights Reserved.

Hi ChuckA,


You most likely need to increase your OS file desctiptor limit. Though, depending upon your OS, your system may have a hard limit that is smaller than 60000.

For example on my SLES11 system using csh, even though I have set my limits to “unlimited”, I’m only allowed 8192 desciptors.

% limit
cputime unlimited
filesize unlimited
datasize unlimited
stacksize unlimited
coredumpsize unlimited
memoryuse unlimited
vmemoryuse unlimited
descriptors 8192
memorylocked 2097152 kbytes
maxproc 143360

  • Mat