intrinsic pack segfaults for large (not too large) arrays

This sample code segfaults with pgfortran, but not with gfortran (for example). The problem is arising from the pack function, and it works for 10 times smaller array, but there should be no memory problem with this size of array either. I see some pack-related posts form the pasts that claim the problem was fixed.
I have pgfortran 17.10-0 and it does the same both on macOS and on Linux. This is not even using GPU, the sample code is a result of hours of debugging.

  integer :: A(5000000), B(10)
  integer ::N

  A=0; B=0;  A(1)=1; A(5000000)=2
  N=count(A>0)
  B(1:N)=pack(A, mask=(A>0))

Hi abalogh,

I just tried your example but for good or bad, it runs fine for me.

% cat test.f90
   program foo
   integer :: A(5000000), B(10)
   integer ::N

   A=0; B=0;  A(1)=1; A(5000000)=2
   N=count(A>0)
   print *, N
   B(1:N)=pack(A, mask=(A>0))
   print *, B(1:N)
   end program foo
% pgfortran test.f90 -V17.10 -fast
% a.out
            2
            1            2

Is there more to the code?

Another possibility is that your encountering a stack overflow. What’s you stacksize? Does the problem go away if you set your stacksize to unlimited?

-Mat

I should also mention that if you’re running on Windows, the stack size is set at link time via the “-stack” flag.

PGI$ pgfortran test.f90
PGI$ ./test.exe
Segmentation fault
PGI$ pgfortran test.f90 -stack=50000000
PGI$ ./test.exe
            2
            1            2

-Mat

It was the stacksize, thank you very much for the help!

Since it was a part of a much larger code with a lot of other things and since it worked with gfortran, I would have never thought that it is a stacksize problem for a single function. I also haven’t encountered skactsize issues in more than 10 years.