Hello,
I have a problem on a DS20 machine (4.0F). I have a c program that processes a
large amount of data. The program runs fine on an old dec alpha(4.0D). It also
ran
fine on a DS20, that I used a month ago. On my actual DS20 machine it always
fails with a segmentation fault.
So it seems that just the configuration of my actual machine is wrong. I
have 4 GB of RAM an the program should use only 1 to 1.5 GB.
I thought it could be caused by to small settings for the memory use, but
all settings seemed ok to me.
Here are my actual 'proc'-settings, in brackets are the values of my working dec
alpha.
max proc per user 256 (64)
max threads per user 256 (256)
per proc stack size 268,435,456 (2,097,152)
max per proc stack size 536,870,912 (134,217,728)
per proc data size 8,589,934,592 (134,217,728)
max per proc data size 8,589,934,592(9,266,765,824 )
max per proc address space 8,589,934,592(9,266,765,824 )
per proc address space 8,589,934,592(9,266,765,824 )
autonice 0 (0)
autonice time 600 (600)
autonice penalty 4 (4)
open max soft 4096 (4096)
open max hard 4096 (4096)
ncallout alloc size 8192 (8192)
round robin switch rate 0 (0)
sched min idle 0 (0)
give boost 1 (1)
max users 2048 (32)
task max 16405 (277)
thread max 32808 (552)
num wait queues 1024 (64)
Sorry if this is a strange question, but I really don't know very much about
the memory setup of an alpha machine.
Thanks for your help
Harald
Received on Tue Apr 18 2000 - 14:15:21 NZST