Hi all ...
We recently purchased a fairly large DS20 system, 2 cpus and 4gig
memory, etc.
We have some researchers that want to run some very large memory
modeling programs and they are having difficulty in exployting the
memory of this machine ... claiming they do not have any problems
running such programs on other platforms. (some of these programs
are home grown).
I need some direction as to some of the ins/outs of running large
memory programs.
-- Are there any special cc compile options to support large memory
-- Pros/cons of "limit datasize"
-- Pros/cons of "limit stacksize"
-- any other "limit" that need tweeking?
First off, I'm not a cc programmer ... so bear with me. The
researcher prefers static arrays in the cc code as opposed to
using mallac. I noted that static declared arrays seem to use
stacksize while mallac defined arrays use datasize memories.
I upped the system sysconfigtab with the following:
proc:
max-per-proc-stack-size = 67108864
max-per-proc-address-space = 3221225472
max-per-proc-data-size = 3221225472
How far can I go with these things before I start causing overall
system problems?
I'd love to hear from anyone who has had experiences with running
big memory programs and can help. As always, I'll summarize back
to the group.
Thanks, Jon.
-----------------------------------------------------------------------
Jon Eidson (J.Eidson_at_tcu.edu) Information Services
Senior Technical Analyst Texas Christian University
(817) 257-6835 Fort Worth, Texas 76129
-----------------------------------------------------------------------
I know God won't give me anything I can't handle ...
I just wish he didn't trust me so much.
Received on Fri Apr 28 2000 - 21:26:41 NZST