Followup II: out of memory (Matlab)

From: Dr. Otto Titze, Kernphysik TUD, +49 6151 162916 <titze_at_ikp.tu-darmstadt.de>
Date: Fri, 13 Aug 1999 10:41:08 +0100 (CET)

Hi managers,

may I again bother you with a followup (I promise the last one) but
I try to use this chance to learn more about Unix. As a
physicist one is alway trying to get the things consistent.

After unlimiting most of the process parameters I made some
interesting observations

- running Matlab it increased the usage of main memory up to about
  400MB (our system has 512 MB) before it was using virtual memory.
  When it started to use virtual memory (swap space) the usage of
  main memory decreased to about 190 MB and stayed there while the
  virtual memory usage went up to about 700 to 800 MB. (Matlab was
  the only important user)

  Question: Is this reasonable? From performance reasons one would
  be interested to have a high workingset resident in main memory.
  (Sorry this is VMS language...)

- I even didn't solve my problem completely compared what should be
  possible with the existing hardware (Ultimate Workstations, 2
  processors, 512MB, 1,5 GB primary swap partition, swap mode
  deferred, i.e. no swapdefault). Our researcher could provide me
  with an even bigger data set. The parameters stack and data
  segment were set to the full size (s.below).

  When Matlab was accumulating memory and reached about 810 MB
  (swapon -s showed a swap usage of about 50 to 52%)
  it crashed again with "out of memory"
  (Before Matlab gave an internal message: Color Table has no
  available R/W Slots and no R/O slots)

  Question: Are there other parameters which I overlooked?

Thanks and regards

Otto

Current Limits:
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1048576
stack(kbytes) 1048576
memory(kbytes) 500656
coredump(blocks) unlimited
nofiles(descriptors) 4096
vmemory(kbytes) 1048576
Received on Fri Aug 13 1999 - 08:44:26 NZST

This archive was generated by hypermail 2.4.0 : Wed Nov 08 2023 - 11:53:39 NZDT