SUMMARY: Followup II: out of memory (Matlab)

From: Dr. Otto Titze, Kernphysik TUD, +49 6151 162916 <TITZE_at_ikp.tu-darmstadt.de>
Date: Wed, 18 Aug 1999 17:08:21 +0100 (CET)

Thanks to all who responded

Arne Steinkamm <arne_at_steinkamm.com>
Paul Henderson <henderson_at_unx.dec.com>
"Dr. Tom Blinn, 603-884-0646" <tpb_at_doctor.zk3.dec.com>
"Don" rye_at_jwfc.acom.mil
Arno Hahma <arno_at_utu.fi>

Because I am going on vacation I want to give the summary before.
I got a lot of good hints and explanations why Unix just works this way.

Indeed I forgot to increase vm-maxvas which was still at a value
where the process is running out of memory. But increasing that too
didn't solve my problem that Matlab crashed with "out of memory" after
getting about half of the swap space. (There is still a parameter
(vmemory as ulimit tells) at this value but till now I don't know what
it means, to which sysconfig paramter it corresponds and how to change)

I asked the Matlab people if there is an internal restriction in the
application to use only half of the resources, but they didn't know.

Nevertheless all that tuning was successful in a sense that the user now
can do nearly all of its normal work - except that particular extreme
example. After my vacation I might look into the problem again and will
try to use sys_check as it was recommended to me for getting additional
information.

Thanks everybody for your patience.

Regards

Otto


My original question was:

After unlimiting most of the process parameters I made some
interesting observations

- running Matlab it increased the usage of main memory up to about
  400MB (our system has 512 MB) before it was using virtual memory.
  When it started to use virtual memory (swap space) the usage of
  main memory decreased to about 190 MB and stayed there while the
  virtual memory usage went up to about 700 to 800 MB. (Matlab was
  the only important user)

  Question: Is this reasonable? From performance reasons one would
  be interested to have a high workingset resident in main memory.
  (Sorry this is VMS language...)

- I even didn't solve my problem completely compared what should be
  possible with the existing hardware (Ultimate Workstations, 2
  processors, 512MB, 1,5 GB primary swap partition, swap mode
  deferred, i.e. no swapdefault). Our researcher could provide me
  with an even bigger data set. The parameters stack and data
  segment were set to the full size (s.below).

  When Matlab was accumulating memory and reached about 810 MB
  (swapon -s showed a swap usage of about 50 to 52%)
  it crashed again with "out of memory"
  (Before Matlab gave an internal message: Color Table has no
  available R/W Slots and no R/O slots)

  Question: Are there other parameters which I overlooked?

 -------------------------------------------------------------------------
| Dr. Otto Titze, Kernphysik TU, Schlossgartenstr. 9, D-64289 Darmstadt |
| titze_at_ikp.tu-darmstadt.de Tel: +49(6151)16-2916,FAX:16-4321|
 -------------------------------------------------------------------------
Received on Wed Aug 18 1999 - 15:11:40 NZST

This archive was generated by hypermail 2.4.0 : Wed Nov 08 2023 - 11:53:39 NZDT