Matlab: Out of memory

From: Dr. Otto Titze, Kernphysik TUD, +49 6151 162916 <titze_at_ikp.tu-darmstadt.de>
Date: Wed, 11 Aug 1999 16:59:23 +0100 (CET)

Hi,

anyone using MATLAB on DUnix?

A user working on large data files (about 100.000 data points) gets
often the message from MATLAB "out of memory" and the hint for the
sysadmin to increase swap space. The system has 512 MB memory and
about 1,5 GB swap area(SW: Unix V4.0D, Patch 4). When the user was
running Matlab there was no one else on the system and swapon
showed available space 99%.

The "out of memory" problem is obviously known a Mathworks
(manufacturer of Matlab). The hints range from "increase swap
space", "add more memory", "reduce size of data" or "look for
external constraints (limit)"

Has anyone encountered a similar problem?

The only idea I have so far is a too small data seg size (s. system
parameters below). But this is obviously hardcoded to 131 MBytes.
May I change this with sysconfig and how large is reasonable? Or
where else can I increase it?

Sorry about these silly questions, but I am not so familiar with unix.

Any other idea?

Thanks and regards

Otto

System Paramters:
/sbin/sysconfig -q proc
per-proc-data-size = 134217728
max-per-proc-address-space = 1073741824

/sbin/sysconfig -q vm
vm-maxvas = 1073741824

ulimit -a
virtual memory (kbytes) 1048576 getmaxvm (mode)
data seg size (kbytes) 131072 (limit / limits[i].block_factor)
max memory size (kbytes) 500656(limit /limits[i].block_factor)
stack size (kbytes) 2048(limit / limits[i].block_factor)

 -------------------------------------------------------------------------
| Dr. Otto Titze, Kernphysik TU, Schlossgartenstr. 9, D-64289 Darmstadt |
| titze_at_ikp.tu-darmstadt.de Tel: +49(6151)16-2916,FAX:16-4321|
 -------------------------------------------------------------------------
Received on Wed Aug 11 1999 - 15:02:47 NZST

This archive was generated by hypermail 2.4.0 : Wed Nov 08 2023 - 11:53:39 NZDT