[Beowulf] NASTRAN on cluster

Roland Krause rokrau at yahoo.com
Tue Apr 12 09:32:02 PDT 2005


Hi,

--- "Lombard, David N" <david.n.lombard at intel.com> wrote:
[...]
> [...]
> >
> >on ia32, TASK_UNMAPPED_BASE, by default, is at 1GB rather than ~1.3.
> >easy to change, though, at least to give 2-2.5 GB on ia32.
> 
> It may *not* be easy to change, depending on the distro and glibc. 

It *is* relatively easy to change. In fact in 2.4 kernels the
TASK_UNMAPPED_BASE (TUB) is not fixed but at 1/3 of TASK_SIZE which is
available physical memory - 1G (for the kernel)=3G by default. This is
set in <line-2.4.xx>/include/asm-i386/processor.h. You can change that
relatively safely to 1/12 which will let you allocate closer to ~3G
using mmap. 

There seems to also be some confusion - at least for me, and please 
correct me if I am wrong here - about TUB and mmap vs. brk/sbrk. 

Afaik. glibc's malloc uses TUB to decide when to allocate using mmap
vs. brk/sbrk. Anything smaller that TUB is allocated using brk/sbrk -
therefore the name "unmapped base". If you allocate all of your memory
using brk/sbrk then you can get up to 4G and with IAE 64G on a 32 bit
machine. Most "legacy" Fortran codes (including the one we sell)
allocate memory in one large chunk, so your compiler or glibc resp.
will use mmap to allocate and hence you are stuck with 2G. 

A while ago Greg Lindahl posted a little code snippet the prevents
glibc's malloc from using mmap all together. Grep the archives for it. 

Finally the whole problem is history with the arrival of 2.6.9 where
changes to the kernel were made so that the memory spaces now grow
towards each other from opposite sides so that one now can avoid these
tricks allthogether, at least on 64bit machines :-) 

Good luck allocating,
Roland



> But,
> if you do whatever work is needed, you can push the memory allocation
> up
> to about 2.1-2.3 GiB.
> 
[...]

> 



		
__________________________________ 
Yahoo! Mail Mobile 
Take Yahoo! Mail with you! Check email on your mobile phone. 
http://mobile.yahoo.com/learn/mail 



More information about the Beowulf mailing list