[Beowulf] [External] Re: Clustering vs Hadoop/spark [EXT]

Tim Cutts tjrc at sanger.ac.uk
Wed Nov 25 11:40:12 UTC 2020


Except of course, you do really.  Java applications can end up with huge memory leaks because the programmers really need to understand the mechanism when objects get moved from Eden and Survivor space into Tenured space.

Tenured space never decreases, so every object which ends up there is allocated memory for the life of the process.  If that was actually an ephemeral object, tough, you’ve lost the memory.

So ideally, the programmer needs to understand the size of their pile of ephemeral objects and make sure the Eden/Survivor spaces are large enough, otherwise they will have a memory leak.  If they make the ephemeral spaces too large, though, garbage collection takes more time and performance decreases, so it’s a balance.

I’m not sure you can ever actually really get rid of the need to understand memory allocation… you just push the problem around!

Tim

On 24 Nov 2020, at 18:32, Prentice Bisbal via Beowulf <beowulf at beowulf.org<mailto:beowulf at beowulf.org>> wrote:

Also, with Java, you don't have to worry about low-level issues like freeing and allocating memory and doing pointer arithmetic. Not having to worry about those low-level issues allows a student to focus more on the programming concepts. I knew I screw up dereferencing pointers A LOT when learning C and C++




-- 
 The Wellcome Sanger Institute is operated by Genome Research 
 Limited, a charity registered in England with number 1021457 and a 
 company registered in England with number 2742969, whose registered 
 office is 215 Euston Road, London, NW1 2BE.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://beowulf.org/pipermail/beowulf/attachments/20201125/5ad3b9f3/attachment.htm>


More information about the Beowulf mailing list