Skip to Content.
Sympa Menu

forum - Re: [abinit-forum] Memory problem

forum@abinit.org

Subject: The ABINIT Users Mailing List ( CLOSED )

List archive

Re: [abinit-forum] Memory problem


Chronological Thread 
  • From: Paul Tangney <tangney@civet.berkeley.edu>
  • To: forum@abinit.org
  • Subject: Re: [abinit-forum] Memory problem
  • Date: Tue, 01 Apr 2003 16:06:07 -0800

Hi Jorge,

I have been having similar problems on an SP3 machine.
I found out the following, and apologise in advance for mistakes....I'm
trying to repeat the computer jargon that I have learned without actually
understanding any of it.

By default, my fortran compiler (xlf ) allows a maximum
datasize of 128 Mb of memory and a maximum stack size of 32Mb.
I think this means that the maximum amount of memory that I can dynamically
allocate in one go is 128Mb and perhaps that the maximum static allocation
is 32Mb (but I'm really not sure what "stacksize" means).
My hamiltonian matrix needs 390 Mb and so the
program crashes even though I should have 16Gb of memory at my disposal.

In order to cure this the compiler needs a flag which increases this maximum data size.
The flags for stacksize and datasize are  "-bmaxstack:A" and "-bmaxdata:B" where A and B
are numbers in hexadecimal. Rather than worrying about hexadecimal, I think you can
use "-bmaxdata:0xn0000000" where  n is the number of units of 256Mb .
For example : "-bmaxdata:0x70000000"  should allow me to allocate 1792Mb.  
For parallel programs there are some limits on the values that you can use which are
imposed by MPI.

In 64-bit mode there are much larger data allocations allowed by default and so,
as Michel Cote suggested, this may solve the problem.....but will the program
slow down ?


As I said, I don't understand any of this....I'm trying to interpret what I read
here :  http://hpcf.nersc.gov/software/ibm/sp_memory.html
and you'll probably have to adapt it to your hardware and compiler.

Regards,

Paul




Jorge Iniguez wrote:
Dear abinitioners,

I have a problem when trying to run relatively big jobs. The program fails
to allocate the required amount of memory even though the system has
enough memory available.

For intance, in the log file I can get something like:

``Test failed to allocate 1843.383 Mbytes''

but if I type "limit" the system says:

cputime         unlimited
filesize        unlimited
datasize        unlimited
stacksize       524288 kbytes
coredumpsize    unlimited
memoryuse       16625280 kbytes
vmemoryuse      unlimited
descriptors     200
threads         1024

i.e., there should not be any memory problem as far as I can see...

The problem happens for both serial and parallel jobs (the above example
corresponds to a serial job). The job is nothing fanzy: I want to
calculate energy, forces and stresses for a 22-atom unit-cell system.
Other details: I am using SGI 2000 with MIPS R14000 Processor and 16 GB
memory (IRIX 6.5).
 
Any help or suggestions would be greatly appreciated!
 
Thanks,

Jorge Iniguez


------------------------------------------------------------------------------
   Jorge Iniguez             Bldg 235, Room E19
                             NIST Center for Neutron Research
                             100 Bureau Drive, Mail Stop 8562
                             Gaithersburg, MD 20899-8562
                             Phone: 1-301-975-8367 Fax: 1-301-921-9847
                             E-mail address: jiniguez@nist.gov
------------------------------------------------------------------------------


  




Archive powered by MHonArc 2.6.16.

Top of Page