forum@abinit.org
Subject: The ABINIT Users Mailing List ( CLOSED )
List archive
- From: <beran_k@utpb.edu>
- To: forum@abinit.org
- Subject: [abinit-forum] Restarting Calcs
- Date: Wed, 25 Feb 2009 16:56:19 +0100 (CET)
I am running the latest version of Abinit (compiled myself) on TACC's Ranger
platform. There are several issues that I would like to bring up in order to
improve the efficiency and completion rate of my jobs.
1. Ranger sets a 24-hour limit on job runs and several of my medium-large
jobs
do not finish in the allotted time. I have looked at Abinit's FAQ page and
found the "Restarting Calculation" notes, but it doesn't tell the information
I
am seeking. Is there a method to take information from the last iteration
performed in the 24-hour period and start the calculation over using
information from the last iteration? I should specify that these jobs are
single-point calcs.
2. I ran an analysis of how well Abinit was performing on Ranger and here are
the discouraging results. A) in a 20-hour job that completed, 76% of the
time
was devoted to MPI communications (i.e. 24% devoted to actual calculations).
B) Abinit was running at 0.15 GFlops/sec per MPI task. Obviously, terrible
efficiency!!!! Any suggestions on how I can improve the efficiency? No one
at
TACC is really familiar with Abinit since TACC does not provide this software
to its users. Any tricks, compiling or otherwise, would be greatly
appreciated.
3. I am investigating the vacuum-level shift that occurs when a small organic
molecule (~17 atoms) is placed on a metal surface, Ni for example. Using
4x4x1
kpt grid with Ecut=28 Ha on a system with the organic and a 3-layer, 4x4 slab,
what type of performance (time-wise, I guess) should I expect from Abinit?
Would this system be considered "large"? Is this a system that Abinit
typically handles in an efficient manner?
Again, any suggestions referring to items 1-3 would be greatly appreciated.
Regards,
Kyle A. Beran
- [abinit-forum] Restarting Calcs, beran_k, 02/25/2009
Archive powered by MHonArc 2.6.15.