Skip to Content.
Sympa Menu

forum - Re: [abinit-forum] input file size...

forum@abinit.org

Subject: The ABINIT Users Mailing List ( CLOSED )

List archive

Re: [abinit-forum] input file size...


Chronological Thread 
  • From: Ravinder Abrol <rabrol@us.ibm.com>
  • To: verstraete@pcpm.ucl.ac.be
  • Cc: forum@abinit.org
  • Subject: Re: [abinit-forum] input file size...
  • Date: Sun, 9 May 2004 21:53:54 -0400


Hi Matthieu,
Changing the strlen to whatever size I want for the input file worked!
Thanks,
Ravi

-----
Ravinder  Abrol,  Ph. D.
25-143, Department of Physical Sciences
IBM Thomas J. Watson Research Center
Yorktown Heights, NY 10598
Phone: 914-945-1617
Email: rabrol@us.ibm.com



verstraete@pcpm.ucl.ac.be

05/06/2004 02:37 AM

Please respond to
verstraete

To
forum@abinit.org
cc
Subject
Re: [abinit-forum] input file size...






Hello Ravi: check the robodoc!

the internal parser in abinit is fairly primitive by modern standards
(which takes nothing away from Doug and Xavier's painstaking job writing
it in fortran). As a result, the whole input file that is parsed is
contained in one string, with a fixed maximum length of

integer, parameter :: strlen=32000

characters, defined in defs_basis, allocated in abinit.f, and read in in
subroutine instrng. I'm not sure it's enough to increase strlen and to
re-compile, some other routines may have presuppositions about the total
length of all input records, but it's worth a try. Make sure all the
routines which use module defs_basis (practically all) are recompiled and
linked (do a make clean - the dependencies in the makefile are not all
good enough).

Do you really have the computing power to do one run efficiently with 30
datasets? I usually make a meta-script which generates input files and
then runs abinit on each of them. It can be trivially parallelized, as
long as your datasets are not chained.

Matthieu



On Thu, 6 May 2004 rabrol@us.ibm.com wrote:

> Hi AbInit Developers and Users,
> I am trying to run abinit with a large input file
> with 30 datasets (ndtset) or more.
> I get the following error message in my log file:
>    -------------------
>    0: instrng : ERROR --
>    0:  The size of your input file is such that the internal
>    0:  character string that should contain it is too small.
>    0:  Action : decrease the size of your input file,
>    0:  or contact the ABINIT group.
>    -------------------
>
> The documentation lists that jdtset (dataset id) can go
> upto 99, but my input is crashing even at 30.
> What seems to be the problem and how do I fix this?
> I prefer not to run AbInit with only a few datasets as
> I need to do a lot.
> Here's a glimpse of the beginning of my input file:
> Thanks,
> Ravi
>
> -----------------input file----------------------
>  mkmem    0
>  mffmem   0
>  ixc      15
>  nband    54
>  ecut     30
>  nstep    40
>  diemac   2.0
>  ntypat  2
>  znucl   6 1
>  natom   36
>  typat 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 1  1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 1
>
>  ndtset   30
>
> .....30 datasets specified here.....
>
> ----------------input file--------------------------
>

--
===================================================================
Matthieu Verstraete               mailto:verstraete@pcpm.ucl.ac.be
PCPM, Boltzmann, pl. Croix du Sud, 1            tel: 010/ 47 33 59
B-1348 Louvain-la-Neuve  Belgium                fax: 010/ 47 34 52




Archive powered by MHonArc 2.6.16.

Top of Page