Jump to content
GMS, SMS, and WMS User Forum

Parallel version of ADCIRC


Recommended Posts


My University has a parallel computing system and I would like to know if there is a parallel version of ADCIRC model to run the SMS in that cluster, using e.g. 20 processors. If yes, can you give me information about how it works?




Only the serial, pre-compiled for Windows version comes with SMS. I had the same problem so I can tell you you'll have to get the code from the development group and compile it on your own. The code doesn't cost anything for university research. If you know Jason Fleming at UNC he is the one that gave me v47.27 so he could set you up. Otherwise, I'd suggest contacting Drs. Luettich or Westerink via the addresses on the ADCIRC homepage (http://www.adcirc.org/development_group.html).


Link to comment
Share on other sites


If you ask on the adcirc development web page they should point you to the ftp page where they store the adcirc source code. This will probably be the latest version. Assuming that it is still the same general principle as the older code version that I am running you bascially need to compile the pre and post processing files and the padcirc (parallel adcirc code) executable on your cluster. You will need to edit the makefile to point to the relevant directories. I didn't find this straight forward so if you can get someone else to do it for you then that would make you life easier:) Of course you may enjoy sorting this sort of thing out, butI didn't:)

Hope it works out well


Link to comment
Share on other sites

  • 3 years later...

I am resurrecting this 2009 thread to ask about parallel ADCIRC, which is now distributed with SMS. (I am using 11.1 64-bit.) I have run ADCIRC successfully with one processor, I have multiple cores available to me. When I set the number of processors to 6 as a test and ran the model, I got an error from AdcPrep Init:


ADCPREP Fortran90 Version 2.3 10/18/2006

Serial version of ADCIRC Pre-processor


...<description of parameters here>


calling: prepinput

use_default = F

partition = T

prep_all = F

prep_15 = F

prep_13 = F

hot_local = F

hot_global = F

Enter the name of the ADCIRC UNIT 14 (Grid) file:

Enter the name of the ADCIRC UNIT 14 (Grid) file:

forrtl: severe (24): end-of-file during read, unit -4, file CONIN$

Image PC Routine Line Source

adcprep.exe 0053D310 Unknown Unknown Unknown

adcprep.exe 004FF7BF Unknown Unknown Unknown

adcprep.exe 004EADA7 Unknown Unknown Unknown

adcprep.exe 004EA0EA Unknown Unknown Unknown

adcprep.exe 004CFD06 Unknown Unknown Unknown

adcprep.exe 00443D48 Unknown Unknown Unknown

adcprep.exe 0044E61E Unknown Unknown Unknown

adcprep.exe 0044E0C7 Unknown Unknown Unknown

adcprep.exe 0054CB73 Unknown Unknown Unknown

adcprep.exe 00527489 Unknown Unknown Unknown

kernel32.dll 762C33AA Unknown Unknown Unknown

ntdll.dll 77099F72 Unknown Unknown Unknown

ntdll.dll 77099F45 Unknown Unknown Unknown


Do you know what these errors mean and what I need to do to run ADCIRC in parallel? I checked my SMS installation directory, and it includes, in the ADCIRC folder, 32- and 64-bit copies of ADCIRC.exe, ADCPREP.exe, and PADCIRC.exe, so I know they're there. Thanks.


Link to comment
Share on other sites


I have succeeded in running ADCIRC in parallel mode from the command line, without SMS, although I still get errors when I try to do it in SMS by setting the number of processors to be greater than 1 in the ADCIRC model control window. I'm not sure if that's something other people are able to do or not.

Edited by RCLwood
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Create New...