Elsevier Science Home
Computer Physics Communications Program Library
Full text online from Science Direct
Programs in Physics & Physical Chemistry
CPC Home

[Licence| Download | New Version Template] aenz_v1_1.tar.gz(1466 Kbytes)
Manuscript Title: MCMC2 (version 1.1): A Monte Carlo code for multiply charged clusters
Authors: David A. Bonhommeau, Marius Lewerenz, Marie-Pierre Gaigeot
Program title: MCMC2
Catalogue identifier: AENZ_v1_1
Distribution format: tar.gz
Journal reference: Comput. Phys. Commun. 185(2014)1188
Programming language: Fortran 90 with MPI extensions for parallelization.
Computer: x86 and IBM platforms.
Operating system:
  1. CentOS 5.6 Intel Xeon X5670 2.93 GHz, gfortran/ifort(version 13.1.0) + MPICH2.
  2. CentOS 5.3 Intel Xeon E5520 2.27 GHz, gfortran/g95/pgf90 + MPICH2.
  3. Red Hat Enterprise 5.3 Intel Xeon X5650 2.67 GHz, gfortran + IntelMPI.
  4. IBM Power 6 4.7 GHz, xlf + PESS (IBM parallel library)
.
Has the code been vectorised or parallelized?: Yes, parallelized using MPI extensions. Number of CPUs used: up to 999.
RAM: 10-20 MB per CPU core. The physical memory needed for the simulation depends on the cluster size, the values indicated are typical for small clusters (N ≤ 300 - 400).
Supplementary material: Pdf documents containing the "Summary of revisions" information can be obtained here.
Keywords: Monte Carlo simulations, Coarse-grained models, Charged clusters, Charged droplets, Electrospray ionisation, Parallel Tempering, Parallel Charging.
PACS: 05.10.Ln, 36.40.Ei, 36.40.Qv, 36.40.Wa.
Classification: 23.

Does the new version supersede the previous version?: Yes

Nature of problem:
We provide a general parallel code to investigate structural and thermodynamic properties of multiply charged clusters.

Solution method:
Parallel Monte Carlo methods are implemented for the exploration of the configuration space of multiply charged clusters. Two parallel Monte Carlo methods were found appropriate to achieve such a goal: the Parallel Tempering method, where replicas of the same cluster at different temperatures are distributed among different CPUs, and Parallel Charging where replicas (at the same temperature) having different particle charges or numbers of charged particles are distributed on different CPUs.

Reasons for new version:
The new version corrects some bugs identified in the previous version. It also provides the user with some new functionalities such as the separation of histograms for positively and negatively charged particles, a new scheme to perform parallel Monte Carlo simulations and a new random number generator.

Summary of revisions:
See Supplementary material.

Restrictions:
The current version of the code uses Lennard-Jones interactions, as the main cohesive interaction between spherical particles, and electrostatic interactions (charge-charge, charge-induced dipole, induced dipole-induced dipole, polarisation). The Monte Carlo simulations can only be performed in the NVT ensemble in the present code.

Unusual features:
The Parallel Charging methods, based on the same philosophy as Parallel Tempering but with particle charges and number of charged particles as parameters instead of temperature, is an interesting new approach to explore energy landscapes. Splitting of the simulations is allowed and averages are accordingly updated.

Running time:
The running time depends on the number of Monte Carlo steps, cluster size, and the type of interactions selected (eg, polarisation turned on or off, and method used for calculating the induced dipoles). Typically a complete simulation can last from a few tens of minutes or a few hours for small clusters (N ≤ 100, not including polarisation interactions), to one week for large clusters (N ≥ 1000 not including polarisation interactions), and several weeks for large clusters (N ≥ 1000) when including polarisation interactions. A restart procedure has been implemented that enables a splitting of the simulation accumulation phase.

References:
[1] D. A. Bonhommeau, M.-P. Gaigeot, Comput. Phys. Commun. 184(2013)873.
[2] D. A. Bonhommeau, R. Spezia, M.-P. Gaigeot, J. Chem. Phys. 136 (2012) 184503.
[3] M. A. Miller, D. A. Bonhommeau, C. J. Heard, Y. Shin, R. Spezia, M.-P. Gaigeot, J. Phys.: Condens. Matter. 24(2012)284130.
[4] S. Kirkpatrick, E. P. Stoll, J. Comp. Phys. 40(1981)517.
[5] G. Bhanot, D. Duke, R. Salvador, Phys. Rev. B 33(1986)7841.
[6] M. Lewerenz, J. Chem. Phys. 106(1997)4596.
[7] M. Mladenović, M. Lewerenz, Chem. Phys. Lett. 321(2000)135.
[8] D. Bonhommeau, P. T. Lake, Jr., C. L. Quiniou, M. Lewerenz, N. Halberstadt, J. Chem. Phys. 126(2007)051104.