Nishadh KA

WRF CHEM compile QN AWS

2014-12-29


#####Compile wrf chem in AWS: QUICK NOTE#####

  1. It is based on the NCAR wrf chem compile complete process and note on wrf chem compile by this
  2. Create a ubuntu trusty based image. Make a small elastic block storage 8GB, it will be permanent even if the instance is kept stop. There will be storage space in /mnt for upto 300GB based on instance, for example c3.large has it. This storage gets erased while stopping the instance but not the /home folder where the small 8GB EBS is being mounted/
  3. Install basic required libs sudo apt-get install g++ sudo apt-get install gfortran sudo apt-get install gcc
  4. Basic environment test, make sure the test passes all good for libs checks such as, gcc, gfortran, gcc+gfortran, perl, csh.
  5. Made sure the gcc and gfortran versions are matching, current case version is 4.8.2 for both gcc and gfortran
  6. To get the required libraries and other program source code run following command in the console wget http://www2.mmm.ucar.edu/wrf/src/WRFV3.4.1.TAR.gz wget http://www2.mmm.ucar.edu/wrf/src/WPSV3.4.1.TAR.gz wget http://www2.mmm.ucar.edu/wrf/src/WRFV3-Chem-3.4.1.TAR.gz wget http://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/mpich-3.0.4.tar.gz wget ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-4.1.1.tar.gz
  7. For basic library compile MPICH and NETCDF For setting up PATH for NETCDF v4.1.1 tar xzvf netcdf-4.1.1.tar.gz cd netcdf-4.1.1 export DIR=/home/ubuntu/wrf2/libsw/ export CC=gcc export CXX=g++ export FC=gfortran export FCFLAGS=-m64 export F77=gfortran export FFLAGS=-m64

./configure –prefix=$DIR/netcdf –disable-dap
–disable-netcdf-4 –disable-shared

make (this command ends in error related to documentation, there is a patch for this error) make install

export PATH=$DIR/netcdf/bin:$PATH export NETCDF=$DIR/netcdf

#setting up PATH for mpich and compiling tar xzvf mpich-3.0.4.tar.gz
cd mpich-3.0.4 ./configure –prefix=$DIR/mpich make make install export PATH=$DIR/mpich/bin:$PATH 1. Remaining required libraries were installed by normal sudo apt-get command, libs required were of

    sudo apt-get install m4
    sudo apt-get install libjasper-dev
            sudo apt-get install libpng-dev
            sudo apt-get install libnetcdf-dev
            sudo apt-get install zlib1g-dev
    sudo apt-get install build-essential
    sudo apt-get install libcloog-ppl-dev
    sudo apt-get install tcsh
    sudo apt-get install libcloog-ppl0
  1. Basic library check was undergone, which checks netcdf and mpich , this test has to pass good. ######compiling WRF######
  2. Edited the file sudo nano /etc/bash.bashrc, added following lines in it

            export NETCDF=/home/ubuntu/wrf2/libsw/netcdf
            export JASPERLIB=/usr/lib
            export JASPERINC=/usr/include
            export WRF_EM_CORE=1
            export WRF_NMM_CORE=0
            export WRF_CHEM=1
            export WRF_KPP=0
            export WRFIO_NCD_LARGE_FILE_SUPPORT=1
    
  3. Inside the wrf chem folder in whcih the chem folder already copy pasted, run following commands

            cd /home/hoopoe/wrfchem341/WRFV3_mpichA4/WRFV3/
    ./configure
    
  4. The configure.wrf file has to be edited since the groftran and gcc used is of 4.8.2 version. The lines starting with has to be as below

                FORMAT_FIXED    =       -ffixed-form -cpp
                FORMAT_FREE     =       -ffree-form -ffree-line-length-none -cpp
    
  5. Then execute the command, ./compile emreal and ./compile emiconv. This completes WRF CHEM compilation

  6. For compiling WPS, unzip the wps folder and run the command ./configure and choose serial based wps, it ends in creation of file configure.wps.

  7. Edit the file configure.wps and change the line starting with exactly as below

                #### Architecture specific settings ####
    
                COMPRESSION_LIBS    = -L/usr/lib -ljasper -lpng -lz
                COMPRESSION_INC     = -I/usr/include
    
                #
                #   Settings for Linux x86_64, gfortran    (serial) 
                #
                #
                COMPRESSION_LIBS    = -L/usr/lib -ljasper -lpng -lz
                COMPRESSION_INC     = -I/usr/include
    
  8. Change the line in wps due to gfortran version used 4.8.2 as, CPP = /usr/bin/cpp -P -traditional

  9. Check the location mentioned in configure.wps for configure WRFV3.

  10. Then execute the command, ./compile, and it will generate the exe required for WPS.

######compiling PREPCHEMSRC###### 1. Faced lots of error in compiling and running this program, the program requires netcdf version 4.1.1 and HDF1.8.4, zlib 1.2.7, other versions will generate error. More over the lib hdf5 has to be compiled in the folder /usr/local/. After compiling, it has to run command ldconfig to make the links correct for the compiled exe. It can be checked by the command lddd prep_chem_sources_RADM_WRF_FIM.exe, if any dependent library is broken it will show as not found. 1. The source file for program and package can be downloaded by following commands

                wget http://zlib.net/fossils/zlib-1.2.7.tar.gz
      tar xzvf zlib-1.2.7.tar.gz
                https://www.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8.14/src/hdf5-1.8.14.tar.gz
      tar xzvf hdf5-1.8.14.tar.gz
      ftp://aftp.fsl.noaa.gov/divisions/taq/global_emissions/prep_chem_sources_v1.4_08aug2013.tar.gz
  1. compiling zlib1.7.2, inside the source directory, given following command, FC=gfortran CC=gcc ./configure –prefix=/home/ubuntu/wrf2/libsw/zlib make make install
  2. compiling hdf5-1.8.4, inside the source directory, given following command,

            ./configure --prefix=/usr/local --enable-shared --enable-hl --with-zlib=/home/ubuntu/wrf2/libsw/zlib --enable-fortran
    make
    sudo apt-get install checkinstall
    sudo checkinstall
    
  3. Then editing the prepchmesrc configure file, include.mk.gfortran.wrf, with lines

tar xzvf prepchemsourcesv1.408aug2013.tar.gz cd /home/ubuntu/wrf2/wrfchem/PREP-CHEM-SRC-1.4/bin/build/ nano include.mk.gfortran.wrf NETCDF=/home/ubuntu/wrf2/libsw/netcdf HDF5=/usr/local HDF5LIB=-L$(HDF5)/lib -lhdf5hlfortran -lhdf5fortran -lhdf5hl -lhdf5 -L/home/ubuntu/wrf2/libsw/zlib/lib/ -lz

  1. Made sure that the line containing wording for big endian memory as F_OPTS= -Xpreprocessor -D$(CHEM) -O2 -fconvert=big-endian -frecord-marker=4
  2. Then saved the configure file and give the command

      make OPT=gfortran.wrf CHEM=RADM_WRF_FIM
      ldconfig
    
  3. This will generate the exe and check the exe using the command, ldd prep_chem_sources_RADM_WRF_FIM.exe

#########Running WRFCHEM in AWS########## 1. To get the geog data necessary for geogrid.exe, it is huge file of size nearing 16 GB, used the script of wrfems, ems_install.pl, for get the geog folder to go as like the currently used geog folder. The c3.large of aws instance is providing 300 GB of SSD which is useful for add this much size files folder. The SSD is mounted in /mnt and the perl script was transfered to the folder aftre giving permission on the folder /mnt by sudo chmod -R ugo+rw /mnt based on this, which give all the permission for any user on the system to copy or edit files in the /mnt like system folder. The script transfer operation was made using FILEZILLA and connected following this. After this the script was made into executables by cd /mnt; chmod a+x ems_install.pl and the script was run by the command perl ems_install.pl --install. While the script execution it asks for location to download, specified /mnt and the files are saving in this folder, the geog folder location is as follows, /mnt/wrfems/data/geog. 1. To get the GLOBAL emission files necessary for prep_chem_src.exe is downloaded by this command

      wget ftp://aftp.fsl.noaa.gov/divisions/taq/global_emissions/global_emissions_v3_02aug2012.tar.gz
      tar xzvf global_emissions_v3_02aug2012.tar.gz
  1. The globalemissions folder is not located files as per the requirement of prepchem_src, following copy past has to be made to avoid the error of files not found error.

cp /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/emissions/OCanthronoship2006.nc /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/

cp /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/emissions/OCanthronoship2006.nc /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/

cp /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/emissions/BCanthronoship2006.nc /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/

cp /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/emissions/SO2anthronoship2006.nc /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/

cp /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/dmsdata/dms1x1.25.nc /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocart_bg/

cp -r /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/erod /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocart_bg/

cp -r /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/dmsdata /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/

cd /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/ mkdir gocart_bg

cp -r * /home/ubuntu/wrfout/Globalemissionsv3/Emissiondata/GOCART/gocartbg/

cp gmi* gocart_bg/

  1. The folder is having 5 GB size and it also must kept in the folder /mnt.
  2. Based on the above folder location the variables has to be edited for namelist.wps and prepchemsources.inp.
  3. A keep in mind fact is the aws volume here 8GB magnetic volume has to be in same zone to be detached and attached with another aws instances.

  4. spot instance is better way to save the cost and making the inatnce withpout terminating the root akes the intance to retian as like taking a AMI of the insatnce it is just copying the inatnce with all root system can be used next time.

  5. sssh innto aws is as follows

ssh -i /home/swl-sacon-dst/Desktop/wrfa.pem ubuntu@ec2-54-198-57-249.compute-1.amazonaws.com

  1. mounting the volume to aws is based on simple mount command based on this

lsblk to list the attached volumes

du -hs * to list the directory with its size

sudo file -s /dev/xvda1 to check the file system in the drive

sudo mkdir mount_point

sudo mkdir mount_point

  1. star cluster another helpful think to do

home/wrf2/wrfchem/WRFV3/test/em_real/rsl.error.0000

ssh -i /home/swl-sacon-dst/Desktop/wrfa.pem ubuntu@ec2-54-88-7-150.compute-1.amazonaws.com

added by Anaconda 2.0.1 installer

export PATH=“/home/wrf/anaconda/bin:$PATH”

#editing the fstab of wrfchem AMI #to knwo the file system in the device sudo file -s /dev/xvdf

sudo nano /etc/fstab /dev/xvdf /home/ubuntu/wrf2 ext4 defaults,nofail,nobootwait 0 2

/dev/xvdg /home/ubuntu/wrf ext4 defaults,nofail,nobootwait 0 2

/dev/xvdh /home/ubuntu/wrfout ext4 defaults,nofail,nobootwait 0 2

#after editing

sudo mount -a

#processing the EDGAR file in AWS, refer the ncks.py file for more info #a new run in r3.4xlarge 1. took 2.03 hours for six hour compilation

#compiling the anthro_emiss in aws 1. setting the gfortran export FC=gfortran export NETCDF=/home/ubuntu/wrf2/libsw/netcdf 2. then given the command make_anthro

#a new run in c3.8xlarge ec2

  1. attached a 50 GB magnetic harddisk to the computer

sudo mkfs -t ext4 /dev/xvdi, never use this for used harddisk mkdir wrfout1 sudo mount /dev/xvdi /home/ubuntu/wrfout1 sudo chown -R ubuntu:ubuntu wrfout1 sudo chmod a+x wrfout1

export DIR=/home/ubuntu/wrf2/libsw/ export PATH=$DIR/netcdf/bin:$PATH export NETCDF=$DIR/netcdf export PATH=$DIR/mpich/bin:$PATH

anthroemis < anthroemisA4.inp

  1. Now using screen for 18 hour simulatin in c3.8xlarge

screen -S “name without mark”

To start a screen with name screen -S “name without mark” * To view running screen screen -r * to get into particular screen screen -xr pid(of the screen) * to get out of screen CTRL+A+D

  1. it is required to add erod directory in geog/data folder for dust emission inclusion in simulation and copy the old GEOGRID.TBL with as follwos

cp GEOGRID.TBL GEOGRID.TBLOLD cp GEOGRID.TBL.ARWCHEM GEOGRID.TBL

  1. ./util/g2print.exe /home/ubuntu/wrf2/wrfchem/gfs/grib1/gfsanl3201501131200000.grb

####errors and issues

  1. —————– ERROR ——————- namelist : nummetgridsoillevels = 4 input files : NUMMETGRIDSOILLEVELS = 0 (from met_em files). ————– FATAL CALLED —————

sol:http://www.researchgate.net/profile/Hector_Sepulveda2/topics 2. FATAL CALLED FROM FILE: LINE: 597 Mismatch between namelist and global attribute NUMMETGRIDSOIL_LEVELS

  1. WARNING: Couldn’t open file NARI:2006-12-19_12 for input. ERROR: The mandatory field TT was not found in any input data.

  2. Problem with GFS 0.5

sol: http://forum.wrfforum.com/viewtopic.php?f=12&t=8494 sol:http://forum.wrfforum.com/viewtopic.php?f=6&t=8539&start=10 sol:http://forum.wrfforum.com/viewtopic.php?f=22&t=50

  1. findsing the bug in wrf run http://www2.mmm.ucar.edu/wrf/OnLineTutorial/Class/cases/find_the_bugs.php#