docs refactoring

This commit is contained in:
cookieenick
2025-11-14 13:46:40 -07:00
parent b136334a63
commit c43746b2f1
48 changed files with 96 additions and 3166 deletions

View File

@@ -1,5 +1,5 @@
# Configuration file for the Sphinx documentation builder.
import os
# -- Project information
project = 'kaiju'
@@ -47,3 +47,17 @@ html_theme_options = {
html_css_files = [
'css/sidebar_theme.css',
]
exclude_patterns = [
"_build",
"Thumbs.db",
".DS_Store",
]
if not os.environ.get("BUILD_ALL"):
exclude_patterns.append("misc/**")
def setup(app):
if os.environ.get('BUILD_ALL'):
app.tags.add('misc')

View File

@@ -33,3 +33,13 @@ Table of contents
makeitso/index
tools/index
roadrules
.. only:: misc
Internal Docs
-------------
.. toctree::
:maxdepth: 2
misc/index

View File

@@ -1,4 +0,0 @@
Computing Ground Magnetic Field Perturbations (calcdb.x)
========================================================
TBD

View File

@@ -52,7 +52,7 @@ the start time of the simulation being used for the calculation.
``tFin`` is the stop time the test particle trajectories, in seconds from
the start time of the simulation being used for the calculation.
For other parameters see the :doc:`Chimp XML </userGuide/chimp/chimpXML>` page.
For other parameters see the :doc:`Chimp XML <chimpXML>` page.
Create the PBS file
^^^^^^^^^^^^^^^^^^^

View File

@@ -1,234 +0,0 @@
Extract a 3D subdomain from GAMERA output (chop.x)
==================================================
Introduction
------------
chop.x extracts a 3D portion of the domain from GAMERA output on either the
MAGE grid or interpolated onto a cartesian or a spherical grid and perform
additional calculations such as field line tracing. Output is to an hdf5 file
that can be visualized and analyzed similarly to GAMERA output files.
Example XML file
----------------
.. code-block::
<?xml version="1.0" ?>
<Kaiju>
<Chimp>
<sim runid="chopRTP"/>
<time T0="26100.0" dt="30.0" tFin="26520.1"/>
<fields doMHD="T" ebfile="msphere" grType="LFM" isMPI="T"/>
<parallel Ri="8" Rj="8" Rk="1"/>
<domain dtype="LFM" xSun="30.0" xTail="-50.0" yzMax="30.0"/>
<units uid="EARTH"/>
<chop grType="RTP" x1Max="20" x1Min="2" x2Max="180" x2Min="0" x3Max="360" x3Min="0" Nx1="225" Nx2="120" Nx3="240"/>
<interp wgt="TSC"/>
<tracer epsds="0.20"/>
<output doTrc="T"/>
</Chimp>
</Kaiju>
Parameter Descriptions
----------------------
``<sim>`` (optional): Specify identifying information for this computation.
``runid`` (optional, default ``"Sim"``): String specifying an identifier
for this run of ``chop.x``. A best practice is to use the ``runid`` in the
name of the XML file.
``<time>`` (optional): Specify time range and interval for magnetic field
calculation.
``T0`` (optional, default ``"0.0"``): Start time (simulated seconds) for
ground magnetic field calculation, relative to start of simulation results
used as input.
``dt`` (optional, default ``"1.0"``): Time interval and output cadence
(simulated seconds) for ground magnetic field calculation.
``tFin`` (optional, default ``"60.0"``): Stop time (simulated seconds) for
ground magnetic field calculation, relative to start of simulation results
used as input.
``<fields>`` (required): Describes the input data from a MAGE model run.
``ebfile`` (optional, default ``"ebdata.h5"``): Path to HDF5 file
containing the electric and magnetic fields computed by a MAGE model run.
``grType`` (optional, default ``"EGG"``): String specifying grid type used
by the MAGE output file. Valid values are ``"EGG"``, ``"LFM"``, ``"SPH"``.
If the string is not one of the supported grid types, the default value
(``"EGG"``) is used, and a warning message is printed.
``doEBFix`` (optional, default ``"false"``): Set to ``"true"`` to "clean"
the electric field E so that the dot product of the electric and magnetic
fields is 0. See ``ebinterp.F90``.
``doMHD`` (optional, default ``"false"``): Set to ``"true"`` to pass the
full set of magnetohydrodynamic variables to CHIMP, rather than just electric
and magnetic fields. Includes velocity vector, density and pressure in the
output file. See ``ebtypes.F90``.
``isMPI`` (optional, default ``"false"``): Set to ``"true"`` is the MAGE
results file was generated with an MPI version of the model. See
``eblCstd.F90``.
``<domain>`` (optional): Options for the problem domain
``dtype`` (optional, default ``"SPH"``): Domain over which to perform
CHIMP calculations, separate from grid, enables the user to perform
calculation on a subset of the grid to reduce computation where it is not
needed - See ``gridloc.F90``. Valid values are ``"SPH"``, ``"LFM"``,
``"LFMCYL"``, ``"MAGE"``, ``"EGG"``, ``"ELL"``.
``rClosed`` (optional, default set by choice of ``units/uid``): Radial value
for field line endpoint to reach to be considered closed - See
``chmpunits.F90``.
``rmax`` (optional, default computed): Maximum radius of Domain region -
See ``gridloc.F90``.
``rmin`` (optional, default computed): Minimum radius of Domain region -
See ``gridloc.F90``.
``xSun`` (optional,default 20.0): if dType is "LFM" or "MAGE", the Domain
region includes all i-shells which have distances along the Earth-Sun line is
less than this value (in Re)
``xTail`` (optional,default -100.0): if dType is "LFM" or "MAGE", the Domain
region includes cells in the magnetotail up until this value (in Re)
``yzMax`` (optional,default 40.0): if dType is "LFM" or "MAGE", the Domain
region includes cells with Y and Z coordinates between +/- yzMax (in Re)
``<output>`` (optional): Options related to driver output
``timer`` (optional, default ``"false"``): Set to ``"true"`` to turn time
flags on See ``starter.F90``.
``tsOut`` (optional, default ``"10"``): Cadence to output diagnostics to
run-log file See ``starter.F90``.
``doEQProj`` (optional, default ``"false"``): Set to ``.true.`` to include
equatorial variables, projected down to magnetic equator along field line from
cell location (i.e. Xeq,Yeq, if field line is open or closed etc) See
``chmpdefs.F90``.
``doSlim`` (optional, default ``"false"``): Set to ``"true"`` to remove
vector electric field and current data from slice.x output See
``chmpdefs.F90``.
``doTrc`` (optional, default ``"false"``): Similar to doEQProj, used in
chop.x See ``chmpdefs.F90``.
``<parallel>`` (optional): Options if ebfile was generated using an MPI
version of the code (read if fields/doMPI is set to ``"true"``, file name in
form of ebfile_Ri_Rj_Rk_i_j_k.gam.h5)
``Ri`` (optional, default ``"1"``): Number of ranks used in decomposition of
``"i"`` dimension See iotable.F90.
``Rj`` (optional, default ``"1"``): Number of ranks used in decomposition of
``"j"`` dimension See iotable.F90.
``Rk`` (optional, default ``"1"``): Number of ranks used in decomposition of
``"k"`` dimension See iotable.F90.
``doOldNaming`` (optional, default ``"false"``): Allow for backward
compatibility for MHD files generated with the now deprecated naming
convention See ``chmpdefs.F90``.
``<tracer>`` (optional): Options related to field line tracing performed by
CHIMP
``epsds`` (optional, default ``"1.0e-2"``): Tolerance for field line tracing
computations See chmpdefs.F90.
``<units>`` (optional): Name of units system used in the model run.
``uID`` (optional, default ``"Earth"``): See chmpunits.F90. Valid values
are ``"EARTH"``, ``"EARTHCODE"``, ``"JUPITER"``, ``"JUPITERCODE"``,
``"SATURN"``, ``"SATURNCODE"``, ``"HELIO"``, ``"LFM"``, ``"LFMJUPITER"``.
``<interp>`` (optional): Options related to interpolation
``wgt`` (optional, default ``"TSC"``): Sets 1D interpolation type. Valid
values are ``"TSC"`` (1D triangular shaped cloud), ``"LIN"`` (linear),
``"QUAD"`` (parabolic). See ``starter.F90``.
``<chop>`` (optional): Options specific to chop.x driver, see chopio.F90
``grType`` (optional, default ``"XYZ"``): String specifying an identifier
for the grid to do 3D data extraction. Valid Values are ``"XYZ"`` (cartesian),
``"RTP"`` (spherical), ``"LFM"`` (MAGE grid).
``Nx1`` (optional, default ``"64"``): Number of cells in X or R depending on
grid specified. Not used if grType is set to ``"LFM"``.
``Nx2`` (optional, default ``"64"``): Number of cells in Y or Theta
depending on grid specified. Not used if grType is set to ``"LFM"``.
``Nx3`` (optional, default ``"64"``): Number of cells in Z or Phi depending
on grid specified. Not used if grType is set to ``"LFM"``.
``x1Max`` (optional, default ``"10.0"``): Maximum value of the X1 dimension
used to initialize the chop grid. If ``grTyp="LFM"``, x1Max is used similar to
domain/xSun
``x1Min`` (optional, default ``"-x1Max"``): Minimum value of the X1
dimension used to initialize the chop grid. Not used if grType is set to
``"LFM"``.
``x2Max`` (optional, default ``"10.0"``): Maximum value of the X2 dimension
used to initialize the chop grid. Not used if grType is set to ``"LFM"``.
``x2Min`` (optional, default ``"-x2Max"``): Minimum value of the X2
dimension used to initialize the chop grid. Not used if grType is set to
``"LFM"``.
``x3Max`` (optional, default ``"10.0"``): Maximum value of the X3 dimension
used to initialize the chop grid. Not used if grType is set to ``"LFM"``.
``x3Min`` (optional, default ``"-x3Max"``): Minimum value of the X3
dimension used to initialize the chop grid. Not used if grType is set to
``"LFM"``.
Run Script example
------------------
An example pbs script on ``cheyenne``, RunChop.pbs to submit a chop job:
.. code-block:: bash
#!/bin/bash
#PBS -A P28100045
#PBS -N chopRTP
#PBS -j oe
#PBS -q regular
#PBS -l walltime=12:00:00
#PBS -l select=1:ncpus=72:ompthreads=72
export EXE=${chop}
export RUNID=${PBS_JOBNAME}
#Replace this with your module set
module purge
module restore kaiju
module list
hostname
date
export OMP_NUM_THREADS=72
export KMP_STACKSIZE=128M
echo "Running $EXE"
./chop.x ${RUNID}.xml > ${RUNID}.out
date
This job can be submitted with the command
.. code-block:: bash
qsub RunChop.pbs

View File

@@ -9,10 +9,8 @@ This page provides links to documentation for CHIMP.
.. toctree::
:maxdepth: 1
calcdb.x
chimpQuickStart
chimpXML
chop.x
psd.x
push.x
slice.x
trace.x

View File

@@ -431,5 +431,5 @@ Output
Creates runID.job#.h5part files (one for each job, holding the particles it
simulated). These files can be read in directly into Visit or Paraview. It is
beneficial to also extract a 2D plane from the equator of the MHD data (see
:doc:`slice.x <slice.x>`) and visualize particle location and MHD solution
slice.x documentation) and visualize particle location and MHD solution
together for context.

View File

@@ -1,280 +0,0 @@
2D Slice from GAMERA output (slice.x)
=====================================
Introduction
------------
slice.x extracts a 2D (Z=z0 or Y=y0) slice at from GAMERA output on either the
MAGE grid or interpolated onto a cartesian or a polar grid. Output is to an
hdf5 file that can be visualized and analyzed similarly to GAMERA output
files.
Example XML file
----------------
.. code-block:: xml
<?xml version="1.0" ?>
<Kaiju>
<Chimp>
<sim runid="ebXY"/>
<time T0="73800.0" dt="15.0" tFin="81000.1"/>
<fields doMHD="T" ebfile="msphere" grType="LFM" isMPI="T"/>
<parallel Ri="8" Rj="8" Rk="1"/>
<domain dtype="LFM" xSun="30.0" xTail="-50.0" yzMax="30.0"/>
<units uid="EARTH"/>
<slice doXY="T" grType="XY" z0="1.0" xSun="20" xTail="-20" yM="20" Nx1="400" Nx2="400"/>
<interp wgt="TSC"/>
<tracer epsds="0.20"/>
<output doSlim="T" doTrc="T"/>
<plasmapause doPP="F"/>
</Chimp>
</Kaiju>
Parameter Descriptions
----------------------
``<sim>`` (optional): Specify identifying information for this computation.
``runid`` (optional, default ``"Sim"``): String specifying an identifier for
this run of ``slice.x``. A best practice is to use the ``runid`` in the name
of the XML file.
``<time>`` (optional): Specify time range and interval for magnetic field
calculation.
``T0`` (optional, default ``"0.0"``): Start time (simulated seconds) for
ground magnetic field calculation, relative to start of simulation results
used as input.
``dt`` (optional, default ``"1.0"``): Time interval and output cadence
(simulated seconds) for ground magnetic field calculation.
``tFin`` (optional, default ``"60.0"``): Stop time (simulated seconds) for
ground magnetic field calculation, relative to start of simulation results
used as input.
``<fields>`` (required): Describes the input data from a MAGE model run.
``ebfile`` (optional, default ``"ebdata.h5"``): Path to HDF5 file containing
the electric and magnetic fields computed by a MAGE model run.
``grType`` (optional, default ``"EGG"``): String specifying grid type used by
the MAGE output file. Valid values are ``"EGG"``, ``"LFM"``, ``"SPH"``. If the
string is not one of the supported grid types, the default value (``"EGG"``)
is used, and a warning message is printed.
``doEBFix`` (optional, default ``"false"``): Set to ``"true"`` to "clean" the
electric field E so that the dot product of the electric and magnetic fields
is 0. See ``ebinterp.F90``.
``doMHD`` (optional, default ``"false"``): Set to ``"true"`` to pass the full
set of magnetohydrodynamic variables to CHIMP, rather than just electric and
magnetic fields. See ``ebtypes.F90``.
``isMPI`` (optional, default ``"false"``): Set to ``"true"`` is the MAGE
results file was generated with an MPI version of the model. See
``eblCstd.F90``.
``<domain>`` (optional): Options for the problem domain
``dtype`` (optional, default ``"SPH"``): Domain over which to perform CHIMP
calculations, separate from grid, enables the user to perform calculation on a
subset of the grid to reduce computation where it is not needed - See
``gridloc.F90``. Valid values are ``"SPH"``, ``"LFM"``, ``"LFMCYL"``,
``"MAGE"``, ``"EGG"``, ``"ELL"``.
``rClosed`` (optional, default set by choice of ``units/uid``): Radial value
for field line endpoint to reach to be considered closed - See
``chmpunits.F90``.
``rmax`` (optional, default computed): Maximum radius of Domain region - See
``gridloc.F90``.
``rmin`` (optional, default computed): Minimum radius of Domain region - See
``gridloc.F90``.
``xSun`` (optional,default 20.0): if dType is "LFM" or "MAGE", the Domain
region includes all i-shells which have distances along the Earth-Sun line is
less than this value (in Re)
``xTail`` (optional,default -100.0): if dType is "LFM" or "MAGE", the Domain
region includes cells in the magnetotail up until this value (in Re)
``yzMax`` (optional,default 40.0): if dType is "LFM" or "MAGE", the Domain
region includes cells with Y and Z coordinates between +/- yzMax (in Re)
``<output>`` (optional): Options related to driver output
``timer`` (optional, default ``"false"``): Set to ``"true"`` to turn time
flags on See ``starter.F90``.
``tsOut`` (optional, default ``"10"``): Cadence to output diagnostics to
run-log file See ``starter.F90``.
``doEQProj`` (optional, default ``"false"``): Set to ``.true.`` to include
equatorial variables, projected down to magnetic equator along field line from
cell location (i.e. Xeq,Yeq, if field line is open or closed etc) See
``chmpdefs.F90``.
``doSlim`` (optional, default ``"false"``): Set to ``"true"`` to remove
vector electric field and current data from slice.x output See
``chmpdefs.F90``.
``doTrc`` (optional, default ``"false"``): Similar to doEQProj, used iZzn
slice.x See ``chmpdefs.F90``.
``<parallel>`` (optional): Options if ebfile was generated using an MPI
version of the code (read if fields/doMPI is set to ``"true"``, file name in
form of ebfile_Ri_Rj_Rk_i_j_k.gam.h5)
``Ri`` (optional, default ``"1"``): Number of ranks used in decomposition of
``"i"`` dimension See iotable.F90.
``Rj`` (optional, default ``"1"``): Number of ranks used in decomposition of
``"j"`` zdimension See iotable.F90.
``Rk`` (optional, default ``"1"``): Number of ranks used in decomàosition of
``"k"`` dimension See iotable.F90.
``doOldNaming`` (optional, default ``"false"``): Allow for backward
compatibility for MHD files generated with the now deprecated naming
convention See ``chmpdefs.F90``.
``<tracer>`` (optional): Options related to field line tracing performed by
CHIMP
``epsds`` (optional, default ``"1.0e-2"``): Tolerance for field line tracing
computations See chmpdefs.F90.
``<units>`` (optional): Name of units system used in the model run.
``uID`` (optional, default ``"Earth"``): See chmpunits.F90. Valid values are
``"EARTH"``, ``"EARTHCODE"``, ``"JUPITER"``, ``"JUPITERCODE"``, ``"SATURN"``,
``"SATURNCODE"``, ``"HELIO"``, ``"LFM"``\ , ``"LFMJUPITER"``.
``<interp>`` (optional): Options related to interpolation
``wgt`` (optional, default ``"TSC"``): Sets 1D interpolation type. Valid
values are ``"TSC"`` (1D triangular shaped cloud), ``"LIN"`` (linear),
``"QUAD"`` (parabolic). See ``starter.F90``.
``<slice>`` (optional): Options specific to slice.x driver, see sliceio.F90
``doXY`` (optional, default ``"true"``): Perform a slice in the XY plane, if
``"false"``, performs a slice in the XZ plane
``z0`` (optional, default ``"0.0"``): Take a slice at Z=z0, if doXY
``"true"``.
``y0`` (optional, default ``"0.0"``): Take a slice at Y=y0, if doXY
``"false"``.
``grType`` (optional, default ``"XY"``): String specifying an identifier for
the grid to be used in sliced plane. Valid Values are ``"XY"`` (cartesian),
``"RP"`` (spherical), ``"LFM2D"`` (MAGE grid).
``Nx1`` (optional, default ``"128"``): Number of cells in X or R depending on
grid specified. Not used if grType is set to ``"LFM2D"``.
``Nx2`` (optional, default ``"256"``): Number of cells in Y/Z or R depending
on grid specified. Not used if grType is set to ``"LFM2D"``.
``Npow`` (optional, default ``"0"``): Number of times to increase the grid
used in the sliced plane by a factor of 2.
``xSun`` (optional, default ``"12.5"``): If ``grType="XY"``, sets the maximum
value of X used to initialize the slice grid. If ``grType="LFM2D"``, used
similarly to domain/xSun where all i-shells which have distances along the
Earth-Sun line is less than this value (in Re) are included in the sliced
grid.
``xTail`` (optional, default ``"-20.0"``): If ``grType="XY"``, sets the
minimum value of X used to initialize the slice grid.
``yM`` (optional, default ``"20.0"``): If ``grType="XY"``, the minimum/maximum
values of the grid of the second axis (either Y or Z depending on the value of
doXY) are set to +/-yM.
``Rin`` (optional, default ``"Rin of ebFile grid"``): If ``grType="RP"``, sets
the minimum value of R used to initialize the slice grid.
``Rout`` (optional, default ``"Max R in the +X direction of ebFile grid"``):
If ``grType="RP"``\ , sets the maximum value of R used to initialize the slice
grid.
``Pin`` (optional, default ``"0.0"``): If ``grType="RP"``, sets the minimum
value of longitude (in degrees) used to set the slice grid. A value of 0
corresponds to the +X direction.
``Pout`` (optional, default ``"360.0"``): If ``grType="RP"``, sets the maximum
value of longitude (in degrees) used to set the slice grid.
``doLog`` (optional, default ``"false"``): Set to ``"true"`` to distribute
cells uniformly in log-space between Rin/Rout values.
``dSlc`` (optional, default ``"0.05"``): Spacing used to average slice over
y0+/-dSlc or z0+/-dSlc, depending on the value of doXY.
``<plasmapause>`` (optional): Options for calculation to determine plasmapause
location in MHD file
``doPP`` (optional, default ``"false"``): Set to ``.true.`` to calculate
plasmapause location in the equator and include it in output file for slice.x
- See chmpfields.F90.
``Lin`` (optional, default ``"2.0"``): Minimum L-shell value to begin
plasmapause calculation - See plasmaputils.F90.
``Lout`` (optional, default ``"6.0"``): Maximum L-shell value to end
plasmapause calculation - See plasmaputils.F90.
``Nl`` (optional, default ``"80"``): Number of cells/steps in L-shell See
plasmaputils.F90.
``Nphi`` (optional, default ``"72"``): Number of cells/steps in longitude See
plasmaputils.F90.
``phi0`` (optional, default ``"0.0"``): Minimum longitude to perform
plasmapause calculation between (in radians, 0 is in the +X direction) -
See plasmaputils.F90.
``phi1`` (optional, default ``"2*PI"``): Maximum longitude to perform
plasmapause calculation between See plasmaputils.F90.
Run Script example
------------------
An example pbs script on ``cheyenne``, RunSlice.pbs to submit slice job:
.. code-block:: bash
#!/bin/bash
#PBS -A P28100045
#PBS -N sliceXZ
#PBS -j oe
#PBS -q regular
#PBS -l walltime=4:00:00
#PBS -l select=1:ncpus=72:ompthreads=72
export EXE=${slice}
export RUNID=${PBS_JOBNAME}
#Replace this with your module set
module purge
module restore kaiju
module list
hostname
date
export OMP_NUM_THREADS=72
export KMP_STACKSIZE=128M
echo "Running $EXE"
./slice.x ${RUNID}.xml > ${RUNID}.out
date
This job can be submitted with the command
.. code-block:: bash
qsub RunSlice.pbs

View File

@@ -1,14 +0,0 @@
Computational Costs
===================
Various estimates/values of the computational cost of different run
configurations on different machines. Useful for HEC requests and such.
GR (GAMERA+RCM)
---------------
GTR (GAMERA+RCM+TIEGCM)
-----------------------
* `Cheyenne estimates <https://docs.google.com/spreadsheets/d/e/2PACX-1vSn5llccs3AIBRdqMym3VOMY6ltwTWp_VtMNv_h2bAW6sJc2ONMz_UE6M0SpdZ_doByBgPZbxXpz98B/pubhtml>`_

View File

@@ -1,13 +0,0 @@
Computational Costs
===================
Introduction
------------
This page provides links to documentation on the computational resources
needed to run the MAGE model.
.. toctree::
:maxdepth: 1
compCosts

View File

@@ -1,11 +0,0 @@
Distributing
============
Table of Contents
-----------------
.. toctree::
:maxdepth: 1
kaipy_for_pip
kaipy_for_conda

View File

@@ -1,135 +0,0 @@
Building a ``conda``-installable distribution of ``kaipy``
==========================================================
Introduction
------------
This page will walk you through the process of creating a packaged version of
the ``kaipy`` software that can be installed with the ``conda`` Python package
management tool. The ``conda`` tool can install packages from local package
files, as well as from web-based repositories such as the standard Anaconda
repositories.
Setup
-----
The standard python distribution requires additional modules to prepare
``conda``-installable packages. You can install these modules into your
current python environment using the ``conda`` package manager:
.. code-block:: bash
conda install conda-build
conda install anaconda-client
Building the package
--------------------
The process of building the ``conda``-installable packages is very simple -
but it assumes you have already performed a ``pip``\ -based package build and
upload to PyPI. This approach allows maintenance of both package flavors with
a minimum of effort. If needed, this procedure can be updated in the future to
remove the ``pip``\ -package prerequisite.
Once you have cloned the ``kaiju`` repository, move into the repository
directory and run the following commands to build the package:
.. code-block:: bash
cd kaiju
conda build kaipy
This command reads the file ``kaiju/meta.yaml``\ , and performs all of the
steps required to build a ``conda``\ -installable package. The package files,
by default, are created in a subdirectory of your ``conda``\ -based ``python``
installation. For example, if your ``python`` is installed under:
.. code-block:: bash
$HOME/miniconda3
you would find the package files in a path similar to:
.. code-block:: bash
$HOME/miniconda3/envs/PYTHON_ENVIRONMENT_NAME/conda-bld/osx-64/kaipy-0.0.2-py38_0.tar.bz2
where ``PYTHON_ENVIRONMENT_NAME`` is the name of the ``conda``-based
``python`` environment that you used to run the ``conda build`` command. Note
that in this directory you may see more than one file, since previous builds
are not removed. Also, the version numbers you see in the filenames will
increase with time.
Testing the package locally
---------------------------
Once the package has been created locally, perform a test installation into
your local ``python`` environment. Assuming you have run the build described
in the previous step, you can perform the local installation with the command:
.. code-block:: bash
conda install --use-local $HOME/miniconda3/envs/kaiju-3.8/conda-bld/osx-64/kaipy-0.0.2-py38_0.tar.bz2
where ``kaiju-3.8`` should be replaced with the name of your ``conda``-based
``python`` environment.
Now launch your ``python`` and load the ``kaipy`` package:
.. code-block:: bash
python -c "import kaipy"
If this command produces no output, it worked - you have installed ``kaipy``
from your local ``pip`` package.
Uploading the package to the Anaconda repository
------------------------------------------------
Once you have built and tested the package locally, you can upload it to the
standard Anaconda repository so that others can easily install ``kaipy``
using ``conda``. Uploading to the Anaconda repository requires the use of a
Anaconda account and password.
.. code-block:: bash
anaconda login
# Enter account email and password when prompted.
anaconda upload $HOME/miniconda3/envs/kaiju-3.8/conda-bld/osx-64/kaipy-0.0.2-py38_0.tar.bz2
anaconda logout
where ``kaiju-3.8`` should be replaced with the name of your ``conda``-based
``python`` environment.
*IMPORTANT NOTE*\ : Currently, the ``kaipy`` package on Anaconda is managed by
the ``cgsapl`` account, which is attached to the email address
``eric.winter@jhuapl.edu``. If you need to upload a ``conda``-installable
package, please see `Eric Winter <mailto:eric.winter@jhuapl.edu>`_ for the CGS
Anaconda account credentials.
Testing the package from the repository
---------------------------------------
Once the package has been uploaded to the Anaconda repository, perform a test
installation into your local ``conda``\ -based ``python`` environment.
Assuming you have run the steps described above, you can perform the
installation from the Anaconda repository with the command:
.. code-block:: bash
# Uninstall any previous version for now.
conda uninstall kaipy
conda install -c cgsapl kaipy
*NOTE*\ : The ``-c cgsapl`` indicates that the package will be installed from
our "channel" (\ ``cgsapl``\ ) on the Anaconda repository.
Now launch your ``python`` and load the ``kaipy`` package:
.. code-block:: bash
python -c "import kaipy"
If this command produces no output, it worked - you have installed ``kaipy``
from the Anaconda repository.

View File

@@ -1,119 +0,0 @@
Building a ``pip``-installable distribution of ``kaipy``
========================================================
Introduction
------------
This page will walk you through the process of creating a packaged version of
the ``kaipy`` software that can be installed with the ``pip`` Python package
management tool. The ``pip`` tool can install packages from local package
files, as well as from web-based repositories such as
`PyPI (the Python Package Index) <https://pypi.org>`_.
Setup
-----
The standard python distribution already contains the tools needed to build
``pip``-installable packages, but an additional module
(`twine <https://pypi.org/project/twine/>`_) is needed to permit the package
to be uploaded to PyPI so it can be installed by any user.
You can install ``twine`` into your current python environment using the
``pip`` package manager:
.. code-block:: bash
pip install twine
If you are using a ``conda``-based python environment, you can install
``twine`` using the ``conda`` package manager:
.. code-block:: bash
conda install twine
Building the package
--------------------
The process of building the ``pip``\ -installable packages is very simple.
Once you have cloned the ``kaiju`` repository, move into the repository
directory and run the ``setup.py`` script like this:
.. code-block:: bash
cd kaiju
python setup.py sdist
This will create a subdirectory called ``dist``, containing an installable
tarball for the ``kaipy`` package:
.. code-block:: bash
(kaiju-3.8) C02FC1QCMD6T-ML:kaiju winteel1$ ls dist
kaipy-0.0.1.tar.gz kaipy-0.0.2.tar.gz
Note that you may see more than one file, since previous builds are not
removed. Also, the version numbers you see in the filenames will increase with
time.
Testing the package locally
---------------------------
Once the package has been created locally, perform a test installation into
your local python environment. Assuming you have run the build described in
the previous step, you can perform the local installation with the command:
.. code-block:: bash
pip install dist/kaipy-0.0.2.tar.gz
Now launch your ``python`` and load the ``kaipy`` package:
.. code-block:: bash
python -c "import kaipy"
If this command produces no output, it worked - you have installed ``kaipy``
from your local ``pip`` package.
Uploading the package
---------------------
Once you have built and tested the package locally, you can upload it to PyPI
so that others can easily install ``kaipy`` using ``pip``. Uploading to PyPI
requires the use of a PyPI account and password.
.. code-block:: bash
python -m twine upload dist/*
# Enter account email and password when prompted.
*IMPORTANT NOTE*\ : Currently, the ``kaipy`` package on PyPI is managed by
the "Center for Geospace Storms" account, which is attached to the email
address ``eric.winter@jhuapl.edu``. If you need to upload a
``pip``-installable package, please see
`Eric Winter <mailto:eric.winter@jhuapl.edu>`_ for the CGS PyPI account
credentials.
Testing the package from the repository
---------------------------------------
Once the package has been uploaded to PyPI, perform a test installation into
your local python environment. Assuming you have run the steps described
above, you can perform the installation from PyPI with the command:
.. code-block:: bash
# Uninstall any previous version for now.
pip uninstall kaipy
pip install kaipy
Now launch your ``python`` and load the ``kaipy`` package:
.. code-block:: bash
python -c "import kaipy"
If this command produces no output, it worked - you have installed ``kaipy``
from PyPI.

View File

@@ -4,6 +4,8 @@ Heliosphere Simulations with MAGE
.. toctree::
:maxdepth: 1
helioQuickStart
create_gamhelio_ensemble
compCostsHelio
gibson-low
helio-cme

View File

@@ -0,0 +1,15 @@
Internal ``kaiju`` documentation
======================================
These are documents that are not ready or not intended for public consumption.
.. toctree::
:maxdepth: 2
chimp/index
heliosphere/index
magnetosphere/index
rcm/index
testing/index
user_rules/index

View File

@@ -1,80 +0,0 @@
.. code-block:: yml
name: kaiju-3.7
channels:
- defaults
dependencies:
- ca-certificates=2022.07.19=hecd8cb5_0
- certifi=2022.6.15=py37hecd8cb5_0
- libcxx=12.0.0=h2f01273_0
- libffi=3.3=hb1e8313_2
- ncurses=6.3=hca72f7f_3
- openssl=1.1.1q=hca72f7f_0
- pip=22.1.2=py37hecd8cb5_0
- python=3.7.13=hdfd78df_0
- readline=8.1.2=hca72f7f_1
- sqlite=3.39.2=h707629a_0
- tk=8.6.12=h5d9f67b_0
- wheel=0.37.1=pyhd3eb1b0_0
- xz=5.2.5=hca72f7f_1
- zlib=1.2.12=h4dc903c_2
- pip:
- ai-cdas==1.2.3
- aioftp==0.21.3
- aiohttp==3.8.1
- aiosignal==1.2.0
- astropy==4.3.1
- async-timeout==4.0.2
- asynctest==0.13.0
- attrs==22.1.0
- cdasws==1.7.40
- cdflib==0.4.4
- cftime==1.6.1
- charset-normalizer==2.1.0
- configparser==5.2.0
- cycler==0.11.0
- dataclasses-json==0.5.7
- fonttools==4.35.0
- frozenlist==1.3.1
- h5py==3.7.0
- idna==3.3
- importlib-metadata==4.12.0
- iniconfig==1.1.1
- kiwisolver==1.4.4
- marshmallow==3.17.0
- marshmallow-enum==1.5.1
- matplotlib==3.5.2
- multidict==6.0.2
- mypy-extensions==0.4.3
- netcdf4==1.6.0
- numpy==1.21.6
- packaging==21.3
- pandas==1.3.5
- parfive==1.5.1
- pillow==9.2.0
- pluggy==1.0.0
- progressbar==2.5
- py==1.11.0
- pyerfa==2.0.0.1
- pyhdf==0.10.5
- pyparsing==3.0.9
- pytest==7.1.2
- python-dateutil==2.8.2
- pytz==2022.2.1
- requests==2.28.1
- scipy==1.7.3
- setuptools==65.0.2
- six==1.16.0
- spacepy==0.3.0
- sunpy==3.1.8
- tomli==2.0.1
- tqdm==4.64.0
- typing-extensions==4.3.0
- typing-inspect==0.7.1
- urllib3==1.26.11
- wget==3.2
- xarray==0.20.2
- yarl==1.8.1
- zipp==3.8.1
prefix: /Users/winteel1/miniconda3/envs/kaiju-3.7

View File

@@ -1,83 +0,0 @@
.. code-block::
name: kaiju-3.7
channels:
- defaults
dependencies:
- ca-certificates=2022.07.19=hecd8cb5_0
- certifi=2022.6.15=py37hecd8cb5_0
- libcxx=12.0.0=h2f01273_0
- libffi=3.3=hb1e8313_2
- ncurses=6.3=hca72f7f_3
- openssl=1.1.1q=hca72f7f_0
- pip=22.1.2=py37hecd8cb5_0
- python=3.7.13=hdfd78df_0
- readline=8.1.2=hca72f7f_1
- sqlite=3.39.2=h707629a_0
- tk=8.6.12=h5d9f67b_0
- wheel=0.37.1=pyhd3eb1b0_0
- xz=5.2.5=hca72f7f_1
- zlib=1.2.12=h4dc903c_2
- pip:
- about-time==4.2.1
- ai-cdas==1.2.3
- aioftp==0.21.3
- aiohttp==3.8.1
- aiosignal==1.2.0
- alive-progress==3.0.1
- astropy==4.3.1
- async-timeout==4.0.2
- asynctest==0.13.0
- attrs==22.1.0
- cdasws==1.7.40
- cdflib==0.4.4
- cftime==1.6.1
- charset-normalizer==2.1.0
- configparser==5.2.0
- cycler==0.11.0
- dataclasses-json==0.5.7
- fonttools==4.35.0
- frozenlist==1.3.1
- grapheme==0.6.0
- h5py==3.7.0
- idna==3.3
- importlib-metadata==4.12.0
- iniconfig==1.1.1
- kiwisolver==1.4.4
- marshmallow==3.17.0
- marshmallow-enum==1.5.1
- matplotlib==3.5.2
- multidict==6.0.2
- mypy-extensions==0.4.3
- netcdf4==1.6.0
- numpy==1.21.6
- packaging==21.3
- pandas==1.3.5
- parfive==1.5.1
- pillow==9.2.0
- pluggy==1.0.0
- progressbar==2.5
- py==1.11.0
- pyerfa==2.0.0.1
- pyhdf==0.10.5
- pyparsing==3.0.9
- pytest==7.1.2
- python-dateutil==2.8.2
- pytz==2022.2.1
- requests==2.28.1
- scipy==1.7.3
- setuptools==65.0.2
- six==1.16.0
- spacepy==0.3.0
- sunpy==3.1.8
- tomli==2.0.1
- tqdm==4.64.0
- typing-extensions==4.3.0
- typing-inspect==0.7.1
- urllib3==1.26.11
- wget==3.2
- xarray==0.20.2
- yarl==1.8.1
- zipp==3.8.1
prefix: /Users/winteel1/miniconda3/envs/kaiju-3.7

View File

@@ -1,85 +0,0 @@
.. code-block::
name: kaiju-3.7
channels:
- defaults
dependencies:
- _libgcc_mutex=0.1=main
- _openmp_mutex=5.1=1_gnu
- ca-certificates=2022.07.19=h06a4308_0
- certifi=2022.6.15=py37h06a4308_0
- ld_impl_linux-64=2.38=h1181459_1
- libffi=3.3=he6710b0_2
- libgcc-ng=11.2.0=h1234567_1
- libgomp=11.2.0=h1234567_1
- libstdcxx-ng=11.2.0=h1234567_1
- ncurses=6.3=h5eee18b_3
- openssl=1.1.1q=h7f8727e_0
- pip=22.1.2=py37h06a4308_0
- python=3.7.13=h12debd9_0
- readline=8.1.2=h7f8727e_1
- sqlite=3.39.2=h5082296_0
- tk=8.6.12=h1ccaba5_0
- wheel=0.37.1=pyhd3eb1b0_0
- xz=5.2.5=h7f8727e_1
- zlib=1.2.12=h7f8727e_2
- pip:
- ai-cdas==1.2.3
- aioftp==0.21.3
- aiohttp==3.8.1
- aiosignal==1.2.0
- astropy==4.3.1
- async-timeout==4.0.2
- asynctest==0.13.0
- attrs==22.1.0
- cdasws==1.7.40
- cdflib==0.4.4
- cftime==1.6.1
- charset-normalizer==2.1.0
- configparser==5.2.0
- cycler==0.11.0
- dataclasses-json==0.5.7
- fonttools==4.36.0
- frozenlist==1.3.1
- h5py==3.7.0
- idna==3.3
- importlib-metadata==4.12.0
- iniconfig==1.1.1
- kiwisolver==1.4.4
- marshmallow==3.17.0
- marshmallow-enum==1.5.1
- matplotlib==3.5.2
- multidict==6.0.2
- mypy-extensions==0.4.3
- netcdf4==1.6.0
- numpy==1.21.6
- packaging==21.3
- pandas==1.3.5
- parfive==1.5.1
- pillow==9.2.0
- pluggy==1.0.0
- progressbar==2.5
- py==1.11.0
- pyerfa==2.0.0.1
- pyhdf==0.10.5
- pyparsing==3.0.9
- pytest==7.1.2
- python-dateutil==2.8.2
- pytz==2022.2.1
- requests==2.28.1
- scipy==1.7.3
- setuptools==65.0.2
- six==1.16.0
- spacepy==0.3.0
- sunpy==3.1.8
- tomli==2.0.1
- tqdm==4.64.0
- typing-extensions==4.3.0
- typing-inspect==0.8.0
- urllib3==1.26.11
- wget==3.2
- xarray==0.20.2
- yarl==1.8.1
- zipp==3.8.1
prefix: /home3/ewinter/miniconda3/envs/kaiju-3.7

View File

@@ -1,78 +0,0 @@
.. code-block::
name: kaiju-3.8
channels:
- defaults
dependencies:
- ca-certificates=2022.07.19=hecd8cb5_0
- certifi=2022.6.15=py38hecd8cb5_0
- libcxx=12.0.0=h2f01273_0
- libffi=3.3=hb1e8313_2
- ncurses=6.3=hca72f7f_3
- openssl=1.1.1q=hca72f7f_0
- pip=22.1.2=py38hecd8cb5_0
- python=3.8.13=hdfd78df_0
- readline=8.1.2=hca72f7f_1
- sqlite=3.39.2=h707629a_0
- tk=8.6.12=h5d9f67b_0
- wheel=0.37.1=pyhd3eb1b0_0
- xz=5.2.5=hca72f7f_1
- zlib=1.2.12=h4dc903c_2
- pip:
- ai-cdas==1.2.3
- aioftp==0.21.3
- aiohttp==3.8.1
- aiosignal==1.2.0
- astropy==4.3.1
- async-timeout==4.0.2
- attrs==22.1.0
- cdasws==1.7.40
- cdflib==0.4.4
- cftime==1.6.1
- charset-normalizer==2.1.0
- configparser==5.2.0
- cycler==0.11.0
- dataclasses-json==0.5.7
- fonttools==4.35.0
- frozenlist==1.3.1
- h5py==3.7.0
- idna==3.3
- iniconfig==1.1.1
- kiwisolver==1.4.4
- marshmallow==3.17.0
- marshmallow-enum==1.5.1
- matplotlib==3.5.2
- multidict==6.0.2
- mypy-extensions==0.4.3
- netcdf4==1.6.0
- numpy==1.21.6
- packaging==21.3
- pandas==1.4.3
- parfive==1.5.1
- pillow==9.2.0
- pluggy==1.0.0
- progressbar==2.5
- py==1.11.0
- pyerfa==2.0.0.1
- pyhdf==0.10.5
- pyparsing==3.0.9
- pytest==7.1.2
- python-dateutil==2.8.2
- pytz==2022.2.1
- pyyaml==6.0
- requests==2.28.1
- scipy==1.7.3
- setuptools==65.0.2
- six==1.16.0
- spacepy==0.3.0
- sunpy==3.1.8
- tomli==2.0.1
- tqdm==4.64.0
- typing-extensions==4.3.0
- typing-inspect==0.7.1
- urllib3==1.26.11
- wget==3.2
- xarray==0.20.2
- yarl==1.8.1
prefix: /Users/winteel1/miniconda3/envs/kaiju-3.8

View File

@@ -1,107 +0,0 @@
.. code-block::
name: kaiju-3.8
channels:
- defaults
dependencies:
- bleach=4.1.0=pyhd3eb1b0_0
- brotlipy=0.7.0=py38h9ed2024_1003
- ca-certificates=2022.10.11=hecd8cb5_0
- certifi=2022.6.15=py38hecd8cb5_0
- cffi=1.15.1=py38hc55c11b_0
- cmarkgfm=0.4.2=py38h9ed2024_0
- colorama=0.4.5=py38hecd8cb5_0
- cryptography=38.0.1=py38hf6deb26_0
- docutils=0.18.1=py38hecd8cb5_3
- future=0.18.2=py38_1
- importlib-metadata=4.11.3=py38hecd8cb5_0
- importlib_metadata=4.11.3=hd3eb1b0_0
- keyring=23.4.0=py38hecd8cb5_0
- libcxx=12.0.0=h2f01273_0
- libffi=3.3=hb1e8313_2
- ncurses=6.3=hca72f7f_3
- openssl=1.1.1s=hca72f7f_0
- packaging=21.3=pyhd3eb1b0_0
- pip=22.1.2=py38hecd8cb5_0
- pkginfo=1.8.3=py38hecd8cb5_0
- pycparser=2.21=pyhd3eb1b0_0
- pygments=2.11.2=pyhd3eb1b0_0
- pyopenssl=22.0.0=pyhd3eb1b0_0
- pyparsing=3.0.9=py38hecd8cb5_0
- pysocks=1.7.1=py38_1
- python=3.8.13=hdfd78df_0
- readline=8.1.2=hca72f7f_1
- readme_renderer=24.0=py38hecd8cb5_0
- requests=2.28.1=py38hecd8cb5_0
- requests-toolbelt=0.9.1=pyhd3eb1b0_0
- rfc3986=1.4.0=pyhd3eb1b0_0
- six=1.16.0=pyhd3eb1b0_1
- sqlite=3.39.2=h707629a_0
- tk=8.6.12=h5d9f67b_0
- twine=3.7.1=pyhd3eb1b0_0
- webencodings=0.5.1=py38_1
- wheel=0.37.1=pyhd3eb1b0_0
- xz=5.2.5=hca72f7f_1
- zipp=3.8.0=py38hecd8cb5_0
- zlib=1.2.12=h4dc903c_2
- pip:
- about-time==4.2.1
- ai-cdas==1.2.3
- aioftp==0.21.3
- aiohttp==3.8.1
- aiosignal==1.2.0
- alive-progress==3.0.1
- astropy==4.3.1
- async-timeout==4.0.2
- attrs==22.1.0
- cartopy==0.21.0
- cdasws==1.7.40
- cdflib==0.4.4
- cftime==1.6.1
- charset-normalizer==2.1.0
- configparser==5.2.0
- cycler==0.11.0
- dataclasses-json==0.5.7
- fonttools==4.35.0
- frozenlist==1.3.1
- grapheme==0.6.0
- h5py==3.7.0
- idna==3.3
- iniconfig==1.1.1
- kiwisolver==1.4.4
- marshmallow==3.17.0
- marshmallow-enum==1.5.1
- matplotlib==3.5.2
- multidict==6.0.2
- mypy-extensions==0.4.3
- netcdf4==1.6.0
- numpy==1.21.6
- pandas==1.4.3
- parfive==1.5.1
- pillow==9.2.0
- pluggy==1.0.0
- progressbar==2.5
- py==1.11.0
- pyerfa==2.0.0.1
- pyhdf==0.10.5
- pyproj==3.4.0
- pyshp==2.3.1
- pytest==7.1.2
- python-dateutil==2.8.2
- pytz==2022.2.1
- pyyaml==6.0
- scipy==1.7.3
- setuptools==65.0.2
- shapely==1.8.4
- spacepy==0.3.0
- sunpy==3.1.8
- tomli==2.0.1
- tqdm==4.64.0
- typing-extensions==4.3.0
- typing-inspect==0.7.1
- urllib3==1.26.11
- wget==3.2
- xarray==0.20.2
- yarl==1.8.1
prefix: /Users/winteel1/miniconda3/envs/kaiju-3.8

View File

@@ -1,83 +0,0 @@
.. code-block::
name: kaiju-3.8
channels:
- defaults
dependencies:
- _libgcc_mutex=0.1=main
- _openmp_mutex=5.1=1_gnu
- ca-certificates=2022.07.19=h06a4308_0
- certifi=2022.6.15=py38h06a4308_0
- ld_impl_linux-64=2.38=h1181459_1
- libffi=3.3=he6710b0_2
- libgcc-ng=11.2.0=h1234567_1
- libgomp=11.2.0=h1234567_1
- libstdcxx-ng=11.2.0=h1234567_1
- ncurses=6.3=h5eee18b_3
- openssl=1.1.1q=h7f8727e_0
- pip=22.1.2=py38h06a4308_0
- python=3.8.13=h12debd9_0
- readline=8.1.2=h7f8727e_1
- sqlite=3.39.2=h5082296_0
- tk=8.6.12=h1ccaba5_0
- wheel=0.37.1=pyhd3eb1b0_0
- xz=5.2.5=h7f8727e_1
- zlib=1.2.12=h7f8727e_2
- pip:
- ai-cdas==1.2.3
- aioftp==0.21.3
- aiohttp==3.8.1
- aiosignal==1.2.0
- astropy==4.3.1
- async-timeout==4.0.2
- attrs==22.1.0
- cdasws==1.7.40
- cdflib==0.4.4
- cftime==1.6.1
- charset-normalizer==2.1.0
- configparser==5.2.0
- cycler==0.11.0
- dataclasses-json==0.5.7
- fonttools==4.36.0
- frozenlist==1.3.1
- h5py==3.7.0
- idna==3.3
- iniconfig==1.1.1
- kiwisolver==1.4.4
- marshmallow==3.17.0
- marshmallow-enum==1.5.1
- matplotlib==3.5.2
- multidict==6.0.2
- mypy-extensions==0.4.3
- netcdf4==1.6.0
- numpy==1.21.6
- packaging==21.3
- pandas==1.4.3
- parfive==1.5.1
- pillow==9.2.0
- pluggy==1.0.0
- progressbar==2.5
- py==1.11.0
- pyerfa==2.0.0.1
- pyhdf==0.10.5
- pyparsing==3.0.9
- pytest==7.1.2
- python-dateutil==2.8.2
- pytz==2022.2.1
- pyyaml==6.0
- requests==2.28.1
- scipy==1.7.3
- setuptools==65.0.2
- six==1.16.0
- spacepy==0.3.0
- sunpy==3.1.8
- tomli==2.0.1
- tqdm==4.64.0
- typing-extensions==4.3.0
- typing-inspect==0.8.0
- urllib3==1.26.11
- wget==3.2
- xarray==0.20.2
- yarl==1.8.1
prefix: /home3/ewinter/miniconda3/envs/kaiju-3.8

View File

@@ -31,8 +31,7 @@ Compilation
Assume Kaiju repository has been installed successfully at $KAIJUDIR, and
$KAIJUDIR/scripts has been added to $PATH and $KAIJUDIR added to $PYTHONPATH.
(Instructions for installation can be found
:doc:`here </quickStart/building/index>`).
The commands to compile for MPI parallelized Voltron, which is the mostly used
mode, include:
@@ -236,8 +235,7 @@ MPI Differences
Running a coupled Gamera-RCM case with MPI support requires three things:
#. Building the MPI version of the coupled executable - see the
:doc:`build instructions </quickStart/building/index>`.
#. Building the MPI version of the coupled executable.
#. Modifying case XML to supply additional MPI decomposition information
#. Modifying the submission script to request multiple nodes and use mpirun

View File

@@ -12,7 +12,7 @@ Get set up on Pleiades
#. Setup RSA tokens: copy from bitbucket to Pleiades Font End (pfe) and
`follow wiki <https://www.nas.nasa.gov/hecc/support/kb/enabling-your-rsa-securid-hard-token-(fob>`_\ _59.html)
`follow wiki <https://www.nas.nasa.gov/hecc/support/kb/enabling-your-rsa-securid-hard-token-fob_59.html>`_
#. Setup of ssh pass through,
`follow wiki <https://www.nas.nasa.gov/hecc/support/kb/setting-up-ssh-passthrough_232.html>`_
@@ -26,7 +26,7 @@ SecureID mobile app (dual-factor authentication):
#. Setup the sup client
`from wiki <https://www.nas.nasa.gov/hecc/support/kb/using-the-secure-unattended-proxy-(sup>`_\ _145.html)
`from wiki <https://www.nas.nasa.gov/hecc/support/kb/using-the-secure-unattended-proxy-sup_145.html>`_
This will enable you to send large files between remote and local servers with
``shiftc``

View File

@@ -5,11 +5,11 @@ Magnetosphere Simulations with MAGE
:maxdepth: 1
analysisTools/index
compCosts/index
exoOuterPlanets/index
hidra/index
mage/index
xml/index
planetaryQuickStart
Gamerasphere
gameraRCM
gameraRemix

View File

@@ -1,207 +0,0 @@
Comparison of heliosphere simulation results with satellite data
================================================================
Introduction
------------
Comparison of simulation results to observed data is a critical part of model
validation. The MAGE team has developed a set of tools to facilitate
comparison of the results of heliospheric simulations with measured spacecraft
data. The initial version of these tools supports comparison of the results
from runs of ``gamhelio.x`` or ``gamhelio_mpi.x`` to data from the
`ACE spacecraft <https://solarsystem.nasa.gov/missions/ace/in-depth/>`_. The
observations for the desired time period are retrieved from
`CDAWeb <https://cdaweb.gsfc.nasa.gov/>`_. Other spacecraft and data sources
will be added in the future as time permits.
The satellite comparison script
-------------------------------
The basic tool for heliospheric comparisons is the script ``helioSatComp.py``.
This script is available under the ``kaiju/scripts/datamodel`` directory of
your clone of the ``kaiju`` repository. Executing this script with the ``-h``
option will provide the following help text:
.. code-block:: bash
usage: helioSatComp.py [-h] [-c command] [-d] [--deltaT deltaT] [-i runid]
[-k] [-n number_segments] [-p path] [--Rin Rin]
[--Rout Rout] [-s satellite_id] [-v]
Extract satellite trajectory and observations for various
heliospheric spacecraft from CDAWeb. Produce comparisons between the
observations and corresponding gamhelio model results.
optional arguments:
-h, --help show this help message and exit
-c command, --cmd command
Full path to sctrack.x command (default: $KAIJUHOME/build/bin/sctrack.x).
-d, --debug Print debugging output (default: False).
--deltaT deltaT Time interval (seconds) for ephemeris points from CDAWeb (default: 3600.0).
-i runid, --id runid ID string of the run (default: hsphere)
-k, --keep Keep intermediate files (default: False).
-n number_segments, --numSeg number_segments
Number of segments to simultaneously process (default: 1).
-p path, --path path Path to directory containing gamhelio results (default: .)
--Rin Rin Inner boundary (Rsun) of gamhelio grid (default: 21.5).
--Rout Rout Outer boundary (Rsun) of gamhelio grid (default: 220.0).
-s satellite_id, --satId satellite_id
Name of Satellite to compare
-v, --verbose Print verbose output (default: False).
The ``-h, --help`` argument will print the help message above.
The ``-c command, --cmd command`` argument specifies the path to the Fortran
program ``sctrack.x``, which is used to interpolate the simulation results
to the times and locations for the ephemeris points for the specified
satellite. If you have added ``$KAIJUHOME/build/bin`` to your ``PATH``
environment variable, you should not have to set this option.
The ``-d, --debug`` argument causes the script to output a variety of
debugging output.
The ``--deltaT deltaT`` argument allows you to specify the time interval for
the satellite data that will be retrieved from CDAWeb. The default value of
3600 seconds (1 hour) is usually sufficient, but smaller values of this
argument will allow more detailed comparisons.
The ``-i runid, --id runid`` argument allows you to specify the run ID of the
simulation results file you are using. The run ID is the identifying substring
of the file name. For example, in the simulation results file
``msphere.gam.h5``, the run ID is ``msphere``. Set this argument as needed for
the file you are using for your comparison.
The ``-k, --keep`` argument forces the script to retain intermediate files
generated during the data retrieval from CDAWeb. This is useful for debugging.
The ``-n number_segments, --numSeg number_segments`` allows you to specify the
number of segments in your simulation results. A segment is a chunk of the
ephemeris that is processed in parallel with other chunks. For example, if you
use the argument ``-n 2``, the ephemeris will be split into 2 separate
portions and processed in parallel.
The ``-p path, --path path`` argument allows you to specify the path to a
directory containing the simulation results file you wish to use for the
satellite comparison. By default, the current directory is searched for the
file.
The ``--Rin Rin`` and ``--Rout Rout`` arguments allow you to specify the inner
and outer radii (in solar radii) of your simulation results. You should not
need to change these values.
The ``-s satellite_id, --satId satellite_id`` argument allows you to specify
the name of the spacecraft to use for the data comparison. Currently, only
'ACE' is supported.
The ``-v, --verbose`` argument causes the script to print informational
messages during execution.
Procedure
---------
The basic procedure for the satellite comparison is simple:
Run the ``helioSatComp.py`` script.
Examine the resulting plots.
You will almost always have to specify at least a few arguments to the
``helioSatComp.py`` script in order to perform the comparison with satellite
data. This is best explained using an example.
Example
-------
For this example, assume we have a directory containing a set of 1 or more HDF
5 files generated by ``gamhelio.x`` or ``gamhelio_mpi.x``. It might look like
this, if generated by ``gamhelio.x``:
.. code-block:: bash
heliogrid.h5 wsa.gam.Res.00000.h5. wsa.gam.h5
innerbc.h5 wsa.gam.Res.XXXXX.h5
The simulation results are in the file ``wsa.gam.h5``. To compare the
simulation results in this file to data measured by the ACE spacecraft during
the same time period, you will need to specify the run ID (``wsa``) and the
satellite ID (``ACE``). And for the sake of illustration, assume you have
built your code in a non-standard location (``$KAIJUHOME/build_serial`` rather
than ``$KAIJUHOME/build``). You can run the comparison with the following
command:
.. code-block:: bash
helioSatComp.py -v --id=wsa --cmd=$KAIJUHOME/build_serial/bin/sctrack.x --satId=ACE
When this command completes, your directory will contain several new files:
.. code-block:: bash
ACE-traj.png
ACE-error.txt
ACE.png
ACE.comp.cdf
sctrack.out
The file ``ACE-traj.png`` contains a set of plots illustrating the portion of
the spacecraft trajectory used in the comparison. An example is shown below:
.. image:: https://bitbucket.org/repo/kMoBzBp/images/3648821514-ACE-traj.png
:target: https://bitbucket.org/repo/kMoBzBp/images/3648821514-ACE-traj.png
:alt: ACE-traj.png
The file ``ACE-error.txt`` contains the error statistics for the variables
examined in the comparison. For the ACE spacecraft, these variables are:
``MagneticField``, ``Density``, ``Speed``, and ``Temperature``. Other
variables may be added later. This file looks something like this:
.. code-block:: bash
Errors for: Density
MAE: 12.838838182545178
MSE: 171.86416943415404
RMSE: 13.109697534045324
MAPE: 0.9660806312821668
RSE: 24.439823909708686
PE: -23.439823909708686
Errors for: Temperature
MAE: 16256587733.017332
MSE: 2.642806561220219e+20
RMSE: 16256711110.246805
MAPE: 214182.88316333634
RSE: 44568596463.807556
PE: -44568596462.807556
Errors for: Speed
MAE: 499.42446903122
MSE: 260272.54722637442
RMSE: 510.1691359013934
MAPE: 0.9986149471382262
RSE: 23.99323569578531
PE: -22.99323569578531
Errors for: Br
MAE: 3.0560021664580166
MSE: 12.410819281122311
RMSE: 3.5228992720658807
MAPE: 0.9957802458984243
RSE: 1.0323378185670626
PE: -0.03233781856706264
The file ``ACE.png`` contains plots which show the comparison of the
simulation and measured data. An example of this file is shown below.
.. image:: https://bitbucket.org/repo/kMoBzBp/images/1302341116-ACE.png
:target: https://bitbucket.org/repo/kMoBzBp/images/1302341116-ACE.png
:alt: ACE.png
The file ``ACE.comp.cdf`` is a `CDF file <https://cdf.gsfc.nasa.gov/>`_
containing the measured and interpolated simulated data for comparison.
The file ``sctrack.out`` contains a log generated by the program
``sctrack.x``, which performs the interpolation of the simulation results to
the ephemeris points.

View File

@@ -1,8 +0,0 @@
Satellite Comparisons
=====================
.. toctree::
:maxdepth: 1
heliosatcomp
msphsatcomp

View File

@@ -1,216 +0,0 @@
Comparison of magnetosphere simulation results with satellite data
==================================================================
Introduction
------------
Comparison of simulation results to observed data is a critical part of model
validation. The MAGE team has developed a set of tools to facilitate
comparison of the results of terrestrial magnetospheric simulations with data
measured by spacecraft. The initial version of these tools supports comparison
of the results from runs of ``voltron.x`` or ``voltron_mpi.x`` to data from
several spacecraft. The observations for the desired time period are retrieved
from `CDAWeb <https://cdaweb.gsfc.nasa.gov/>`_. Other spacecraft and data
sources will be added in the future as time permits.
The satellite comparison script
-------------------------------
The basic tool for terrestrial magnetospheric comparisons is the script
``msphSatComp.py``. This script is available under the
``kaiju/scripts/datamodel`` directory of your clone of the ``kaiju``
repository. Executing this script with the ``-h`` option will provide the
following help text:
.. code-block:: bash
usage: msphSatComp.py [-h] [-id runid] [-path path] [-cmd command] [-satId Satellite Id] [-numSeg Number of segments] [--keep]
Extracts information from satellite trajectory for various
spacecraft. Space craft data is pulled from CDAWeb. Output CDF files
contain data pulled from CDAWeb along with data extracted from GAMERA.
Image files of satellite comparisons are also produced.
optional arguments:
-h, --help show this help message and exit
-id runid RunID of data (default: msphere)
-path path Path to directory containing REMIX files (default: .)
-cmd command Full path to sctrack.x command
-satId Satellite Id Name of Satellite to compare
-numSeg Number of segments
Number of segments to simulateously process
--keep Keep intermediate files
The ``-h, --help`` argument will print the help message above.
The ``-i runid`` argument allows you to specify the run ID of the simulation
results file you are using. The run ID is the identifying substring of the
file name. For example, in the simulation results file ``msphere.gam.h5``, the
run ID is ``msphere``. Set this argument as needed for the file you are using
for your comparison.
The ``-p path`` argument allows you to specify the path to a directory
containing the simulation results files you wish to use for the satellite
comparison. By default, the current directory is searched for the file.
The ``-cmd command`` argument specifies the path to the Fortran program
``sctrack.x``, which is used to interpolate the simulation results to the
times and locations for the ephemeris points for the specified satellite. If
you have added ``$KAIJUHOME/build/bin`` to your ``PATH`` environment variable,
you should not have to set this option.
The ``-satId Satellite Id`` argument allows you to specify the name of the
spacecraft to use for the data comparison. The currently supported spacecraft
are ``GOES11``, ``GOES12``, ``GEOTAIL``, ``RBSPA``, ``RBSPB``, ``CLUSTER1``,
``CLUSTER2``, ``CLUSTER3``, ``CLUSTER4``, ``THEMISA``, ``THEMISB``,
``THEMISC``, ``THEMISD``, ``THEMISE``, ``MMS1``, ``MMS2``, ``MMS3``, ``MMS4``.
The ``-numSeg Number of segments`` allows you to specify the number of
segments to use when processing your simulation results. A segment is a chunk
of the ephemeris that is processed in parallel with other chunks. For example,
if you use the argument ``-n 2``, the ephemeris will be split into 2 separate
portions and processed in parallel.
The ``--keep`` argument forces the script to retain intermediate files
generated during the data retrieval from CDAWeb. This is useful for debugging.
Procedure
---------
The basic procedure for the satellite comparison is simple:
Run the ``msphSatComp.py`` script.
Examine the resulting plots.
You will almost always have to specify at least a few arguments to the
``msphSatComp.py`` script in order to perform the comparison with satellite
data. This is best explained using an example.
Example
-------
For this example, assume we have a directory containing a set of HDF 5 files
generated by ``voltron.x``. It might look like this:
.. code-block:: bash
bcwind.h5
lfmD.h5
msphere.RCM.Res.00000.h5
msphere.RCM.Res.00001.h5
msphere.RCM.Res.00002.h5
msphere.RCM.Res.XXXXX.h5
msphere.gam.Res.00000.h5
msphere.gam.Res.00001.h5
msphere.gam.Res.00002.h5
msphere.gam.Res.XXXXX.h5
msphere.gam.h5
msphere.mhd2imag.Res.00000.h5
msphere.mhd2imag.Res.00001.h5
msphere.mhd2imag.Res.00002.h5
msphere.mhd2imag.Res.XXXXX.h5
msphere.mhdrcm.h5
msphere.mix.Res.00000.h5
msphere.mix.Res.00001.h5
msphere.mix.Res.00002.h5
msphere.mix.Res.XXXXX.h5
msphere.mix.h5
msphere.rcm.h5
msphere.volt.Res.00000.h5
msphere.volt.Res.00001.h5
msphere.volt.Res.00002.h5
msphere.volt.Res.XXXXX.h5
msphere.volt.h5
rcmconfig.h5
The simulation results are in the file named ``msphere.gam.h5``. To compare
the simulation results in these files to data measured by the CLUSTER2
spacecraft during the same time period, you will need to specify the run ID
(``msphere``) and the satellite ID (``CLUSTER2``). And for the sake of
illustration, assume you have built your code in a non-standard location
(``$KAIJUHOME/build_serial`` rather than ``$KAIJUHOME/build``). You can run
the comparison with the following command:
.. code-block:: bash
msphSatComp.py -id msphere -cmd $KAIJUHOME/build_serial/bin/sctrack.x -satId CLUSTER2
When this command completes, your directory will contain several new files:
.. code-block:: bash
CLUSTER2-error.txt
CLUSTER2-traj.png
CLUSTER2.comp.cdf
CLUSTER2.png
The file ``CLUSTER2-error.txt`` contains the error statistics for the
variables examined in the comparison. This file looks something like this:
.. code-block:: bash
Errors for: MagneticField,0
MAE: 1.9302265646500913
MSE: 3.9882785771605453
RMSE: 1.9970674943928524
MAPE: 0.9822277021232483
RSE: 15.696381926540374
PE: -14.696381926540374
Errors for: MagneticField,1
MAE: 0.3233028097934419
MSE: 0.18348260708329495
RMSE: 0.4283486980058361
MAPE: 0.16666691285364862
RSE: 0.3886266092261268
PE: 0.6113733907738732
Errors for: MagneticField,2
MAE: 0.33418238031124625
MSE: 0.20086373069554195
RMSE: 0.4481782354103576
MAPE: 0.8533060324302149
RSE: 0.3807987168111359
PE: 0.619201283188864
The file ``CLUSTER2-traj.png`` contains a set of plots illustrating the
portion of the spacecraft trajectory used in the comparison. An example is
shown below:
.. image:: https://bitbucket.org/repo/kMoBzBp/images/2466003589-CLUSTER2-traj.png
:target: https://bitbucket.org/repo/kMoBzBp/images/2466003589-CLUSTER2-traj.png
:alt: CLUSTER2-traj.png
The file ``CLUSTER2.png`` contains plots which show the comparison of the
simulation and measured data. An example of this file is shown below.
.. image:: https://bitbucket.org/repo/kMoBzBp/images/3007037072-CLUSTER2.png
:target: https://bitbucket.org/repo/kMoBzBp/images/3007037072-CLUSTER2.png
:alt: CLUSTER2.png
The file ``RBSPA.comp.cdf`` is a `CDF file <https://cdf.gsfc.nasa.gov/>`_
containing the measured and interpolated simulated data for comparison.
Parallel processing
-------------------
If you have a long simulation interval you may want to use one of parallel
processing versions of magnetosphere satellite comparison scripts to speed up
the process of conducting the comparison with the satellite observations. In
general, these scripts have the same options as the serial version but allow
for running interactively on multiple processors or submitting a batch job on
NCAR's HPC system to compute the interpolations across N time slices in
parallel.
The ``msphParallelComp.py`` computes interpolation in parallel in an
interactive session by using the ``numSeg`` option to break the interval up
into numSeg fractions of equal length and then combining the results back into
a single file.
The ``msphPBsSatComp.py`` version of the script submits jobs to NCAR's data
analysis machine casper to complete parallel processing of the specified
output files. To use this script users need to supply a valid NCAR project
number with the ``-acct`` option. Users do not need to specify a number of
segments as the script looks at run length and computes the number of jobs
need to complete the parallel processing.

View File

@@ -1,254 +0,0 @@
A simple example
================
The instructions below walk you through the process of running a simple
magnetosphere model to test your build of the ``kaiju`` code.
Preparing to run a ``kaiju`` model
----------------------------------
To set up your environment to run the ``kaiju`` software, the following steps
are required.
1. Load the same modules that you loaded when you built the ``kaiju``
software. For example, on `derecho`, you would run the following commands:
.. code-block:: bash
$ module load ncarenv/23.06
$ module load cmake/3.26.3
$ module load craype/2.7.20
$ module load intel/2023.0.0
$ module load ncarcompilers/1.0.0
$ module load cray-mpich/8.1.25
$ module load hdf5-mpi/1.12.2
2. *Source* (not *run*) the environment setup script for the ``kaiju``
software. For example, if the root of your ``kaiju`` repository clone is
at ``$HOME/kaiju``, then you would run:
.. code-block:: bash
$ source $HOME/kaiju/scripts/setupEnvironment.sh
Running a simple magnetosphere problem
--------------------------------------
The ``kaiju`` software needs several files in order to run. The detailed steps
for creating these files have been combined into a script called
``makeitso.py``. The script is provided in the ``kaiju`` code repository. You
can see the options supported my ``makeitso.py`` by running it with the
``--help`` or ``-h`` command-line option.
.. code-block:: bash
$ makeitso.py --help
usage: makeitso.py [-h] [--clobber] [--debug] [--mode MODE] [--options_path OPTIONS_PATH] [--verbose]
Interactive script to prepare a MAGE magnetosphere model run.
optional arguments:
-h, --help show this help message and exit
--clobber Overwrite existing options file (default: False).
--debug, -d Print debugging output (default: False).
--mode MODE User mode (BASIC|INTERMEDIATE|EXPERT) (default: BASIC).
--options_path OPTIONS_PATH, -o OPTIONS_PATH
Path to JSON file of options (default: None)
--verbose, -v Print verbose output (default: False).
For this example, we will use run the code on ``derecho``, and use the default
``BASIC`` mode, which requires the minimum amount of input from the user. At
each prompt, you can either type in a value, or hit the ``Return`` key to
accept the default value (shown in square brackets at the end of the prompt).
To get started, run ``makeitso.py`` with no arguments:
.. code-block:: bash
$ source ~/local/cdf/3.9.0/bin/definitions.B
$ makeitso.py
Name to use for PBS job(s) [geospace]:
Do you have an existing boundary condition file to use? (Y|N) [N]:
Start date for simulation (yyyy-mm-ddThh:mm:ss) [2016-08-09T09:00:00]:
Stop date for simulation (yyyy-mm-ddThh:mm:ss) [2016-08-09T11:00:00]:
Do you want to split your job into multiple segments? (Y|N) [N]:
GAMERA grid type (D|Q|O|H) [Q]:
Name of HPC system (derecho|pleiades) [pleiades]: derecho
PBS account name [ewinter]: <YOUR_ACCOUNT_HERE>
Run directory [.]:
Path to kaiju installation [/glade/u/home/ewinter/cgs/aplkaiju/kaiju-private/development/kaiju-private]: <YOUR_KAIJUHOME_HERE>
Path to kaiju build directory [/glade/u/home/ewinter/cgs/aplkaiju/kaiju-private/development/kaiju-private/build_mpi]: <YOUR_BUILD_DIRECTORY_HERE>
PBS queue name (develop|main|preempt) [main]:
Job priority (regular|economy) [economy]:
WARNING: You are responsible for ensuring that the wall time is sufficient to run a segment of your simulation!
Requested wall time for each PBS job segment (HH:MM:SS) [01:00:00]: 12:00:00
(GAMERA) Relative path to HDF5 file containing solar wind boundary conditions [bcwind.h5]:
(VOLTRON) File output cadence in simulated seconds [60.0]:
After these inputs, the script fetches data from CDAWeb for the specified time
range to use in the solar wind boundary condition file, and generates XML and
PBS files for the run, as well as a grid file for use in the model.
You should see output similar to this:
.. code-block:: bash
Generating Quad LFM-style grid ...
Output: lfmQ.h5
Size: (96,96,128)
Inner Radius: 2.000000
Sunward Outer Radius: 30.000000
Tail Outer Radius: 322.511578
Low-lat BC: 45.000000
Ring params:
<ring gid="lfm" doRing="T" Nr="8" Nc1="8" Nc2="16" Nc3="32" Nc4="32" Nc5="64" Nc6="64" Nc7="64" Nc8="64"/>
Writing to lfmQ.h5
/glade/u/home/ewinter/miniconda3/envs/kaiju-3.8/lib/python3.8/site-packages/spacepy/time.py:2367: UserWarning: Leapseconds may be out of date. Use spacepy.toolbox.update(leapsecs=True)
warnings.warn('Leapseconds may be out of date.'
Retrieving f10.7 data from CDAWeb
Retrieving solar wind data from CDAWeb
Using Bx fields
Bx Fit Coefficients are [-3.78792744 -0.77915822 -1.0774984 ]
Saving "OMNI_HRO_1MIN.txt_bxFit.png"
Converting to Gamera solar wind file
Found 21 variables and 120 lines
Offsetting from LFM start ( 0.00 min) to Gamera start ( 0.00 min)
Saving "OMNI_HRO_1MIN.txt.png"
Writing Gamera solar wind to bcwind.h5
Reading /glade/derecho/scratch/ewinter/cgs/aplkaiju/kaipy-private/development/kaipy-private/kaipy/rcm/dktable
Reading /glade/derecho/scratch/ewinter/cgs/aplkaiju/kaipy-private/development/kaipy-private/kaipy/rcm/wmutils/chorus_polynomial.txt
Dimension of parameters in Chorus wave model, Kp: 6 MLT: 97 L: 41 Ek: 155
Wrote RCM configuration to rcmconfig.h5
Template creation complete!
The PBS scripts ['./geospace-00.pbs'] have been created, each with a corresponding XML file. To submit the jobs with the proper dependency (to ensure each segment runs in order), please run the script geospace_pbs.sh like this:
bash geospace_pbs.sh
You should now see the following files in your run directory:
.. code-block:: bash
$ ls
bcwind.h5 geospace.json OMNI_HRO_1MIN.txt_bxFit.png
geospace-00.pbs geospace_pbs.sh OMNI_HRO_1MIN.txt.png
geospace-00.xml lfmQ.h5 rcmconfig.h5
The image files are summaries of the CDAWeb data used in the initial condition
file (``bcwind.h5``). Those plots should look like:
.. image:: Bx_fit.png
.. image:: sw.png
Finally, submit the model run using the script generated by ``makeitso.py``.
You will see the resulting PBS job ID.
.. code-block:: bash
$ bash geospace_pbs.sh
7808651.desched1
Once the job is started in the queue, it should take about 80 minutes to run
(on ``derecho``). When complete, you will see the following in your run
directory:
.. code-block:: bash
$ ls
bcwind.h5 geospace_0004_0004_0001_0002_0003_0000.gam.Res.00002.h5
geospace_0004_0004_0001_0000_0000_0000.gam.h5 geospace_0004_0004_0001_0002_0003_0000.gam.Res.00003.h5
geospace_0004_0004_0001_0000_0000_0000.gam.Res.00000.h5 geospace_0004_0004_0001_0002_0003_0000.gam.Res.00004.h5
geospace_0004_0004_0001_0000_0000_0000.gam.Res.00001.h5 geospace_0004_0004_0001_0002_0003_0000.gam.Res.XXXXX.h5
geospace_0004_0004_0001_0000_0000_0000.gam.Res.00002.h5 geospace_0004_0004_0001_0003_0000_0000.gam.h5
geospace_0004_0004_0001_0000_0000_0000.gam.Res.00003.h5 geospace_0004_0004_0001_0003_0000_0000.gam.Res.00000.h5
geospace_0004_0004_0001_0000_0000_0000.gam.Res.00004.h5 geospace_0004_0004_0001_0003_0000_0000.gam.Res.00001.h5
geospace_0004_0004_0001_0000_0000_0000.gam.Res.XXXXX.h5 geospace_0004_0004_0001_0003_0000_0000.gam.Res.00002.h5
geospace_0004_0004_0001_0000_0001_0000.gam.h5 geospace_0004_0004_0001_0003_0000_0000.gam.Res.00003.h5
geospace_0004_0004_0001_0000_0001_0000.gam.Res.00000.h5 geospace_0004_0004_0001_0003_0000_0000.gam.Res.00004.h5
geospace_0004_0004_0001_0000_0001_0000.gam.Res.00001.h5 geospace_0004_0004_0001_0003_0000_0000.gam.Res.XXXXX.h5
geospace_0004_0004_0001_0000_0001_0000.gam.Res.00002.h5 geospace_0004_0004_0001_0003_0001_0000.gam.h5
geospace_0004_0004_0001_0000_0001_0000.gam.Res.00003.h5 geospace_0004_0004_0001_0003_0001_0000.gam.Res.00000.h5
geospace_0004_0004_0001_0000_0001_0000.gam.Res.00004.h5 geospace_0004_0004_0001_0003_0001_0000.gam.Res.00001.h5
geospace_0004_0004_0001_0000_0001_0000.gam.Res.XXXXX.h5 geospace_0004_0004_0001_0003_0001_0000.gam.Res.00002.h5
geospace_0004_0004_0001_0000_0002_0000.gam.h5 geospace_0004_0004_0001_0003_0001_0000.gam.Res.00003.h5
geospace_0004_0004_0001_0000_0002_0000.gam.Res.00000.h5 geospace_0004_0004_0001_0003_0001_0000.gam.Res.00004.h5
geospace_0004_0004_0001_0000_0002_0000.gam.Res.00001.h5 geospace_0004_0004_0001_0003_0001_0000.gam.Res.XXXXX.h5
geospace_0004_0004_0001_0000_0002_0000.gam.Res.00002.h5 geospace_0004_0004_0001_0003_0002_0000.gam.h5
geospace_0004_0004_0001_0000_0002_0000.gam.Res.00003.h5 geospace_0004_0004_0001_0003_0002_0000.gam.Res.00000.h5
geospace_0004_0004_0001_0000_0002_0000.gam.Res.00004.h5 geospace_0004_0004_0001_0003_0002_0000.gam.Res.00001.h5
geospace_0004_0004_0001_0000_0002_0000.gam.Res.XXXXX.h5 geospace_0004_0004_0001_0003_0002_0000.gam.Res.00002.h5
geospace_0004_0004_0001_0000_0003_0000.gam.h5 geospace_0004_0004_0001_0003_0002_0000.gam.Res.00003.h5
geospace_0004_0004_0001_0000_0003_0000.gam.Res.00000.h5 geospace_0004_0004_0001_0003_0002_0000.gam.Res.00004.h5
geospace_0004_0004_0001_0000_0003_0000.gam.Res.00001.h5 geospace_0004_0004_0001_0003_0002_0000.gam.Res.XXXXX.h5
geospace_0004_0004_0001_0000_0003_0000.gam.Res.00002.h5 geospace_0004_0004_0001_0003_0003_0000.gam.h5
geospace_0004_0004_0001_0000_0003_0000.gam.Res.00003.h5 geospace_0004_0004_0001_0003_0003_0000.gam.Res.00000.h5
geospace_0004_0004_0001_0000_0003_0000.gam.Res.00004.h5 geospace_0004_0004_0001_0003_0003_0000.gam.Res.00001.h5
geospace_0004_0004_0001_0000_0003_0000.gam.Res.XXXXX.h5 geospace_0004_0004_0001_0003_0003_0000.gam.Res.00002.h5
geospace_0004_0004_0001_0001_0000_0000.gam.h5 geospace_0004_0004_0001_0003_0003_0000.gam.Res.00003.h5
geospace_0004_0004_0001_0001_0000_0000.gam.Res.00000.h5 geospace_0004_0004_0001_0003_0003_0000.gam.Res.00004.h5
geospace_0004_0004_0001_0001_0000_0000.gam.Res.00001.h5 geospace_0004_0004_0001_0003_0003_0000.gam.Res.XXXXX.h5
geospace_0004_0004_0001_0001_0000_0000.gam.Res.00002.h5 geospace-00.o7808651
geospace_0004_0004_0001_0001_0000_0000.gam.Res.00003.h5 geospace-00.pbs
geospace_0004_0004_0001_0001_0000_0000.gam.Res.00004.h5 geospace-00.xml
geospace_0004_0004_0001_0001_0000_0000.gam.Res.XXXXX.h5 geospace.gamCpl.h5
geospace_0004_0004_0001_0001_0001_0000.gam.h5 geospace.gamCpl.Res.00000.h5
geospace_0004_0004_0001_0001_0001_0000.gam.Res.00000.h5 geospace.gamCpl.Res.00001.h5
geospace_0004_0004_0001_0001_0001_0000.gam.Res.00001.h5 geospace.gamCpl.Res.00002.h5
geospace_0004_0004_0001_0001_0001_0000.gam.Res.00002.h5 geospace.gamCpl.Res.00003.h5
geospace_0004_0004_0001_0001_0001_0000.gam.Res.00003.h5 geospace.gamCpl.Res.00004.h5
geospace_0004_0004_0001_0001_0001_0000.gam.Res.00004.h5 geospace.gamCpl.Res.XXXXX.h5
geospace_0004_0004_0001_0001_0001_0000.gam.Res.XXXXX.h5 geospace.json
geospace_0004_0004_0001_0001_0002_0000.gam.h5 geospace.mhd2imag.Res.00000.h5
geospace_0004_0004_0001_0001_0002_0000.gam.Res.00000.h5 geospace.mhd2imag.Res.00001.h5
geospace_0004_0004_0001_0001_0002_0000.gam.Res.00001.h5 geospace.mhd2imag.Res.00002.h5
geospace_0004_0004_0001_0001_0002_0000.gam.Res.00002.h5 geospace.mhd2imag.Res.00003.h5
geospace_0004_0004_0001_0001_0002_0000.gam.Res.00003.h5 geospace.mhd2imag.Res.00004.h5
geospace_0004_0004_0001_0001_0002_0000.gam.Res.00004.h5 geospace.mhd2imag.Res.XXXXX.h5
geospace_0004_0004_0001_0001_0002_0000.gam.Res.XXXXX.h5 geospace.mhdrcm.h5
geospace_0004_0004_0001_0001_0003_0000.gam.h5 geospace.mix.h5
geospace_0004_0004_0001_0001_0003_0000.gam.Res.00000.h5 geospace.mix.Res.00000.h5
geospace_0004_0004_0001_0001_0003_0000.gam.Res.00001.h5 geospace.mix.Res.00001.h5
geospace_0004_0004_0001_0001_0003_0000.gam.Res.00002.h5 geospace.mix.Res.00002.h5
geospace_0004_0004_0001_0001_0003_0000.gam.Res.00003.h5 geospace.mix.Res.00003.h5
geospace_0004_0004_0001_0001_0003_0000.gam.Res.00004.h5 geospace.mix.Res.00004.h5
geospace_0004_0004_0001_0001_0003_0000.gam.Res.XXXXX.h5 geospace.mix.Res.XXXXX.h5
geospace_0004_0004_0001_0002_0000_0000.gam.h5 geospace_pbs.sh
geospace_0004_0004_0001_0002_0000_0000.gam.Res.00000.h5 geospace.rcm.h5
geospace_0004_0004_0001_0002_0000_0000.gam.Res.00001.h5 geospace.RCM.Res.00000.h5
geospace_0004_0004_0001_0002_0000_0000.gam.Res.00002.h5 geospace.RCM.Res.00001.h5
geospace_0004_0004_0001_0002_0000_0000.gam.Res.00003.h5 geospace.RCM.Res.00002.h5
geospace_0004_0004_0001_0002_0000_0000.gam.Res.00004.h5 geospace.RCM.Res.00003.h5
geospace_0004_0004_0001_0002_0000_0000.gam.Res.XXXXX.h5 geospace.RCM.Res.00004.h5
geospace_0004_0004_0001_0002_0001_0000.gam.h5 geospace.RCM.Res.XXXXX.h5
geospace_0004_0004_0001_0002_0001_0000.gam.Res.00000.h5 geospace.volt.h5
geospace_0004_0004_0001_0002_0001_0000.gam.Res.00001.h5 geospace.volt.Res.00000.h5
geospace_0004_0004_0001_0002_0001_0000.gam.Res.00002.h5 geospace.volt.Res.00001.h5
geospace_0004_0004_0001_0002_0001_0000.gam.Res.00003.h5 geospace.volt.Res.00002.h5
geospace_0004_0004_0001_0002_0001_0000.gam.Res.00004.h5 geospace.volt.Res.00003.h5
geospace_0004_0004_0001_0002_0001_0000.gam.Res.XXXXX.h5 geospace.volt.Res.00004.h5
geospace_0004_0004_0001_0002_0002_0000.gam.h5 geospace.volt.Res.XXXXX.h5
geospace_0004_0004_0001_0002_0002_0000.gam.Res.00000.h5 lfmQ.h5
geospace_0004_0004_0001_0002_0002_0000.gam.Res.00001.h5 nodefile.7808651.desched1
geospace_0004_0004_0001_0002_0002_0000.gam.Res.00002.h5 OMNI_HRO_1MIN.txt_bxFit.png
geospace_0004_0004_0001_0002_0002_0000.gam.Res.00003.h5 OMNI_HRO_1MIN.txt.png
geospace_0004_0004_0001_0002_0002_0000.gam.Res.00004.h5 rcmconfig.h5
geospace_0004_0004_0001_0002_0002_0000.gam.Res.XXXXX.h5 tmp
geospace_0004_0004_0001_0002_0003_0000.gam.h5 voltron_mpi.x
geospace_0004_0004_0001_0002_0003_0000.gam.Res.00000.h5 voltron_mpi.x-geospace-00.out
geospace_0004_0004_0001_0002_0003_0000.gam.Res.00001.h5
Now perform a quick visualization of the results from your model using the
``msphpic.py`` script, provided in the ``kaipy`` package.
.. code-block:: bash
$ msphpic.py -id geospace
This script will create a file called ``qkmsphpic.png``, which should look
like this:
.. image:: qkmsphpic.png

View File

@@ -1,7 +1,7 @@
MPI Test Template
=================
.. code-block:: fortran
.. code-block:: none
module <INSERT MODULE NAME>
use testHelperMpi

View File

@@ -1,701 +0,0 @@
Running a test case
===================
.. Serial
.. ------
.. Before you begin
.. ~~~~~~~~~~~~~~~~
.. Before proceeding, initialize your ``kaiju`` environment by loading the required modules. For the serial version of ``kaiju`` on Pleiades, these commands should work:
.. .. code-block:: shell
.. module purge
.. module load pkgsrc/2021Q2
.. module load comp-intel/2020.4.304
.. module load hdf5/1.8.18_serial
.. For the MPI version of ``kaiju`` on Pleiades, use:
.. .. code-block:: shell
.. module purge
.. module load pkgsrc/2021Q2
.. module load comp-intel/2020.4.304
.. module load mpi-hpe/mpt.2.23
.. module load hdf5/1.8.18_mpt
.. Next, you need to run the setup script for the CDF software used by the ``kaiju`` software:
.. .. code-block:: shell
.. source CDF_INSTALL_DIR/bin/definitions.B
.. where ``CDF_INSTALL_DIR`` is the installation directory for your CDF software. The ``definitions.B`` script is for the ``bash`` shell. Setup scripts for other shells are in the same directory.
.. At this point, you should make sure your python environment is properly configured. For these examples, we assume the use of a ``conda``-based virtual environment for Python 3.8 called ``kaiju-python-3.8``, which should contain all of the required python modules described in LINK_TO_PYTHON_SECTION. Assuming this environment was already created, activate it with the command:
.. .. code-block:: shell
.. conda activate kaiju-python-3.8
.. Next, run the ``kaiju`` setup script appropriate for your shell. For example, using the ``bash`` shell on Pleiades, run:
.. .. code-block:: shell
.. source KAIJU_INSTALL_DIR/scripts/setupEnvironment.sh
.. where ``KAIJU_INSTALL_DIR`` is the path to the directory created when you cloned the ``kaiju`` repository. This script sets the ``KAIJUHOME`` environment variable (it will be the same as ``KAIJU_INSTALL_DIR``), and adds ``kaiju``-specific entries in the ``PATH`` and ``PYTHONPATH`` environment variables.
.. Finally, some of the ``kaiju`` scripts require the ``geopack-2008`` compiled Python module. Add it to your Python module search path using the command:
.. .. code-block:: shell
.. export PYTHONPATH=$KAIJUHOME/external/geopack-2008/build/lib.linux-x86_64-3.8:$PYTHONPATH
.. where the ``lib.XXX`` string is for Pleiades; it may differ for your machine.
.. NOTE: The scripts used below are available as part of the ``kaiju`` code distribution. They were designed to illustrate the steps needed to run a simple model and examine the results. The scripts illustrate how to set up a run, and how to extract data from the result files. Feel free to use these scripts as the starting point for your own scripts using the ``kaiju`` code.
.. NOTE: All examples below assume use of the ``bash`` shell. Modify as needed for your preferred shell. The primary differences will
.. be replacing ``export X=Y`` with ``setenv X Y`` when setting environment variables in the ``csh``/``tcsh`` shells.
.. 2-D field loop convection
.. ~~~~~~~~~~~~~~~~~~~~~~~~~
.. The first example we will run is the ``loop2d`` model. This is a simple 2-D model illustrating the convective transport of a loop-shaped magnetic field created by a linear current. This test case is discussed at `Field Loop Test <https://www.astro.princeton.edu/~jstone/Athena/tests/field-loop/Field-loop.html>`_. This test case uses the *serial* version of the ``kaiju`` code.
.. Begin by adding the directory containing the serial ``kaiju`` binaries to your ``PATH`` environment variable.
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/build/bin:$PATH
.. Next, add the directory containing the the ``loop2d`` example scripts to your ``PATH`` environment variable:
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/quickstart/loop2d:$PATH
.. Now create a new directory to run the ``loop2d`` example (it can be anywhere).
.. .. code-block:: shell
.. cd $HOME
.. mkdir -p kaiju_test/loop2d
.. cd kaiju_test/loop2d
.. The next step is to generate the configuration files for the ``loop2d`` model. This is done using the ``prepare_loop2d.py`` utility:
.. .. code-block:: shell
.. prepare_loop2d.py -v
.. This command will create 3 files in your current directory:
.. * ``loop2d.ini`` - An .ini-format initialization file for running ``gamera.x`` on the ``loop2d`` example.
.. * ``loop2d.xml`` - An XML-format initialization file for running ``gamera.x`` on the ``loop2d`` example, created from ``loop2d.ini``.
.. * ``loop2d.pbs`` - An PBS job script for running the ``gamera.x`` binary on the ``loop2d`` example.
.. NOTE: The conversion from ``.ini`` to ``.xml`` is still under development, so both initialization files are created from templates.
.. NOTE: The PBS script is designed for use on the Pleiades system at NASA Ames. If you are working in a non-HPC environment, the commands listed in the ``loop2d.pbs`` file can be executed manually on your command line.
.. Next, submit the PBS job script for execution. On Pleiades, the job is submitted using the ``qsub`` command:
.. .. code-block:: shell
.. qsub loop2d.pbs
.. Once the job has been accepted in the queue, the run should take about 20-30 seconds (on Pleiades).
.. When complete, you should see in your working directory the input files created by ``prepare_loop2d.py``, and the output files created by ``gamera.x``:
.. .. code-block:: shell
.. bash-4.2$ ls -1
.. gamera.x.loop2d.out
.. loop2d.gam.Res.00000.h5
.. loop2d.gam.Res.XXXXX.h5
.. loop2d.gam.h5
.. loop2d.ini
.. loop2d.o12913999
.. loop2d.pbs
.. loop2d.xml
.. NOTE: The ``gamera.x.loop2d.out`` contains the terminal output generated by the ``gamera.x`` executable. The file ``loop2d.o12913999`` (the digits will be different for your run) contains the PBS log for the job. The ``.h5`` files are the HDF5 output files created by ``gamera.x``.
.. We can now analyze the results of the model run. A simple example analysis can be run using the utility ``run_loop2d_checks.py``:
.. .. code-block:: shell
.. $ run_loop2d_checks.py -v
.. Computing volume-integrated magnetic pressure.
.. Volume-integrated magnetic pressure (SUM(Pb*dV), code units):
.. At start: 1.3978992818223555e-07
.. At end: 1.3276572728297693e-07
.. Your values for the volume-integrated magnetic pressure should be very close to these values. If they are significantly different, please double-check your build and run procedure. If you are unable to identify the cause of the discrepancy, please contact the ``kaiju`` team.
.. Finally, generate a quick-look plot illustrating the model results. For this case, the quick-look plot shows the magnetic pressure in the first and last simulation frames:
.. .. code-block:: shell
.. $ create_loop2d_quicklook.py -v
.. This script will create the file ``loop2d_quicklook.png``. It should look like this:
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/720765616-loop2d_quicklook.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/720765616-loop2d_quicklook.png
.. :alt: loop2d_quicklook.png
.. **Note for Cheyenne users** Sometimes you may get an error message of
.. *"No module h5py"*. If ncar_pylib has been loaded, try to
.. ##deactivate## and then re-activate.
.. Geospace example (serial)
.. ~~~~~~~~~~~~~~~~~~~~~~~~~
.. An additional serial test case is available, in the ``geo_serial`` model. This model examines the terrestrial magnetosphere during the period 2016-08-09T09:00:00 to 2016-08-09T11:00:00.
.. Begin by adding the directory containing the serial ``kaiju`` binaries to your ``PATH`` environment variable.
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/build/bin:$PATH
.. To use the ``geo_serial`` test case:
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/quickstart/geo_serial:$PATH
.. This is a more complex example of using the serial ``kaiju`` code, as it includes several preprocessing steps, and uses the ``voltron.x`` binary rather that the ``gamera.x`` binary.
.. The next step is to generate the configuration and input files for the ``geo_serial`` model. This is done using the ``prepare_geo_serial.py`` utility:
.. .. code-block:: shell
.. prepare_geo_serial.py -v
.. This command will create 7 files in your current directory:
.. * ``geo_serial.ini`` - An .ini-format initialization file for running ``voltron.x`` on the ``geo_serial`` example.
.. * ``geo_serial.pbs`` - An PBS job script for running ``voltron.x`` on the ``geo_serial`` example.
.. * ``geo_serial.xml`` - An XML-format initialization file for running ``voltron.x`` on the ``geo_serial`` example, created from ``geo_serial.ini``.
.. * bcwind.h5 - A HDF5 file containing solar wind data from `OMNIWeb <https://omniweb.gsfc.nasa.gov/ow.html>`_ to use as a boundary condition for the simulation.
.. * lfmD.h5 - A HDF5 file containing the grid to use for the simulation.
.. * OMNI_HRO_1MIN.txt.png - A quick-look plot of the solar wind data from OMNIWeb.
.. * rcmconfig.h5 - A HDF5 file containing configuration parameters for the Rice Convection Model (RCM) used by ``voltron.x``.
.. NOTE: The detailed steps for manually generating these files are described `here <geoQuickStart>`_.
.. NOTE: The conversion from ``.ini`` to ``.xml`` is still under development, so both initialization files are created from templates.
.. The PBS script is designed for use on the Pleiades system at NASA Ames. If you are working in a non-HPC environment, the commands listed in this file can be executed manually on your command line. On Pleiades, the job is submitted using the ``qsub`` command:
.. .. code-block:: shell
.. qsub geo_serial.pbs
.. Once the job has been accepted in the queue, the run should take about an hour (on Pleiades).
.. When complete, you should see in your working directory the output files created by ``voltron.x``, along with the files created by ``prepare_geo_serial.py``:
.. .. code-block:: shell
.. bash-4.2$ ls -1
.. OMNI_HRO_1MIN.txt.png
.. bcwind.h5
.. geo_serial.ini
.. geo_serial.o12920410
.. geo_serial.pbs
.. geo_serial.xml
.. lfmD.h5
.. msphere.RCM.Res.00000.h5
.. msphere.RCM.Res.00001.h5
.. msphere.RCM.Res.00002.h5
.. msphere.RCM.Res.XXXXX.h5
.. msphere.gam.Res.00000.h5
.. msphere.gam.Res.00001.h5
.. msphere.gam.Res.00002.h5
.. msphere.gam.Res.XXXXX.h5
.. msphere.gam.h5
.. msphere.mhd2imag.Res.00000.h5
.. msphere.mhd2imag.Res.00001.h5
.. msphere.mhd2imag.Res.00002.h5
.. msphere.mhd2imag.Res.XXXXX.h5
.. msphere.mhdrcm.h5
.. msphere.mix.Res.00000.h5
.. msphere.mix.Res.00001.h5
.. msphere.mix.Res.00002.h5
.. msphere.mix.Res.XXXXX.h5
.. msphere.mix.h5
.. msphere.rcm.h5
.. msphere.volt.Res.00000.h5
.. msphere.volt.Res.00001.h5
.. msphere.volt.Res.00002.h5
.. msphere.volt.Res.XXXXX.h5
.. msphere.volt.h5
.. rcmconfig.h5
.. voltron.x.geo_serial.out
.. The ``voltron.x.geo_serial.out`` contains the terminal output generated by the ``voltron.x`` executable. The file ``geo_serial.o12920410`` (the digits will be different) contains the PBS log for the job.
.. We can now analyze the results of the model run. A simple analysis can be run using the utility ``run_geo_serial_checks.py``:
.. .. code-block:: shell
.. $ run_geo_serial_checks.py -v --runid=msphere
.. Computing volume-integrated magnetic pressure.
.. Volume-integrated magnetic pressure (SUM(Pb*dV), code units):
.. At start: 450951034.31847197
.. At end: 742967690.5682685
.. Your values for the volume-integrated magnetic pressure should be very close to these values. This is actually a dummy statistic - it is not scientifically useful, but the code illustrates how to access and manipulate the results using the ``kaiju`` software.
.. Finally, generate a quick-look plot illustrating the model results. For this case, the quick-look plot shows several plots of different data generated by the model:
.. .. code-block:: shell
.. $ create_geo_serial_quicklook.py -v --runid=msphere
.. This script will create the file ``qkpic.png``. It should look like this:
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/4238837198-qkpic.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/4238837198-qkpic.png
.. :alt: qkpic.png
.. Heliosphere example (serial)
.. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. The ``helio_serial`` example is a heliospheric model, using the heliosphere-specific build of ``gamera`` (``gamhelio.x``).
.. Begin by adding the directory containing the serial ``kaiju`` binaries to your ``PATH`` environment variable.
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/build/bin:$PATH
.. To use the ``helio_serial`` test case, add the model directory to your command path:
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/quickstart/helio_serial:$PATH
.. The next step is to generate the configuration and input files for the ``helio_serial`` model. This is done using the ``prepare_helio_serial.py`` utility:
.. .. code-block:: shell
.. prepare_helio_serial.py -v
.. This command will create several files in your current directory:
.. * ``helio_serial.ini`` - An .ini-format initialization file for running ``gamhelio.x`` on the ``helio_serial`` example.
.. * ``helio_serial.pbs`` - An PBS job script for running ``gamhelio.x on the``helio_serial` example.
.. * ``helio_serial.xml`` - An XML-format initialization file for running ``gamhelio.x`` on the ``helio_serial`` example, created from ``helio_serial.ini``.
.. * heliogrid.h5 - A HDF5 file containing the grid to use for the simulation.
.. * innerbc.h5 - A HDF5 file containing the inner boundary conditions derived from the the WSA (Wang-Sheeley-Arge) model used for this example.
.. NOTE: The detailed steps for manually generating these files are described `here <helioQuickStart>`_.
.. NOTE: The conversion from ``.ini`` to ``.xml`` is still under development, so both initialization files are created from templates.
.. The PBS script is designed for use on the Pleiades system at NASA Ames. If you are working in a non-HPC environment, the commands listed in this file can be executed manually on your command line. On Pleiades, the job is submitted using the ``qsub`` command:
.. .. code-block:: shell
.. qsub helio_serial.pbs
.. Once the job has been accepted in the queue, the run should take about 30 minutes (on Pleiades).
.. When complete, you should see in your working directory the output files created by ``gamhelio.x``, along with the files created by ``prepare_helio_serial.py``:
.. .. code-block:: shell
.. bash-4.2$ ls -1
.. gamhelio.x.helio_serial.out
.. helio_serial.ini
.. helio_serial.o13122765
.. helio_serial.pbs
.. helio_serial.xml
.. heliogrid.h5
.. innerbc.h5
.. wsa.gam.Res.00000.h5
.. wsa.gam.Res.XXXXX.h5
.. wsa.gam.h5
.. The ``gamhelio.x.helio_serial.out`` contains the terminal output generated by the heliosphere version of the ``gamera`` executable (``gamhelio.x``). The file ``helio_serial.o13122765`` (the digits will be different) contains the PBS log for the job.
.. We can now analyze the results of the model run. A simple analysis can be run using the utility ``run_helio_serial_checks.py``:
.. .. code-block:: shell
.. $ run_helio_serial_checks.py -v --runid=wsa
.. Computing volume-integrated magnetic pressure.
.. Volume-integrated magnetic pressure (SUM(Pb*dV), code units):
.. At start: 103622.17348083727
.. At end: 90669.31440751288
.. Your values for the volume-integrated magnetic pressure should be very close to these values. This is actually a dummy statistic - it is not scientifically useful, but the code illustrates how to access and manipulate the results using the ``kaiju`` software.
.. Finally, generate a quick-look plot illustrating the model results. For this case, the quick-look plot shows several plots of different data generated by the model:
.. .. code-block:: shell
.. $ create_helio_serial_quicklook.py -v --runid=wsa
.. This script will create the file ``qkpichelio.png``. It should look like this:
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/2400191851-qkpichelio-min.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/2400191851-qkpichelio-min.png
.. :alt: qkpichelio.png
.. Running a test case (MPI)
.. ^^^^^^^^^^^^^^^^^^^^^^^^^
.. 3-D blast wave
.. ~~~~~~~~~~~~~~
.. The first MPI-based example is the ``bw3d`` model. This is a simple 3-D model of a blast wave with an applied magnetic field. This test case is discussed at `Blast Wave Test <https://www.astro.princeton.edu/~jstone/Athena/tests/blast/blast.html>`_.
.. Begin by adding the directory containing the MPI kaiju binaries to your ``PATH`` environment variable.
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/build_mpi/bin:$PATH
.. Next, add the directory containing the the ``bw3d`` example scripts to your ``PATH`` environment variable:
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/quickstart/bw3d:$PATH
.. Create a new directory to run the ``bw3d`` example (it can be anywhere).
.. .. code-block:: shell
.. cd $HOME
.. mkdir -p kaiju_test/bw3d
.. cd kaiju_test/bw3d
.. The next step is to generate the configuration files for the ``bw3d`` model. This is done using the ``prepare_bw3d.py`` utility:
.. .. code-block:: shell
.. prepare_bw3d.py -v
.. This command will create 3 files in your current directory:
.. * ``bw3d.ini`` - An ``.ini``-format initialization file for running ``gamera_mpi.x`` on the ``bw3d`` example.
.. * ``bw3d.pbs`` - An PBS job script for running ``gamera_mpi.x`` on the ``bw3d`` example.
.. * ``bw3d.xml`` - An XML-format initialization file for running ``gamera_mpi.x`` on the ``bw3d`` example, created from ``bw3d.ini``.
.. NOTE: The conversion from ``.ini`` to ``.xml`` is still under development, so both initialization files are created from templates.
.. The PBS script is designed for use on the Pleiades system at NASA Ames. If you are working in a non-HPC environment, the commands listed in this file can be executed manually on your command line. On Pleiades, the job is submitted using the ``qsub`` command:
.. .. code-block:: shell
.. qsub bw3d.pbs
.. Once the job has been accepted in the queue, the run should take about an hour (on Pleiades).
.. When complete, you should see in your working directory the output files created by ``gamera_mpi.x``:
.. .. code-block:: shell
.. bash-4.2$ ls -1
.. bw3d_0002_0002_0002_0000_0000_0000.gam.h5
.. bw3d_0002_0002_0002_0000_0000_0000.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0000_0000_0000.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0000_0000_0001.gam.h5
.. bw3d_0002_0002_0002_0000_0000_0001.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0000_0000_0001.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0000_0001_0000.gam.h5
.. bw3d_0002_0002_0002_0000_0001_0000.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0000_0001_0000.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0000_0001_0001.gam.h5
.. bw3d_0002_0002_0002_0000_0001_0001.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0000_0001_0001.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0001_0000_0000.gam.h5
.. bw3d_0002_0002_0002_0001_0000_0000.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0001_0000_0000.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0001_0000_0001.gam.h5
.. bw3d_0002_0002_0002_0001_0000_0001.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0001_0000_0001.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0001_0001_0000.gam.h5
.. bw3d_0002_0002_0002_0001_0001_0000.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0001_0001_0000.gam.Res.XXXXX.h5
.. bw3d_0002_0002_0002_0001_0001_0001.gam.h5
.. bw3d_0002_0002_0002_0001_0001_0001.gam.Res.00000.h5
.. bw3d_0002_0002_0002_0001_0001_0001.gam.Res.XXXXX.h5
.. bw3d.ini
.. bw3d.o12920469
.. bw3d.pbs
.. bw3d.xml
.. gamera_mpi.x.bw3d.out
.. The ``gamera_mpi.x.bw3d.out`` contains the terminal output generated by the ``gamera_mpi.x`` executable. The file ``bw3d.o12920469`` (the digits will be different) contains the PBS log for the job.
.. We can now analyze the results of the model run. A simple analysis can be run using the utility ``run_bw3d_checks.py``:
.. .. code-block:: shell
.. $ run_bw3d_checks.py -v
.. Asymmetry metric (SUM(ABS(Pb - Pb.T)*dV), code units):
.. At start: 0.0
.. At end: 0.001626661792365313
.. Your values for the asymmetry metric should be very close to these values.
.. Finally, generate a quick-look plot illustrating the model results. For this case, the quick-look plot shows the magnetic pressure in the first and last simulation frames:
.. .. code-block:: shell
.. $ create_bw3d_quicklook.py -v
.. This script will create the file ``bw3d_quicklook.png``. It should look like this:
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/431662986-bw3d_quicklook.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/431662986-bw3d_quicklook.png
.. :alt: bw3d_quicklook.png
.. Heliosphere example (MPI)
.. ~~~~~~~~~~~~~~~~~~~~~~~~~
.. The next example is an MPI version of the serial heliosphere model described above. The mechanics of the MPI version are slightly different, but the results should be identical.
.. Begin by adding the directory containing the MPI kaiju binaries to your ``PATH`` environment variable.
.. .. code-block:: shell
.. export PATH=$KAIJUHOME/build_mpi/bin:$PATH
.. Next, add the directory containing the the ``helio_mpi`` example scripts to your ``PATH`` environment variable:
.. .. code-block:: shell
.. KAIJUHOME/quickstart/helio_mpi:$PATH
.. The next step is to generate the configuration and input files for the ``helio_mpi`` model. This is done using the ``prepare_helio_mpi.py`` utility:
.. .. code-block:: shell
.. prepare_helio_mpi.py -v
.. This command will create the following files in your current directory:
.. * ``helio_mpi.ini`` - An ``.ini-format`` initialization file for running ``gamera_mpi.x`` on the ``helio_mpi`` example.
.. * ``helio_mpi.pbs`` - A PBS job script for running ``gamera_mpi.x`` on the ``helio_mpi`` example.
.. * ``helio_mpi.xml`` - An XML-format initialization file for running ``gamera_mpi.x`` on the ``helio_mpi`` example, created from ``helio_mpi.ini``.
.. * heliogrid.h5 - A HDF5 file containing the grid to use for the simulation.
.. * innerbc.h5 - A HDF5 file containing the inner boundary conditions derived from the the WSA (Wang-Sheeley-Arge) model used for this example.
.. NOTE: The detailed steps for manually generating these files are described `here <helioQuickStart>`_.
.. NOTE: The conversion from ``.ini`` to ``.xml`` is still under development, so both initialization files are created from templates.
.. The PBS script is designed for use on the Pleiades system at NASA Ames. If you are working in a non-HPC environment, the commands listed in this file can be executed manually on your command line. On Pleiades, the job is submitted using the ``qsub`` command:
.. .. code-block:: shell
.. qsub helio_mpi.pbs
.. Once the job has been accepted in the queue, the run should take about 30 minutes (on Pleiades).
.. When complete, you should see in your working directory the output files created by ``gamera_mpi.x``, along with the files created by ``prepare_helio_mpi.py``:
.. .. code-block:: shell
.. bash-4.2$ ls -1
.. C02DT5CZMD6T-ML:20220328 winteel1$ ls -1 helio_mpi
.. gamera_mpi.x.helio_mpi.out
.. helio_mpi.ini
.. helio_mpi.o13124130
.. helio_mpi.pbs
.. helio_mpi.xml
.. heliogrid.h5
.. innerbc.h5
.. qkpichelio.png
.. wsa_0004_0002_0004_0000_0000_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0000_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0000_0000.gam.h5
.. wsa_0004_0002_0004_0000_0000_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0000_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0000_0001.gam.h5
.. wsa_0004_0002_0004_0000_0000_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0000_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0000_0002.gam.h5
.. wsa_0004_0002_0004_0000_0000_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0000_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0000_0003.gam.h5
.. wsa_0004_0002_0004_0000_0001_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0001_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0001_0000.gam.h5
.. wsa_0004_0002_0004_0000_0001_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0001_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0001_0001.gam.h5
.. wsa_0004_0002_0004_0000_0001_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0001_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0001_0002.gam.h5
.. wsa_0004_0002_0004_0000_0001_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0000_0001_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0000_0001_0003.gam.h5
.. wsa_0004_0002_0004_0001_0000_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0000_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0000_0000.gam.h5
.. wsa_0004_0002_0004_0001_0000_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0000_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0000_0001.gam.h5
.. wsa_0004_0002_0004_0001_0000_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0000_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0000_0002.gam.h5
.. wsa_0004_0002_0004_0001_0000_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0000_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0000_0003.gam.h5
.. wsa_0004_0002_0004_0001_0001_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0001_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0001_0000.gam.h5
.. wsa_0004_0002_0004_0001_0001_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0001_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0001_0001.gam.h5
.. wsa_0004_0002_0004_0001_0001_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0001_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0001_0002.gam.h5
.. wsa_0004_0002_0004_0001_0001_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0001_0001_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0001_0001_0003.gam.h5
.. wsa_0004_0002_0004_0002_0000_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0000_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0000_0000.gam.h5
.. wsa_0004_0002_0004_0002_0000_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0000_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0000_0001.gam.h5
.. wsa_0004_0002_0004_0002_0000_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0000_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0000_0002.gam.h5
.. wsa_0004_0002_0004_0002_0000_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0000_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0000_0003.gam.h5
.. wsa_0004_0002_0004_0002_0001_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0001_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0001_0000.gam.h5
.. wsa_0004_0002_0004_0002_0001_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0001_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0001_0001.gam.h5
.. wsa_0004_0002_0004_0002_0001_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0001_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0001_0002.gam.h5
.. wsa_0004_0002_0004_0002_0001_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0002_0001_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0002_0001_0003.gam.h5
.. wsa_0004_0002_0004_0003_0000_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0000_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0000_0000.gam.h5
.. wsa_0004_0002_0004_0003_0000_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0000_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0000_0001.gam.h5
.. wsa_0004_0002_0004_0003_0000_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0000_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0000_0002.gam.h5
.. wsa_0004_0002_0004_0003_0000_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0000_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0000_0003.gam.h5
.. wsa_0004_0002_0004_0003_0001_0000.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0001_0000.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0001_0000.gam.h5
.. wsa_0004_0002_0004_0003_0001_0001.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0001_0001.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0001_0001.gam.h5
.. wsa_0004_0002_0004_0003_0001_0002.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0001_0002.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0001_0002.gam.h5
.. wsa_0004_0002_0004_0003_0001_0003.gam.Res.00000.h5
.. wsa_0004_0002_0004_0003_0001_0003.gam.Res.XXXXX.h5
.. wsa_0004_0002_0004_0003_0001_0003.gam.h5
.. The ``gamera_mpi.x.helio_mpi.out`` contains the terminal output generated by the heliosphere version of the ``gamera_mpi.x`` executable. The file ``helio_mpi.o13124130`` (the digits will be different) contains the PBS log for the job.
.. We can now analyze the results of the model run. A simple analysis can be run using the utility ``run_helio_mpi_checks.py``:
.. .. code-block:: shell
.. $ run_helio_mpi_checks.py -v --runid=wsa
.. Computing volume-integrated magnetic pressure.
.. Volume-integrated magnetic pressure (SUM(Pb*dV), code units):
.. At start: 103622.17348083727
.. At end: 90669.31440751288
.. Your values for the volume-integrated magnetic pressure should be very close to these values. This is actually a dummy statistic - it is not scientifically useful, but the code illustrates how to access and manipulate the results using the ``kaiju`` software.
.. Finally, generate a quick-look plot illustrating the model results. For this case, the quick-look plot shows several plots of different data generated by the model:
.. .. code-block:: shell
.. $ create_helio_mpi_quicklook.py -v --runid=wsa
.. This script will create the file ``qkpichelio.png``. It should look like this:
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/4029922679-qkpichelio-min.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/4029922679-qkpichelio-min.png
.. :alt: qkpichelio-min.png

View File

@@ -6,7 +6,7 @@ Introduction
This guide assumes that you have already checked out a local copy of our git
repository using the other pages in our wiki. If not, please check other pages
first such as :doc:`the quickstart guide </quickStart/index>`.
first.
This guide will talk about concepts such as pull requests and branching.
Details about them are beyond the scope of this guide, so for additional

View File

@@ -1,7 +0,0 @@
Viewing the results of a ``kaiju`` run
======================================
.. toctree::
:maxdepth: 1
quickLook

Binary file not shown.

Before

Width:  |  Height:  |  Size: 160 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 MiB

View File

@@ -1,291 +0,0 @@
Quick-look plots for model results
==================================
.. ``dbpic.py``
.. ----------------
.. This script visualizes output from the program ``calcdb.x``\ , which computes
.. ground magnetometer readings (delta B) corresponding to a ``gamera``
.. terrestrial magnetosphere run. The script is found at:
.. .. code-block::
.. KAIJUHOME/scripts/quicklook/dbpic.py
.. Help text for the script can be found using the ``--help`` command-line option:
.. .. code-block::
.. usage: dbpic.py [-h] [--debug] [-d directory] [-n step] [-id runid] [-Jr] [-k0 layer] [--spacecraft spacecraft] [-v]
.. Plot the ground magnetic field perturbations for a MAGE magnetosphere run.
.. optional arguments:
.. -h, --help show this help message and exit
.. --debug Print debugging output (default: False).
.. -d directory Directory containing data to read (default: /glade/derecho/scratch/ewinter/cgs/runs/test/makeitso)
.. -n step Time slice to plot (default: -1)
.. -id runid Run ID of data (default: msphere)
.. -Jr Show radial component of anomalous current (default: False).
.. -k0 layer Vertical layer to plot (default: 0)
.. --spacecraft spacecraft
.. Names of spacecraft to plot magnetic footprints, separated by commas (default: None)
.. -v, --verbose Print verbose output (default: False).
.. As an example, running ``dbpic.py`` on the output generated by the magnetosphere quickstart case generates the following file (``qkdbpic.png``):
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/1367296445-qkdbpic.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/1367296445-qkdbpic.png
.. :alt: qkdbpic.png
``dstpic.py``
-------------
This script visualizes Dst output from gamera-RCM. Help text for the script
can be displayed using the ``--help`` command-line option:
.. code-block:: bash
$ dstpic.py --help
usage: dstpic.py [-h] [-d directory] [-id runid] [-tpad time padding] [-swfile filename] [--dps]
Creates simple plot comparing SYM-H from OMNI dataset to Gamera-RCM.
Need to run or point to directory that has the bcwind and msphere.gam
files of interest. Note this calculation only includes the
magnetospheric currents and does not include the ionospheric currents or
the FACS. It should be used with CAUTION and never in a scientific
publication.
optional arguments:
-h, --help show this help message and exit
-d directory Directory to read from (default: /glade/derecho/scratch/ewinter/cgs/runs/test/makeitso)
-id runid RunID of data (default: msphere)
-tpad time padding Time beyond MHD data (in hours) to plot (default: 8)
-swfile filename Solar wind file name (default: bcwind.h5)
--dps Also plot the DPS Dst (default: False)
As an example, running ``dstpic.py`` on the output generated by the standard
magnetosphere quickstart case generates the following file (``qkdstpic.png``):
.. image:: qkdstpic.png
.. ``heliopic.py``
.. -------------------
.. This script visualizes results from ``gamhelio.x``. The script is found at:
.. .. code-block::
.. KAIJUHOME/scripts/quicklook/heliopic.py
.. Help text for the script can be found using the ``-h`` command-line option:
.. .. code-block::
.. usage: heliopic.py [-h] [-d directory] [-id runid] [-n step] [-den] [-nompi]
.. [-p pictype]
.. Creates simple multi-panel figure for Gamera helio run
.. Top Panel - density and radial velocity in equatorial plane
.. Bottom Panel - density and radial velocity in meridional plane
.. optional arguments:
.. -h, --help show this help message and exit
.. -d directory Directory to read from (default: /Users/winteel1)
.. -id runid RunID of data (default: wsa)
.. -n step Time slice to plot (default: -1)
.. -den Show density instead of pressure (default: False)
.. -nompi Don't show MPI boundaries (default: False)
.. -p pictype Type of output image (default: pic1)
.. As an example, running ``heliopic.py`` on the output generated by the standard
.. ``helio_mpi`` quickstart case generates the following file
.. (``qkheliopic.png``) (NOTE: Image is large, so it has been reduced in size for
.. use on the wiki):
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/2600792652-qkpichelio_reduced.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/2600792652-qkpichelio_reduced.png
.. :alt: qkpichelio_reduced.png
``mixpic.py``
-------------
This script visualizes results from REMIX in a magnetosphere run. Help text for
the script can be displayed using the ``--help`` command-line option:
.. code-block:: bash
$ mixpic.py --help
usage: mixpic.py [-h] [--debug] [-id runid] [-n step] [-nflux] [-print] [--spacecraft spacecraft] [-v] [-GTYPE] [-PP] [-vid] [-overwrite]
[--ncpus ncpus] [-nohash] [--coord coord]
Creates simple multi-panel REMIX figure for a GAMERA magnetosphere run.
Top Row - FAC (with potential contours overplotted), Pedersen and Hall Conductances
Bottom Row - Joule heating rate, particle energy and energy flux
optional arguments:
-h, --help show this help message and exit
--debug Print debugging output (default: False).
-id runid Run ID of data (default: msphere)
-n step Time slice to plot (default: -1)
-nflux Show number flux instead of energy flux (default: False)
-print Print list of all steps and time labels, then exit (default: False)
--spacecraft spacecraft
Names of spacecraft to plot magnetic footprints, separated by commas (default: None)
-v, --verbose Print verbose output (default: False).
-GTYPE Show RCM grid type in the eflx plot (default: False)
-PP Show plasmapause (10/cc) in the eflx/nflx plot (default: False)
-vid Make a video and store in mixVid directory (default: False)
-overwrite Overwrite existing vid files (default: False)
--ncpus ncpus Number of threads to use with --vid (default: 1)
-nohash Don't display branch/hash info (default: False)
--coord coord Coordinate system to use (default: SM)
As an example, running ``mixpic.py`` on the output generated by the standard
magnetosphere quickstart case generates the following files (``remix_n.png``,
``remix_s.png``).
.. image:: remix_n.png
.. image:: remix_s.png
``msphpic.py``
--------------
This script visualizes results from GAMERA in a magnetosphere run. Help text
for the script can be displayed using the ``--help`` command-line option:
.. code-block:: bash
$ msphpic.py --help
/glade/u/home/ewinter/miniconda3/envs/kaiju-3.8/lib/python3.8/site-packages/spacepy/time.py:2367: UserWarning: Leapseconds may be out of date. Use spacepy.toolbox.update(leapsecs=True)
warnings.warn('Leapseconds may be out of date.'
usage: msphpic.py [-h] [--debug] [-d directory] [-id runid] [-n step] [-bz] [-den] [-jy] [-ephi] [-noion] [-nompi] [-norcm] [-bigrcm]
[-src] [-v] [--spacecraft spacecraft] [-vid] [-overwrite] [--ncpus ncpus] [-nohash]
[-size {small,std,big,bigger,fullD,dm}]
Creates simple multi-panel figure for Gamera magnetosphere run
Top Panel - Residual vertical magnetic field
Bottom Panel - Pressure (or density) and hemispherical insets
NOTE: There is an optional -size argument for domain bounds options
(default: std), which is passed to kaiViz functions.
optional arguments:
-h, --help show this help message and exit
--debug Print debugging output (default: False).
-d directory Directory containing data to read (default: /glade/derecho/scratch/ewinter/cgs/runs/test/makeitso)
-id runid Run ID of data (default: msphere)
-n step Time slice to plot (default: -1)
-bz Show Bz instead of dBz (default: False).
-den Show density instead of pressure (default: False).
-jy Show Jy instead of pressure (default: False).
-ephi Show Ephi instead of pressure (default: False).
-noion Don't show ReMIX data (default: False).
-nompi Don't show MPI boundaries (default: False).
-norcm Don't show RCM data (default: False).
-bigrcm Show entire RCM domain (default: False).
-src Show source term (default: False).
-v, --verbose Print verbose output (default: False).
--spacecraft spacecraft
Names of spacecraft to plot positions, separated by commas (default: None)
-vid Make a video and store in mixVid directory (default: False)
-overwrite Overwrite existing vid files (default: False)
--ncpus ncpus Number of threads to use with --vid (default: 1)
-nohash Don't display branch/hash info (default: False)
-size {small,std,big,bigger,fullD,dm}
Domain bounds options (default: std)
As an example, running ``msphpic.py`` on the output generated by the standard
magnetosphere quickstart case generates the following file
(``qkmsphpic.png``).
.. image:: qkmsphpic.png
``rcmpic.py``
-------------
This script visualizes results from RCM in a magnetosphere run. Help text
for the script can be displayed using the ``--help`` command-line option:
.. code-block::
$ rcmpic.py --help
usage: rcmpic.py [-h] [-beta] [-big] [-bmin] [--debug] [-d directory] [-elec] [-fac] [-id runid] [-kt] [-n step] [--spacecraft spacecraft]
[-tbnc] [-v] [-vol] [-wgt] [-vid] [-overwrite] [--ncpus ncpus] [-nohash]
Creates simple multi-panel figure for RCM magnetosphere run
Top Panel - XXX
Bottom Panel - XXX
optional arguments:
-h, --help show this help message and exit
-beta Show beta instead of FTE (default: False)
-big Show entire RCM grid (default: False).
-bmin Show B-min (default: False)
--debug Print debugging output (default: False).
-d directory Directory containing data to read (default: /glade/derecho/scratch/ewinter/cgs/runs/test/makeitso)
-elec Show electron pressure (default: False)
-fac Show FAC (default: False)
-id runid Run ID of data (default: msphere)
-kt Show temperature instead of FTE (default: False)
-n step Time slice to plot (default: -1)
--spacecraft spacecraft
Names of spacecraft to plot trajectories, separated by commas (default: None)
-tbnc Show Tb instead of FTE (default: False)
-v, --verbose Print verbose output (default: False).
-vol Show FTV instead of FTE (default: False)
-wgt Show wRCM instead of FTE (default: False)
-vid Make a video and store in mixVid directory (default: False)
-overwrite Overwrite existing vid files (default: False)
--ncpus ncpus Number of threads to use with --vid (default: 1)
-nohash Don't display branch/hash info (default: False)
As an example, running ``mixpic.py`` on the output generated by the standard
magnetosphere quickstart case generates the following files (``remix_n.png``,
``remix_s.png``).
.. image:: qkrcmpic.png
.. ``swpic.py``
.. ----------------
.. This script creates a time vs distance plot from a 2D slice created by
.. ``slice.x``. The script is found at:
.. .. code-block::
.. KAIJUHOME/scripts/quicklook/swpic.py
.. Help text for the script can be found using the ``-h`` command-line option:
.. .. code-block::
.. usage: swpic.py [-h] [-d directory] [-id swid] [-type type]
.. Creates simple multi-panel figure for the
.. solar wind conditions within the bcwind file and saves it as
.. swBCplot.pdf
.. optional arguments:
.. -h, --help show this help message and exit
.. -d directory Directory to read from (default: /Users/winteel1)
.. -id swid Solar wind file used (default: bcwind.h5)
.. -type type Image type (default: png)
.. As an example, running ``swpic.py`` on the output generated by the standard
.. ``geo_mpi`` quickstart case generates the following file (``bcwind.png``):
.. .. image:: https://bitbucket.org/repo/kMoBzBp/images/360434816-bcwind.png
.. :target: https://bitbucket.org/repo/kMoBzBp/images/360434816-bcwind.png
.. :alt: bcwind.png

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.5 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 MiB

View File

@@ -0,0 +1,10 @@
Generic guides for building the ``kaiju`` software
================================================================================
.. toctree::
:maxdepth: 1
build_cdf
build_hdf5
install_intel_compilers
install_python

View File

@@ -32,13 +32,6 @@ These instructions should work on all platforms (MacOS, Linux, HPC).
If version conflicts arise, you can install different versions of each package individually with ``pip``. You may need to adjust package versions based on your operating system and python version. For reference, we provide the following environment files generated from working ``conda``-based environments on a Big Sur Mac, on ``cheyenne.ucar.edu``, and on ``pleiades.nas.nasa.gov``, for python 3.7 and 3.8:
* :doc:`Python 3.7 on a Big Sur Mac <kaiju-3.7_BigSur_20220817.yml>`
* :doc:`Python 3.8 on a Big Sur Mac <kaiju-3.8_BigSur_20220817.yml>`
* :doc:`Python 3.7 on a Monterey Mac <kaiju-3.7_Monterey_20230105.yml>`
* :doc:`Python 3.8 on a Monterey Mac <kaiju-3.8_Monterey_20230105.yml>`
* :doc:`Python 3.7 on pleiades <kaiju-3.7_pleiades_20220817.yml>` - NEEDS UPDATE FOR PROGRESS BAR
* :doc:`Python 3.8 on pleiades <kaiju-3.8_pleiades_20220817.yml>` - NEEDS UPDATE FOR PROGRESS BAR
You can create copies of these environments with the command:
.. code-block:: shell

View File

@@ -0,0 +1,10 @@
Build guides documentation
======================================
.. toctree::
:maxdepth: 4
generic/index
centos-stream-9/index
ubuntu-20.04/index
macos/index

View File

@@ -102,7 +102,6 @@ Building the ``kaiju`` software on Ubuntu 20.04
.. version of ``kaiju`` is not supported on Ubuntu 20.04 at this time.
.. toctree::
:maxdepth: 1
ubuntu-20.04_build_cdf
ubuntu-20.04_build_hdf5

View File

@@ -7,3 +7,4 @@ Code Information
codeOrg
derivation_of_precipitation
interpolationInMAGE
dataCompression

View File

@@ -0,0 +1,23 @@
Internal ``kaiju`` documentation
======================================
This is the documentation for the development team of ``kaiju`` software. ``kaiju`` includes the
Multiscale Atmosphere-Geospace Environment (`MAGE
<https://cgs.jhuapl.edu/Models/>`_) model developed by the `Center for
Geospace Storms <https://cgs.jhuapl.edu/>`_ as well as other scientific
software for simulation of heliospheric environments such as planetary
magnetospheres and the solar wind. This documentation focuses on
`MAGE <https://cgs.jhuapl.edu/Models/>`_ and `GAMERA-helio <https://cgs.jhuapl.edu/Models/gamera.php>`_, i.e., the geospace
and inner heliosphere applications of the ``kaiju`` software.
====================
Table of contents
====================
.. toctree::
:maxdepth: 1
build_guides/index
codeInformation/index
tools/index
_obsolete/index

View File

@@ -1,133 +0,0 @@
JupyterLab on NASA HEC Systems
==============================
What follows below is to compile and use Jupyterlab on Pleiades. Replace
rmalbarr with your specific Pleiades username. Note that you will have a
certain /nobackup directory number (for me it is /nobackupp12/rmalbarr).
Change this according to your number. Similarly, I have /home7/rmalbarr as my
home directory. Change this home directory number for you, accordingly.
Get set up on Pleiades
----------------------
Setup RSA tokens: copy from bitbucket to Pleiades Font End (pfe) and
`follow wiki <https://www.nas.nasa.gov/hecc/support/kb/enabling-your-rsa-securid-hard-token-(fob>`__59.html)
Setup of ssh pass through,
`follow wiki <https://www.nas.nasa.gov/hecc/support/kb/setting-up-ssh-passthrough_232.html>`_
This should enable you to login to pfe with where the passcode is given by
SecureID mobile app (dual-factor authentication):
.. code-block:: bash
ssh pfe
Setup the sup client
`from wiki <https://www.nas.nasa.gov/hecc/support/kb/using-the-secure-unattended-proxy-(sup>`__145.html)
This will enable you to send large files between remote and local servers with
``shiftc``
From local machine, for example, run:
.. code-block:: bash
sup shiftc <files> rmalbar@pfe:/nobackupp12/rmalbarr
Clone repo to your nobackup or home directory on pfe. Here, for example, check
out the hidra branch of kaiju. From pfe prompt, run:
.. code-block:: bash
install home-brew
git lsf install
git branch
git checkout hidra
From
`kaiju wiki <https://bitbucket.org/aplkaiju/kaiju/wiki/quickStart/prerequisites>`_,
run:
.. code-block:: bash
git clone https://YOUR_BITBUCKET_USERNAME@bitbucket.org/aplkaiju/kaiju.git
export KAIJUHOME=$HOME/kaiju
where $HOME, for me, is set to /nobackupp12/rmalbarr. Change this according to
your username and desired directory. Note: Use Atlassian App password for the
clone command above. This is found in your bitbucket profile.
Using Jupyter notebook on Pleiades
----------------------------------
Setup kaiju-3.8 virtual conda environment and Jupyter notebook kernel. From
`kaiju wiki <https://bitbucket.org/aplkaiju/kaiju/wiki/quickStart/install_python.md>`_
copy the .yml file called ``kaiju-3.8_pleiades_20220817.yml`` to your pfe home
directory. Edit the last line in the .yml file to your correct path:
``/swbuild/analytix/tools/miniconda3_220407/envs/kaiju-3.8``
From pfe home directory prompt, run:
.. code-block:: bash
module use -a /swbuild/analytix/tools/modulefiles
module load miniconda3/v4
conda env create -f kaiju-3.8_pleiades_20220817.yml
source activate kaiju-3.8
pip install ipykernel
python -m ipykernel install --user --name=kaiju-npl
If done correctly, it should output:
.. code-block:: bash
Installed kernelspec kaiju-npl in /home7/rmalbarr/.local/share/jupyter/kernels/kaiju-npl
Note: You can check the available environments by running:
.. code-block:: bash
conda info --envs
Reserve dedicated node for Jupyter analysis, where, below uses ivy system
(model=ivy) which is common for dedicated nodes. See
`wiki here for more information <https://www.nas.nasa.gov/hecc/support/kb/using-jupyter-notebook-for-machine-learning-development-on-nas-systems_576.html>`_
From pfe run an interactive mode and note the number of your pfe at command
prompt (which will change each pfe login), e.g. for me it is rmalbarr@pfe22.
From pfe run:
.. code-block:: bash
qsub -I -lselect=1:ncpus=20:model=ivy,walltime=2:00:00
Once on a dedicated node, e.g., node number r437i4n1, navigate to your
directory that has the Jupyter notebook files, then run:
.. code-block:: bash
ssh r437i4n1
module use -a /swbuild/analytix/tools/modulefiles
module load miniconda3/v4
source activate jupyterlab
jupyter lab --no-browser
From you local machine terminal, with the pfe number (e.g., here I have pfe22)
used to reserved the dedicated node and node number used to activate jupyter
lab (e.g., here I have r437i4n1), I run (for my username rmalbarr):
.. code-block:: bash
ssh -l rmalbarr -o “StrictHostKeyChecking ask” -L 8080:localhost:8888 -o ProxyJump=sfe,pfe22 r437i4n1
Then, from your local machine browser, navigate to: https://localhost:8080/ or
https://127.0.0.1:8080 Here, you should be able to access the Jupyter notebook
files and the kaiju-3.8 kernel should be available and working.
Please reach out with any comments or questions to Robert Albarran, Ph.D. via
email to albarran1@atmos.ucla.edu