mirror of
https://github.com/JHUAPL/Terrasaur.git
synced 2026-01-09 20:57:54 -05:00
Compare commits
14 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9e00d48789 | ||
|
|
72c19fb941 | ||
|
|
6212144215 | ||
|
|
f826c9acb8 | ||
|
|
dfaa734383 | ||
|
|
ac0806dc78 | ||
|
|
2d146a8de2 | ||
|
|
5454dd8c3e | ||
|
|
db9ffb5fe2 | ||
|
|
da6d458dbd | ||
|
|
50b82de885 | ||
|
|
901d851e00 | ||
|
|
8ef0a95e1c | ||
|
|
9b95d9cfb1 |
23
CHANGELOG.md
23
CHANGELOG.md
@@ -1,8 +1,25 @@
|
|||||||
# SBCLT Changelog
|
# Terrasaur Changelog
|
||||||
|
|
||||||
Repository is https://gitlab.jhuapl.edu/sbmt/sbclt/command-line-tools. Git hash for each merge to main branch (except the most recent) are listed below in reverse chronological order.
|
## July 30, 2025 - v1.1.0
|
||||||
|
|
||||||
## November 26, 2024
|
- Updates to existing tools
|
||||||
|
- AdjustShapeModelToOtherShapeModel
|
||||||
|
- fix intersection bug
|
||||||
|
- CreateSBMTStructure
|
||||||
|
- new options: -flipX, -flipY, -spice, -date, -observer, -target, -cameraFrame
|
||||||
|
- ValidateNormals
|
||||||
|
- new option: -fast to only check for overhangs if center and normal point in opposite directions
|
||||||
|
|
||||||
|
- New tools:
|
||||||
|
- FacetInfo: Print info about a facet
|
||||||
|
- PointCloudOverlap: Find points in a point cloud which overlap a reference point cloud
|
||||||
|
- TriAx: Generate a triaxial ellipsoid in ICQ format
|
||||||
|
|
||||||
|
## April 28, 2025 - v1.0.1
|
||||||
|
|
||||||
|
- Add MIT license to repository and source code
|
||||||
|
|
||||||
|
## April 27, 2025 - v1.0.0
|
||||||
|
|
||||||
- Initial release
|
- Initial release
|
||||||
|
|
||||||
21
LICENSE.md
Normal file
21
LICENSE.md
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2025 Johns Hopkins Applied Physics Laboratory
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
BIN
doc/dist/Terrasaur-2025.03.03-e1a0e14-src.tar.gz
vendored
BIN
doc/dist/Terrasaur-2025.03.03-e1a0e14-src.tar.gz
vendored
Binary file not shown.
BIN
doc/dist/Terrasaur-2025.03.03-e1a0e14_*.tar.gz
vendored
BIN
doc/dist/Terrasaur-2025.03.03-e1a0e14_*.tar.gz
vendored
Binary file not shown.
@@ -16,7 +16,7 @@ The Terrasaur package requires Java 21 or later. Some freely available versions
|
|||||||
Download
|
Download
|
||||||
~~~~~~~~
|
~~~~~~~~
|
||||||
|
|
||||||
Binary packages for use on Mac OS X and Linux are available at ...
|
Binary packages for use on Mac OS X and Linux are available at `GitHub <https://github.com/JHUAPL/Terrasaur/releases>`__.
|
||||||
|
|
||||||
We have not tried using the softare on Microsoft Windows, but users may try the Linux package with the `Windows Subsystem for Linux <https://docs.microsoft.com/en-us/windows/wsl/>`__.
|
We have not tried using the softare on Microsoft Windows, but users may try the Linux package with the `Windows Subsystem for Linux <https://docs.microsoft.com/en-us/windows/wsl/>`__.
|
||||||
|
|
||||||
|
|||||||
70
doc/make.bat
70
doc/make.bat
@@ -1,35 +1,35 @@
|
|||||||
@ECHO OFF
|
@ECHO OFF
|
||||||
|
|
||||||
pushd %~dp0
|
pushd %~dp0
|
||||||
|
|
||||||
REM Command file for Sphinx documentation
|
REM Command file for Sphinx documentation
|
||||||
|
|
||||||
if "%SPHINXBUILD%" == "" (
|
if "%SPHINXBUILD%" == "" (
|
||||||
set SPHINXBUILD=sphinx-build
|
set SPHINXBUILD=sphinx-build
|
||||||
)
|
)
|
||||||
set SOURCEDIR=.
|
set SOURCEDIR=.
|
||||||
set BUILDDIR=_build
|
set BUILDDIR=_build
|
||||||
|
|
||||||
if "%1" == "" goto help
|
if "%1" == "" goto help
|
||||||
|
|
||||||
%SPHINXBUILD% >NUL 2>NUL
|
%SPHINXBUILD% >NUL 2>NUL
|
||||||
if errorlevel 9009 (
|
if errorlevel 9009 (
|
||||||
echo.
|
echo.
|
||||||
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
|
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
|
||||||
echo.installed, then set the SPHINXBUILD environment variable to point
|
echo.installed, then set the SPHINXBUILD environment variable to point
|
||||||
echo.to the full path of the 'sphinx-build' executable. Alternatively you
|
echo.to the full path of the 'sphinx-build' executable. Alternatively you
|
||||||
echo.may add the Sphinx directory to PATH.
|
echo.may add the Sphinx directory to PATH.
|
||||||
echo.
|
echo.
|
||||||
echo.If you don't have Sphinx installed, grab it from
|
echo.If you don't have Sphinx installed, grab it from
|
||||||
echo.http://sphinx-doc.org/
|
echo.http://sphinx-doc.org/
|
||||||
exit /b 1
|
exit /b 1
|
||||||
)
|
)
|
||||||
|
|
||||||
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
|
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
|
||||||
goto end
|
goto end
|
||||||
|
|
||||||
:help
|
:help
|
||||||
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
|
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
|
||||||
|
|
||||||
:end
|
:end
|
||||||
popd
|
popd
|
||||||
|
|||||||
70
doc/tools/ColorSpots.rst
Normal file
70
doc/tools/ColorSpots.rst
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
.. _ColorSpots:
|
||||||
|
|
||||||
|
##########
|
||||||
|
ColorSpots
|
||||||
|
##########
|
||||||
|
|
||||||
|
ColorSpots takes as input a shape model and a file containing (x, y, z, value),
|
||||||
|
or (lat, lon, value). It writes out the mean and standard deviation of values
|
||||||
|
within a specified range for each facet.
|
||||||
|
|
||||||
|
.. include:: ../toolDescriptions/ColorSpots.txt
|
||||||
|
:literal:
|
||||||
|
|
||||||
|
********
|
||||||
|
Examples
|
||||||
|
********
|
||||||
|
|
||||||
|
Download the :download:`Apophis<./support_files/apophis_g_15618mm_rdr_obj_0000n00000_v001.obj>`
|
||||||
|
shape model and the :download:`info<./support_files/xyzrandom.txt>` file containing
|
||||||
|
cartesian coordinates and a random value.
|
||||||
|
|
||||||
|
Run ColorSpots:
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
ColorSpots -obj apophis_g_15618mm_rdr_obj_0000n00000_v001.obj -xyz \
|
||||||
|
-info xyzrandom.txt -outFile apophis_value_at_vertex.csv -noWeight \
|
||||||
|
-allFacets -additionalFields n -searchRadius 0.015 -writeVertices
|
||||||
|
|
||||||
|
The first few lines of apophis_value_at_vertex.csv look like:
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
% head apophis_value_at_vertex.csv
|
||||||
|
0.000000e+00, 0.000000e+00, 1.664960e-01, -3.805764e-02, 5.342315e-01, 4.000000e+01
|
||||||
|
1.589500e-02, 0.000000e+00, 1.591030e-01, 6.122849e-02, 6.017192e-01, 5.000000e+01
|
||||||
|
7.837000e-03, 1.486800e-02, 1.591670e-01, -6.072964e-03, 5.220682e-01, 5.700000e+01
|
||||||
|
-7.747000e-03, 1.506300e-02, 1.621040e-01, 9.146163e-02, 5.488631e-01, 4.900000e+01
|
||||||
|
-1.554900e-02, 0.000000e+00, 1.657970e-01, -8.172811e-03, 5.270302e-01, 3.400000e+01
|
||||||
|
-7.982000e-03, -1.571100e-02, 1.694510e-01, -2.840524e-02, 5.045911e-01, 3.900000e+01
|
||||||
|
8.060000e-03, -1.543300e-02, 1.655150e-01, 3.531959e-02, 5.464390e-01, 4.900000e+01
|
||||||
|
3.179500e-02, 0.000000e+00, 1.515820e-01, -1.472434e-02, 5.967265e-01, 5.400000e+01
|
||||||
|
2.719700e-02, 1.658200e-02, 1.508930e-01, -9.050683e-03, 5.186966e-01, 4.700000e+01
|
||||||
|
1.554100e-02, 2.901300e-02, 1.530770e-01, -7.053547e-02, 4.980369e-01, 7.000000e+01
|
||||||
|
|
||||||
|
The columns are:
|
||||||
|
|
||||||
|
.. list-table:: ColorSpots Vertex output
|
||||||
|
:header-rows: 1
|
||||||
|
|
||||||
|
* - Column
|
||||||
|
- Value
|
||||||
|
* - 1
|
||||||
|
- X
|
||||||
|
* - 2
|
||||||
|
- Y
|
||||||
|
* - 3
|
||||||
|
- Z
|
||||||
|
* - 4
|
||||||
|
- mean value in region
|
||||||
|
* - 5
|
||||||
|
- standard deviation in region
|
||||||
|
* - 6
|
||||||
|
- number of points in region
|
||||||
|
|
||||||
|
.. figure:: images/ColorSpots-n.png
|
||||||
|
:alt: Number of points in region at each vertex
|
||||||
|
|
||||||
|
This image shows the number of points in the region at each vertex.
|
||||||
|
|
||||||
@@ -23,7 +23,7 @@ Local Model Comparison
|
|||||||
|
|
||||||
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
|
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
|
||||||
shape models. You can view them in a tool such as
|
shape models. You can view them in a tool such as
|
||||||
`ParaView<https://www.paraview.org/>`.
|
`ParaView <https://www.paraview.org/>`__.
|
||||||
|
|
||||||
.. figure:: images/CompareOBJ_local_1.png
|
.. figure:: images/CompareOBJ_local_1.png
|
||||||
|
|
||||||
@@ -32,8 +32,8 @@ shape models. You can view them in a tool such as
|
|||||||
Run CompareOBJ to find the optimal transform to align the comparison with the reference:
|
Run CompareOBJ to find the optimal transform to align the comparison with the reference:
|
||||||
::
|
::
|
||||||
|
|
||||||
CompareOBJ -computeOptimalRotationAndTranslation -model F3H-1/EVAL20.obj \
|
CompareOBJ -computeOptimalRotationAndTranslation -model EVAL20.obj \
|
||||||
-reference F3H-1/EVAL20_wtr.obj -computeVerticalError verticalError.txt \
|
-reference EVAL20_wtr.obj -computeVerticalError verticalError.txt \
|
||||||
-saveOptimalShape optimal.obj -savePlateDiff plateDiff.txt -savePlateIndex plateIndex.txt
|
-saveOptimalShape optimal.obj -savePlateDiff plateDiff.txt -savePlateIndex plateIndex.txt
|
||||||
|
|
||||||
The screen output is
|
The screen output is
|
||||||
@@ -77,7 +77,7 @@ model for comparison:
|
|||||||
|
|
||||||
::
|
::
|
||||||
|
|
||||||
ShapeFormatConverter -input Bennu/Bennu49k.obj -output BennuComparison.obj \
|
ShapeFormatConverter -input Bennu49k.obj -output BennuComparison.obj \
|
||||||
-rotate 5,0,0,1 -translate 0.01,-0.01,0.01
|
-rotate 5,0,0,1 -translate 0.01,-0.01,0.01
|
||||||
|
|
||||||
This rotates the shape model by 5 degrees about the z axis and then translates
|
This rotates the shape model by 5 degrees about the z axis and then translates
|
||||||
@@ -94,11 +94,11 @@ Run CompareOBJ to find the optimal transform to align the comparison with the re
|
|||||||
|
|
||||||
CompareOBJ -computeOptimalRotationAndTranslation \
|
CompareOBJ -computeOptimalRotationAndTranslation \
|
||||||
-model BennuComparison.obj \
|
-model BennuComparison.obj \
|
||||||
-reference Bennu/Bennu49k.obj \
|
-reference Bennu49k.obj \
|
||||||
-computeVerticalError CompareOBJ/terrasaur-verticalError.txt \
|
-computeVerticalError terrasaur-verticalError.txt \
|
||||||
-saveOptimalShape CompareOBJ/terrasaur-optimal.obj \
|
-saveOptimalShape terrasaur-optimal.obj \
|
||||||
-savePlateDiff CompareOBJ/terrasaur-plateDiff.txt \
|
-savePlateDiff terrasaur-plateDiff.txt \
|
||||||
-savePlateIndex CompareOBJ/terrasaur-plateIndex.txt
|
-savePlateIndex terrasaur-plateIndex.txt
|
||||||
|
|
||||||
The screen output is
|
The screen output is
|
||||||
|
|
||||||
|
|||||||
38
doc/tools/PointCloudOverlap.rst
Normal file
38
doc/tools/PointCloudOverlap.rst
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
.. _PointCloudOverlap:
|
||||||
|
|
||||||
|
#################
|
||||||
|
PointCloudOverlap
|
||||||
|
#################
|
||||||
|
|
||||||
|
*****
|
||||||
|
Usage
|
||||||
|
*****
|
||||||
|
|
||||||
|
.. include:: ../toolDescriptions/PointCloudOverlap.txt
|
||||||
|
:literal:
|
||||||
|
|
||||||
|
********
|
||||||
|
Examples
|
||||||
|
********
|
||||||
|
|
||||||
|
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
|
||||||
|
shape models. You can view them in a tool such as
|
||||||
|
`ParaView <https://www.paraview.org/>`__.
|
||||||
|
|
||||||
|
.. figure:: images/PointCloudOverlap_1.png
|
||||||
|
|
||||||
|
This image shows the reference (pink) and input (blue) shape models.
|
||||||
|
|
||||||
|
Run PointCloudOverlap:
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
PointCloudOverlap -inputFile EVAL20.obj -referenceFile EVAL20_wtr.obj -outputFile overlap.vtk
|
||||||
|
|
||||||
|
Note that OBJ is supported as an input format but not as an output format.
|
||||||
|
|
||||||
|
|
||||||
|
.. figure:: images/PointCloudOverlap_2.png
|
||||||
|
|
||||||
|
The points in white are those in the input model that overlap the reference.
|
||||||
|
|
||||||
43
doc/tools/TriAx.rst
Normal file
43
doc/tools/TriAx.rst
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
.. _TriAx:
|
||||||
|
|
||||||
|
=====
|
||||||
|
TriAx
|
||||||
|
=====
|
||||||
|
|
||||||
|
TriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.
|
||||||
|
|
||||||
|
*****
|
||||||
|
Usage
|
||||||
|
*****
|
||||||
|
|
||||||
|
.. include:: ../toolDescriptions/TriAx.txt
|
||||||
|
:literal:
|
||||||
|
|
||||||
|
*******
|
||||||
|
Example
|
||||||
|
*******
|
||||||
|
|
||||||
|
Generate an ellipsoid with dimensions 10, 8, 6, with q = 8.
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
TriAx -A 10 -B 8 -C 6 -Q 8 -output triax.icq -saveOBJ
|
||||||
|
|
||||||
|
The following ellipsoid is generated:
|
||||||
|
|
||||||
|
.. container:: figures-row
|
||||||
|
|
||||||
|
.. figure:: images/TriAx_X.png
|
||||||
|
:alt: looking down from the +X direction
|
||||||
|
|
||||||
|
looking down from the +X direction
|
||||||
|
|
||||||
|
.. figure:: images/TriAx_Y.png
|
||||||
|
:alt: looking down from the +Y direction
|
||||||
|
|
||||||
|
looking down from the +Y direction
|
||||||
|
|
||||||
|
.. figure:: images/TriAx_Z.png
|
||||||
|
:alt: looking down from the +Z direction
|
||||||
|
|
||||||
|
looking down from the +Z direction
|
||||||
BIN
doc/tools/images/PointCloudOverlap_1.png
Normal file
BIN
doc/tools/images/PointCloudOverlap_1.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 174 KiB |
BIN
doc/tools/images/PointCloudOverlap_2.png
Normal file
BIN
doc/tools/images/PointCloudOverlap_2.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 139 KiB |
BIN
doc/tools/images/TriAx_X.png
Normal file
BIN
doc/tools/images/TriAx_X.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 23 KiB |
BIN
doc/tools/images/TriAx_Y.png
Normal file
BIN
doc/tools/images/TriAx_Y.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 24 KiB |
BIN
doc/tools/images/TriAx_Z.png
Normal file
BIN
doc/tools/images/TriAx_Z.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 29 KiB |
@@ -89,7 +89,7 @@ function build_jar() {
|
|||||||
|
|
||||||
function make_scripts() {
|
function make_scripts() {
|
||||||
|
|
||||||
classes=$(jar tf "${scriptPath}"/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep class)
|
classes=$(jar tf ${scriptPath}/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep -v "Immutable" | grep class)
|
||||||
|
|
||||||
for class in $classes; do
|
for class in $classes; do
|
||||||
base=$(basename "$class" ".class")
|
base=$(basename "$class" ".class")
|
||||||
@@ -247,7 +247,7 @@ for arch in 3rd-party/*; do
|
|||||||
done
|
done
|
||||||
|
|
||||||
mvn -q -Dmdep.copyPom=true dependency:copy-dependencies
|
mvn -q -Dmdep.copyPom=true dependency:copy-dependencies
|
||||||
rsync -a README.md CHANGELOG.md mkPackage.bash pom.xml doc src target/dependency "${pkgBase}"-src/
|
rsync -a README.md CHANGELOG.md LICENSE.md mkPackage.bash pom.xml doc src target/dependency "${pkgBase}"-src/
|
||||||
tar cfz ./dist/"${pkgBase}"-${rev}-src.tar.gz ./"${pkgBase}"-src
|
tar cfz ./dist/"${pkgBase}"-${rev}-src.tar.gz ./"${pkgBase}"-src
|
||||||
echo -e "\nCreated ./dist/${pkgBase}-${rev}-src.tar.gz"
|
echo -e "\nCreated ./dist/${pkgBase}-${rev}-src.tar.gz"
|
||||||
/bin/rm -fr ./"${pkgBase}" ./"${pkgBase}"-src
|
/bin/rm -fr ./"${pkgBase}" ./"${pkgBase}"-src
|
||||||
|
|||||||
102
pom.xml
102
pom.xml
@@ -6,21 +6,28 @@
|
|||||||
<artifactId>Terrasaur</artifactId>
|
<artifactId>Terrasaur</artifactId>
|
||||||
<version>0.0.1-SNAPSHOT</version>
|
<version>0.0.1-SNAPSHOT</version>
|
||||||
|
|
||||||
<!-- Specifies organization -->
|
<inceptionYear>2025</inceptionYear>
|
||||||
<organization>
|
<organization>
|
||||||
<name>Johns Hopkins University Applied Physics Lab</name>
|
<name>Johns Hopkins University Applied Physics Laboratory</name>
|
||||||
<url>https://www.jhuapl.edu</url>
|
<url>https://www.jhuapl.edu</url>
|
||||||
</organization>
|
</organization>
|
||||||
|
|
||||||
<!-- publish to surfshop -->
|
<developers>
|
||||||
<distributionManagement>
|
<developer>
|
||||||
<repository>
|
<id>HariNairJHUAPL</id>
|
||||||
<id>central</id>
|
<name>Hari Nair</name>
|
||||||
<name>surfshop-snapshots</name>
|
<email>Hari.Nair@jhuapl.edu</email>
|
||||||
<url>http://surfshop.jhuapl.edu:8081/artifactory/libs-snapshot-local
|
<organization>JHUAPL</organization>
|
||||||
</url>
|
<organizationUrl>https://www.jhuapl.edu/</organizationUrl>
|
||||||
</repository>
|
</developer>
|
||||||
</distributionManagement>
|
</developers>
|
||||||
|
|
||||||
|
<licenses>
|
||||||
|
<license>
|
||||||
|
<name>MIT License</name>
|
||||||
|
<url>https://opensource.org/license/mit/</url>
|
||||||
|
</license>
|
||||||
|
</licenses>
|
||||||
|
|
||||||
<properties>
|
<properties>
|
||||||
<package>Terrasaur</package>
|
<package>Terrasaur</package>
|
||||||
@@ -36,7 +43,7 @@
|
|||||||
<maven.compiler.release>21</maven.compiler.release>
|
<maven.compiler.release>21</maven.compiler.release>
|
||||||
<javafx.version>21.0.5</javafx.version>
|
<javafx.version>21.0.5</javafx.version>
|
||||||
|
|
||||||
<immutables.version>2.10.1</immutables.version>
|
<immutables.version>2.11.1</immutables.version>
|
||||||
<jackfruit.version>1.1.1</jackfruit.version>
|
<jackfruit.version>1.1.1</jackfruit.version>
|
||||||
</properties>
|
</properties>
|
||||||
|
|
||||||
@@ -45,9 +52,38 @@
|
|||||||
<sourceDirectory>src/main/java</sourceDirectory>
|
<sourceDirectory>src/main/java</sourceDirectory>
|
||||||
<plugins>
|
<plugins>
|
||||||
|
|
||||||
|
<plugin>
|
||||||
|
<groupId>com.mycila</groupId>
|
||||||
|
<artifactId>license-maven-plugin</artifactId>
|
||||||
|
<version>5.0.0</version>
|
||||||
|
<configuration>
|
||||||
|
<licenseSets>
|
||||||
|
<licenseSet>
|
||||||
|
<header>com/mycila/maven/plugin/license/templates/MIT.txt</header>
|
||||||
|
<includes>
|
||||||
|
<include>**/*.java</include>
|
||||||
|
</includes>
|
||||||
|
<excludes>
|
||||||
|
<exclude>3rd-party/**/*.java</exclude>
|
||||||
|
<exclude>support-libraries/3rd-party/**/*.java</exclude>
|
||||||
|
</excludes>
|
||||||
|
</licenseSet>
|
||||||
|
</licenseSets>
|
||||||
|
</configuration>
|
||||||
|
<executions>
|
||||||
|
<execution>
|
||||||
|
<id>check-license</id>
|
||||||
|
<phase>verify</phase>
|
||||||
|
<goals>
|
||||||
|
<goal>check</goal>
|
||||||
|
</goals>
|
||||||
|
</execution>
|
||||||
|
</executions>
|
||||||
|
</plugin>
|
||||||
|
|
||||||
<plugin>
|
<plugin>
|
||||||
<artifactId>maven-compiler-plugin</artifactId>
|
<artifactId>maven-compiler-plugin</artifactId>
|
||||||
<version>3.13.0</version>
|
<version>3.14.0</version>
|
||||||
<configuration>
|
<configuration>
|
||||||
<release>${maven.compiler.release}</release>
|
<release>${maven.compiler.release}</release>
|
||||||
<annotationProcessorPaths>
|
<annotationProcessorPaths>
|
||||||
@@ -93,7 +129,7 @@
|
|||||||
<plugin>
|
<plugin>
|
||||||
<groupId>org.apache.maven.plugins</groupId>
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
<artifactId>maven-surefire-plugin</artifactId>
|
<artifactId>maven-surefire-plugin</artifactId>
|
||||||
<version>3.5.2</version>
|
<version>3.5.3</version>
|
||||||
<configuration>
|
<configuration>
|
||||||
<argLine>
|
<argLine>
|
||||||
-Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
|
-Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
|
||||||
@@ -103,7 +139,7 @@
|
|||||||
<plugin>
|
<plugin>
|
||||||
<groupId>org.apache.maven.plugins</groupId>
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
<artifactId>maven-enforcer-plugin</artifactId>
|
<artifactId>maven-enforcer-plugin</artifactId>
|
||||||
<version>3.5.0</version>
|
<version>3.6.1</version>
|
||||||
<executions>
|
<executions>
|
||||||
<execution>
|
<execution>
|
||||||
<id>enforce-maven</id>
|
<id>enforce-maven</id>
|
||||||
@@ -177,7 +213,7 @@
|
|||||||
<plugin>
|
<plugin>
|
||||||
<groupId>org.codehaus.mojo</groupId>
|
<groupId>org.codehaus.mojo</groupId>
|
||||||
<artifactId>exec-maven-plugin</artifactId>
|
<artifactId>exec-maven-plugin</artifactId>
|
||||||
<version>3.5.0</version>
|
<version>3.5.1</version>
|
||||||
<executions>
|
<executions>
|
||||||
<execution>
|
<execution>
|
||||||
<phase>generate-sources</phase>
|
<phase>generate-sources</phase>
|
||||||
@@ -192,6 +228,18 @@
|
|||||||
</execution>
|
</execution>
|
||||||
</executions>
|
</executions>
|
||||||
</plugin>
|
</plugin>
|
||||||
|
|
||||||
|
<plugin>
|
||||||
|
<groupId>com.diffplug.spotless</groupId>
|
||||||
|
<artifactId>spotless-maven-plugin</artifactId>
|
||||||
|
<version>2.46.1</version>
|
||||||
|
<configuration>
|
||||||
|
<java>
|
||||||
|
<palantirJavaFormat />
|
||||||
|
</java>
|
||||||
|
</configuration>
|
||||||
|
</plugin>
|
||||||
|
|
||||||
</plugins>
|
</plugins>
|
||||||
</build>
|
</build>
|
||||||
<dependencies>
|
<dependencies>
|
||||||
@@ -234,7 +282,7 @@
|
|||||||
<dependency>
|
<dependency>
|
||||||
<groupId>commons-beanutils</groupId>
|
<groupId>commons-beanutils</groupId>
|
||||||
<artifactId>commons-beanutils</artifactId>
|
<artifactId>commons-beanutils</artifactId>
|
||||||
<version>1.10.0</version>
|
<version>1.11.0</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>commons-cli</groupId>
|
<groupId>commons-cli</groupId>
|
||||||
@@ -244,17 +292,22 @@
|
|||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.commons</groupId>
|
<groupId>org.apache.commons</groupId>
|
||||||
<artifactId>commons-configuration2</artifactId>
|
<artifactId>commons-configuration2</artifactId>
|
||||||
<version>2.11.0</version>
|
<version>2.12.0</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.commons</groupId>
|
<groupId>org.apache.commons</groupId>
|
||||||
<artifactId>commons-csv</artifactId>
|
<artifactId>commons-csv</artifactId>
|
||||||
<version>1.13.0</version>
|
<version>1.14.1</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>commons-io</groupId>
|
<groupId>commons-io</groupId>
|
||||||
<artifactId>commons-io</artifactId>
|
<artifactId>commons-io</artifactId>
|
||||||
<version>2.18.0</version>
|
<version>2.20.0</version>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.apache.commons</groupId>
|
||||||
|
<artifactId>commons-text</artifactId>
|
||||||
|
<version>1.14.0</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>com.beust</groupId>
|
<groupId>com.beust</groupId>
|
||||||
@@ -264,7 +317,7 @@
|
|||||||
<dependency>
|
<dependency>
|
||||||
<groupId>com.google.code.gson</groupId>
|
<groupId>com.google.code.gson</groupId>
|
||||||
<artifactId>gson</artifactId>
|
<artifactId>gson</artifactId>
|
||||||
<version>2.12.1</version>
|
<version>2.13.1</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
@@ -285,18 +338,13 @@
|
|||||||
<dependency>
|
<dependency>
|
||||||
<groupId>gov.nasa.gsfc.heasarc</groupId>
|
<groupId>gov.nasa.gsfc.heasarc</groupId>
|
||||||
<artifactId>nom-tam-fits</artifactId>
|
<artifactId>nom-tam-fits</artifactId>
|
||||||
<version>1.20.2</version>
|
<version>1.21.1</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>gov.nasa.jpl.naif</groupId>
|
<groupId>gov.nasa.jpl.naif</groupId>
|
||||||
<artifactId>spice</artifactId>
|
<artifactId>spice</artifactId>
|
||||||
<version>N0067</version>
|
<version>N0067</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
|
||||||
<groupId>org.apache.commons</groupId>
|
|
||||||
<artifactId>commons-text</artifactId>
|
|
||||||
<version>1.13.0</version>
|
|
||||||
</dependency>
|
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.logging.log4j</groupId>
|
<groupId>org.apache.logging.log4j</groupId>
|
||||||
<artifactId>log4j-api</artifactId>
|
<artifactId>log4j-api</artifactId>
|
||||||
|
|||||||
@@ -30,8 +30,30 @@ mkdir -p $(dirname $srcFile)
|
|||||||
touch $srcFile
|
touch $srcFile
|
||||||
|
|
||||||
cat <<EOF > $srcFile
|
cat <<EOF > $srcFile
|
||||||
|
/*
|
||||||
package terrasaur.utils;
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
|
|
||||||
|
package terrasaur.utils;
|
||||||
|
|
||||||
public class AppVersion {
|
public class AppVersion {
|
||||||
public final static String lastCommit = "$lastCommit";
|
public final static String lastCommit = "$lastCommit";
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
import terrasaur.enums.AltwgDataType;
|
import terrasaur.enums.AltwgDataType;
|
||||||
@@ -7,319 +29,314 @@ import terrasaur.utils.StringUtil;
|
|||||||
|
|
||||||
public class ALTWGProductNamer implements ProductNamer {
|
public class ALTWGProductNamer implements ProductNamer {
|
||||||
|
|
||||||
public ALTWGProductNamer() {
|
public ALTWGProductNamer() {
|
||||||
super();
|
super();
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse the productName and return the portion of the name corresponding to a given field. Fields
|
|
||||||
* are assumed separated by "_" in the filename.
|
|
||||||
*
|
|
||||||
* @param productName
|
|
||||||
* @param fieldNum
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public String getNameFrag(String productName, int fieldNum) {
|
|
||||||
|
|
||||||
String[] fields = productName.split("_");
|
|
||||||
String returnField = "ERROR";
|
|
||||||
if (fieldNum > fields.length) {
|
|
||||||
System.out.println(
|
|
||||||
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
|
|
||||||
System.out.println("returning:" + returnField);
|
|
||||||
} else {
|
|
||||||
returnField = fields[fieldNum];
|
|
||||||
}
|
|
||||||
return returnField;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String productbaseName(
|
|
||||||
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
|
|
||||||
|
|
||||||
String gsd = "gsd";
|
|
||||||
String dataSrc = "dataSrc";
|
|
||||||
String productType = altwgProduct.getFileFrag();
|
|
||||||
String prodVer = getVersion(hdrBuilder);
|
|
||||||
|
|
||||||
// extract ground sample distance. gsdD is in mm!
|
|
||||||
double gsdD = gsdFromHdr(hdrBuilder);
|
|
||||||
|
|
||||||
int gsdI = (int) Math.round(gsdD);
|
|
||||||
String fileUnits = "mm";
|
|
||||||
gsd = String.format("%05d", gsdI) + fileUnits;
|
|
||||||
|
|
||||||
// System.out.println("gsd:" + gsd);
|
|
||||||
// System.out.println("file units:" + fileUnits);
|
|
||||||
|
|
||||||
HeaderTag key = HeaderTag.DATASRC;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
|
|
||||||
|
|
||||||
// check whether dataSrc needs to be modified
|
|
||||||
dataSrc = HeaderTag.getSDP(dataSrc);
|
|
||||||
// data source should only be 3 chars long
|
|
||||||
if (dataSrc.length() > 3) {
|
|
||||||
dataSrc = dataSrc.substring(0, 3);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
key = HeaderTag.CLON;
|
/**
|
||||||
String cLon = null;
|
* Parse the productName and return the portion of the name corresponding to a given field. Fields
|
||||||
if (hdrBuilder.containsKey(key)) {
|
* are assumed separated by "_" in the filename.
|
||||||
cLon = hdrBuilder.getCard(key).getValue();
|
*
|
||||||
}
|
* @param productName
|
||||||
if (cLon == null) {
|
* @param fieldNum
|
||||||
if (isGlobal) {
|
* @return
|
||||||
// set center longitude to 0.0 if value not parsed and this is a global product
|
*/
|
||||||
cLon = "0.0";
|
@Override
|
||||||
} else {
|
public String getNameFrag(String productName, int fieldNum) {
|
||||||
String errMesg = "ERROR! Could not parse CLON from fits header!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// System.out.println("clon:" + cLon);
|
|
||||||
key = HeaderTag.CLAT;
|
|
||||||
String cLat = null;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
cLat = hdrBuilder.getCard(key).getValue();
|
|
||||||
}
|
|
||||||
if (cLat == null) {
|
|
||||||
if (isGlobal) {
|
|
||||||
// set center latitude to 0.0 if value not parsed and this is a global product
|
|
||||||
cLat = "0.0";
|
|
||||||
} else {
|
|
||||||
String errMesg = "ERROR! Could not parse CLAT from fits header!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// System.out.println("clat" + cLat);
|
|
||||||
|
|
||||||
String region = "l";
|
String[] fields = productName.split("_");
|
||||||
if (isGlobal) {
|
String returnField = "ERROR";
|
||||||
region = "g";
|
if (fieldNum > fields.length) {
|
||||||
}
|
System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
|
||||||
|
System.out.println("returning:" + returnField);
|
||||||
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
|
} else {
|
||||||
|
returnField = fields[fieldNum];
|
||||||
// pds likes having filenames all in the same case, so chose lowercase
|
|
||||||
String outFile =
|
|
||||||
ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
|
|
||||||
return outFile;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Retrieve the product version string. Returns initial default value if product version keyword
|
|
||||||
* not found in builder.
|
|
||||||
*
|
|
||||||
* @param hdrBuilder
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public String getVersion(FitsHdrBuilder hdrBuilder) {
|
|
||||||
String prodVer = "prodVer";
|
|
||||||
|
|
||||||
// note: this has been changed to MAP_VER in the SIS
|
|
||||||
HeaderTag key = HeaderTag.MAP_VER;
|
|
||||||
// key = HeaderTag.PRODVERS;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
prodVer = hdrBuilder.getCard(key).getValue();
|
|
||||||
prodVer = prodVer.replaceAll("\\.", "");
|
|
||||||
}
|
|
||||||
|
|
||||||
return prodVer;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Given the fields return the altwg PDS base name
|
|
||||||
public static String altwgBaseName(
|
|
||||||
String region, String gsd, String dataSource, String desc, String lahLon, String version) {
|
|
||||||
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
String delim = "_";
|
|
||||||
sb.append(region);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(gsd);
|
|
||||||
sb.append(delim);
|
|
||||||
|
|
||||||
// data source should only be 3 characters long
|
|
||||||
if (dataSource.length() > 3) {
|
|
||||||
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
|
|
||||||
dataSource = dataSource.substring(0, 3);
|
|
||||||
System.out.println(
|
|
||||||
"Will set data source to:"
|
|
||||||
+ dataSource
|
|
||||||
+ " but"
|
|
||||||
+ " this might NOT conform to the ALTWG naming convention!");
|
|
||||||
}
|
|
||||||
sb.append(dataSource);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(desc);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(lahLon);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append("v");
|
|
||||||
|
|
||||||
// remove '.' from version string
|
|
||||||
version = version.replaceAll("\\.", "");
|
|
||||||
sb.append(version);
|
|
||||||
|
|
||||||
// pds likes having filenames all in the same case, so chose lowercase
|
|
||||||
String outFile = sb.toString().toLowerCase();
|
|
||||||
|
|
||||||
return outFile;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse center lat, lon strings and return the formatted clahLon portion of the PDS filename.
|
|
||||||
*
|
|
||||||
* @param clat
|
|
||||||
* @param clon
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static String clahLon(String clat, String clon) {
|
|
||||||
|
|
||||||
// String cLon = "";
|
|
||||||
|
|
||||||
// remove all whitespace that may exist in the strings
|
|
||||||
clat = clat.replaceAll("\\s+", "");
|
|
||||||
clon = clon.replaceAll("\\s+", "");
|
|
||||||
|
|
||||||
double cLonD = StringUtil.parseSafeD(clon);
|
|
||||||
double cLatD = StringUtil.parseSafeD(clat);
|
|
||||||
|
|
||||||
String clahLon = clahLon(cLatD, cLonD);
|
|
||||||
return clahLon;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String clahLon(double cLatD, double cLonD) {
|
|
||||||
|
|
||||||
String cLon = "";
|
|
||||||
|
|
||||||
if (cLonD == Double.NaN) {
|
|
||||||
|
|
||||||
// unable to parse center longitude using normal method (see V in getProductCards())
|
|
||||||
cLon = "xxxxxx";
|
|
||||||
|
|
||||||
} else {
|
|
||||||
|
|
||||||
if (cLonD < 0) {
|
|
||||||
// transform to 0-360
|
|
||||||
cLonD = cLonD + 360D;
|
|
||||||
}
|
|
||||||
// format double to 2 significant digits
|
|
||||||
cLon = String.format("%06.2f", cLonD);
|
|
||||||
}
|
|
||||||
|
|
||||||
// remove decimal point
|
|
||||||
cLon = cLon.replace(".", "");
|
|
||||||
|
|
||||||
String cLat = "";
|
|
||||||
|
|
||||||
// System.out.println("cLatD:" + Double.toString(cLatD));
|
|
||||||
if (cLatD == Double.NaN) {
|
|
||||||
|
|
||||||
// unable to parse center latitude
|
|
||||||
cLat = "xxxxxx";
|
|
||||||
|
|
||||||
} else {
|
|
||||||
|
|
||||||
double tol = 0.0101D;
|
|
||||||
|
|
||||||
// determine whether latitude is within tolerance of its rounded value.
|
|
||||||
// if so then use rounded value
|
|
||||||
double roundValue = Math.round(cLatD);
|
|
||||||
double diffTol = Math.abs(roundValue - cLatD);
|
|
||||||
if (diffTol < tol) {
|
|
||||||
cLatD = roundValue;
|
|
||||||
}
|
|
||||||
String hemiSphere = (cLatD >= 0) ? "N" : "S";
|
|
||||||
|
|
||||||
if (cLatD < 0) {
|
|
||||||
// remove negative sign if in southern hemisphere
|
|
||||||
cLatD = cLatD * -1.0D;
|
|
||||||
}
|
|
||||||
// format cLat to 2 significant digits
|
|
||||||
cLat = String.format("%05.2f", cLatD);
|
|
||||||
cLat = cLat.replace(".", "");
|
|
||||||
|
|
||||||
// trim to length 4.
|
|
||||||
cLat = cLat.substring(0, Math.min(cLat.length(), 4));
|
|
||||||
cLat = cLat + hemiSphere;
|
|
||||||
}
|
|
||||||
|
|
||||||
String clahLon = cLat + cLon;
|
|
||||||
return clahLon;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* return GSD parsed from FitsHdrBuilder. Returns 0 if valid GSD could not be parsed. GSD is in
|
|
||||||
* mm.
|
|
||||||
*
|
|
||||||
* @param hdrBuilder
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
|
|
||||||
|
|
||||||
String gsd = "gsd";
|
|
||||||
double gsdD = Double.NaN;
|
|
||||||
|
|
||||||
// extract ground sample distance using GSD first
|
|
||||||
HeaderTag key = HeaderTag.GSD;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
gsd = hdrBuilder.getCard(key).getValue();
|
|
||||||
gsdD = StringUtil.parseSafeD(gsd);
|
|
||||||
if (gsdD < 0D) {
|
|
||||||
// keyword value not initialized
|
|
||||||
gsdD = Double.NaN;
|
|
||||||
System.out.println("WARNING! keyword GSD not set!");
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
|
||||||
}
|
|
||||||
if (Double.isNaN(gsdD)) {
|
|
||||||
// could not parse GSD into valid number, try GSDI
|
|
||||||
key = HeaderTag.GSDI;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
|
|
||||||
if (gsdD < 0D) {
|
|
||||||
// keyword value not initialized
|
|
||||||
gsdD = Double.NaN;
|
|
||||||
System.out.println("WARNING! keyword GSDI not set!");
|
|
||||||
}
|
}
|
||||||
} else {
|
return returnField;
|
||||||
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
|
||||||
}
|
|
||||||
if (Double.isNaN(gsdD)) {
|
|
||||||
// still cannot parse gsd. Set to -999
|
|
||||||
System.out.println(
|
|
||||||
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
|
|
||||||
System.out.println("Setting gsd = -999");
|
|
||||||
gsdD = -999D;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
@Override
|
||||||
|
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
|
||||||
|
|
||||||
// mandated to use mm! change the units
|
String gsd = "gsd";
|
||||||
gsdD = gsdD * 10.0D;
|
String dataSrc = "dataSrc";
|
||||||
|
String productType = altwgProduct.getFileFrag();
|
||||||
|
String prodVer = getVersion(hdrBuilder);
|
||||||
|
|
||||||
|
// extract ground sample distance. gsdD is in mm!
|
||||||
|
double gsdD = gsdFromHdr(hdrBuilder);
|
||||||
|
|
||||||
|
int gsdI = (int) Math.round(gsdD);
|
||||||
|
String fileUnits = "mm";
|
||||||
|
gsd = String.format("%05d", gsdI) + fileUnits;
|
||||||
|
|
||||||
|
// System.out.println("gsd:" + gsd);
|
||||||
|
// System.out.println("file units:" + fileUnits);
|
||||||
|
|
||||||
|
HeaderTag key = HeaderTag.DATASRC;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
|
||||||
|
|
||||||
|
// check whether dataSrc needs to be modified
|
||||||
|
dataSrc = HeaderTag.getSDP(dataSrc);
|
||||||
|
// data source should only be 3 chars long
|
||||||
|
if (dataSrc.length() > 3) {
|
||||||
|
dataSrc = dataSrc.substring(0, 3);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
key = HeaderTag.CLON;
|
||||||
|
String cLon = null;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
cLon = hdrBuilder.getCard(key).getValue();
|
||||||
|
}
|
||||||
|
if (cLon == null) {
|
||||||
|
if (isGlobal) {
|
||||||
|
// set center longitude to 0.0 if value not parsed and this is a global product
|
||||||
|
cLon = "0.0";
|
||||||
|
} else {
|
||||||
|
String errMesg = "ERROR! Could not parse CLON from fits header!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// System.out.println("clon:" + cLon);
|
||||||
|
key = HeaderTag.CLAT;
|
||||||
|
String cLat = null;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
cLat = hdrBuilder.getCard(key).getValue();
|
||||||
|
}
|
||||||
|
if (cLat == null) {
|
||||||
|
if (isGlobal) {
|
||||||
|
// set center latitude to 0.0 if value not parsed and this is a global product
|
||||||
|
cLat = "0.0";
|
||||||
|
} else {
|
||||||
|
String errMesg = "ERROR! Could not parse CLAT from fits header!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// System.out.println("clat" + cLat);
|
||||||
|
|
||||||
|
String region = "l";
|
||||||
|
if (isGlobal) {
|
||||||
|
region = "g";
|
||||||
|
}
|
||||||
|
|
||||||
|
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
|
||||||
|
|
||||||
|
// pds likes having filenames all in the same case, so chose lowercase
|
||||||
|
String outFile = ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
|
||||||
|
return outFile;
|
||||||
}
|
}
|
||||||
|
|
||||||
return gsdD;
|
/**
|
||||||
}
|
* Retrieve the product version string. Returns initial default value if product version keyword
|
||||||
|
* not found in builder.
|
||||||
|
*
|
||||||
|
* @param hdrBuilder
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public String getVersion(FitsHdrBuilder hdrBuilder) {
|
||||||
|
String prodVer = "prodVer";
|
||||||
|
|
||||||
@Override
|
// note: this has been changed to MAP_VER in the SIS
|
||||||
public NameConvention getNameConvention() {
|
HeaderTag key = HeaderTag.MAP_VER;
|
||||||
return NameConvention.ALTPRODUCT;
|
// key = HeaderTag.PRODVERS;
|
||||||
}
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
prodVer = hdrBuilder.getCard(key).getValue();
|
||||||
|
prodVer = prodVer.replaceAll("\\.", "");
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
return prodVer;
|
||||||
/** Parse the filename using the ALTWGProduct naming convention and return the GSD value. */
|
}
|
||||||
public double gsdFromFilename(String filename) {
|
|
||||||
String[] splitStr = filename.split("_");
|
// Given the fields return the altwg PDS base name
|
||||||
// GSD is second element
|
public static String altwgBaseName(
|
||||||
String gsd = splitStr[1];
|
String region, String gsd, String dataSource, String desc, String lahLon, String version) {
|
||||||
gsd = gsd.replace("mm", "");
|
|
||||||
return StringUtil.parseSafeD(gsd);
|
StringBuilder sb = new StringBuilder();
|
||||||
}
|
String delim = "_";
|
||||||
|
sb.append(region);
|
||||||
|
sb.append(delim);
|
||||||
|
sb.append(gsd);
|
||||||
|
sb.append(delim);
|
||||||
|
|
||||||
|
// data source should only be 3 characters long
|
||||||
|
if (dataSource.length() > 3) {
|
||||||
|
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
|
||||||
|
dataSource = dataSource.substring(0, 3);
|
||||||
|
System.out.println("Will set data source to:"
|
||||||
|
+ dataSource
|
||||||
|
+ " but"
|
||||||
|
+ " this might NOT conform to the ALTWG naming convention!");
|
||||||
|
}
|
||||||
|
sb.append(dataSource);
|
||||||
|
sb.append(delim);
|
||||||
|
sb.append(desc);
|
||||||
|
sb.append(delim);
|
||||||
|
sb.append(lahLon);
|
||||||
|
sb.append(delim);
|
||||||
|
sb.append("v");
|
||||||
|
|
||||||
|
// remove '.' from version string
|
||||||
|
version = version.replaceAll("\\.", "");
|
||||||
|
sb.append(version);
|
||||||
|
|
||||||
|
// pds likes having filenames all in the same case, so chose lowercase
|
||||||
|
String outFile = sb.toString().toLowerCase();
|
||||||
|
|
||||||
|
return outFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse center lat, lon strings and return the formatted clahLon portion of the PDS filename.
|
||||||
|
*
|
||||||
|
* @param clat
|
||||||
|
* @param clon
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static String clahLon(String clat, String clon) {
|
||||||
|
|
||||||
|
// String cLon = "";
|
||||||
|
|
||||||
|
// remove all whitespace that may exist in the strings
|
||||||
|
clat = clat.replaceAll("\\s+", "");
|
||||||
|
clon = clon.replaceAll("\\s+", "");
|
||||||
|
|
||||||
|
double cLonD = StringUtil.parseSafeD(clon);
|
||||||
|
double cLatD = StringUtil.parseSafeD(clat);
|
||||||
|
|
||||||
|
String clahLon = clahLon(cLatD, cLonD);
|
||||||
|
return clahLon;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static String clahLon(double cLatD, double cLonD) {
|
||||||
|
|
||||||
|
String cLon = "";
|
||||||
|
|
||||||
|
if (cLonD == Double.NaN) {
|
||||||
|
|
||||||
|
// unable to parse center longitude using normal method (see V in getProductCards())
|
||||||
|
cLon = "xxxxxx";
|
||||||
|
|
||||||
|
} else {
|
||||||
|
|
||||||
|
if (cLonD < 0) {
|
||||||
|
// transform to 0-360
|
||||||
|
cLonD = cLonD + 360D;
|
||||||
|
}
|
||||||
|
// format double to 2 significant digits
|
||||||
|
cLon = String.format("%06.2f", cLonD);
|
||||||
|
}
|
||||||
|
|
||||||
|
// remove decimal point
|
||||||
|
cLon = cLon.replace(".", "");
|
||||||
|
|
||||||
|
String cLat = "";
|
||||||
|
|
||||||
|
// System.out.println("cLatD:" + Double.toString(cLatD));
|
||||||
|
if (cLatD == Double.NaN) {
|
||||||
|
|
||||||
|
// unable to parse center latitude
|
||||||
|
cLat = "xxxxxx";
|
||||||
|
|
||||||
|
} else {
|
||||||
|
|
||||||
|
double tol = 0.0101D;
|
||||||
|
|
||||||
|
// determine whether latitude is within tolerance of its rounded value.
|
||||||
|
// if so then use rounded value
|
||||||
|
double roundValue = Math.round(cLatD);
|
||||||
|
double diffTol = Math.abs(roundValue - cLatD);
|
||||||
|
if (diffTol < tol) {
|
||||||
|
cLatD = roundValue;
|
||||||
|
}
|
||||||
|
String hemiSphere = (cLatD >= 0) ? "N" : "S";
|
||||||
|
|
||||||
|
if (cLatD < 0) {
|
||||||
|
// remove negative sign if in southern hemisphere
|
||||||
|
cLatD = cLatD * -1.0D;
|
||||||
|
}
|
||||||
|
// format cLat to 2 significant digits
|
||||||
|
cLat = String.format("%05.2f", cLatD);
|
||||||
|
cLat = cLat.replace(".", "");
|
||||||
|
|
||||||
|
// trim to length 4.
|
||||||
|
cLat = cLat.substring(0, Math.min(cLat.length(), 4));
|
||||||
|
cLat = cLat + hemiSphere;
|
||||||
|
}
|
||||||
|
|
||||||
|
String clahLon = cLat + cLon;
|
||||||
|
return clahLon;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* return GSD parsed from FitsHdrBuilder. Returns 0 if valid GSD could not be parsed. GSD is in
|
||||||
|
* mm.
|
||||||
|
*
|
||||||
|
* @param hdrBuilder
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
|
||||||
|
|
||||||
|
String gsd = "gsd";
|
||||||
|
double gsdD = Double.NaN;
|
||||||
|
|
||||||
|
// extract ground sample distance using GSD first
|
||||||
|
HeaderTag key = HeaderTag.GSD;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
gsd = hdrBuilder.getCard(key).getValue();
|
||||||
|
gsdD = StringUtil.parseSafeD(gsd);
|
||||||
|
if (gsdD < 0D) {
|
||||||
|
// keyword value not initialized
|
||||||
|
gsdD = Double.NaN;
|
||||||
|
System.out.println("WARNING! keyword GSD not set!");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
||||||
|
}
|
||||||
|
if (Double.isNaN(gsdD)) {
|
||||||
|
// could not parse GSD into valid number, try GSDI
|
||||||
|
key = HeaderTag.GSDI;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
|
||||||
|
if (gsdD < 0D) {
|
||||||
|
// keyword value not initialized
|
||||||
|
gsdD = Double.NaN;
|
||||||
|
System.out.println("WARNING! keyword GSDI not set!");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
||||||
|
}
|
||||||
|
if (Double.isNaN(gsdD)) {
|
||||||
|
// still cannot parse gsd. Set to -999
|
||||||
|
System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
|
||||||
|
System.out.println("Setting gsd = -999");
|
||||||
|
gsdD = -999D;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
||||||
|
|
||||||
|
// mandated to use mm! change the units
|
||||||
|
gsdD = gsdD * 10.0D;
|
||||||
|
}
|
||||||
|
|
||||||
|
return gsdD;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public NameConvention getNameConvention() {
|
||||||
|
return NameConvention.ALTPRODUCT;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
/** Parse the filename using the ALTWGProduct naming convention and return the GSD value. */
|
||||||
|
public double gsdFromFilename(String filename) {
|
||||||
|
String[] splitStr = filename.split("_");
|
||||||
|
// GSD is second element
|
||||||
|
String gsd = splitStr[1];
|
||||||
|
gsd = gsd.replace("mm", "");
|
||||||
|
return StringUtil.parseSafeD(gsd);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
import terrasaur.enums.AltwgDataType;
|
import terrasaur.enums.AltwgDataType;
|
||||||
@@ -7,184 +29,182 @@ import terrasaur.utils.StringUtil;
|
|||||||
|
|
||||||
public class AltwgMLNNamer implements ProductNamer {
|
public class AltwgMLNNamer implements ProductNamer {
|
||||||
|
|
||||||
public AltwgMLNNamer() {
|
public AltwgMLNNamer() {
|
||||||
super();
|
super();
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String getNameFrag(String productName, int fieldNum) {
|
|
||||||
|
|
||||||
String nameFrag = "";
|
|
||||||
|
|
||||||
return nameFrag;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* ALTWG MLN naming convention applies only to one productType - the ALTWG NFT-MLN.
|
|
||||||
*
|
|
||||||
* @param hdrBuilder - contains values that are used to create the MLN according to naming
|
|
||||||
* convention.
|
|
||||||
* @param altwgProduct - N/A. Included here as part of the interface structure.
|
|
||||||
* @param isGlobal - N/A. Included here as part of the interface structure.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public String productbaseName(
|
|
||||||
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
|
|
||||||
|
|
||||||
// initialize string fragments for NFT name. This will help identify
|
|
||||||
// which string fragments have not been updated by the method.
|
|
||||||
String gsd = "gsd";
|
|
||||||
String dataSrc = "dataSrc";
|
|
||||||
String dataSrcfile = "dataSrcFile";
|
|
||||||
String productType = "nftdtm";
|
|
||||||
String cLon = "cLon";
|
|
||||||
String cLat = "cLat";
|
|
||||||
String prodVer = "prodVer";
|
|
||||||
String productID = "prodID";
|
|
||||||
|
|
||||||
// find relevant information in the hdrBuilder map.
|
|
||||||
double gsdD = gsdFromHdr(hdrBuilder);
|
|
||||||
int gsdI = (int) Math.round(gsdD);
|
|
||||||
String fileUnits = "mm";
|
|
||||||
gsd = String.format("%05d", gsdI) + fileUnits;
|
|
||||||
// HeaderTag key = HeaderTag.GSD;
|
|
||||||
// if (hdrBuilder.containsKey(key)) {
|
|
||||||
// gsd = hdrBuilder.getCard(key).getValue();
|
|
||||||
//
|
|
||||||
// double gsdD = Double.parseDouble(gsd);
|
|
||||||
// String fileUnits = "";
|
|
||||||
// if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
|
|
||||||
// fileUnits = "mm";
|
|
||||||
// } else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
|
||||||
//
|
|
||||||
// // mandated to use mm! change the units
|
|
||||||
// gsdD = gsdD * 10.0D;
|
|
||||||
// gsdI = (int) Math.round(gsdD);
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// System.out.println("gsd:" + gsd);
|
|
||||||
// System.out.println("file units:" + fileUnits);
|
|
||||||
//
|
|
||||||
// }
|
|
||||||
|
|
||||||
HeaderTag key = HeaderTag.DATASRC;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
|
|
||||||
// data source should only be 3 chars long
|
|
||||||
if (dataSrc.length() > 3) {
|
|
||||||
dataSrc = dataSrc.substring(0, 3);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
key = HeaderTag.CLON;
|
@Override
|
||||||
if (hdrBuilder.containsKey(key)) {
|
public String getNameFrag(String productName, int fieldNum) {
|
||||||
cLon = hdrBuilder.getCard(key).getValue();
|
|
||||||
|
String nameFrag = "";
|
||||||
|
|
||||||
|
return nameFrag;
|
||||||
}
|
}
|
||||||
|
|
||||||
key = HeaderTag.CLAT;
|
/**
|
||||||
if (hdrBuilder.containsKey(key)) {
|
* ALTWG MLN naming convention applies only to one productType - the ALTWG NFT-MLN.
|
||||||
cLat = hdrBuilder.getCard(key).getValue();
|
*
|
||||||
}
|
* @param hdrBuilder - contains values that are used to create the MLN according to naming
|
||||||
|
* convention.
|
||||||
key = HeaderTag.DATASRCF;
|
* @param altwgProduct - N/A. Included here as part of the interface structure.
|
||||||
if (hdrBuilder.containsKey(key)) {
|
* @param isGlobal - N/A. Included here as part of the interface structure.
|
||||||
dataSrcfile = hdrBuilder.getCard(key).getValue();
|
|
||||||
}
|
|
||||||
|
|
||||||
key = HeaderTag.PRODVERS;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
prodVer = hdrBuilder.getCard(key).getValue();
|
|
||||||
prodVer = prodVer.replaceAll("\\.", "");
|
|
||||||
}
|
|
||||||
|
|
||||||
// hardcode region to local
|
|
||||||
String region = "l";
|
|
||||||
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
String delim = "_";
|
|
||||||
sb.append(region);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(gsd);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(dataSrc);
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(productType);
|
|
||||||
sb.append(delim);
|
|
||||||
|
|
||||||
/*
|
|
||||||
* determine product ID. For OLA it is the center lat,lon For SPC it is the NFT feature id,
|
|
||||||
* which is assumed to be the first 5 chars in DATASRC
|
|
||||||
*/
|
*/
|
||||||
if (dataSrc.contains("ola")) {
|
@Override
|
||||||
|
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
|
||||||
|
|
||||||
// follow ALTWG product naming convention for center lat, lon
|
// initialize string fragments for NFT name. This will help identify
|
||||||
productID = ALTWGProductNamer.clahLon(cLat, cLon);
|
// which string fragments have not been updated by the method.
|
||||||
} else {
|
String gsd = "gsd";
|
||||||
productID = dataSrcfile.substring(0, 5);
|
String dataSrc = "dataSrc";
|
||||||
}
|
String dataSrcfile = "dataSrcFile";
|
||||||
sb.append(productID);
|
String productType = "nftdtm";
|
||||||
sb.append(delim);
|
String cLon = "cLon";
|
||||||
sb.append("v");
|
String cLat = "cLat";
|
||||||
sb.append(prodVer);
|
String prodVer = "prodVer";
|
||||||
|
String productID = "prodID";
|
||||||
|
|
||||||
return sb.toString().toLowerCase();
|
// find relevant information in the hdrBuilder map.
|
||||||
}
|
double gsdD = gsdFromHdr(hdrBuilder);
|
||||||
|
int gsdI = (int) Math.round(gsdD);
|
||||||
|
String fileUnits = "mm";
|
||||||
|
gsd = String.format("%05d", gsdI) + fileUnits;
|
||||||
|
// HeaderTag key = HeaderTag.GSD;
|
||||||
|
// if (hdrBuilder.containsKey(key)) {
|
||||||
|
// gsd = hdrBuilder.getCard(key).getValue();
|
||||||
|
//
|
||||||
|
// double gsdD = Double.parseDouble(gsd);
|
||||||
|
// String fileUnits = "";
|
||||||
|
// if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
|
||||||
|
// fileUnits = "mm";
|
||||||
|
// } else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
||||||
|
//
|
||||||
|
// // mandated to use mm! change the units
|
||||||
|
// gsdD = gsdD * 10.0D;
|
||||||
|
// gsdI = (int) Math.round(gsdD);
|
||||||
|
// }
|
||||||
|
//
|
||||||
|
// System.out.println("gsd:" + gsd);
|
||||||
|
// System.out.println("file units:" + fileUnits);
|
||||||
|
//
|
||||||
|
// }
|
||||||
|
|
||||||
@Override
|
HeaderTag key = HeaderTag.DATASRC;
|
||||||
public String getVersion(FitsHdrBuilder hdrBuilder) {
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
|
||||||
|
// data source should only be 3 chars long
|
||||||
|
if (dataSrc.length() > 3) {
|
||||||
|
dataSrc = dataSrc.substring(0, 3);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
String version = "";
|
key = HeaderTag.CLON;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
cLon = hdrBuilder.getCard(key).getValue();
|
||||||
|
}
|
||||||
|
|
||||||
return version;
|
key = HeaderTag.CLAT;
|
||||||
}
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
cLat = hdrBuilder.getCard(key).getValue();
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
key = HeaderTag.DATASRCF;
|
||||||
* Extract ground sample distance from FitsHdrBuilder. GSD is needed as part of naming convention.
|
if (hdrBuilder.containsKey(key)) {
|
||||||
* GSD in units of mm.
|
dataSrcfile = hdrBuilder.getCard(key).getValue();
|
||||||
*/
|
}
|
||||||
@Override
|
|
||||||
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
|
|
||||||
|
|
||||||
// find relevant information in the hdrBuilder map.
|
key = HeaderTag.PRODVERS;
|
||||||
double gsdD = Double.NaN;
|
if (hdrBuilder.containsKey(key)) {
|
||||||
HeaderTag key = HeaderTag.GSD;
|
prodVer = hdrBuilder.getCard(key).getValue();
|
||||||
if (hdrBuilder.containsKey(key)) {
|
prodVer = prodVer.replaceAll("\\.", "");
|
||||||
String gsd = hdrBuilder.getCard(key).getValue();
|
}
|
||||||
|
|
||||||
gsdD = Double.parseDouble(gsd);
|
// hardcode region to local
|
||||||
String fileUnits = "";
|
String region = "l";
|
||||||
if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
|
|
||||||
fileUnits = "mm";
|
|
||||||
} else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
|
||||||
|
|
||||||
// mandated to use mm! change the units
|
StringBuilder sb = new StringBuilder();
|
||||||
gsdD = gsdD * 10.0D;
|
String delim = "_";
|
||||||
fileUnits = "mm";
|
sb.append(region);
|
||||||
}
|
sb.append(delim);
|
||||||
System.out.println("gsd:" + gsd);
|
sb.append(gsd);
|
||||||
System.out.println("file units:" + fileUnits);
|
sb.append(delim);
|
||||||
|
sb.append(dataSrc);
|
||||||
|
sb.append(delim);
|
||||||
|
sb.append(productType);
|
||||||
|
sb.append(delim);
|
||||||
|
|
||||||
} else {
|
/*
|
||||||
String errMesg =
|
* determine product ID. For OLA it is the center lat,lon For SPC it is the NFT feature id,
|
||||||
"ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
|
* which is assumed to be the first 5 chars in DATASRC
|
||||||
throw new RuntimeException(errMesg);
|
*/
|
||||||
|
if (dataSrc.contains("ola")) {
|
||||||
|
|
||||||
|
// follow ALTWG product naming convention for center lat, lon
|
||||||
|
productID = ALTWGProductNamer.clahLon(cLat, cLon);
|
||||||
|
} else {
|
||||||
|
productID = dataSrcfile.substring(0, 5);
|
||||||
|
}
|
||||||
|
sb.append(productID);
|
||||||
|
sb.append(delim);
|
||||||
|
sb.append("v");
|
||||||
|
sb.append(prodVer);
|
||||||
|
|
||||||
|
return sb.toString().toLowerCase();
|
||||||
}
|
}
|
||||||
|
|
||||||
return gsdD;
|
@Override
|
||||||
}
|
public String getVersion(FitsHdrBuilder hdrBuilder) {
|
||||||
|
|
||||||
@Override
|
String version = "";
|
||||||
public NameConvention getNameConvention() {
|
|
||||||
return NameConvention.ALTNFTMLN;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
return version;
|
||||||
/** Parse the filename using the ALTWG MLN naming convention and return the GSD value. */
|
}
|
||||||
public double gsdFromFilename(String filename) {
|
|
||||||
String[] splitStr = filename.split("_");
|
/**
|
||||||
// GSD is second element
|
* Extract ground sample distance from FitsHdrBuilder. GSD is needed as part of naming convention.
|
||||||
String gsd = splitStr[1];
|
* GSD in units of mm.
|
||||||
gsd = gsd.replace("mm", "");
|
*/
|
||||||
return StringUtil.parseSafeD(gsd);
|
@Override
|
||||||
}
|
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
|
||||||
|
|
||||||
|
// find relevant information in the hdrBuilder map.
|
||||||
|
double gsdD = Double.NaN;
|
||||||
|
HeaderTag key = HeaderTag.GSD;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
String gsd = hdrBuilder.getCard(key).getValue();
|
||||||
|
|
||||||
|
gsdD = Double.parseDouble(gsd);
|
||||||
|
String fileUnits = "";
|
||||||
|
if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
|
||||||
|
fileUnits = "mm";
|
||||||
|
} else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
||||||
|
|
||||||
|
// mandated to use mm! change the units
|
||||||
|
gsdD = gsdD * 10.0D;
|
||||||
|
fileUnits = "mm";
|
||||||
|
}
|
||||||
|
System.out.println("gsd:" + gsd);
|
||||||
|
System.out.println("file units:" + fileUnits);
|
||||||
|
|
||||||
|
} else {
|
||||||
|
String errMesg = "ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
|
||||||
|
return gsdD;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public NameConvention getNameConvention() {
|
||||||
|
return NameConvention.ALTNFTMLN;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
/** Parse the filename using the ALTWG MLN naming convention and return the GSD value. */
|
||||||
|
public double gsdFromFilename(String filename) {
|
||||||
|
String[] splitStr = filename.split("_");
|
||||||
|
// GSD is second element
|
||||||
|
String gsd = splitStr[1];
|
||||||
|
gsd = gsd.replace("mm", "");
|
||||||
|
return StringUtil.parseSafeD(gsd);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
@@ -14,291 +36,287 @@ import terrasaur.utils.StringUtil;
|
|||||||
*/
|
*/
|
||||||
public class DartNamer implements ProductNamer {
|
public class DartNamer implements ProductNamer {
|
||||||
|
|
||||||
public static String getBaseName(Map<NameFields, String> nameFragments) {
|
public static String getBaseName(Map<NameFields, String> nameFragments) {
|
||||||
|
|
||||||
// check to see if the map contains all the fragments needed. Throw runtimeexception if it
|
// check to see if the map contains all the fragments needed. Throw runtimeexception if it
|
||||||
// doesn't
|
// doesn't
|
||||||
validateMap(nameFragments);
|
validateMap(nameFragments);
|
||||||
|
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
String delim = "_";
|
String delim = "_";
|
||||||
sb.append(nameFragments.get(NameFields.REGION));
|
sb.append(nameFragments.get(NameFields.REGION));
|
||||||
sb.append(delim);
|
sb.append(delim);
|
||||||
sb.append(nameFragments.get(NameFields.GSD));
|
sb.append(nameFragments.get(NameFields.GSD));
|
||||||
sb.append(delim);
|
sb.append(delim);
|
||||||
|
|
||||||
// data source should only be 3 characters long
|
// data source should only be 3 characters long
|
||||||
String dataSource = nameFragments.get(NameFields.DATASRC);
|
String dataSource = nameFragments.get(NameFields.DATASRC);
|
||||||
if (dataSource.length() > 3) {
|
if (dataSource.length() > 3) {
|
||||||
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
|
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
|
||||||
dataSource = dataSource.substring(0, 3);
|
dataSource = dataSource.substring(0, 3);
|
||||||
System.out.println(
|
System.out.println("Will set data source to:"
|
||||||
"Will set data source to:"
|
+ dataSource
|
||||||
+ dataSource
|
+ " but"
|
||||||
+ " but"
|
+ " this might NOT conform to the ALTWG naming convention!");
|
||||||
+ " this might NOT conform to the ALTWG naming convention!");
|
|
||||||
}
|
|
||||||
sb.append(dataSource);
|
|
||||||
sb.append(delim);
|
|
||||||
|
|
||||||
sb.append(nameFragments.get(NameFields.DATATYPE));
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(nameFragments.get(NameFields.TBODY));
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append(nameFragments.get(NameFields.CLATLON));
|
|
||||||
sb.append(delim);
|
|
||||||
sb.append("v");
|
|
||||||
|
|
||||||
// remove '.' from version string
|
|
||||||
String version = nameFragments.get(NameFields.VERSION);
|
|
||||||
version = version.replaceAll("\\.", "");
|
|
||||||
sb.append(version);
|
|
||||||
|
|
||||||
// pds likes having filenames all in the same case, so chose lowercase
|
|
||||||
String outFile = sb.toString().toLowerCase();
|
|
||||||
|
|
||||||
return outFile;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse the productName and return the portion of the name corresponding to a given field. Fields
|
|
||||||
* are assumed separated by "_" in the filename.
|
|
||||||
*
|
|
||||||
* @param productName
|
|
||||||
* @param fieldNum
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public String getNameFrag(String productName, int fieldNum) {
|
|
||||||
|
|
||||||
String[] fields = productName.split("_");
|
|
||||||
String returnField = "ERROR";
|
|
||||||
if (fieldNum > fields.length) {
|
|
||||||
System.out.println(
|
|
||||||
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
|
|
||||||
System.out.println("returning:" + returnField);
|
|
||||||
} else {
|
|
||||||
returnField = fields[fieldNum];
|
|
||||||
}
|
|
||||||
return returnField;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String productbaseName(
|
|
||||||
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
|
|
||||||
|
|
||||||
String gsd = "gsd";
|
|
||||||
String dataSrc = "dataSrc";
|
|
||||||
|
|
||||||
Map<NameFields, String> nameFragments = new HashMap<NameFields, String>();
|
|
||||||
|
|
||||||
// data type
|
|
||||||
String productType = altwgProduct.getFileFrag();
|
|
||||||
nameFragments.put(NameFields.DATATYPE, productType);
|
|
||||||
|
|
||||||
// product version
|
|
||||||
String prodVer = getVersion(hdrBuilder);
|
|
||||||
nameFragments.put(NameFields.VERSION, prodVer);
|
|
||||||
|
|
||||||
// extract ground sample distance. gsdD is in mm!
|
|
||||||
double gsdD = gsdFromHdr(hdrBuilder);
|
|
||||||
|
|
||||||
int gsdI = (int) Math.round(gsdD);
|
|
||||||
String fileUnits = "mm";
|
|
||||||
gsd = String.format("%05d", gsdI) + fileUnits;
|
|
||||||
nameFragments.put(NameFields.GSD, gsd);
|
|
||||||
|
|
||||||
// data source
|
|
||||||
HeaderTag key = HeaderTag.DATASRC;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
|
|
||||||
// check whether dataSrc needs to be modified
|
|
||||||
dataSrc = HeaderTag.getSDP(dataSrc);
|
|
||||||
// data source should only be 3 chars long
|
|
||||||
if (dataSrc.length() > 3) {
|
|
||||||
dataSrc = dataSrc.substring(0, 3);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
nameFragments.put(NameFields.DATASRC, dataSrc);
|
|
||||||
|
|
||||||
// center lon
|
|
||||||
key = HeaderTag.CLON;
|
|
||||||
String cLon = null;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
cLon = hdrBuilder.getCard(key).getValue();
|
|
||||||
}
|
|
||||||
if (cLon == null) {
|
|
||||||
if (isGlobal) {
|
|
||||||
// set center longitude to 0.0 if value not parsed and this is a global product
|
|
||||||
cLon = "0.0";
|
|
||||||
} else {
|
|
||||||
String errMesg = "ERROR! Could not parse CLON from fits header!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// center lat
|
|
||||||
key = HeaderTag.CLAT;
|
|
||||||
String cLat = null;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
cLat = hdrBuilder.getCard(key).getValue();
|
|
||||||
}
|
|
||||||
if (cLat == null) {
|
|
||||||
if (isGlobal) {
|
|
||||||
// set center latitude to 0.0 if value not parsed and this is a global product
|
|
||||||
cLat = "0.0";
|
|
||||||
} else {
|
|
||||||
String errMesg = "ERROR! Could not parse CLAT from fits header!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
|
|
||||||
nameFragments.put(NameFields.CLATLON, clahLon);
|
|
||||||
|
|
||||||
// region
|
|
||||||
String region = "l";
|
|
||||||
if (isGlobal) {
|
|
||||||
region = "g";
|
|
||||||
}
|
|
||||||
nameFragments.put(NameFields.REGION, region);
|
|
||||||
|
|
||||||
// target body
|
|
||||||
key = HeaderTag.TARGET;
|
|
||||||
String tBody = "unkn";
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
tBody = hdrBuilder.getCard(key).getValue();
|
|
||||||
tBody = getBodyStFrag(tBody);
|
|
||||||
}
|
|
||||||
nameFragments.put(NameFields.TBODY, tBody);
|
|
||||||
|
|
||||||
// pds likes having filenames all in the same case, so chose lowercase
|
|
||||||
String outFile = DartNamer.getBaseName(nameFragments);
|
|
||||||
return outFile;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String getVersion(FitsHdrBuilder hdrBuilder) {
|
|
||||||
|
|
||||||
String prodVer = "prodVer";
|
|
||||||
|
|
||||||
// note: this has been changed to MAP_VER in the SIS
|
|
||||||
HeaderTag key = HeaderTag.MAP_VER;
|
|
||||||
// key = HeaderTag.PRODVERS;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
prodVer = hdrBuilder.getCard(key).getValue();
|
|
||||||
prodVer = prodVer.replaceAll("\\.", "");
|
|
||||||
}
|
|
||||||
|
|
||||||
return prodVer;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
|
|
||||||
|
|
||||||
String gsd = "gsd";
|
|
||||||
double gsdD = Double.NaN;
|
|
||||||
|
|
||||||
// extract ground sample distance using GSD first
|
|
||||||
HeaderTag key = HeaderTag.GSD;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
gsd = hdrBuilder.getCard(key).getValue();
|
|
||||||
gsdD = StringUtil.parseSafeD(gsd);
|
|
||||||
if (gsdD < 0D) {
|
|
||||||
// keyword value not initialized
|
|
||||||
gsdD = Double.NaN;
|
|
||||||
System.out.println("WARNING! keyword GSD not set!");
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
|
||||||
}
|
|
||||||
if (Double.isNaN(gsdD)) {
|
|
||||||
// could not parse GSD into valid number, try GSDI
|
|
||||||
key = HeaderTag.GSDI;
|
|
||||||
if (hdrBuilder.containsKey(key)) {
|
|
||||||
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
|
|
||||||
if (gsdD < 0D) {
|
|
||||||
// keyword value not initialized
|
|
||||||
gsdD = Double.NaN;
|
|
||||||
System.out.println("WARNING! keyword GSDI not set!");
|
|
||||||
}
|
}
|
||||||
} else {
|
sb.append(dataSource);
|
||||||
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
sb.append(delim);
|
||||||
}
|
|
||||||
if (Double.isNaN(gsdD)) {
|
sb.append(nameFragments.get(NameFields.DATATYPE));
|
||||||
// still cannot parse gsd. Set to -999
|
sb.append(delim);
|
||||||
System.out.println(
|
sb.append(nameFragments.get(NameFields.TBODY));
|
||||||
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
|
sb.append(delim);
|
||||||
System.out.println("Setting gsd = -999");
|
sb.append(nameFragments.get(NameFields.CLATLON));
|
||||||
gsdD = -999D;
|
sb.append(delim);
|
||||||
}
|
sb.append("v");
|
||||||
|
|
||||||
|
// remove '.' from version string
|
||||||
|
String version = nameFragments.get(NameFields.VERSION);
|
||||||
|
version = version.replaceAll("\\.", "");
|
||||||
|
sb.append(version);
|
||||||
|
|
||||||
|
// pds likes having filenames all in the same case, so chose lowercase
|
||||||
|
String outFile = sb.toString().toLowerCase();
|
||||||
|
|
||||||
|
return outFile;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
/**
|
||||||
|
* Parse the productName and return the portion of the name corresponding to a given field. Fields
|
||||||
|
* are assumed separated by "_" in the filename.
|
||||||
|
*
|
||||||
|
* @param productName
|
||||||
|
* @param fieldNum
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public String getNameFrag(String productName, int fieldNum) {
|
||||||
|
|
||||||
// mandated to use mm! change the units
|
String[] fields = productName.split("_");
|
||||||
gsdD = gsdD * 10.0D;
|
String returnField = "ERROR";
|
||||||
|
if (fieldNum > fields.length) {
|
||||||
|
System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
|
||||||
|
System.out.println("returning:" + returnField);
|
||||||
|
} else {
|
||||||
|
returnField = fields[fieldNum];
|
||||||
|
}
|
||||||
|
return returnField;
|
||||||
}
|
}
|
||||||
|
|
||||||
return gsdD;
|
@Override
|
||||||
}
|
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
|
||||||
|
|
||||||
@Override
|
String gsd = "gsd";
|
||||||
public NameConvention getNameConvention() {
|
String dataSrc = "dataSrc";
|
||||||
return NameConvention.DARTPRODUCT;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
Map<NameFields, String> nameFragments = new HashMap<NameFields, String>();
|
||||||
* Parse target body string to get the proper string fragment for the target body name.
|
|
||||||
*
|
|
||||||
* @param tBody
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
private String getBodyStFrag(String tBody) {
|
|
||||||
|
|
||||||
String returnFrag = tBody;
|
// data type
|
||||||
|
String productType = altwgProduct.getFileFrag();
|
||||||
|
nameFragments.put(NameFields.DATATYPE, productType);
|
||||||
|
|
||||||
if (tBody.toLowerCase().contains("didy")) {
|
// product version
|
||||||
returnFrag = "didy";
|
String prodVer = getVersion(hdrBuilder);
|
||||||
} else {
|
nameFragments.put(NameFields.VERSION, prodVer);
|
||||||
if (tBody.toLowerCase().contains("dimo")) {
|
|
||||||
returnFrag = "dimo";
|
// extract ground sample distance. gsdD is in mm!
|
||||||
} else {
|
double gsdD = gsdFromHdr(hdrBuilder);
|
||||||
System.out.println("Could not parse target string fragment from:" + tBody);
|
|
||||||
}
|
int gsdI = (int) Math.round(gsdD);
|
||||||
|
String fileUnits = "mm";
|
||||||
|
gsd = String.format("%05d", gsdI) + fileUnits;
|
||||||
|
nameFragments.put(NameFields.GSD, gsd);
|
||||||
|
|
||||||
|
// data source
|
||||||
|
HeaderTag key = HeaderTag.DATASRC;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
|
||||||
|
// check whether dataSrc needs to be modified
|
||||||
|
dataSrc = HeaderTag.getSDP(dataSrc);
|
||||||
|
// data source should only be 3 chars long
|
||||||
|
if (dataSrc.length() > 3) {
|
||||||
|
dataSrc = dataSrc.substring(0, 3);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
nameFragments.put(NameFields.DATASRC, dataSrc);
|
||||||
|
|
||||||
|
// center lon
|
||||||
|
key = HeaderTag.CLON;
|
||||||
|
String cLon = null;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
cLon = hdrBuilder.getCard(key).getValue();
|
||||||
|
}
|
||||||
|
if (cLon == null) {
|
||||||
|
if (isGlobal) {
|
||||||
|
// set center longitude to 0.0 if value not parsed and this is a global product
|
||||||
|
cLon = "0.0";
|
||||||
|
} else {
|
||||||
|
String errMesg = "ERROR! Could not parse CLON from fits header!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// center lat
|
||||||
|
key = HeaderTag.CLAT;
|
||||||
|
String cLat = null;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
cLat = hdrBuilder.getCard(key).getValue();
|
||||||
|
}
|
||||||
|
if (cLat == null) {
|
||||||
|
if (isGlobal) {
|
||||||
|
// set center latitude to 0.0 if value not parsed and this is a global product
|
||||||
|
cLat = "0.0";
|
||||||
|
} else {
|
||||||
|
String errMesg = "ERROR! Could not parse CLAT from fits header!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
|
||||||
|
nameFragments.put(NameFields.CLATLON, clahLon);
|
||||||
|
|
||||||
|
// region
|
||||||
|
String region = "l";
|
||||||
|
if (isGlobal) {
|
||||||
|
region = "g";
|
||||||
|
}
|
||||||
|
nameFragments.put(NameFields.REGION, region);
|
||||||
|
|
||||||
|
// target body
|
||||||
|
key = HeaderTag.TARGET;
|
||||||
|
String tBody = "unkn";
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
tBody = hdrBuilder.getCard(key).getValue();
|
||||||
|
tBody = getBodyStFrag(tBody);
|
||||||
|
}
|
||||||
|
nameFragments.put(NameFields.TBODY, tBody);
|
||||||
|
|
||||||
|
// pds likes having filenames all in the same case, so chose lowercase
|
||||||
|
String outFile = DartNamer.getBaseName(nameFragments);
|
||||||
|
return outFile;
|
||||||
}
|
}
|
||||||
|
|
||||||
return returnFrag;
|
@Override
|
||||||
}
|
public String getVersion(FitsHdrBuilder hdrBuilder) {
|
||||||
|
|
||||||
private static void validateMap(Map<NameFields, String> nameFragments) {
|
String prodVer = "prodVer";
|
||||||
|
|
||||||
NameFields[] reqFields = new NameFields[7];
|
// note: this has been changed to MAP_VER in the SIS
|
||||||
reqFields[0] = NameFields.REGION;
|
HeaderTag key = HeaderTag.MAP_VER;
|
||||||
reqFields[1] = NameFields.GSD;
|
// key = HeaderTag.PRODVERS;
|
||||||
reqFields[2] = NameFields.DATASRC;
|
if (hdrBuilder.containsKey(key)) {
|
||||||
reqFields[3] = NameFields.DATATYPE;
|
prodVer = hdrBuilder.getCard(key).getValue();
|
||||||
reqFields[4] = NameFields.TBODY;
|
prodVer = prodVer.replaceAll("\\.", "");
|
||||||
reqFields[5] = NameFields.CLATLON;
|
}
|
||||||
reqFields[6] = NameFields.VERSION;
|
|
||||||
|
|
||||||
for (NameFields requiredField : reqFields) {
|
return prodVer;
|
||||||
|
|
||||||
if (!nameFragments.containsKey(requiredField)) {
|
|
||||||
String errMesg = "ERROR! Missing required field:" + requiredField.toString();
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
/** Parse the filename using the DART naming convention and return the GSD value. */
|
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
|
||||||
public double gsdFromFilename(String filename) {
|
|
||||||
|
|
||||||
String[] splitStr = filename.split("_");
|
String gsd = "gsd";
|
||||||
// GSD is second element
|
double gsdD = Double.NaN;
|
||||||
String gsd = splitStr[1];
|
|
||||||
gsd = gsd.replace("mm", "");
|
// extract ground sample distance using GSD first
|
||||||
return StringUtil.parseSafeD(gsd);
|
HeaderTag key = HeaderTag.GSD;
|
||||||
}
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
gsd = hdrBuilder.getCard(key).getValue();
|
||||||
|
gsdD = StringUtil.parseSafeD(gsd);
|
||||||
|
if (gsdD < 0D) {
|
||||||
|
// keyword value not initialized
|
||||||
|
gsdD = Double.NaN;
|
||||||
|
System.out.println("WARNING! keyword GSD not set!");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
||||||
|
}
|
||||||
|
if (Double.isNaN(gsdD)) {
|
||||||
|
// could not parse GSD into valid number, try GSDI
|
||||||
|
key = HeaderTag.GSDI;
|
||||||
|
if (hdrBuilder.containsKey(key)) {
|
||||||
|
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
|
||||||
|
if (gsdD < 0D) {
|
||||||
|
// keyword value not initialized
|
||||||
|
gsdD = Double.NaN;
|
||||||
|
System.out.println("WARNING! keyword GSDI not set!");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
System.out.println("could not find " + key.toString() + " to parse GSD from.");
|
||||||
|
}
|
||||||
|
if (Double.isNaN(gsdD)) {
|
||||||
|
// still cannot parse gsd. Set to -999
|
||||||
|
System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
|
||||||
|
System.out.println("Setting gsd = -999");
|
||||||
|
gsdD = -999D;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
|
||||||
|
|
||||||
|
// mandated to use mm! change the units
|
||||||
|
gsdD = gsdD * 10.0D;
|
||||||
|
}
|
||||||
|
|
||||||
|
return gsdD;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public NameConvention getNameConvention() {
|
||||||
|
return NameConvention.DARTPRODUCT;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse target body string to get the proper string fragment for the target body name.
|
||||||
|
*
|
||||||
|
* @param tBody
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
private String getBodyStFrag(String tBody) {
|
||||||
|
|
||||||
|
String returnFrag = tBody;
|
||||||
|
|
||||||
|
if (tBody.toLowerCase().contains("didy")) {
|
||||||
|
returnFrag = "didy";
|
||||||
|
} else {
|
||||||
|
if (tBody.toLowerCase().contains("dimo")) {
|
||||||
|
returnFrag = "dimo";
|
||||||
|
} else {
|
||||||
|
System.out.println("Could not parse target string fragment from:" + tBody);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return returnFrag;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void validateMap(Map<NameFields, String> nameFragments) {
|
||||||
|
|
||||||
|
NameFields[] reqFields = new NameFields[7];
|
||||||
|
reqFields[0] = NameFields.REGION;
|
||||||
|
reqFields[1] = NameFields.GSD;
|
||||||
|
reqFields[2] = NameFields.DATASRC;
|
||||||
|
reqFields[3] = NameFields.DATATYPE;
|
||||||
|
reqFields[4] = NameFields.TBODY;
|
||||||
|
reqFields[5] = NameFields.CLATLON;
|
||||||
|
reqFields[6] = NameFields.VERSION;
|
||||||
|
|
||||||
|
for (NameFields requiredField : reqFields) {
|
||||||
|
|
||||||
|
if (!nameFragments.containsKey(requiredField)) {
|
||||||
|
String errMesg = "ERROR! Missing required field:" + requiredField.toString();
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
/** Parse the filename using the DART naming convention and return the GSD value. */
|
||||||
|
public double gsdFromFilename(String filename) {
|
||||||
|
|
||||||
|
String[] splitStr = filename.split("_");
|
||||||
|
// GSD is second element
|
||||||
|
String gsd = splitStr[1];
|
||||||
|
gsd = gsd.replace("mm", "");
|
||||||
|
return StringUtil.parseSafeD(gsd);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,26 +1,50 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Enum to store the different types of naming conventions.
|
* Enum to store the different types of naming conventions.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum NameConvention {
|
public enum NameConvention {
|
||||||
|
ALTPRODUCT,
|
||||||
|
ALTNFTMLN,
|
||||||
|
DARTPRODUCT,
|
||||||
|
NOMATCH,
|
||||||
|
NONEUSED;
|
||||||
|
|
||||||
ALTPRODUCT, ALTNFTMLN, DARTPRODUCT, NOMATCH, NONEUSED;
|
public static NameConvention parseNameConvention(String name) {
|
||||||
|
for (NameConvention nameConvention : values()) {
|
||||||
public static NameConvention parseNameConvention(String name) {
|
if (nameConvention.toString().toLowerCase().equals(name.toLowerCase())) {
|
||||||
for (NameConvention nameConvention : values()) {
|
System.out.println("parsed naming convention:" + nameConvention.toString());
|
||||||
if (nameConvention.toString().toLowerCase().equals(name.toLowerCase())) {
|
return nameConvention;
|
||||||
System.out.println("parsed naming convention:" + nameConvention.toString());
|
}
|
||||||
|
}
|
||||||
|
NameConvention nameConvention = NameConvention.NOMATCH;
|
||||||
|
System.out.println("NameConvention.parseNameConvention()" + " could not parse naming convention:" + name
|
||||||
|
+ ". Returning:" + nameConvention.toString());
|
||||||
return nameConvention;
|
return nameConvention;
|
||||||
}
|
|
||||||
}
|
}
|
||||||
NameConvention nameConvention = NameConvention.NOMATCH;
|
|
||||||
System.out
|
|
||||||
.println("NameConvention.parseNameConvention()" + " could not parse naming convention:"
|
|
||||||
+ name + ". Returning:" + nameConvention.toString());
|
|
||||||
return nameConvention;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,13 +1,40 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Enum to describe the different parts of the product name. Used by concrete classes implementing
|
* Enum to describe the different parts of the product name. Used by concrete classes implementing
|
||||||
* ProductNamer.
|
* ProductNamer.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum NameFields {
|
public enum NameFields {
|
||||||
|
GSD,
|
||||||
GSD, DATATYPE, VERSION, DATASRC, CLATLON, REGION, TBODY;
|
DATATYPE,
|
||||||
|
VERSION,
|
||||||
|
DATASRC,
|
||||||
|
CLATLON,
|
||||||
|
REGION,
|
||||||
|
TBODY;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -13,99 +35,97 @@ import terrasaur.fits.FitsHdr.FitsHdrBuilder;
|
|||||||
*/
|
*/
|
||||||
public class NamingFactory {
|
public class NamingFactory {
|
||||||
|
|
||||||
public static ProductNamer getNamingConvention(NameConvention namingConvention) {
|
public static ProductNamer getNamingConvention(NameConvention namingConvention) {
|
||||||
|
|
||||||
switch (namingConvention) {
|
switch (namingConvention) {
|
||||||
case ALTPRODUCT:
|
case ALTPRODUCT:
|
||||||
return new ALTWGProductNamer();
|
return new ALTWGProductNamer();
|
||||||
|
|
||||||
case ALTNFTMLN:
|
case ALTNFTMLN:
|
||||||
return new AltwgMLNNamer();
|
return new AltwgMLNNamer();
|
||||||
|
|
||||||
case DARTPRODUCT:
|
case DARTPRODUCT:
|
||||||
return new DartNamer();
|
return new DartNamer();
|
||||||
|
|
||||||
default:
|
default:
|
||||||
System.err.println(
|
System.err.println("ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
|
||||||
"ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
|
throw new RuntimeException();
|
||||||
throw new RuntimeException();
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse for keyword in pipeline config file that specifies what naming convention to use.
|
|
||||||
*
|
|
||||||
* @param pipeConfig
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static ProductNamer parseNamingConvention(
|
|
||||||
Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
|
|
||||||
|
|
||||||
ProductNamer productNamer = null;
|
|
||||||
|
|
||||||
if ((pipeConfig.containsKey(AltPipelnEnum.NAMINGCONVENTION))) {
|
|
||||||
String value = pipeConfig.get(AltPipelnEnum.NAMINGCONVENTION);
|
|
||||||
productNamer = parseNamingConvention(value);
|
|
||||||
} else {
|
|
||||||
String errMesg = "ERROR! Naming convention should have been defined in pipeConfig!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return productNamer;
|
/**
|
||||||
}
|
* Parse for keyword in pipeline config file that specifies what naming convention to use.
|
||||||
|
*
|
||||||
|
* @param pipeConfig
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static ProductNamer parseNamingConvention(Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
|
||||||
|
|
||||||
/**
|
ProductNamer productNamer = null;
|
||||||
* Parse string to determine the naming convention to use. Naming convention supplied by
|
|
||||||
* ProductNamer interface.
|
|
||||||
*
|
|
||||||
* @param value
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static ProductNamer parseNamingConvention(String value) {
|
|
||||||
|
|
||||||
if (value.length() < 1) {
|
if ((pipeConfig.containsKey(AltPipelnEnum.NAMINGCONVENTION))) {
|
||||||
String errMesg = "ERROR! Cannot pass empty string to NamingFactory.parseNamingConvention!";
|
String value = pipeConfig.get(AltPipelnEnum.NAMINGCONVENTION);
|
||||||
throw new RuntimeException(errMesg);
|
productNamer = parseNamingConvention(value);
|
||||||
}
|
} else {
|
||||||
NameConvention nameConvention = NameConvention.parseNameConvention(value);
|
String errMesg = "ERROR! Naming convention should have been defined in pipeConfig!";
|
||||||
return NamingFactory.getNamingConvention(nameConvention);
|
throw new RuntimeException(errMesg);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
return productNamer;
|
||||||
* Given the naming convention, hdrBuilder, productType, and original output file, return the
|
|
||||||
* renamed output file and cross reference file. Output is returned as a File[] array where
|
|
||||||
* array[0] is the output basename, array[1] is the cross-reference file. If no naming convention
|
|
||||||
* is specified (NONEUSED) then array[0] is the same as outfile, array[1] is null.
|
|
||||||
*
|
|
||||||
* @param namingConvention
|
|
||||||
* @param hdrBuilder
|
|
||||||
* @param productType
|
|
||||||
* @param isGlobal
|
|
||||||
* @param outfile - proposed output filename. If Naming convention results in a renamed OBJ then
|
|
||||||
* this is not used. If no naming convention specified then outputFiles[0] = outfile.
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static File[] getBaseNameAndCrossRef(
|
|
||||||
NameConvention namingConvention,
|
|
||||||
FitsHdrBuilder hdrBuilder,
|
|
||||||
AltwgDataType productType,
|
|
||||||
boolean isGlobal,
|
|
||||||
String outfile) {
|
|
||||||
|
|
||||||
File[] outputFiles = new File[2];
|
|
||||||
|
|
||||||
// default to no renaming.
|
|
||||||
File crossrefFile = null;
|
|
||||||
String basename = outfile;
|
|
||||||
|
|
||||||
if (namingConvention != NameConvention.NONEUSED) {
|
|
||||||
ProductNamer productNamer = NamingFactory.getNamingConvention(namingConvention);
|
|
||||||
basename = productNamer.productbaseName(hdrBuilder, productType, isGlobal);
|
|
||||||
crossrefFile = new File(outfile + ".crf");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
outputFiles[0] = new File(basename);
|
/**
|
||||||
outputFiles[1] = crossrefFile;
|
* Parse string to determine the naming convention to use. Naming convention supplied by
|
||||||
return outputFiles;
|
* ProductNamer interface.
|
||||||
}
|
*
|
||||||
|
* @param value
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static ProductNamer parseNamingConvention(String value) {
|
||||||
|
|
||||||
|
if (value.length() < 1) {
|
||||||
|
String errMesg = "ERROR! Cannot pass empty string to NamingFactory.parseNamingConvention!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
NameConvention nameConvention = NameConvention.parseNameConvention(value);
|
||||||
|
return NamingFactory.getNamingConvention(nameConvention);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Given the naming convention, hdrBuilder, productType, and original output file, return the
|
||||||
|
* renamed output file and cross reference file. Output is returned as a File[] array where
|
||||||
|
* array[0] is the output basename, array[1] is the cross-reference file. If no naming convention
|
||||||
|
* is specified (NONEUSED) then array[0] is the same as outfile, array[1] is null.
|
||||||
|
*
|
||||||
|
* @param namingConvention
|
||||||
|
* @param hdrBuilder
|
||||||
|
* @param productType
|
||||||
|
* @param isGlobal
|
||||||
|
* @param outfile - proposed output filename. If Naming convention results in a renamed OBJ then
|
||||||
|
* this is not used. If no naming convention specified then outputFiles[0] = outfile.
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static File[] getBaseNameAndCrossRef(
|
||||||
|
NameConvention namingConvention,
|
||||||
|
FitsHdrBuilder hdrBuilder,
|
||||||
|
AltwgDataType productType,
|
||||||
|
boolean isGlobal,
|
||||||
|
String outfile) {
|
||||||
|
|
||||||
|
File[] outputFiles = new File[2];
|
||||||
|
|
||||||
|
// default to no renaming.
|
||||||
|
File crossrefFile = null;
|
||||||
|
String basename = outfile;
|
||||||
|
|
||||||
|
if (namingConvention != NameConvention.NONEUSED) {
|
||||||
|
ProductNamer productNamer = NamingFactory.getNamingConvention(namingConvention);
|
||||||
|
basename = productNamer.productbaseName(hdrBuilder, productType, isGlobal);
|
||||||
|
crossrefFile = new File(outfile + ".crf");
|
||||||
|
}
|
||||||
|
|
||||||
|
outputFiles[0] = new File(basename);
|
||||||
|
outputFiles[1] = crossrefFile;
|
||||||
|
return outputFiles;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.altwg.pipeline;
|
package terrasaur.altwg.pipeline;
|
||||||
|
|
||||||
import terrasaur.enums.AltwgDataType;
|
import terrasaur.enums.AltwgDataType;
|
||||||
@@ -5,17 +27,15 @@ import terrasaur.fits.FitsHdr.FitsHdrBuilder;
|
|||||||
|
|
||||||
public interface ProductNamer {
|
public interface ProductNamer {
|
||||||
|
|
||||||
public String getNameFrag(String productName, int fieldNum);
|
public String getNameFrag(String productName, int fieldNum);
|
||||||
|
|
||||||
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct,
|
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal);
|
||||||
boolean isGlobal);
|
|
||||||
|
|
||||||
public String getVersion(FitsHdrBuilder hdrBuilder);
|
public String getVersion(FitsHdrBuilder hdrBuilder);
|
||||||
|
|
||||||
public double gsdFromHdr(FitsHdrBuilder hdrBuilder);
|
public double gsdFromHdr(FitsHdrBuilder hdrBuilder);
|
||||||
|
|
||||||
public NameConvention getNameConvention();
|
public NameConvention getNameConvention();
|
||||||
|
|
||||||
public double gsdFromFilename(String filename);
|
|
||||||
|
|
||||||
|
public double gsdFromFilename(String filename);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,9 +1,30 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
import java.nio.charset.Charset;
|
import java.nio.charset.Charset;
|
||||||
import java.util.*;
|
import java.util.*;
|
||||||
|
|
||||||
import org.apache.commons.cli.CommandLine;
|
import org.apache.commons.cli.CommandLine;
|
||||||
import org.apache.commons.cli.Option;
|
import org.apache.commons.cli.Option;
|
||||||
import org.apache.commons.cli.Options;
|
import org.apache.commons.cli.Options;
|
||||||
@@ -11,14 +32,14 @@ import org.apache.commons.io.FileUtils;
|
|||||||
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
||||||
import org.apache.logging.log4j.LogManager;
|
import org.apache.logging.log4j.LogManager;
|
||||||
import org.apache.logging.log4j.Logger;
|
import org.apache.logging.log4j.Logger;
|
||||||
|
import spice.basic.Plane;
|
||||||
|
import spice.basic.Vector3;
|
||||||
import terrasaur.smallBodyModel.BoundingBox;
|
import terrasaur.smallBodyModel.BoundingBox;
|
||||||
import terrasaur.smallBodyModel.SmallBodyModel;
|
import terrasaur.smallBodyModel.SmallBodyModel;
|
||||||
import terrasaur.templates.TerrasaurTool;
|
import terrasaur.templates.TerrasaurTool;
|
||||||
import terrasaur.utils.Log4j2Configurator;
|
import terrasaur.utils.Log4j2Configurator;
|
||||||
import terrasaur.utils.NativeLibraryLoader;
|
import terrasaur.utils.NativeLibraryLoader;
|
||||||
import terrasaur.utils.PolyDataUtil;
|
import terrasaur.utils.PolyDataUtil;
|
||||||
import spice.basic.Plane;
|
|
||||||
import spice.basic.Vector3;
|
|
||||||
import vtk.vtkGenericCell;
|
import vtk.vtkGenericCell;
|
||||||
import vtk.vtkPoints;
|
import vtk.vtkPoints;
|
||||||
import vtk.vtkPolyData;
|
import vtk.vtkPolyData;
|
||||||
@@ -33,51 +54,45 @@ import vtk.vtksbCellLocator;
|
|||||||
*/
|
*/
|
||||||
public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
|
public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Adjust vertices of one shape model to lie on the surface of another.";
|
return "Adjust vertices of one shape model to lie on the surface of another.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
String header =
|
String header =
|
||||||
"""
|
"""
|
||||||
\n
|
\n
|
||||||
This program takes 2 shape models in OBJ format and tries to adjust
|
This program takes 2 shape models in OBJ format and tries to adjust
|
||||||
to vertices of the first shape model so they lie on the surface of the
|
to vertices of the first shape model so they lie on the surface of the
|
||||||
second shape model. It does this by shooting a ray starting from the origin
|
second shape model. It does this by shooting a ray starting from the origin
|
||||||
in the direction of each point of the first model into the second model and
|
in the direction of each point of the first model into the second model and
|
||||||
then changes the point of the first model to the intersection point.""";
|
then changes the point of the first model to the intersection point.""";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, "");
|
return TerrasaurTool.super.fullDescription(options, header, "");
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Options defineOptions() {
|
private static Options defineOptions() {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
Options options = TerrasaurTool.defineOptions();
|
||||||
options.addOption(
|
options.addOption(Option.builder("from")
|
||||||
Option.builder("from")
|
.hasArg()
|
||||||
.hasArg()
|
.desc("path to first shape model in OBJ format which will get shifted to the second shape model")
|
||||||
.desc(
|
.build());
|
||||||
"path to first shape model in OBJ format which will get shifted to the second shape model")
|
options.addOption(Option.builder("to")
|
||||||
.build());
|
.hasArg()
|
||||||
options.addOption(
|
.desc("path to second shape model in OBJ format which the first shape model will try to match to")
|
||||||
Option.builder("to")
|
.build());
|
||||||
.hasArg()
|
options.addOption(Option.builder("output")
|
||||||
.desc(
|
.hasArg()
|
||||||
"path to second shape model in OBJ format which the first shape model will try to match to")
|
.desc(
|
||||||
.build());
|
"path to adjusted shape model in OBJ format generated by this program by shifting first to second")
|
||||||
options.addOption(
|
.build());
|
||||||
Option.builder("output")
|
options.addOption(Option.builder("filelist")
|
||||||
.hasArg()
|
.desc(
|
||||||
.desc(
|
"""
|
||||||
"path to adjusted shape model in OBJ format generated by this program by shifting first to second")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("filelist")
|
|
||||||
.desc(
|
|
||||||
"""
|
|
||||||
If specified then the second required argument to this program,
|
If specified then the second required argument to this program,
|
||||||
"to" is a file containing a list of OBJ files to match to.
|
"to" is a file containing a list of OBJ files to match to.
|
||||||
In this situation the ray is shot into each of the shape models in this
|
In this situation the ray is shot into each of the shape models in this
|
||||||
@@ -86,204 +101,177 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
|
|||||||
list may be only a piece of the the complete shape model (e.g. a mapola).
|
list may be only a piece of the the complete shape model (e.g. a mapola).
|
||||||
However, the global shape model formed when all these pieces are
|
However, the global shape model formed when all these pieces are
|
||||||
combined together, may not have any holes or gaps.""")
|
combined together, may not have any holes or gaps.""")
|
||||||
.build());
|
.build());
|
||||||
options.addOption(
|
options.addOption(Option.builder("fit-plane-radius")
|
||||||
Option.builder("fit-plane-radius")
|
.hasArg()
|
||||||
.hasArg()
|
.desc(
|
||||||
.desc(
|
"""
|
||||||
"""
|
|
||||||
If present, find a local normal at each point in the first shape
|
If present, find a local normal at each point in the first shape
|
||||||
model by fitting a plane to all points within the specified radius.
|
model by fitting a plane to all points within the specified radius.
|
||||||
Use this normal to adjust the point to the second shape model rather
|
Use this normal to adjust the point to the second shape model rather
|
||||||
than the radial vector.""")
|
than the radial vector.""")
|
||||||
.build());
|
.build());
|
||||||
options.addOption(
|
options.addOption(Option.builder("local")
|
||||||
Option.builder("local")
|
.desc(
|
||||||
.desc(
|
"""
|
||||||
"""
|
|
||||||
Use when adjusting a local OBJ file to another. The best fit plane to the
|
Use when adjusting a local OBJ file to another. The best fit plane to the
|
||||||
first shape model is used to adjust the vertices rather than the radial vector
|
first shape model is used to adjust the vertices rather than the radial vector
|
||||||
for each point.
|
for each point.
|
||||||
""")
|
""")
|
||||||
.build());
|
.build());
|
||||||
return options;
|
return options;
|
||||||
}
|
|
||||||
|
|
||||||
static Vector3D computeMeanPoint(List<Vector3D> points) {
|
|
||||||
Vector3D meanPoint = new Vector3D(0., 0., 0.);
|
|
||||||
for (Vector3D point : points) meanPoint = meanPoint.add(point);
|
|
||||||
meanPoint = meanPoint.scalarMultiply(1. / points.size());
|
|
||||||
return meanPoint;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void adjustShapeModelToOtherShapeModel(
|
|
||||||
vtkPolyData frompolydata,
|
|
||||||
ArrayList<vtkPolyData> topolydata,
|
|
||||||
double planeRadius,
|
|
||||||
boolean localModel)
|
|
||||||
throws Exception {
|
|
||||||
vtkPoints points = frompolydata.GetPoints();
|
|
||||||
long numberPoints = frompolydata.GetNumberOfPoints();
|
|
||||||
|
|
||||||
boolean fitPlane = (planeRadius > 0);
|
|
||||||
SmallBodyModel sbModel = new SmallBodyModel(frompolydata);
|
|
||||||
double diagonalLength = new BoundingBox(frompolydata.GetBounds()).getDiagonalLength();
|
|
||||||
|
|
||||||
ArrayList<vtksbCellLocator> cellLocators = new ArrayList<>();
|
|
||||||
for (vtkPolyData polydata : topolydata) {
|
|
||||||
vtksbCellLocator cellLocator = new vtksbCellLocator();
|
|
||||||
cellLocator.SetDataSet(polydata);
|
|
||||||
cellLocator.CacheCellBoundsOn();
|
|
||||||
cellLocator.AutomaticOn();
|
|
||||||
cellLocator.BuildLocator();
|
|
||||||
cellLocators.add(cellLocator);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
vtkGenericCell cell = new vtkGenericCell();
|
static Vector3D computeMeanPoint(List<Vector3D> points) {
|
||||||
double tol = 1e-6;
|
Vector3D meanPoint = new Vector3D(0., 0., 0.);
|
||||||
double[] t = new double[1];
|
for (Vector3D point : points) meanPoint = meanPoint.add(point);
|
||||||
double[] pcoords = new double[3];
|
meanPoint = meanPoint.scalarMultiply(1. / points.size());
|
||||||
int[] subId = new int[1];
|
return meanPoint;
|
||||||
long[] cell_id = new long[1];
|
|
||||||
|
|
||||||
double[] localNormal = null;
|
|
||||||
if (localModel) {
|
|
||||||
// fit a plane to the local model and check that the normal points outward
|
|
||||||
Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata);
|
|
||||||
Vector3 localNormalVector = localPlane.getNormal();
|
|
||||||
if (localNormalVector.dot(localPlane.getPoint()) < 0)
|
|
||||||
localNormalVector = localNormalVector.negate();
|
|
||||||
localNormal = localNormalVector.toArray();
|
|
||||||
}
|
}
|
||||||
|
|
||||||
double[] p = new double[3];
|
public static void adjustShapeModelToOtherShapeModel(
|
||||||
Vector3D origin = new Vector3D(0., 0., 0.);
|
vtkPolyData frompolydata, ArrayList<vtkPolyData> topolydata, double planeRadius, boolean localModel)
|
||||||
for (int i = 0; i < numberPoints; ++i) {
|
throws Exception {
|
||||||
points.GetPoint(i, p);
|
vtkPoints points = frompolydata.GetPoints();
|
||||||
|
long numberPoints = frompolydata.GetNumberOfPoints();
|
||||||
|
|
||||||
Vector3D lookDir;
|
boolean fitPlane = (planeRadius > 0);
|
||||||
|
SmallBodyModel sbModel = new SmallBodyModel(frompolydata);
|
||||||
|
double diagonalLength = new BoundingBox(frompolydata.GetBounds()).getDiagonalLength();
|
||||||
|
|
||||||
if (fitPlane) {
|
ArrayList<vtksbCellLocator> cellLocators = new ArrayList<>();
|
||||||
// fit a plane to the local area
|
for (vtkPolyData polydata : topolydata) {
|
||||||
System.arraycopy(p, 0, origin.toArray(), 0, 3);
|
vtksbCellLocator cellLocator = new vtksbCellLocator();
|
||||||
lookDir = new Vector3D(sbModel.getNormalAtPoint(p, planeRadius)).normalize();
|
cellLocator.SetDataSet(polydata);
|
||||||
} else if (localModel) {
|
cellLocator.CacheCellBoundsOn();
|
||||||
System.arraycopy(p, 0, origin.toArray(), 0, 3);
|
cellLocator.AutomaticOn();
|
||||||
lookDir = new Vector3D(localNormal).normalize();
|
cellLocator.BuildLocator();
|
||||||
} else {
|
cellLocators.add(cellLocator);
|
||||||
// use radial vector
|
}
|
||||||
lookDir = new Vector3D(p).normalize();
|
|
||||||
}
|
|
||||||
|
|
||||||
Vector3D lookPt = lookDir.scalarMultiply(diagonalLength);
|
vtkGenericCell cell = new vtkGenericCell();
|
||||||
lookPt = lookPt.add(origin);
|
double tol = 1e-6;
|
||||||
|
double[] t = new double[1];
|
||||||
|
double[] pcoords = new double[3];
|
||||||
|
int[] subId = new int[1];
|
||||||
|
long[] cell_id = new long[1];
|
||||||
|
|
||||||
List<Vector3D> intersections = new ArrayList<>();
|
double[] localNormal = null;
|
||||||
for (vtksbCellLocator cellLocator : cellLocators) {
|
if (localModel) {
|
||||||
double[] intersectPoint = new double[3];
|
// fit a plane to the local model and check that the normal points outward
|
||||||
|
Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata);
|
||||||
|
Vector3 localNormalVector = localPlane.getNormal();
|
||||||
|
if (localNormalVector.dot(localPlane.getPoint()) < 0) localNormalVector = localNormalVector.negate();
|
||||||
|
localNormal = localNormalVector.toArray();
|
||||||
|
}
|
||||||
|
|
||||||
// trace ray from the lookPt to the origin - first intersection is the farthest intersection
|
double[] p = new double[3];
|
||||||
// from the origin
|
Vector3D origin = new Vector3D(0., 0., 0.);
|
||||||
int result =
|
for (int i = 0; i < numberPoints; ++i) {
|
||||||
cellLocator.IntersectWithLine(
|
points.GetPoint(i, p);
|
||||||
lookPt.toArray(),
|
Vector3D thisPoint = new Vector3D(p);
|
||||||
origin.toArray(),
|
|
||||||
tol,
|
|
||||||
t,
|
|
||||||
intersectPoint,
|
|
||||||
pcoords,
|
|
||||||
subId,
|
|
||||||
cell_id,
|
|
||||||
cell);
|
|
||||||
Vector3D intersectVector = new Vector3D(intersectPoint);
|
|
||||||
|
|
||||||
if (fitPlane || localModel) {
|
Vector3D lookDir;
|
||||||
// NOTE: result should return 1 in case of intersection but doesn't sometimes.
|
|
||||||
// Use the norm of intersection point to test for intersection instead.
|
|
||||||
|
|
||||||
NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>();
|
if (fitPlane) {
|
||||||
if (intersectVector.getNorm() > 0) {
|
// fit a plane to the local area
|
||||||
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector);
|
System.arraycopy(p, 0, origin.toArray(), 0, 3);
|
||||||
}
|
lookDir = new Vector3D(sbModel.getNormalAtPoint(p, planeRadius)).normalize();
|
||||||
|
} else if (localModel) {
|
||||||
|
System.arraycopy(p, 0, origin.toArray(), 0, 3);
|
||||||
|
lookDir = new Vector3D(localNormal).normalize();
|
||||||
|
} else {
|
||||||
|
// use radial vector
|
||||||
|
lookDir = new Vector3D(p).normalize();
|
||||||
|
}
|
||||||
|
|
||||||
lookPt = lookDir.scalarMultiply(-diagonalLength);
|
Vector3D lookPt = lookDir.scalarMultiply(diagonalLength);
|
||||||
lookPt = lookPt.add(origin);
|
lookPt = lookPt.add(thisPoint);
|
||||||
result =
|
|
||||||
cellLocator.IntersectWithLine(
|
|
||||||
lookPt.toArray(),
|
|
||||||
origin.toArray(),
|
|
||||||
tol,
|
|
||||||
t,
|
|
||||||
intersectPoint,
|
|
||||||
pcoords,
|
|
||||||
subId,
|
|
||||||
cell_id,
|
|
||||||
cell);
|
|
||||||
|
|
||||||
intersectVector = new Vector3D(intersectPoint);
|
List<Vector3D> intersections = new ArrayList<>();
|
||||||
if (intersectVector.getNorm() > 0) {
|
for (vtksbCellLocator cellLocator : cellLocators) {
|
||||||
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector);
|
double[] intersectPoint = new double[3];
|
||||||
}
|
|
||||||
|
|
||||||
if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey()));
|
// trace ray from thisPoint to the lookPt - Assume cell intersection is the closest one if
|
||||||
|
// there are multiple?
|
||||||
|
// NOTE: result should return 1 in case of intersection but doesn't sometimes.
|
||||||
|
// Use the norm of intersection point to test for intersection instead.
|
||||||
|
int result = cellLocator.IntersectWithLine(
|
||||||
|
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
|
||||||
|
Vector3D intersectVector = new Vector3D(intersectPoint);
|
||||||
|
|
||||||
|
NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>();
|
||||||
|
if (intersectVector.getNorm() > 0) {
|
||||||
|
pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
|
||||||
|
}
|
||||||
|
|
||||||
|
// look in the other direction
|
||||||
|
lookPt = lookDir.scalarMultiply(-diagonalLength);
|
||||||
|
lookPt = lookPt.add(thisPoint);
|
||||||
|
result = cellLocator.IntersectWithLine(
|
||||||
|
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
|
||||||
|
|
||||||
|
intersectVector = new Vector3D(intersectPoint);
|
||||||
|
if (intersectVector.getNorm() > 0) {
|
||||||
|
pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey()));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (intersections.isEmpty()) throw new Exception("Error: no intersections at all");
|
||||||
|
|
||||||
|
Vector3D meanIntersectionPoint = computeMeanPoint(intersections);
|
||||||
|
points.SetPoint(i, meanIntersectionPoint.toArray());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws Exception {
|
||||||
|
TerrasaurTool defaultOBJ = new AdjustShapeModelToOtherShapeModel();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
boolean loadListFromFile = cl.hasOption("filelist");
|
||||||
|
double planeRadius = Double.parseDouble(cl.getOptionValue("fit-plane-radius", "-1"));
|
||||||
|
boolean localModel = cl.hasOption("local");
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
|
||||||
|
String fromfile = cl.getOptionValue("from");
|
||||||
|
String tofile = cl.getOptionValue("to");
|
||||||
|
String outfile = cl.getOptionValue("output");
|
||||||
|
|
||||||
|
Log4j2Configurator.getInstance();
|
||||||
|
logger.info("loading <from-obj-file>: {}", fromfile);
|
||||||
|
vtkPolyData frompolydata = PolyDataUtil.loadShapeModelAndComputeNormals(fromfile);
|
||||||
|
|
||||||
|
ArrayList<vtkPolyData> topolydata = new ArrayList<>();
|
||||||
|
if (loadListFromFile) {
|
||||||
|
List<String> lines = FileUtils.readLines(new File(tofile), Charset.defaultCharset());
|
||||||
|
for (String file : lines) {
|
||||||
|
|
||||||
|
// checking length prevents trying to load an empty line, such as the
|
||||||
|
// last line of the file.
|
||||||
|
if (file.length() > 1) {
|
||||||
|
logger.info("loading <to-obj-file>: {}", file);
|
||||||
|
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(file));
|
||||||
|
}
|
||||||
|
}
|
||||||
} else {
|
} else {
|
||||||
if (result > 0) intersections.add(intersectVector);
|
logger.info("loading <to-obj-file>: {}", tofile);
|
||||||
|
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(tofile));
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
if (intersections.isEmpty()) throw new Exception("Error: no intersections at all");
|
adjustShapeModelToOtherShapeModel(frompolydata, topolydata, planeRadius, localModel);
|
||||||
|
|
||||||
Vector3D meanIntersectionPoint = computeMeanPoint(intersections);
|
PolyDataUtil.saveShapeModelAsOBJ(frompolydata, outfile);
|
||||||
points.SetPoint(i, meanIntersectionPoint.toArray());
|
|
||||||
|
logger.info("wrote {}", outfile);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
TerrasaurTool defaultOBJ = new AdjustShapeModelToOtherShapeModel();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
boolean loadListFromFile = cl.hasOption("filelist");
|
|
||||||
double planeRadius = Double.parseDouble(cl.getOptionValue("fit-plane-radius", "-1"));
|
|
||||||
boolean localModel = cl.hasOption("local");
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
|
|
||||||
String fromfile = cl.getOptionValue("from");
|
|
||||||
String tofile = cl.getOptionValue("to");
|
|
||||||
String outfile = cl.getOptionValue("output");
|
|
||||||
|
|
||||||
Log4j2Configurator.getInstance();
|
|
||||||
logger.info("loading <from-obj-file>: {}", fromfile);
|
|
||||||
vtkPolyData frompolydata = PolyDataUtil.loadShapeModelAndComputeNormals(fromfile);
|
|
||||||
|
|
||||||
ArrayList<vtkPolyData> topolydata = new ArrayList<>();
|
|
||||||
if (loadListFromFile) {
|
|
||||||
List<String> lines = FileUtils.readLines(new File(tofile), Charset.defaultCharset());
|
|
||||||
for (String file : lines) {
|
|
||||||
|
|
||||||
// checking length prevents trying to load an empty line, such as the
|
|
||||||
// last line of the file.
|
|
||||||
if (file.length() > 1) {
|
|
||||||
logger.info("loading <to-obj-file>: {}", file);
|
|
||||||
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(file));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
logger.info("loading <to-obj-file>: {}", tofile);
|
|
||||||
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(tofile));
|
|
||||||
}
|
|
||||||
|
|
||||||
adjustShapeModelToOtherShapeModel(frompolydata, topolydata, planeRadius, localModel);
|
|
||||||
|
|
||||||
PolyDataUtil.saveShapeModelAsOBJ(frompolydata, outfile);
|
|
||||||
|
|
||||||
logger.info("wrote {}", outfile);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
@@ -22,121 +44,117 @@ import vtk.vtkPolyData;
|
|||||||
*/
|
*/
|
||||||
public class AppendOBJ implements TerrasaurTool {
|
public class AppendOBJ implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Combine multiple shape files (OBJ or VTK format) into one.";
|
return "Combine multiple shape files (OBJ or VTK format) into one.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
String header = "This program combines input shape models into a single shape model.";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, "");
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logFile")
|
|
||||||
.hasArg()
|
|
||||||
.desc("If present, save screen output to log file.")
|
|
||||||
.build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logLevel")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim()
|
|
||||||
+ ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("boundary")
|
|
||||||
.desc("Only save out boundary. This option implies -vtk.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("decimate")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Reduce the number of facets in the output shape model. The argument should be between 0 and 1. "
|
|
||||||
+ "For example, if a model has 100 facets and <arg> is 0.90, "
|
|
||||||
+ "there will be approximately 10 facets after the decimation.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("input")
|
|
||||||
.required()
|
|
||||||
.hasArgs()
|
|
||||||
.desc(
|
|
||||||
"input file(s) to read. Format is derived from the allowed extension: "
|
|
||||||
+ "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified "
|
|
||||||
+ "with a single -input option, separated by whitespace. Alternatively, you may "
|
|
||||||
+ "specify -input multiple times.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("output").required().hasArg().desc("output file to write.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("vtk").desc("Save output file in VTK format rather than OBJ.").build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
|
|
||||||
TerrasaurTool defaultOBJ = new AppendOBJ();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
boolean boundaryOnly = cl.hasOption("boundary");
|
|
||||||
boolean vtkFormat = boundaryOnly || cl.hasOption("vtk");
|
|
||||||
boolean decimate = cl.hasOption("decimate");
|
|
||||||
double decimationPercentage =
|
|
||||||
decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
|
|
||||||
String outfile = cl.getOptionValue("output");
|
|
||||||
String[] infiles = cl.getOptionValues("input");
|
|
||||||
|
|
||||||
vtkAppendPolyData append = new vtkAppendPolyData();
|
|
||||||
append.UserManagedInputsOn();
|
|
||||||
append.SetNumberOfInputs(infiles.length);
|
|
||||||
|
|
||||||
for (int i = 0; i < infiles.length; ++i) {
|
|
||||||
logger.info("loading {} {} / {}", infiles[i], i + 1, infiles.length);
|
|
||||||
|
|
||||||
vtkPolyData polydata = PolyDataUtil.loadShapeModel(infiles[i]);
|
|
||||||
|
|
||||||
if (polydata == null) {
|
|
||||||
logger.warn("Cannot load {}", infiles[i]);
|
|
||||||
} else {
|
|
||||||
if (boundaryOnly) {
|
|
||||||
vtkPolyData boundary = PolyDataUtil.getBoundary(polydata);
|
|
||||||
boundary.GetCellData().SetScalars(null);
|
|
||||||
polydata.DeepCopy(boundary);
|
|
||||||
}
|
|
||||||
|
|
||||||
append.SetInputDataByNumber(i, polydata);
|
|
||||||
}
|
|
||||||
System.gc();
|
|
||||||
vtkObjectBase.JAVA_OBJECT_MANAGER.gc(false);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
append.Update();
|
@Override
|
||||||
|
public String fullDescription(Options options) {
|
||||||
|
String header = "This program combines input shape models into a single shape model.";
|
||||||
|
return TerrasaurTool.super.fullDescription(options, header, "");
|
||||||
|
}
|
||||||
|
|
||||||
vtkPolyData outputShape = append.GetOutput();
|
private static Options defineOptions() {
|
||||||
if (decimate) PolyDataUtil.decimatePolyData(outputShape, decimationPercentage);
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim()
|
||||||
|
+ ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
|
||||||
if (vtkFormat) PolyDataUtil.saveShapeModelAsVTK(outputShape, outfile);
|
options.addOption(Option.builder("boundary")
|
||||||
else PolyDataUtil.saveShapeModelAsOBJ(append.GetOutput(), outfile);
|
.desc("Only save out boundary. This option implies -vtk.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("decimate")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Reduce the number of facets in the output shape model. The argument should be between 0 and 1. "
|
||||||
|
+ "For example, if a model has 100 facets and <arg> is 0.90, "
|
||||||
|
+ "there will be approximately 10 facets after the decimation.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("input")
|
||||||
|
.required()
|
||||||
|
.hasArgs()
|
||||||
|
.desc("input file(s) to read. Format is derived from the allowed extension: "
|
||||||
|
+ "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified "
|
||||||
|
+ "with a single -input option, separated by whitespace. Alternatively, you may "
|
||||||
|
+ "specify -input multiple times.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("output")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("output file to write.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("vtk")
|
||||||
|
.desc("Save output file in VTK format rather than OBJ.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
logger.info("Wrote " + outfile);
|
public static void main(String[] args) throws Exception {
|
||||||
}
|
|
||||||
|
TerrasaurTool defaultOBJ = new AppendOBJ();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
boolean boundaryOnly = cl.hasOption("boundary");
|
||||||
|
boolean vtkFormat = boundaryOnly || cl.hasOption("vtk");
|
||||||
|
boolean decimate = cl.hasOption("decimate");
|
||||||
|
double decimationPercentage = decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
|
||||||
|
String outfile = cl.getOptionValue("output");
|
||||||
|
String[] infiles = cl.getOptionValues("input");
|
||||||
|
|
||||||
|
vtkAppendPolyData append = new vtkAppendPolyData();
|
||||||
|
append.UserManagedInputsOn();
|
||||||
|
append.SetNumberOfInputs(infiles.length);
|
||||||
|
|
||||||
|
for (int i = 0; i < infiles.length; ++i) {
|
||||||
|
logger.info("loading {} {} / {}", infiles[i], i + 1, infiles.length);
|
||||||
|
|
||||||
|
vtkPolyData polydata = PolyDataUtil.loadShapeModel(infiles[i]);
|
||||||
|
|
||||||
|
if (polydata == null) {
|
||||||
|
logger.warn("Cannot load {}", infiles[i]);
|
||||||
|
} else {
|
||||||
|
if (boundaryOnly) {
|
||||||
|
vtkPolyData boundary = PolyDataUtil.getBoundary(polydata);
|
||||||
|
boundary.GetCellData().SetScalars(null);
|
||||||
|
polydata.DeepCopy(boundary);
|
||||||
|
}
|
||||||
|
|
||||||
|
append.SetInputDataByNumber(i, polydata);
|
||||||
|
}
|
||||||
|
System.gc();
|
||||||
|
vtkObjectBase.JAVA_OBJECT_MANAGER.gc(false);
|
||||||
|
}
|
||||||
|
|
||||||
|
append.Update();
|
||||||
|
|
||||||
|
vtkPolyData outputShape = append.GetOutput();
|
||||||
|
if (decimate) PolyDataUtil.decimatePolyData(outputShape, decimationPercentage);
|
||||||
|
|
||||||
|
if (vtkFormat) PolyDataUtil.saveShapeModelAsVTK(outputShape, outfile);
|
||||||
|
else PolyDataUtil.saveShapeModelAsOBJ(append.GetOutput(), outfile);
|
||||||
|
|
||||||
|
logger.info("Wrote " + outfile);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
@@ -17,72 +39,65 @@ import terrasaur.utils.batch.GridType;
|
|||||||
|
|
||||||
public class BatchSubmit implements TerrasaurTool {
|
public class BatchSubmit implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
|
@Override
|
||||||
@Override
|
public String shortDescription() {
|
||||||
public String shortDescription() {
|
return "Run a command on a cluster.";
|
||||||
return "Run a command on a cluster.";
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
|
|
||||||
String footer = "\nRun a command on a cluster.\n";
|
|
||||||
|
|
||||||
return TerrasaurTool.super.fullDescription(options, "", footer);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("command")
|
|
||||||
.required()
|
|
||||||
.hasArgs()
|
|
||||||
.desc("Required. Command(s) to run.")
|
|
||||||
.build());
|
|
||||||
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name()));
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("gridType")
|
|
||||||
.hasArg()
|
|
||||||
.desc("Grid type. Valid values are " + sb + ". Default is LOCAL.")
|
|
||||||
.build());
|
|
||||||
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("workingDir")
|
|
||||||
.hasArg()
|
|
||||||
.desc("Working directory to run command. Default is current directory.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public static void main(String[] args) {
|
@Override
|
||||||
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
TerrasaurTool defaultOBJ = new BatchSubmit();
|
String footer = "\nRun a command on a cluster.\n";
|
||||||
|
|
||||||
Options options = defineOptions();
|
return TerrasaurTool.super.fullDescription(options, "", footer);
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
List<String> cmdList = Arrays.asList(cl.getOptionValues("command"));
|
|
||||||
BatchType batchType = BatchType.GRID_ENGINE;
|
|
||||||
GridType gridType =
|
|
||||||
cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
|
|
||||||
|
|
||||||
BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType);
|
|
||||||
String workingDir = "";
|
|
||||||
try {
|
|
||||||
submitter.runBatchSubmitinDir(workingDir);
|
|
||||||
} catch (InterruptedException | IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("command")
|
||||||
|
.required()
|
||||||
|
.hasArgs()
|
||||||
|
.desc("Required. Command(s) to run.")
|
||||||
|
.build());
|
||||||
|
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name()));
|
||||||
|
options.addOption(Option.builder("gridType")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Grid type. Valid values are " + sb + ". Default is LOCAL.")
|
||||||
|
.build());
|
||||||
|
|
||||||
|
options.addOption(Option.builder("workingDir")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Working directory to run command. Default is current directory.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
|
||||||
|
TerrasaurTool defaultOBJ = new BatchSubmit();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
List<String> cmdList = Arrays.asList(cl.getOptionValues("command"));
|
||||||
|
BatchType batchType = BatchType.GRID_ENGINE;
|
||||||
|
GridType gridType = cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
|
||||||
|
|
||||||
|
BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType);
|
||||||
|
String workingDir = "";
|
||||||
|
try {
|
||||||
|
submitter.runBatchSubmitinDir(workingDir);
|
||||||
|
} catch (InterruptedException | IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -42,35 +64,46 @@ import terrasaur.utils.math.MathConversions;
|
|||||||
|
|
||||||
public class CKFromSumFile implements TerrasaurTool {
|
public class CKFromSumFile implements TerrasaurTool {
|
||||||
|
|
||||||
private final static Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Create a CK from a list of sumfiles.";
|
return "Create a CK from a list of sumfiles.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String footer = "Create a CK from a list of sumfiles.";
|
String footer = "Create a CK from a list of sumfiles.";
|
||||||
return TerrasaurTool.super.fullDescription(options, "", footer);
|
return TerrasaurTool.super.fullDescription(options, "", footer);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Options defineOptions() {
|
private static Options defineOptions() {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
Options options = TerrasaurTool.defineOptions();
|
||||||
options.addOption(Option.builder("config").required().hasArg()
|
options.addOption(Option.builder("config")
|
||||||
.desc("Required. Name of configuration file.").build());
|
.required()
|
||||||
options.addOption(Option.builder("dumpConfig").hasArg()
|
.hasArg()
|
||||||
.desc("Write out an example configuration to the named file.").build());
|
.desc("Required. Name of configuration file.")
|
||||||
options.addOption(Option.builder("logFile").hasArg()
|
.build());
|
||||||
.desc("If present, save screen output to log file.").build());
|
options.addOption(Option.builder("dumpConfig")
|
||||||
StringBuilder sb = new StringBuilder();
|
.hasArg()
|
||||||
for (StandardLevel l : StandardLevel.values())
|
.desc("Write out an example configuration to the named file.")
|
||||||
sb.append(String.format("%s ", l.name()));
|
.build());
|
||||||
options.addOption(Option.builder("logLevel").hasArg()
|
options.addOption(Option.builder("logFile")
|
||||||
.desc("If present, print messages above selected priority. Valid values are "
|
.hasArg()
|
||||||
+ sb.toString().trim() + ". Default is INFO.")
|
.desc("If present, save screen output to log file.")
|
||||||
.build());
|
.build());
|
||||||
options.addOption(Option.builder("sumFile").hasArg().required().desc("""
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim() + ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("sumFile")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc(
|
||||||
|
"""
|
||||||
Required. File listing sumfiles to read. This is a text file, one per line.
|
Required. File listing sumfiles to read. This is a text file, one per line.
|
||||||
Lines starting with # are ignored.
|
Lines starting with # are ignored.
|
||||||
|
|
||||||
@@ -84,212 +117,212 @@ public class CKFromSumFile implements TerrasaurTool {
|
|||||||
D717506131G0.SUM
|
D717506131G0.SUM
|
||||||
# This is a comment
|
# This is a comment
|
||||||
D717506132G0.SUM
|
D717506132G0.SUM
|
||||||
""").build());
|
""")
|
||||||
return options;
|
.build());
|
||||||
}
|
return options;
|
||||||
|
|
||||||
private final CKFromSumFileConfig config;
|
|
||||||
private final NavigableMap<SumFile, String> sumFiles;
|
|
||||||
|
|
||||||
private CKFromSumFile(){config=null;sumFiles=null;}
|
|
||||||
|
|
||||||
public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) {
|
|
||||||
this.config = config;
|
|
||||||
this.sumFiles = sumFiles;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String writeMSOPCKFiles(String basename, List<String> comments) throws SpiceException {
|
|
||||||
|
|
||||||
ReferenceFrame instrFrame = new ReferenceFrame(config.instrumentFrameName());
|
|
||||||
ReferenceFrame scFrame = new ReferenceFrame(config.spacecraftFrame());
|
|
||||||
ReferenceFrame j2000 = new ReferenceFrame("J2000");
|
|
||||||
ReferenceFrame bodyFixed = new ReferenceFrame(config.bodyFrame());
|
|
||||||
|
|
||||||
ReferenceFrame ref = config.J2000() ? j2000 : bodyFixed;
|
|
||||||
|
|
||||||
logger.debug("Body fixed frame: {}", bodyFixed.getName());
|
|
||||||
logger.debug("Instrument frame: {}", instrFrame.getName());
|
|
||||||
logger.debug("Spacecraft frame: {}", scFrame.getName());
|
|
||||||
logger.debug(" Reference frame: {}", ref.getName());
|
|
||||||
|
|
||||||
File commentFile = new File(basename + "-comments.txt");
|
|
||||||
if (commentFile.exists())
|
|
||||||
if (!commentFile.delete())
|
|
||||||
logger.error("{} exists but cannot be deleted!", commentFile.getPath());
|
|
||||||
|
|
||||||
String setupFile = basename + ".setup";
|
|
||||||
String inputFile = basename + ".inp";
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
|
|
||||||
DateTimeFormatter dtf = DateTimeFormatter.ofPattern("uuuu-MM-dd HH:mm:ss z");
|
|
||||||
ZonedDateTime now = ZonedDateTime.now(ZoneId.of("UTC"));
|
|
||||||
|
|
||||||
sb.append("This CK was created on ").append(dtf.format(now)).append(" from the following sumFiles:\n");
|
|
||||||
for (SumFile sumFile : sumFiles.keySet()) {
|
|
||||||
sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile)));
|
|
||||||
}
|
|
||||||
sb.append("\n");
|
|
||||||
sb.append("providing the orientation of ").append(scFrame.getName()).append(" with respect to ").append(config.J2000() ? "J2000" : bodyFixed.getName()).append(". ");
|
|
||||||
double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds();
|
|
||||||
double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend();
|
|
||||||
sb.append("The coverage period is ").append(new TDBTime(first).toUTCString("ISOC", 3)).append(" to ").append(new TDBTime(last).toUTCString("ISOC", 3)).append(" UTC.");
|
|
||||||
|
|
||||||
String allComments = sb.toString();
|
|
||||||
for (String comment : allComments.split("\\r?\\n"))
|
|
||||||
pw.println(WordUtils.wrap(comment, 80));
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
Map<String, String> map = new TreeMap<>();
|
private final CKFromSumFileConfig config;
|
||||||
map.put("LSK_FILE_NAME", "'" + config.lsk() + "'");
|
private final NavigableMap<SumFile, String> sumFiles;
|
||||||
map.put("SCLK_FILE_NAME", "'" + config.sclk() + "'");
|
|
||||||
map.put("CK_TYPE", "3");
|
|
||||||
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
|
|
||||||
map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode()));
|
|
||||||
map.put("REFERENCE_FRAME_NAME",
|
|
||||||
String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
|
|
||||||
if (!config.fk().isEmpty())
|
|
||||||
map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
|
|
||||||
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
|
|
||||||
map.put("INPUT_TIME_TYPE", "'UTC'");
|
|
||||||
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
|
|
||||||
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(setupFile)) {
|
private CKFromSumFile() {
|
||||||
pw.println("\\begindata");
|
config = null;
|
||||||
for (String key : map.keySet()) {
|
sumFiles = null;
|
||||||
pw.printf("%s = %s\n", key, map.get(key));
|
|
||||||
}
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
|
public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) {
|
||||||
for (SumFile s : sumFiles.keySet()) {
|
this.config = config;
|
||||||
TDBTime t = new TDBTime(s.utcString());
|
this.sumFiles = sumFiles;
|
||||||
|
|
||||||
Vector3[] rows = new Vector3[3];
|
|
||||||
rows[0] = MathConversions.toVector3(s.cx());
|
|
||||||
rows[1] = MathConversions.toVector3(s.cy());
|
|
||||||
rows[2] = MathConversions.toVector3(s.cz());
|
|
||||||
|
|
||||||
Vector3 row0 = rows[Math.abs(config.flipX()) - 1];
|
|
||||||
Vector3 row1 = rows[Math.abs(config.flipY()) - 1];
|
|
||||||
Vector3 row2 = rows[Math.abs(config.flipZ()) - 1];
|
|
||||||
|
|
||||||
if (config.flipX() < 0)
|
|
||||||
row0 = row0.negate();
|
|
||||||
if (config.flipY() < 0)
|
|
||||||
row1 = row1.negate();
|
|
||||||
if (config.flipZ() < 0)
|
|
||||||
row2 = row2.negate();
|
|
||||||
|
|
||||||
Matrix33 refToInstr = new Matrix33(row0, row1, row2);
|
|
||||||
|
|
||||||
if (config.J2000()) {
|
|
||||||
Matrix33 j2000ToBodyFixed = j2000.getPositionTransformation(bodyFixed, t);
|
|
||||||
refToInstr = refToInstr.mxm(j2000ToBodyFixed);
|
|
||||||
}
|
|
||||||
|
|
||||||
Matrix33 instrToSc = instrFrame.getPositionTransformation(scFrame, t);
|
|
||||||
Matrix33 refToSc = instrToSc.mxm(refToInstr);
|
|
||||||
|
|
||||||
SpiceQuaternion q = new SpiceQuaternion(refToSc);
|
|
||||||
attitudeMap.put(t.getTDBSeconds(), q);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (config.extend() > 0) {
|
public String writeMSOPCKFiles(String basename, List<String> comments) throws SpiceException {
|
||||||
var lastEntry = attitudeMap.lastEntry();
|
|
||||||
attitudeMap.put(lastEntry.getKey() + config.extend(), lastEntry.getValue());
|
ReferenceFrame instrFrame = new ReferenceFrame(config.instrumentFrameName());
|
||||||
|
ReferenceFrame scFrame = new ReferenceFrame(config.spacecraftFrame());
|
||||||
|
ReferenceFrame j2000 = new ReferenceFrame("J2000");
|
||||||
|
ReferenceFrame bodyFixed = new ReferenceFrame(config.bodyFrame());
|
||||||
|
|
||||||
|
ReferenceFrame ref = config.J2000() ? j2000 : bodyFixed;
|
||||||
|
|
||||||
|
logger.debug("Body fixed frame: {}", bodyFixed.getName());
|
||||||
|
logger.debug("Instrument frame: {}", instrFrame.getName());
|
||||||
|
logger.debug("Spacecraft frame: {}", scFrame.getName());
|
||||||
|
logger.debug(" Reference frame: {}", ref.getName());
|
||||||
|
|
||||||
|
File commentFile = new File(basename + "-comments.txt");
|
||||||
|
if (commentFile.exists())
|
||||||
|
if (!commentFile.delete()) logger.error("{} exists but cannot be deleted!", commentFile.getPath());
|
||||||
|
|
||||||
|
String setupFile = basename + ".setup";
|
||||||
|
String inputFile = basename + ".inp";
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
|
||||||
|
DateTimeFormatter dtf = DateTimeFormatter.ofPattern("uuuu-MM-dd HH:mm:ss z");
|
||||||
|
ZonedDateTime now = ZonedDateTime.now(ZoneId.of("UTC"));
|
||||||
|
|
||||||
|
sb.append("This CK was created on ").append(dtf.format(now)).append(" from the following sumFiles:\n");
|
||||||
|
for (SumFile sumFile : sumFiles.keySet()) {
|
||||||
|
sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile)));
|
||||||
|
}
|
||||||
|
sb.append("\n");
|
||||||
|
sb.append("providing the orientation of ")
|
||||||
|
.append(scFrame.getName())
|
||||||
|
.append(" with respect to ")
|
||||||
|
.append(config.J2000() ? "J2000" : bodyFixed.getName())
|
||||||
|
.append(". ");
|
||||||
|
double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds();
|
||||||
|
double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend();
|
||||||
|
sb.append("The coverage period is ")
|
||||||
|
.append(new TDBTime(first).toUTCString("ISOC", 3))
|
||||||
|
.append(" to ")
|
||||||
|
.append(new TDBTime(last).toUTCString("ISOC", 3))
|
||||||
|
.append(" UTC.");
|
||||||
|
|
||||||
|
String allComments = sb.toString();
|
||||||
|
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
Map<String, String> map = new TreeMap<>();
|
||||||
|
map.put("LSK_FILE_NAME", "'" + config.lsk() + "'");
|
||||||
|
map.put("SCLK_FILE_NAME", "'" + config.sclk() + "'");
|
||||||
|
map.put("CK_TYPE", "3");
|
||||||
|
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
|
||||||
|
map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode()));
|
||||||
|
map.put("REFERENCE_FRAME_NAME", String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
|
||||||
|
if (!config.fk().isEmpty()) map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
|
||||||
|
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
|
||||||
|
map.put("INPUT_TIME_TYPE", "'UTC'");
|
||||||
|
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
|
||||||
|
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(setupFile)) {
|
||||||
|
pw.println("\\begindata");
|
||||||
|
for (String key : map.keySet()) {
|
||||||
|
pw.printf("%s = %s\n", key, map.get(key));
|
||||||
|
}
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
|
||||||
|
for (SumFile s : sumFiles.keySet()) {
|
||||||
|
TDBTime t = new TDBTime(s.utcString());
|
||||||
|
|
||||||
|
Vector3[] rows = new Vector3[3];
|
||||||
|
rows[0] = MathConversions.toVector3(s.cx());
|
||||||
|
rows[1] = MathConversions.toVector3(s.cy());
|
||||||
|
rows[2] = MathConversions.toVector3(s.cz());
|
||||||
|
|
||||||
|
Vector3 row0 = rows[Math.abs(config.flipX()) - 1];
|
||||||
|
Vector3 row1 = rows[Math.abs(config.flipY()) - 1];
|
||||||
|
Vector3 row2 = rows[Math.abs(config.flipZ()) - 1];
|
||||||
|
|
||||||
|
if (config.flipX() < 0) row0 = row0.negate();
|
||||||
|
if (config.flipY() < 0) row1 = row1.negate();
|
||||||
|
if (config.flipZ() < 0) row2 = row2.negate();
|
||||||
|
|
||||||
|
Matrix33 refToInstr = new Matrix33(row0, row1, row2);
|
||||||
|
|
||||||
|
if (config.J2000()) {
|
||||||
|
Matrix33 j2000ToBodyFixed = j2000.getPositionTransformation(bodyFixed, t);
|
||||||
|
refToInstr = refToInstr.mxm(j2000ToBodyFixed);
|
||||||
|
}
|
||||||
|
|
||||||
|
Matrix33 instrToSc = instrFrame.getPositionTransformation(scFrame, t);
|
||||||
|
Matrix33 refToSc = instrToSc.mxm(refToInstr);
|
||||||
|
|
||||||
|
SpiceQuaternion q = new SpiceQuaternion(refToSc);
|
||||||
|
attitudeMap.put(t.getTDBSeconds(), q);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (config.extend() > 0) {
|
||||||
|
var lastEntry = attitudeMap.lastEntry();
|
||||||
|
attitudeMap.put(lastEntry.getKey() + config.extend(), lastEntry.getValue());
|
||||||
|
}
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
|
||||||
|
for (double t : attitudeMap.keySet()) {
|
||||||
|
SpiceQuaternion q = attitudeMap.get(t);
|
||||||
|
pw.printf(
|
||||||
|
"%s %.14e %.14e %.14e %.14e\n",
|
||||||
|
new TDBTime(t).toUTCString("ISOC", 6), q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
|
||||||
}
|
}
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
|
public static void main(String[] args) throws SpiceException, IOException {
|
||||||
for (double t : attitudeMap.keySet()) {
|
TerrasaurTool defaultOBJ = new CKFromSumFile();
|
||||||
SpiceQuaternion q = attitudeMap.get(t);
|
|
||||||
pw.printf("%s %.14e %.14e %.14e %.14e\n", new TDBTime(t).toUTCString("ISOC", 6),
|
Options options = defineOptions();
|
||||||
q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
|
|
||||||
}
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
if (cl.hasOption("dumpConfig")) {
|
||||||
|
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
|
||||||
|
PropertiesConfiguration config = factory.toConfig(factory.getTemplate());
|
||||||
|
try {
|
||||||
|
String filename = cl.getOptionValue("dumpConfig");
|
||||||
|
config.write(new PrintWriter(filename));
|
||||||
|
logger.info("Wrote {}", filename);
|
||||||
|
} catch (ConfigurationException | IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
|
||||||
|
PropertiesConfiguration config = null;
|
||||||
|
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
|
||||||
|
try {
|
||||||
|
config = new Configurations().properties(new File(cl.getOptionValue("config")));
|
||||||
|
} catch (ConfigurationException e1) {
|
||||||
|
logger.error(e1.getLocalizedMessage(), e1);
|
||||||
|
}
|
||||||
|
|
||||||
|
CKFromSumFileConfig appConfig = factory.fromConfig(config);
|
||||||
|
|
||||||
|
for (String kernel : appConfig.metakernel()) KernelDatabase.load(kernel);
|
||||||
|
|
||||||
|
NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> {
|
||||||
|
try {
|
||||||
|
return Double.compare(
|
||||||
|
new TDBTime(o1.utcString()).getTDBSeconds(), new TDBTime(o2.utcString()).getTDBSeconds());
|
||||||
|
} catch (SpiceErrorException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
});
|
||||||
|
|
||||||
|
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
|
||||||
|
for (String line : lines) {
|
||||||
|
if (line.strip().startsWith("#")) continue;
|
||||||
|
String[] parts = line.strip().split("\\s+");
|
||||||
|
String filename = parts[0].trim();
|
||||||
|
sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename));
|
||||||
|
}
|
||||||
|
|
||||||
|
CKFromSumFile app = new CKFromSumFile(appConfig, sumFiles);
|
||||||
|
TDBTime begin = new TDBTime(sumFiles.firstKey().utcString());
|
||||||
|
TDBTime end = new TDBTime(sumFiles.lastKey().utcString());
|
||||||
|
String picture = "YYYY_DOY";
|
||||||
|
String command = app.writeMSOPCKFiles(
|
||||||
|
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)), new ArrayList<>());
|
||||||
|
|
||||||
|
logger.info("To generate the CK, run:\n\t{}", command);
|
||||||
|
|
||||||
|
logger.info("Finished.");
|
||||||
}
|
}
|
||||||
|
|
||||||
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws SpiceException, IOException {
|
|
||||||
TerrasaurTool defaultOBJ = new CKFromSumFile();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
if (cl.hasOption("dumpConfig")){
|
|
||||||
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
|
|
||||||
PropertiesConfiguration config = factory.toConfig(factory.getTemplate());
|
|
||||||
try {
|
|
||||||
String filename = cl.getOptionValue("dumpConfig");
|
|
||||||
config.write(new PrintWriter(filename));
|
|
||||||
logger.info("Wrote {}", filename);
|
|
||||||
} catch (ConfigurationException | IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
|
|
||||||
PropertiesConfiguration config = null;
|
|
||||||
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
|
|
||||||
try {
|
|
||||||
config = new Configurations().properties(new File(cl.getOptionValue("config")));
|
|
||||||
} catch (ConfigurationException e1) {
|
|
||||||
logger.error(e1.getLocalizedMessage(), e1);
|
|
||||||
}
|
|
||||||
|
|
||||||
CKFromSumFileConfig appConfig = factory.fromConfig(config);
|
|
||||||
|
|
||||||
for (String kernel : appConfig.metakernel())
|
|
||||||
KernelDatabase.load(kernel);
|
|
||||||
|
|
||||||
NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> {
|
|
||||||
try {
|
|
||||||
return Double.compare(new TDBTime(o1.utcString()).getTDBSeconds(),
|
|
||||||
new TDBTime(o2.utcString()).getTDBSeconds());
|
|
||||||
} catch (SpiceErrorException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
return 0;
|
|
||||||
});
|
|
||||||
|
|
||||||
List<String> lines =
|
|
||||||
FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
|
|
||||||
for (String line : lines) {
|
|
||||||
if (line.strip().startsWith("#"))
|
|
||||||
continue;
|
|
||||||
String[] parts = line.strip().split("\\s+");
|
|
||||||
String filename = parts[0].trim();
|
|
||||||
sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename));
|
|
||||||
}
|
|
||||||
|
|
||||||
CKFromSumFile app = new CKFromSumFile(appConfig, sumFiles);
|
|
||||||
TDBTime begin = new TDBTime(sumFiles.firstKey().utcString());
|
|
||||||
TDBTime end = new TDBTime(sumFiles.lastKey().utcString());
|
|
||||||
String picture = "YYYY_DOY";
|
|
||||||
String command = app.writeMSOPCKFiles(
|
|
||||||
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)),
|
|
||||||
new ArrayList<>());
|
|
||||||
|
|
||||||
logger.info("To generate the CK, run:\n\t{}", command);
|
|
||||||
|
|
||||||
logger.info("Finished.");
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.awt.*;
|
import java.awt.*;
|
||||||
@@ -23,195 +45,223 @@ import vtk.vtkPolyData;
|
|||||||
|
|
||||||
public class CreateSBMTStructure implements TerrasaurTool {
|
public class CreateSBMTStructure implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* This doesn't need to be private, or even declared, but you might want to if you have other
|
* This doesn't need to be private, or even declared, but you might want to if you have other
|
||||||
* constructors.
|
* constructors.
|
||||||
*/
|
*/
|
||||||
private CreateSBMTStructure() {}
|
private CreateSBMTStructure() {}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Construct ellipses from user-defined points on an image.";
|
return "Construct ellipses from user-defined points on an image.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header =
|
String header = "This tool creates an SBMT ellipse file from a set of points on an image.";
|
||||||
"This tool creates an SBMT ellipse file from a set of point on an image.";
|
String footer = "";
|
||||||
String footer = "";
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Create an Ellipse as an SBMT structure from three points. The first two points define the long
|
* Create an Ellipse as an SBMT structure from three points. The first two points define the long
|
||||||
* axis and the third point defines the short axis.
|
* axis and the third point defines the short axis.
|
||||||
*
|
*
|
||||||
* @param p1 First point
|
* @param p1 First point
|
||||||
* @param p2 Second point
|
* @param p2 Second point
|
||||||
* @param p3 Third point
|
* @param p3 Third point
|
||||||
* @return An SBMT structure describing the ellipse
|
* @return An SBMT structure describing the ellipse
|
||||||
*/
|
*/
|
||||||
private static SBMTEllipseRecord createRecord(
|
private static SBMTEllipseRecord createRecord(int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
|
||||||
int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
|
// Create a local coordinate system where X axis contains long axis and Y axis contains short
|
||||||
// Create a local coordinate system where X axis contains long axis and Y axis contains short
|
// axis
|
||||||
// axis
|
|
||||||
|
|
||||||
Vector3D origin = p1.add(p2).scalarMultiply(0.5);
|
Vector3D origin = p1.add(p2).scalarMultiply(0.5);
|
||||||
Vector3D X = p1.subtract(p2).normalize();
|
Vector3D X = p1.subtract(p2).normalize();
|
||||||
Vector3D Y = p3.subtract(origin).normalize();
|
Vector3D Y = p3.subtract(origin).normalize();
|
||||||
|
|
||||||
// Create a rotation matrix to go from body fixed frame to this local coordinate system
|
// Create a rotation matrix to go from body fixed frame to this local coordinate system
|
||||||
Rotation globalToLocal = RotationUtils.IprimaryJsecondary(X, Y);
|
Rotation globalToLocal = RotationUtils.IprimaryJsecondary(X, Y);
|
||||||
|
|
||||||
// All of these vectors should have a Z coordinate of zero
|
// All of these vectors should have a Z coordinate of zero
|
||||||
Vector3D p1Local = globalToLocal.applyTo(p1);
|
Vector3D p1Local = globalToLocal.applyTo(p1);
|
||||||
Vector3D p2Local = globalToLocal.applyTo(p2);
|
Vector3D p2Local = globalToLocal.applyTo(p2);
|
||||||
Vector3D p3Local = globalToLocal.applyTo(p3);
|
Vector3D p3Local = globalToLocal.applyTo(p3);
|
||||||
|
|
||||||
// fit an ellipse to the three points on the plane
|
// fit an ellipse to the three points on the plane
|
||||||
Vector2D a = new Vector2D(p1Local.getX(), p1Local.getY());
|
Vector2D a = new Vector2D(p1Local.getX(), p1Local.getY());
|
||||||
Vector2D b = new Vector2D(p2Local.getX(), p2Local.getY());
|
Vector2D b = new Vector2D(p2Local.getX(), p2Local.getY());
|
||||||
Vector2D c = new Vector2D(p3Local.getX(), p3Local.getY());
|
Vector2D c = new Vector2D(p3Local.getX(), p3Local.getY());
|
||||||
|
|
||||||
Vector2D center = a.add(b).scalarMultiply(0.5);
|
Vector2D center = a.add(b).scalarMultiply(0.5);
|
||||||
double majorAxis = a.subtract(b).getNorm();
|
double majorAxis = a.subtract(b).getNorm();
|
||||||
double minorAxis = 2 * c.subtract(center).getNorm();
|
double minorAxis = 2 * c.subtract(center).getNorm();
|
||||||
|
|
||||||
double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX());
|
double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX());
|
||||||
double flattening = (majorAxis - minorAxis) / majorAxis;
|
double flattening = (majorAxis - minorAxis) / majorAxis;
|
||||||
|
|
||||||
ImmutableSBMTEllipseRecord.Builder record =
|
ImmutableSBMTEllipseRecord.Builder record = ImmutableSBMTEllipseRecord.builder()
|
||||||
ImmutableSBMTEllipseRecord.builder()
|
.id(id)
|
||||||
.id(id)
|
.name(name)
|
||||||
.name(name)
|
.x(origin.getX())
|
||||||
.x(origin.getX())
|
.y(origin.getY())
|
||||||
.y(origin.getY())
|
.z(origin.getZ())
|
||||||
.z(origin.getZ())
|
.lat(origin.getDelta())
|
||||||
.lat(origin.getDelta())
|
.lon(origin.getAlpha())
|
||||||
.lon(origin.getAlpha())
|
.radius(origin.getNorm())
|
||||||
.radius(origin.getNorm())
|
.slope(0)
|
||||||
.slope(0)
|
.elevation(0)
|
||||||
.elevation(0)
|
.acceleration(0)
|
||||||
.acceleration(0)
|
.potential(0)
|
||||||
.potential(0)
|
.diameter(majorAxis)
|
||||||
.diameter(majorAxis)
|
.flattening(flattening)
|
||||||
.flattening(flattening)
|
.angle(rotation)
|
||||||
.angle(rotation)
|
.color(Color.BLACK)
|
||||||
.color(Color.BLACK)
|
.dummy("")
|
||||||
.dummy("")
|
.label("");
|
||||||
.label("");
|
return record.build();
|
||||||
return record.build();
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
private static Options defineOptions() {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
Options options = TerrasaurTool.defineOptions();
|
||||||
options.addOption(
|
options.addOption(Option.builder("flipX")
|
||||||
Option.builder("input")
|
.desc("If present, negate the X coordinate of the input points.")
|
||||||
.required()
|
.build());
|
||||||
.hasArg()
|
options.addOption(Option.builder("flipY")
|
||||||
.desc(
|
.desc("If present, negate the Y coordinate of the input points.")
|
||||||
"""
|
.build());
|
||||||
Required. Name or input file. This is a text file with a pair of pixel coordinates per line. The pixel
|
options.addOption(Option.builder("input")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"""
|
||||||
|
Required. Name or input file. This is a text file with a pair of pixel coordinates (X and Y) per line. The pixel
|
||||||
coordinates are offsets from the image center. For example:
|
coordinates are offsets from the image center. For example:
|
||||||
|
|
||||||
# My test file
|
# My test file
|
||||||
|
|
||||||
627.51274 876.11775
|
89.6628 285.01
|
||||||
630.53612 883.55992
|
97.8027 280.126
|
||||||
626.3499 881.46681
|
95.0119 285.01
|
||||||
|
-13.8299 323.616
|
||||||
|
-1.9689 331.756
|
||||||
|
-11.7367 330.826
|
||||||
|
|
||||||
Empty lines or lines beginning with # are ignored.
|
Empty lines or lines beginning with # are ignored.
|
||||||
|
|
||||||
Each set of three points are used to create the SBMT structures. The first two points are the long
|
Each set of three points are used to create the SBMT structures. The first two points are the long
|
||||||
axis and the third is a location for the semi-minor axis.""")
|
axis and the third is a location for the semi-minor axis.""")
|
||||||
.build());
|
.build());
|
||||||
options.addOption(
|
options.addOption(Option.builder("objFile")
|
||||||
Option.builder("objFile")
|
.required()
|
||||||
.required()
|
.hasArg()
|
||||||
.hasArg()
|
.desc("Required. Name of OBJ shape file.")
|
||||||
.desc("Required. Name of OBJ shape file.")
|
.build());
|
||||||
.build());
|
options.addOption(Option.builder("output")
|
||||||
options.addOption(
|
.required()
|
||||||
Option.builder("output")
|
.hasArg()
|
||||||
.required()
|
.desc("Required. Name of output file.")
|
||||||
.hasArg()
|
.build());
|
||||||
.desc("Required. Name of output file.")
|
options.addOption(Option.builder("spice")
|
||||||
.build());
|
.hasArg()
|
||||||
options.addOption(
|
.desc(
|
||||||
Option.builder("sumFile")
|
"If present, name of metakernel to read. Other required options with -spice are -date, -observer, -target, and -cameraFrame.")
|
||||||
.required()
|
.build());
|
||||||
.hasArg()
|
options.addOption(Option.builder("date")
|
||||||
.desc("Required. Name of sum file to read.")
|
.hasArgs()
|
||||||
.build());
|
.desc("Only used with -spice. Date of image (e.g. 2022 SEP 26 23:11:12.649).")
|
||||||
return options;
|
.build());
|
||||||
}
|
options.addOption(Option.builder("observer")
|
||||||
|
.hasArg()
|
||||||
public static void main(String[] args) {
|
.desc("Only used with -spice. Observing body (e.g. DART)")
|
||||||
TerrasaurTool defaultOBJ = new CreateSBMTStructure();
|
.build());
|
||||||
|
options.addOption(Option.builder("target")
|
||||||
Options options = defineOptions();
|
.hasArg()
|
||||||
|
.desc("Only used with -spice. Target body (e.g. DIMORPHOS).")
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
.build());
|
||||||
|
options.addOption(Option.builder("cameraFrame")
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
.hasArg()
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
.desc("Only used with -spice. Camera frame (e.g. DART_DRACO).")
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
.build());
|
||||||
|
options.addOption(Option.builder("sumFile")
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
.required()
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
.hasArg()
|
||||||
|
.desc(
|
||||||
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
|
"Required. Name of sum file to read. This is still required with -spice, but only used as a template to create a new sum file.")
|
||||||
|
.build());
|
||||||
try {
|
return options;
|
||||||
String objFile = cl.getOptionValue("objFile");
|
|
||||||
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
|
|
||||||
if (polyData == null) {
|
|
||||||
logger.error("Cannot read shape model {}!", objFile);
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
|
|
||||||
|
|
||||||
List<Vector3D> intercepts = new ArrayList<>();
|
|
||||||
List<String> lines =
|
|
||||||
FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
|
|
||||||
for (String line :
|
|
||||||
lines.stream().filter(s -> !(s.isBlank() || s.strip().startsWith("#"))).toList()) {
|
|
||||||
String[] parts = line.split("\\s+");
|
|
||||||
int ix = (int) Math.round(Double.parseDouble(parts[0]));
|
|
||||||
int iy = (int) Math.round(Double.parseDouble(parts[1]));
|
|
||||||
|
|
||||||
Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy);
|
|
||||||
long cellID = entry.getKey();
|
|
||||||
if (cellID > -1) intercepts.add(entry.getValue());
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.info("Found {} sets of points", intercepts.size() / 3);
|
|
||||||
|
|
||||||
List<SBMTEllipseRecord> records = new ArrayList<>();
|
|
||||||
for (int i = 0; i < intercepts.size(); i += 3) {
|
|
||||||
|
|
||||||
// p1 and p2 define the long axis of the ellipse
|
|
||||||
Vector3D p1 = intercepts.get(i);
|
|
||||||
Vector3D p2 = intercepts.get(i+1);
|
|
||||||
|
|
||||||
// p3 lies on the short axis
|
|
||||||
Vector3D p3 = intercepts.get(i+2);
|
|
||||||
|
|
||||||
SBMTEllipseRecord record = createRecord(i/3, String.format("Ellipse %d", i/3), p1, p2, p3);
|
|
||||||
records.add(record);
|
|
||||||
}
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
|
|
||||||
for (SBMTEllipseRecord record : records) pw.println(record.toString());
|
|
||||||
}
|
|
||||||
logger.info("Wrote {}", cl.getOptionValue("output"));
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
throw new RuntimeException(e);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("Finished");
|
public static void main(String[] args) {
|
||||||
}
|
TerrasaurTool defaultOBJ = new CreateSBMTStructure();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
|
||||||
|
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
|
||||||
|
|
||||||
|
try {
|
||||||
|
String objFile = cl.getOptionValue("objFile");
|
||||||
|
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
|
||||||
|
if (polyData == null) {
|
||||||
|
logger.error("Cannot read shape model {}!", objFile);
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
|
||||||
|
|
||||||
|
boolean flipX = cl.hasOption("flipX");
|
||||||
|
boolean flipY = cl.hasOption("flipY");
|
||||||
|
List<Vector3D> intercepts = new ArrayList<>();
|
||||||
|
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
|
||||||
|
for (String line : lines.stream()
|
||||||
|
.filter(s -> !(s.isBlank() || s.strip().startsWith("#")))
|
||||||
|
.toList()) {
|
||||||
|
String[] parts = line.split("\\s+");
|
||||||
|
int ix = (int) Math.round(Double.parseDouble(parts[0]));
|
||||||
|
int iy = (int) Math.round(Double.parseDouble(parts[1]));
|
||||||
|
|
||||||
|
if (flipX) ix *= -1;
|
||||||
|
if (flipY) iy *= -1;
|
||||||
|
|
||||||
|
Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy);
|
||||||
|
long cellID = entry.getKey();
|
||||||
|
if (cellID > -1) intercepts.add(entry.getValue());
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Found {} sets of points", intercepts.size() / 3);
|
||||||
|
|
||||||
|
List<SBMTEllipseRecord> records = new ArrayList<>();
|
||||||
|
for (int i = 0; i < intercepts.size(); i += 3) {
|
||||||
|
|
||||||
|
// p1 and p2 define the long axis of the ellipse
|
||||||
|
Vector3D p1 = intercepts.get(i);
|
||||||
|
Vector3D p2 = intercepts.get(i + 1);
|
||||||
|
|
||||||
|
// p3 lies on the short axis
|
||||||
|
Vector3D p3 = intercepts.get(i + 2);
|
||||||
|
|
||||||
|
SBMTEllipseRecord record = createRecord(i / 3, String.format("Ellipse %d", i / 3), p1, p2, p3);
|
||||||
|
records.add(record);
|
||||||
|
}
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
|
||||||
|
for (SBMTEllipseRecord record : records) pw.println(record.toString());
|
||||||
|
}
|
||||||
|
logger.info("Wrote {}", cl.getOptionValue("output"));
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
throw new RuntimeException(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Finished");
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -21,176 +43,168 @@ import terrasaur.templates.TerrasaurTool;
|
|||||||
|
|
||||||
public class DSK2OBJ implements TerrasaurTool {
|
public class DSK2OBJ implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Create an OBJ from a DSK.";
|
return "Create an OBJ from a DSK.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
String header = "";
|
|
||||||
String footer = "\nCreate an OBJ from a DSK.\n";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("body")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, convert shape for named body. Default is to use the first body in the DSK.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("dsk").hasArg().required().desc("Required. Name of input DSK.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logFile")
|
|
||||||
.hasArg()
|
|
||||||
.desc("If present, save screen output to log file.")
|
|
||||||
.build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logLevel")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim()
|
|
||||||
+ ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("printBodies")
|
|
||||||
.desc("If present, print bodies and surface ids in DSK.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("surface")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, use specified surface id. Default is to use the first surface id for the body.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
TerrasaurTool defaultOBJ = new DSK2OBJ();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
System.loadLibrary("JNISpice");
|
|
||||||
|
|
||||||
String dskName = cl.getOptionValue("dsk");
|
|
||||||
File dskFile = new File(dskName);
|
|
||||||
if (!dskFile.exists()) {
|
|
||||||
logger.warn("Input DSK " + dskName + "does not exist!");
|
|
||||||
System.exit(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
try {
|
@Override
|
||||||
DSK dsk = DSK.openForRead(dskName);
|
public String fullDescription(Options options) {
|
||||||
Body[] bodies = dsk.getBodies();
|
String header = "";
|
||||||
|
String footer = "\nCreate an OBJ from a DSK.\n";
|
||||||
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
|
}
|
||||||
|
|
||||||
if (cl.hasOption("printBodies")) {
|
private static Options defineOptions() {
|
||||||
logger.info("found bodies and surface ids:");
|
Options options = TerrasaurTool.defineOptions();
|
||||||
for (int i = 0; i < bodies.length; i++) {
|
options.addOption(Option.builder("body")
|
||||||
Body b = bodies[i];
|
.hasArg()
|
||||||
Surface[] surfaces = dsk.getSurfaces(b);
|
.desc("If present, convert shape for named body. Default is to use the first body in the DSK.")
|
||||||
StringBuilder sb = new StringBuilder();
|
.build());
|
||||||
sb.append(String.format("%d) %s", i, b.getName()));
|
options.addOption(Option.builder("dsk")
|
||||||
for (Surface s : surfaces) sb.append(String.format(" %d", s.getIDCode()));
|
.hasArg()
|
||||||
logger.info(sb.toString());
|
.required()
|
||||||
|
.desc("Required. Name of input DSK.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim()
|
||||||
|
+ ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(
|
||||||
|
Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
|
||||||
|
options.addOption(Option.builder("printBodies")
|
||||||
|
.desc("If present, print bodies and surface ids in DSK.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("surface")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, use specified surface id. Default is to use the first surface id for the body.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
TerrasaurTool defaultOBJ = new DSK2OBJ();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
System.loadLibrary("JNISpice");
|
||||||
|
|
||||||
|
String dskName = cl.getOptionValue("dsk");
|
||||||
|
File dskFile = new File(dskName);
|
||||||
|
if (!dskFile.exists()) {
|
||||||
|
logger.warn("Input DSK " + dskName + "does not exist!");
|
||||||
|
System.exit(0);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
Body b = cl.hasOption("body") ? new Body(cl.getOptionValue("body")) : bodies[0];
|
try {
|
||||||
boolean missingBody = true;
|
DSK dsk = DSK.openForRead(dskName);
|
||||||
for (Body body : bodies) {
|
Body[] bodies = dsk.getBodies();
|
||||||
if (b.equals(body)) {
|
|
||||||
missingBody = false;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (missingBody) {
|
|
||||||
logger.warn(String.format("Body %s not found in DSK! Valid bodies are:", b.getName()));
|
|
||||||
for (Body body : bodies) logger.warn(body.getName());
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
|
|
||||||
Surface[] surfaces = dsk.getSurfaces(b);
|
if (cl.hasOption("printBodies")) {
|
||||||
Surface s =
|
logger.info("found bodies and surface ids:");
|
||||||
cl.hasOption("surface")
|
for (int i = 0; i < bodies.length; i++) {
|
||||||
? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b)
|
Body b = bodies[i];
|
||||||
: surfaces[0];
|
Surface[] surfaces = dsk.getSurfaces(b);
|
||||||
boolean missingSurface = true;
|
StringBuilder sb = new StringBuilder();
|
||||||
for (Surface surface : surfaces) {
|
sb.append(String.format("%d) %s", i, b.getName()));
|
||||||
if (s.equals(surface)) {
|
for (Surface s : surfaces) sb.append(String.format(" %d", s.getIDCode()));
|
||||||
missingSurface = false;
|
logger.info(sb.toString());
|
||||||
break;
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
if (missingSurface) {
|
|
||||||
logger.warn(
|
|
||||||
String.format(
|
|
||||||
"Surface %d for body %s not found in DSK! Valid surfaces are:",
|
|
||||||
s.getIDCode(), b.getName()));
|
|
||||||
for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode()));
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
|
|
||||||
DLADescriptor dladsc = dsk.beginBackwardSearch();
|
|
||||||
boolean found = true;
|
|
||||||
while (found) {
|
|
||||||
|
|
||||||
DSKDescriptor dskdsc = dsk.getDSKDescriptor(dladsc);
|
|
||||||
|
|
||||||
if (b.getIDCode() == dskdsc.getCenterID() && s.getIDCode() == dskdsc.getSurfaceID()) {
|
|
||||||
|
|
||||||
// number of plates and vertices
|
|
||||||
int[] np = new int[1];
|
|
||||||
int[] nv = new int[1];
|
|
||||||
CSPICE.dskz02(dsk.getHandle(), dladsc.toArray(), nv, np);
|
|
||||||
|
|
||||||
double[][] vertices = CSPICE.dskv02(dsk.getHandle(), dladsc.toArray(), 1, nv[0]);
|
|
||||||
int[][] plates = CSPICE.dskp02(dsk.getHandle(), dladsc.toArray(), 1, np[0]);
|
|
||||||
|
|
||||||
if (cl.hasOption("obj")) {
|
|
||||||
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("obj"))) {
|
|
||||||
for (double[] v : vertices) {
|
|
||||||
pw.printf("v %20.16f %20.16f %20.16f\r\n", v[0], v[1], v[2]);
|
|
||||||
}
|
|
||||||
for (int[] p : plates) {
|
|
||||||
pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]);
|
|
||||||
}
|
|
||||||
logger.info(
|
|
||||||
String.format(
|
|
||||||
"Wrote %d vertices and %d plates to %s for body %d surface %d",
|
|
||||||
nv[0],
|
|
||||||
np[0],
|
|
||||||
cl.getOptionValue("obj"),
|
|
||||||
dskdsc.getCenterID(),
|
|
||||||
dskdsc.getSurfaceID()));
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.warn(e.getLocalizedMessage());
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
found = dsk.hasPrevious(dladsc);
|
Body b = cl.hasOption("body") ? new Body(cl.getOptionValue("body")) : bodies[0];
|
||||||
if (found) {
|
boolean missingBody = true;
|
||||||
dladsc = dsk.getPrevious(dladsc);
|
for (Body body : bodies) {
|
||||||
}
|
if (b.equals(body)) {
|
||||||
}
|
missingBody = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (missingBody) {
|
||||||
|
logger.warn(String.format("Body %s not found in DSK! Valid bodies are:", b.getName()));
|
||||||
|
for (Body body : bodies) logger.warn(body.getName());
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
} catch (SpiceException e) {
|
Surface[] surfaces = dsk.getSurfaces(b);
|
||||||
logger.warn(e.getLocalizedMessage());
|
Surface s = cl.hasOption("surface")
|
||||||
|
? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b)
|
||||||
|
: surfaces[0];
|
||||||
|
boolean missingSurface = true;
|
||||||
|
for (Surface surface : surfaces) {
|
||||||
|
if (s.equals(surface)) {
|
||||||
|
missingSurface = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (missingSurface) {
|
||||||
|
logger.warn(String.format(
|
||||||
|
"Surface %d for body %s not found in DSK! Valid surfaces are:", s.getIDCode(), b.getName()));
|
||||||
|
for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode()));
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
DLADescriptor dladsc = dsk.beginBackwardSearch();
|
||||||
|
boolean found = true;
|
||||||
|
while (found) {
|
||||||
|
|
||||||
|
DSKDescriptor dskdsc = dsk.getDSKDescriptor(dladsc);
|
||||||
|
|
||||||
|
if (b.getIDCode() == dskdsc.getCenterID() && s.getIDCode() == dskdsc.getSurfaceID()) {
|
||||||
|
|
||||||
|
// number of plates and vertices
|
||||||
|
int[] np = new int[1];
|
||||||
|
int[] nv = new int[1];
|
||||||
|
CSPICE.dskz02(dsk.getHandle(), dladsc.toArray(), nv, np);
|
||||||
|
|
||||||
|
double[][] vertices = CSPICE.dskv02(dsk.getHandle(), dladsc.toArray(), 1, nv[0]);
|
||||||
|
int[][] plates = CSPICE.dskp02(dsk.getHandle(), dladsc.toArray(), 1, np[0]);
|
||||||
|
|
||||||
|
if (cl.hasOption("obj")) {
|
||||||
|
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("obj"))) {
|
||||||
|
for (double[] v : vertices) {
|
||||||
|
pw.printf("v %20.16f %20.16f %20.16f\r\n", v[0], v[1], v[2]);
|
||||||
|
}
|
||||||
|
for (int[] p : plates) {
|
||||||
|
pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]);
|
||||||
|
}
|
||||||
|
logger.info(String.format(
|
||||||
|
"Wrote %d vertices and %d plates to %s for body %d surface %d",
|
||||||
|
nv[0],
|
||||||
|
np[0],
|
||||||
|
cl.getOptionValue("obj"),
|
||||||
|
dskdsc.getCenterID(),
|
||||||
|
dskdsc.getSurfaceID()));
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.warn(e.getLocalizedMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
found = dsk.hasPrevious(dladsc);
|
||||||
|
if (found) {
|
||||||
|
dladsc = dsk.getPrevious(dladsc);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (SpiceException e) {
|
||||||
|
logger.warn(e.getLocalizedMessage());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -21,56 +43,54 @@ import terrasaur.utils.AppVersion;
|
|||||||
|
|
||||||
public class DumpConfig implements TerrasaurTool {
|
public class DumpConfig implements TerrasaurTool {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Write a sample configuration file to use with Terrasaur, using defaults for DART.";
|
return "Write a sample configuration file to use with Terrasaur, using defaults for DART.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
return WordUtils.wrap(
|
|
||||||
"This program writes out sample configuration files to be used with Terrasaur. "
|
|
||||||
+ "It takes a single argument, which is the name of the directory that will contain "
|
|
||||||
+ "the configuration files to be written.",
|
|
||||||
80);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
// if no arguments, print the usage and exit
|
|
||||||
if (args.length == 0) {
|
|
||||||
System.out.println(new DumpConfig().fullDescription(null));
|
|
||||||
System.exit(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// if -shortDescription is specified, print short description and exit.
|
@Override
|
||||||
for (String arg : args) {
|
public String fullDescription(Options options) {
|
||||||
if (arg.equals("-shortDescription")) {
|
return WordUtils.wrap(
|
||||||
System.out.println(new DumpConfig().shortDescription());
|
"This program writes out sample configuration files to be used with Terrasaur. "
|
||||||
System.exit(0);
|
+ "It takes a single argument, which is the name of the directory that will contain "
|
||||||
}
|
+ "the configuration files to be written.",
|
||||||
|
80);
|
||||||
}
|
}
|
||||||
|
|
||||||
File path = Paths.get(args[0]).toFile();
|
public static void main(String[] args) {
|
||||||
|
// if no arguments, print the usage and exit
|
||||||
|
if (args.length == 0) {
|
||||||
|
System.out.println(new DumpConfig().fullDescription(null));
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
ConfigBlock configBlock = TerrasaurConfig.getTemplate();
|
// if -shortDescription is specified, print short description and exit.
|
||||||
try (PrintWriter pw = new PrintWriter(path)) {
|
for (String arg : args) {
|
||||||
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock);
|
if (arg.equals("-shortDescription")) {
|
||||||
PropertiesConfigurationLayout layout = config.getLayout();
|
System.out.println(new DumpConfig().shortDescription());
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
String now =
|
File path = Paths.get(args[0]).toFile();
|
||||||
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
|
|
||||||
.withLocale(Locale.getDefault())
|
|
||||||
.withZone(ZoneOffset.UTC)
|
|
||||||
.format(Instant.now());
|
|
||||||
layout.setHeaderComment(
|
|
||||||
String.format(
|
|
||||||
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
|
|
||||||
|
|
||||||
config.write(pw);
|
ConfigBlock configBlock = TerrasaurConfig.getTemplate();
|
||||||
} catch (ConfigurationException | IOException e) {
|
try (PrintWriter pw = new PrintWriter(path)) {
|
||||||
throw new RuntimeException(e);
|
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock);
|
||||||
|
PropertiesConfigurationLayout layout = config.getLayout();
|
||||||
|
|
||||||
|
String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
|
||||||
|
.withLocale(Locale.getDefault())
|
||||||
|
.withZone(ZoneOffset.UTC)
|
||||||
|
.format(Instant.now());
|
||||||
|
layout.setHeaderComment(
|
||||||
|
String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
|
||||||
|
|
||||||
|
config.write(pw);
|
||||||
|
} catch (ConfigurationException | IOException e) {
|
||||||
|
throw new RuntimeException(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
System.out.println("Wrote config file to " + path.getAbsolutePath());
|
||||||
}
|
}
|
||||||
|
|
||||||
System.out.println("Wrote config file to " + path.getAbsolutePath());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
261
src/main/java/terrasaur/apps/FacetInfo.java
Normal file
261
src/main/java/terrasaur/apps/FacetInfo.java
Normal file
@@ -0,0 +1,261 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
|
package terrasaur.apps;
|
||||||
|
|
||||||
|
import java.io.PrintWriter;
|
||||||
|
import java.util.Arrays;
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.NavigableSet;
|
||||||
|
import java.util.TreeSet;
|
||||||
|
import org.apache.commons.cli.CommandLine;
|
||||||
|
import org.apache.commons.cli.Option;
|
||||||
|
import org.apache.commons.cli.Options;
|
||||||
|
import org.apache.commons.math3.geometry.euclidean.threed.SphericalCoordinates;
|
||||||
|
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
||||||
|
import org.apache.logging.log4j.LogManager;
|
||||||
|
import org.apache.logging.log4j.Logger;
|
||||||
|
import org.immutables.value.Value;
|
||||||
|
import terrasaur.templates.TerrasaurTool;
|
||||||
|
import terrasaur.utils.CellInfo;
|
||||||
|
import terrasaur.utils.NativeLibraryLoader;
|
||||||
|
import terrasaur.utils.PolyDataStatistics;
|
||||||
|
import terrasaur.utils.PolyDataUtil;
|
||||||
|
import vtk.vtkIdList;
|
||||||
|
import vtk.vtkOBBTree;
|
||||||
|
import vtk.vtkPolyData;
|
||||||
|
|
||||||
|
public class FacetInfo implements TerrasaurTool {
|
||||||
|
|
||||||
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* This doesn't need to be private, or even declared, but you might want to if you have other
|
||||||
|
* constructors.
|
||||||
|
*/
|
||||||
|
private FacetInfo() {}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String shortDescription() {
|
||||||
|
return "Print info about a facet.";
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String fullDescription(Options options) {
|
||||||
|
String header = "Prints information about facet(s).";
|
||||||
|
String footer =
|
||||||
|
"""
|
||||||
|
|
||||||
|
This tool prints out facet center, normal, angle between center and
|
||||||
|
normal, and other information about the specified facet(s).""";
|
||||||
|
|
||||||
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
|
}
|
||||||
|
|
||||||
|
private vtkPolyData polyData;
|
||||||
|
private vtkOBBTree searchTree;
|
||||||
|
private Vector3D origin;
|
||||||
|
|
||||||
|
private FacetInfo(vtkPolyData polyData) {
|
||||||
|
this.polyData = polyData;
|
||||||
|
PolyDataStatistics stats = new PolyDataStatistics(polyData);
|
||||||
|
origin = new Vector3D(stats.getCentroid());
|
||||||
|
|
||||||
|
logger.info("Origin is at {}", origin);
|
||||||
|
|
||||||
|
logger.info("Creating search tree");
|
||||||
|
searchTree = new vtkOBBTree();
|
||||||
|
searchTree.SetDataSet(polyData);
|
||||||
|
searchTree.SetTolerance(1e-12);
|
||||||
|
searchTree.BuildLocator();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @param cellId id of this cell
|
||||||
|
* @return Set of neighboring cells (ones which share a vertex with this one)
|
||||||
|
*/
|
||||||
|
private NavigableSet<Long> neighbors(long cellId) {
|
||||||
|
NavigableSet<Long> neighborCellIds = new TreeSet<>();
|
||||||
|
|
||||||
|
vtkIdList vertexIdlist = new vtkIdList();
|
||||||
|
CellInfo.getCellInfo(polyData, cellId, vertexIdlist);
|
||||||
|
|
||||||
|
vtkIdList facetIdlist = new vtkIdList();
|
||||||
|
for (long i = 0; i < vertexIdlist.GetNumberOfIds(); i++) {
|
||||||
|
long vertexId = vertexIdlist.GetId(i);
|
||||||
|
polyData.GetPointCells(vertexId, facetIdlist);
|
||||||
|
}
|
||||||
|
for (long i = 0; i < facetIdlist.GetNumberOfIds(); i++) {
|
||||||
|
long id = facetIdlist.GetId(i);
|
||||||
|
if (id == cellId) continue;
|
||||||
|
neighborCellIds.add(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
return neighborCellIds;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Value.Immutable
|
||||||
|
public abstract static class FacetInfoLine {
|
||||||
|
|
||||||
|
public abstract long index();
|
||||||
|
|
||||||
|
public abstract Vector3D radius();
|
||||||
|
|
||||||
|
public abstract Vector3D normal();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @return facets between this and origin
|
||||||
|
*/
|
||||||
|
public abstract NavigableSet<Long> interiorIntersections();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @return facets between this and infinity
|
||||||
|
*/
|
||||||
|
public abstract NavigableSet<Long> exteriorIntersections();
|
||||||
|
|
||||||
|
public static String getHeader() {
|
||||||
|
return "# Index, "
|
||||||
|
+ "Center Lat (deg), "
|
||||||
|
+ "Center Lon (deg), "
|
||||||
|
+ "Radius, "
|
||||||
|
+ "Radial Vector, "
|
||||||
|
+ "Normal Vector, "
|
||||||
|
+ "Angle between radial and normal (deg), "
|
||||||
|
+ "facets between this and origin, "
|
||||||
|
+ "facets between this and infinity";
|
||||||
|
}
|
||||||
|
|
||||||
|
public String toCSV() {
|
||||||
|
|
||||||
|
SphericalCoordinates spc = new SphericalCoordinates(radius());
|
||||||
|
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
|
||||||
|
sb.append(String.format("%d, ", index()));
|
||||||
|
sb.append(String.format("%.4f, ", 90 - Math.toDegrees(spc.getPhi())));
|
||||||
|
sb.append(String.format("%.4f, ", Math.toDegrees(spc.getTheta())));
|
||||||
|
sb.append(String.format("%.6f, ", spc.getR()));
|
||||||
|
sb.append(String.format("%.6f %.6f %.6f, ", radius().getX(), radius().getY(), radius().getZ()));
|
||||||
|
sb.append(String.format("%.6f %.6f %.6f, ", normal().getX(), normal().getY(), normal().getZ()));
|
||||||
|
sb.append(String.format("%.3f, ", Math.toDegrees(Vector3D.angle(radius(), normal()))));
|
||||||
|
sb.append(String.format("%d", interiorIntersections().size()));
|
||||||
|
if (!interiorIntersections().isEmpty()) {
|
||||||
|
for (long id : interiorIntersections()) sb.append(String.format(" %d", id));
|
||||||
|
}
|
||||||
|
sb.append(", ");
|
||||||
|
sb.append(String.format("%d", exteriorIntersections().size()));
|
||||||
|
if (!exteriorIntersections().isEmpty()) {
|
||||||
|
for (long id : exteriorIntersections()) sb.append(String.format(" %d", id));
|
||||||
|
}
|
||||||
|
return sb.toString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private FacetInfoLine getFacetInfoLine(long cellId) {
|
||||||
|
CellInfo ci = CellInfo.getCellInfo(polyData, cellId, new vtkIdList());
|
||||||
|
|
||||||
|
vtkIdList cellIds = new vtkIdList();
|
||||||
|
searchTree.IntersectWithLine(origin.toArray(), ci.center().toArray(), null, cellIds);
|
||||||
|
|
||||||
|
// count up all crossings of the surface between the origin and the facet.
|
||||||
|
NavigableSet<Long> insideIds = new TreeSet<>();
|
||||||
|
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
|
||||||
|
if (cellIds.GetId(j) == cellId) continue;
|
||||||
|
insideIds.add(cellIds.GetId(j));
|
||||||
|
}
|
||||||
|
|
||||||
|
Vector3D infinity = ci.center().scalarMultiply(1e9);
|
||||||
|
|
||||||
|
cellIds = new vtkIdList();
|
||||||
|
searchTree.IntersectWithLine(infinity.toArray(), ci.center().toArray(), null, cellIds);
|
||||||
|
|
||||||
|
// count up all crossings of the surface between the infinity and the facet.
|
||||||
|
NavigableSet<Long> outsideIds = new TreeSet<>();
|
||||||
|
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
|
||||||
|
if (cellIds.GetId(j) == cellId) continue;
|
||||||
|
outsideIds.add(cellIds.GetId(j));
|
||||||
|
}
|
||||||
|
|
||||||
|
return ImmutableFacetInfoLine.builder()
|
||||||
|
.index(cellId)
|
||||||
|
.radius(ci.center())
|
||||||
|
.normal(ci.normal())
|
||||||
|
.interiorIntersections(insideIds)
|
||||||
|
.exteriorIntersections(outsideIds)
|
||||||
|
.build();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("facet")
|
||||||
|
.required()
|
||||||
|
.hasArgs()
|
||||||
|
.desc("Facet(s) to query. Separate multiple indices with whitespace.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("obj")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Shape model to validate.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("output")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("CSV file to write.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
TerrasaurTool defaultOBJ = new FacetInfo();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
|
||||||
|
try {
|
||||||
|
vtkPolyData polydata = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
|
||||||
|
FacetInfo app = new FacetInfo(polydata);
|
||||||
|
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
|
||||||
|
pw.println(FacetInfoLine.getHeader());
|
||||||
|
for (long cellId : Arrays.stream(cl.getOptionValues("facet"))
|
||||||
|
.mapToLong(Long::parseLong)
|
||||||
|
.toArray()) {
|
||||||
|
pw.println(app.getFacetInfoLine(cellId).toCSV());
|
||||||
|
|
||||||
|
NavigableSet<Long> neighbors = app.neighbors(cellId);
|
||||||
|
for (long neighborCellId : neighbors)
|
||||||
|
pw.println(app.getFacetInfoLine(neighborCellId).toCSV());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
logger.info("Wrote {}", cl.getOptionValue("output"));
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Finished.");
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -156,24 +178,24 @@ import vtk.vtkPolyData;
|
|||||||
*/
|
*/
|
||||||
public class GetSpots implements TerrasaurTool {
|
public class GetSpots implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
private GetSpots() {}
|
private GetSpots() {}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "find relevant OSIRIS-REx data for assigning values to facets in an OBJ file.";
|
return "find relevant OSIRIS-REx data for assigning values to facets in an OBJ file.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header =
|
String header =
|
||||||
"""
|
"""
|
||||||
This program identifies those times when the boresight of instrumenttype intersects the surface
|
This program identifies those times when the boresight of instrumenttype intersects the surface
|
||||||
of Bennu less than a specified distance from the center of individual facets in shape model.
|
of Bennu less than a specified distance from the center of individual facets in shape model.
|
||||||
""";
|
""";
|
||||||
String footer =
|
String footer =
|
||||||
"""
|
"""
|
||||||
All output is written to standard output in the following format:
|
All output is written to standard output in the following format:
|
||||||
F1
|
F1
|
||||||
sclkvalue dist pos frac flag inc ems phs [otherdata]
|
sclkvalue dist pos frac flag inc ems phs [otherdata]
|
||||||
@@ -203,424 +225,403 @@ public class GetSpots implements TerrasaurTool {
|
|||||||
phs is the phase angle in degrees
|
phs is the phase angle in degrees
|
||||||
[otherdata] is all textual data on the line following the sclk value, up to the linefeed, unchanged.
|
[otherdata] is all textual data on the line following the sclk value, up to the linefeed, unchanged.
|
||||||
""";
|
""";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
|
||||||
|
|
||||||
private String spicemetakernel;
|
|
||||||
private String objfile;
|
|
||||||
private String instrument;
|
|
||||||
private String sclkfile;
|
|
||||||
private int debugLevel;
|
|
||||||
private double maxdist;
|
|
||||||
private vtkPolyData polydata;
|
|
||||||
private SmallBodyModel smallBodyModel;
|
|
||||||
private HashMap<Integer, String> coverageMap;
|
|
||||||
|
|
||||||
private int SC_ID;
|
|
||||||
private String SC_ID_String;
|
|
||||||
private ReferenceFrame BodyFixed;
|
|
||||||
private String TARGET;
|
|
||||||
private final Vector3 NORTH = new Vector3(0, 0, 1e6);
|
|
||||||
private int instID;
|
|
||||||
|
|
||||||
private PrintStream outputStream;
|
|
||||||
|
|
||||||
public GetSpots(
|
|
||||||
String spicemetakernel,
|
|
||||||
String objfile,
|
|
||||||
String instrument,
|
|
||||||
String sclkfile,
|
|
||||||
double maxdist,
|
|
||||||
int debugLevel) {
|
|
||||||
this.spicemetakernel = spicemetakernel;
|
|
||||||
this.objfile = objfile;
|
|
||||||
this.instrument = instrument;
|
|
||||||
this.sclkfile = sclkfile;
|
|
||||||
this.maxdist = maxdist;
|
|
||||||
this.debugLevel = debugLevel;
|
|
||||||
|
|
||||||
coverageMap = new HashMap<>();
|
|
||||||
|
|
||||||
outputStream = System.out;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Find facets covered by the FOV of the instrument. For each facet, find the distance and
|
|
||||||
* position angle of the instrument boresight and the fraction of the facet covered by the FOV.
|
|
||||||
*
|
|
||||||
* @param line String where the first "word" is an sclk time
|
|
||||||
* @throws SpiceException
|
|
||||||
*/
|
|
||||||
private void findCoverage(String line) throws SpiceException {
|
|
||||||
String[] parts = line.split(" ");
|
|
||||||
if (parts.length == 0) return;
|
|
||||||
|
|
||||||
SCLKTime sclkTime = new SCLKTime(new SCLK(SC_ID), parts[0]);
|
|
||||||
TDBTime tdbTime = new TDBTime(sclkTime.getTDBSeconds());
|
|
||||||
|
|
||||||
Instrument instrument = new Instrument(instID);
|
|
||||||
FOV fov = new FOV(instrument);
|
|
||||||
Matrix33 instrToBodyFixed =
|
|
||||||
fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
|
|
||||||
Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight());
|
|
||||||
|
|
||||||
StateRecord sr =
|
|
||||||
new StateRecord(
|
|
||||||
new Body(SC_ID_String),
|
|
||||||
tdbTime,
|
|
||||||
BodyFixed,
|
|
||||||
new AberrationCorrection("LT+S"),
|
|
||||||
new Body(TARGET));
|
|
||||||
Vector3 scposBodyFixed = sr.getPosition();
|
|
||||||
|
|
||||||
PositionVector sunPos =
|
|
||||||
new StateRecord(
|
|
||||||
new Body("SUN"),
|
|
||||||
tdbTime,
|
|
||||||
BodyFixed,
|
|
||||||
new AberrationCorrection("LT+S"),
|
|
||||||
new Body(TARGET))
|
|
||||||
.getPosition();
|
|
||||||
|
|
||||||
double[] double3 = new double[3];
|
|
||||||
long cellID =
|
|
||||||
smallBodyModel.computeRayIntersection(
|
|
||||||
scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3);
|
|
||||||
if (cellID == -1) return; // no boresight intersection
|
|
||||||
|
|
||||||
Vector3 bsightIntersectVector = new Vector3(double3);
|
|
||||||
|
|
||||||
if (debugLevel > 1) {
|
|
||||||
LatitudinalCoordinates lc = new LatitudinalCoordinates(bsightIntersectVector);
|
|
||||||
System.out.printf(
|
|
||||||
"# %s %f %f %s\n",
|
|
||||||
sclkTime,
|
|
||||||
Math.toDegrees(lc.getLatitude()),
|
|
||||||
Math.toDegrees(lc.getLongitude()),
|
|
||||||
bsightIntersectVector);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// flag is 1 if any portion of the spot does not intersect the surface
|
private String spicemetakernel;
|
||||||
int flag = 0;
|
private String objfile;
|
||||||
Vector<Vector3> boundaryBodyFixed = new Vector<>();
|
private String instrument;
|
||||||
if (fov.getShape().equals("RECTANGLE") || fov.getShape().equals("POLYGON")) {
|
private String sclkfile;
|
||||||
for (Vector3 boundary : fov.getBoundary()) {
|
private int debugLevel;
|
||||||
boundaryBodyFixed.add(instrToBodyFixed.mxv(boundary));
|
private double maxdist;
|
||||||
}
|
private vtkPolyData polydata;
|
||||||
} else if (fov.getShape().equals("CIRCLE")) {
|
private SmallBodyModel smallBodyModel;
|
||||||
// bounds contains a single vector parallel to a ray that lies in the cone
|
private HashMap<Integer, String> coverageMap;
|
||||||
// that makes up the boundary of the FOV
|
|
||||||
Vector3[] bounds = fov.getBoundary();
|
|
||||||
|
|
||||||
for (int i = 0; i < 8; i++) {
|
private int SC_ID;
|
||||||
// not ideal, but check every 45 degrees along the perimeter of the circle for intersection
|
private String SC_ID_String;
|
||||||
// with the
|
private ReferenceFrame BodyFixed;
|
||||||
// surface
|
private String TARGET;
|
||||||
Matrix33 rotateAlongPerimeter = new Matrix33(fov.getBoresight(), i * Math.toRadians(45));
|
private final Vector3 NORTH = new Vector3(0, 0, 1e6);
|
||||||
Vector3 perimeterVector = rotateAlongPerimeter.mxv(bounds[0]);
|
private int instID;
|
||||||
boundaryBodyFixed.add(instrToBodyFixed.mxv(perimeterVector));
|
|
||||||
}
|
private PrintStream outputStream;
|
||||||
} else {
|
|
||||||
// TODO: add ELLIPSE
|
public GetSpots(
|
||||||
System.err.printf(
|
String spicemetakernel,
|
||||||
"Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
|
String objfile,
|
||||||
System.exit(0);
|
String instrument,
|
||||||
|
String sclkfile,
|
||||||
|
double maxdist,
|
||||||
|
int debugLevel) {
|
||||||
|
this.spicemetakernel = spicemetakernel;
|
||||||
|
this.objfile = objfile;
|
||||||
|
this.instrument = instrument;
|
||||||
|
this.sclkfile = sclkfile;
|
||||||
|
this.maxdist = maxdist;
|
||||||
|
this.debugLevel = debugLevel;
|
||||||
|
|
||||||
|
coverageMap = new HashMap<>();
|
||||||
|
|
||||||
|
outputStream = System.out;
|
||||||
}
|
}
|
||||||
|
|
||||||
// check all of the boundary vectors for surface intersections
|
/**
|
||||||
for (Vector3 vector : boundaryBodyFixed) {
|
* Find facets covered by the FOV of the instrument. For each facet, find the distance and
|
||||||
cellID =
|
* position angle of the instrument boresight and the fraction of the facet covered by the FOV.
|
||||||
smallBodyModel.computeRayIntersection(
|
*
|
||||||
scposBodyFixed.toArray(), vector.hat().toArray(), double3);
|
* @param line String where the first "word" is an sclk time
|
||||||
if (cellID == -1) {
|
* @throws SpiceException
|
||||||
flag = 1;
|
*/
|
||||||
break;
|
private void findCoverage(String line) throws SpiceException {
|
||||||
}
|
String[] parts = line.split(" ");
|
||||||
}
|
if (parts.length == 0) return;
|
||||||
|
|
||||||
vtkIdList idList = new vtkIdList();
|
SCLKTime sclkTime = new SCLKTime(new SCLK(SC_ID), parts[0]);
|
||||||
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
|
TDBTime tdbTime = new TDBTime(sclkTime.getTDBSeconds());
|
||||||
|
|
||||||
polydata.GetCellPoints(i, idList);
|
Instrument instrument = new Instrument(instID);
|
||||||
double[] pt0 = polydata.GetPoint(idList.GetId(0));
|
FOV fov = new FOV(instrument);
|
||||||
double[] pt1 = polydata.GetPoint(idList.GetId(1));
|
Matrix33 instrToBodyFixed = fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
|
||||||
double[] pt2 = polydata.GetPoint(idList.GetId(2));
|
Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight());
|
||||||
|
|
||||||
CellInfo ci = CellInfo.getCellInfo(polydata, i, idList);
|
StateRecord sr = new StateRecord(
|
||||||
Vector3 facetNormal = MathConversions.toVector3(ci.normal());
|
new Body(SC_ID_String), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET));
|
||||||
Vector3 facetCenter = MathConversions.toVector3(ci.center());
|
Vector3 scposBodyFixed = sr.getPosition();
|
||||||
|
|
||||||
// check that facet faces the observer
|
PositionVector sunPos = new StateRecord(
|
||||||
Vector3 facetToSC = scposBodyFixed.sub(facetCenter);
|
new Body("SUN"), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET))
|
||||||
double emission = facetToSC.sep(facetNormal);
|
.getPosition();
|
||||||
if (emission > Math.PI / 2) continue;
|
|
||||||
|
|
||||||
double dist =
|
double[] double3 = new double[3];
|
||||||
findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
|
long cellID = smallBodyModel.computeRayIntersection(
|
||||||
if (dist < maxdist) {
|
scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3);
|
||||||
|
if (cellID == -1) return; // no boresight intersection
|
||||||
|
|
||||||
Vector3 facetToSun = sunPos.sub(facetCenter);
|
Vector3 bsightIntersectVector = new Vector3(double3);
|
||||||
double incidence = facetToSun.sep(facetNormal);
|
|
||||||
double phase = facetToSun.sep(facetToSC);
|
|
||||||
|
|
||||||
Vector3 pt0v = new Vector3(pt0);
|
if (debugLevel > 1) {
|
||||||
Vector3 pt1v = new Vector3(pt1);
|
LatitudinalCoordinates lc = new LatitudinalCoordinates(bsightIntersectVector);
|
||||||
Vector3 pt2v = new Vector3(pt2);
|
System.out.printf(
|
||||||
|
"# %s %f %f %s\n",
|
||||||
Vector3 span1 = pt1v.sub(pt0v);
|
|
||||||
Vector3 span2 = pt2v.sub(pt0v);
|
|
||||||
|
|
||||||
Plane facetPlane = new Plane(pt0v, span1, span2);
|
|
||||||
Vector3 localNorth = facetPlane.project(NORTH).sub(facetCenter);
|
|
||||||
Vector3 bsightIntersectProjection =
|
|
||||||
facetPlane.project(bsightIntersectVector).sub(facetCenter);
|
|
||||||
|
|
||||||
// 0 = North, 90 = East
|
|
||||||
double pos =
|
|
||||||
Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
|
|
||||||
if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos;
|
|
||||||
|
|
||||||
int nCovered = 0;
|
|
||||||
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt0v.sub(scposBodyFixed)))) nCovered++;
|
|
||||||
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt1v.sub(scposBodyFixed)))) nCovered++;
|
|
||||||
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt2v.sub(scposBodyFixed)))) nCovered++;
|
|
||||||
double frac;
|
|
||||||
if (nCovered == 3) {
|
|
||||||
frac = 1;
|
|
||||||
} else {
|
|
||||||
final double sep012 = span1.negate().sep(pt2v.sub(pt1v)); // angle at vertex 1
|
|
||||||
final double sep021 = span2.negate().sep(pt1v.sub(pt2v)); // angle at vertex 2
|
|
||||||
|
|
||||||
// check 0.5*nPts^2 points if they fall in FOV
|
|
||||||
int nPts = 50;
|
|
||||||
Vector<Vector3> pointsInFacet = new Vector<>();
|
|
||||||
for (int ii = 0; ii < nPts; ii++) {
|
|
||||||
Vector3 x = pt0v.add(span1.scale(ii / (nPts - 1.)));
|
|
||||||
for (int jj = 0; jj < nPts; jj++) {
|
|
||||||
Vector3 y = x.add(span2.scale(jj / (nPts - 1.)));
|
|
||||||
|
|
||||||
// if outside the facet, angle 01y will be larger than angle 012
|
|
||||||
if (span1.negate().sep(y.sub(pt1v)) > sep012) continue;
|
|
||||||
// if outside the facet, angle 02y will be larger than angle 021
|
|
||||||
if (span2.negate().sep(y.sub(pt2v)) > sep021) continue;
|
|
||||||
pointsInFacet.add(instrToBodyFixed.mtxv(y.sub(scposBodyFixed)));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (pointsInFacet.isEmpty()) {
|
|
||||||
frac = 0;
|
|
||||||
} else {
|
|
||||||
List<Boolean> isInFOV = SPICEUtil.isInFOV(fov, pointsInFacet);
|
|
||||||
nCovered = 0;
|
|
||||||
for (boolean b : isInFOV) if (b) nCovered++;
|
|
||||||
frac = ((double) nCovered) / pointsInFacet.size();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
StringBuilder output =
|
|
||||||
new StringBuilder(
|
|
||||||
String.format(
|
|
||||||
"%s %.4f %5.1f %.1f %d %.1f %.1f %.1f",
|
|
||||||
sclkTime,
|
sclkTime,
|
||||||
dist,
|
Math.toDegrees(lc.getLatitude()),
|
||||||
pos,
|
Math.toDegrees(lc.getLongitude()),
|
||||||
frac * 100,
|
bsightIntersectVector);
|
||||||
flag,
|
|
||||||
Math.toDegrees(incidence),
|
|
||||||
Math.toDegrees(emission),
|
|
||||||
Math.toDegrees(phase)));
|
|
||||||
for (int j = 1; j < parts.length; j++) output.append(String.format(" %s", parts[j]));
|
|
||||||
output.append("\n");
|
|
||||||
String coverage = coverageMap.get(i);
|
|
||||||
if (coverage == null) {
|
|
||||||
coverageMap.put(i, output.toString());
|
|
||||||
} else {
|
|
||||||
coverage += output;
|
|
||||||
coverageMap.put(i, coverage);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
// flag is 1 if any portion of the spot does not intersect the surface
|
||||||
* Find the angular distance between pt1 and pt2 as seen from scPos. All coordinates are in the
|
int flag = 0;
|
||||||
* body fixed frame.
|
Vector<Vector3> boundaryBodyFixed = new Vector<>();
|
||||||
*
|
if (fov.getShape().equals("RECTANGLE") || fov.getShape().equals("POLYGON")) {
|
||||||
* @param scPos Spacecraft position
|
for (Vector3 boundary : fov.getBoundary()) {
|
||||||
* @param pt1 Point 1
|
boundaryBodyFixed.add(instrToBodyFixed.mxv(boundary));
|
||||||
* @param pt2 Point 2
|
}
|
||||||
* @return distance between pt1 and pt2 in radians.
|
} else if (fov.getShape().equals("CIRCLE")) {
|
||||||
*/
|
// bounds contains a single vector parallel to a ray that lies in the cone
|
||||||
private double findDist(Vector3 scPos, Vector3 pt1, Vector3 pt2) {
|
// that makes up the boundary of the FOV
|
||||||
Vector3 scToPt1 = pt1.sub(scPos).hat();
|
Vector3[] bounds = fov.getBoundary();
|
||||||
Vector3 scToPt2 = pt2.sub(scPos).hat();
|
|
||||||
|
|
||||||
return Math.acos(scToPt1.dot(scToPt2));
|
for (int i = 0; i < 8; i++) {
|
||||||
}
|
// not ideal, but check every 45 degrees along the perimeter of the circle for intersection
|
||||||
|
// with the
|
||||||
|
// surface
|
||||||
|
Matrix33 rotateAlongPerimeter = new Matrix33(fov.getBoresight(), i * Math.toRadians(45));
|
||||||
|
Vector3 perimeterVector = rotateAlongPerimeter.mxv(bounds[0]);
|
||||||
|
boundaryBodyFixed.add(instrToBodyFixed.mxv(perimeterVector));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// TODO: add ELLIPSE
|
||||||
|
System.err.printf("Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
public void printMap() {
|
// check all of the boundary vectors for surface intersections
|
||||||
if (debugLevel > 0) {
|
for (Vector3 vector : boundaryBodyFixed) {
|
||||||
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
|
cellID = smallBodyModel.computeRayIntersection(
|
||||||
outputStream.printf("F%d\n", i + 1);
|
scposBodyFixed.toArray(), vector.hat().toArray(), double3);
|
||||||
String output = coverageMap.get(i);
|
if (cellID == -1) {
|
||||||
if (output != null) outputStream.print(coverageMap.get(i));
|
flag = 1;
|
||||||
}
|
break;
|
||||||
} else {
|
}
|
||||||
List<Integer> list = new ArrayList<>(coverageMap.keySet());
|
}
|
||||||
Collections.sort(list);
|
|
||||||
for (Integer i : list) {
|
|
||||||
outputStream.printf("F%d\n", i + 1);
|
|
||||||
outputStream.print(coverageMap.get(i));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
outputStream.println("END");
|
|
||||||
}
|
|
||||||
|
|
||||||
public void process() throws Exception {
|
vtkIdList idList = new vtkIdList();
|
||||||
boolean useNEAR = false;
|
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
|
||||||
if (instrument.equalsIgnoreCase("OLA_LOW")) {
|
|
||||||
// instID = -64400; // ORX_OLA_BASE
|
polydata.GetCellPoints(i, idList);
|
||||||
// instID = -64401; // ORX_OLA_ART
|
double[] pt0 = polydata.GetPoint(idList.GetId(0));
|
||||||
instID = -64403; // ORX_OLA_LOW
|
double[] pt1 = polydata.GetPoint(idList.GetId(1));
|
||||||
}
|
double[] pt2 = polydata.GetPoint(idList.GetId(2));
|
||||||
if (instrument.equalsIgnoreCase("OLA_HIGH")) {
|
|
||||||
instID = -64402; // ORX_OLA_HIGH
|
CellInfo ci = CellInfo.getCellInfo(polydata, i, idList);
|
||||||
} else if (instrument.equalsIgnoreCase("OTES")) {
|
Vector3 facetNormal = MathConversions.toVector3(ci.normal());
|
||||||
instID = -64310; // ORX_OTES
|
Vector3 facetCenter = MathConversions.toVector3(ci.center());
|
||||||
} else if (instrument.equalsIgnoreCase("OVIRS_SCI")) {
|
|
||||||
// instID = -64320; // ORX_OVIRS <- no instrument kernel for this
|
// check that facet faces the observer
|
||||||
instID = -64321; // ORX_OVIRS_SCI
|
Vector3 facetToSC = scposBodyFixed.sub(facetCenter);
|
||||||
// instID = -64322; // ORX_OVIRS_SUN
|
double emission = facetToSC.sep(facetNormal);
|
||||||
} else if (instrument.equalsIgnoreCase("REXIS")) {
|
if (emission > Math.PI / 2) continue;
|
||||||
instID = -64330; // ORX_REXIS
|
|
||||||
} else if (instrument.equalsIgnoreCase("REXIS_SXM")) {
|
double dist = findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
|
||||||
instID = -64340; // ORX_REXIS_SXM
|
if (dist < maxdist) {
|
||||||
} else if (instrument.equalsIgnoreCase("POLYCAM")) {
|
|
||||||
instID = -64360; // ORX_OCAMS_POLYCAM
|
Vector3 facetToSun = sunPos.sub(facetCenter);
|
||||||
} else if (instrument.equalsIgnoreCase("MAPCAM")) {
|
double incidence = facetToSun.sep(facetNormal);
|
||||||
instID = -64361; // ORX_OCAMS_MAPCAM
|
double phase = facetToSun.sep(facetToSC);
|
||||||
} else if (instrument.equalsIgnoreCase("SAMCAM")) {
|
|
||||||
instID = -64362; // ORX_OCAMS_SAMCAM
|
Vector3 pt0v = new Vector3(pt0);
|
||||||
} else if (instrument.equalsIgnoreCase("NAVCAM")) {
|
Vector3 pt1v = new Vector3(pt1);
|
||||||
// instID = -64070; // ORX_NAVCAM <- no frame kernel for this
|
Vector3 pt2v = new Vector3(pt2);
|
||||||
// instID = -64081; // ORX_NAVCAM1 <- no instrument kernel for this
|
|
||||||
instID = -64082; // ORX_NAVCAM2 <- no instrument kernel for this
|
Vector3 span1 = pt1v.sub(pt0v);
|
||||||
} else if (instrument.equalsIgnoreCase("NIS_RECT")) {
|
Vector3 span2 = pt2v.sub(pt0v);
|
||||||
useNEAR = true;
|
|
||||||
// instID = -93021;
|
Plane facetPlane = new Plane(pt0v, span1, span2);
|
||||||
instID = -93023; // relative to NEAR_NIS_BASE
|
Vector3 localNorth = facetPlane.project(NORTH).sub(facetCenter);
|
||||||
} else if (instrument.equalsIgnoreCase("NIS_SQUARE")) {
|
Vector3 bsightIntersectProjection =
|
||||||
useNEAR = true;
|
facetPlane.project(bsightIntersectVector).sub(facetCenter);
|
||||||
// instID = -93022;
|
|
||||||
instID = -93024; // relative to NEAR_NIS_BASE
|
// 0 = North, 90 = East
|
||||||
|
double pos = Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
|
||||||
|
if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos;
|
||||||
|
|
||||||
|
int nCovered = 0;
|
||||||
|
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt0v.sub(scposBodyFixed)))) nCovered++;
|
||||||
|
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt1v.sub(scposBodyFixed)))) nCovered++;
|
||||||
|
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt2v.sub(scposBodyFixed)))) nCovered++;
|
||||||
|
double frac;
|
||||||
|
if (nCovered == 3) {
|
||||||
|
frac = 1;
|
||||||
|
} else {
|
||||||
|
final double sep012 = span1.negate().sep(pt2v.sub(pt1v)); // angle at vertex 1
|
||||||
|
final double sep021 = span2.negate().sep(pt1v.sub(pt2v)); // angle at vertex 2
|
||||||
|
|
||||||
|
// check 0.5*nPts^2 points if they fall in FOV
|
||||||
|
int nPts = 50;
|
||||||
|
Vector<Vector3> pointsInFacet = new Vector<>();
|
||||||
|
for (int ii = 0; ii < nPts; ii++) {
|
||||||
|
Vector3 x = pt0v.add(span1.scale(ii / (nPts - 1.)));
|
||||||
|
for (int jj = 0; jj < nPts; jj++) {
|
||||||
|
Vector3 y = x.add(span2.scale(jj / (nPts - 1.)));
|
||||||
|
|
||||||
|
// if outside the facet, angle 01y will be larger than angle 012
|
||||||
|
if (span1.negate().sep(y.sub(pt1v)) > sep012) continue;
|
||||||
|
// if outside the facet, angle 02y will be larger than angle 021
|
||||||
|
if (span2.negate().sep(y.sub(pt2v)) > sep021) continue;
|
||||||
|
pointsInFacet.add(instrToBodyFixed.mtxv(y.sub(scposBodyFixed)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (pointsInFacet.isEmpty()) {
|
||||||
|
frac = 0;
|
||||||
|
} else {
|
||||||
|
List<Boolean> isInFOV = SPICEUtil.isInFOV(fov, pointsInFacet);
|
||||||
|
nCovered = 0;
|
||||||
|
for (boolean b : isInFOV) if (b) nCovered++;
|
||||||
|
frac = ((double) nCovered) / pointsInFacet.size();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
StringBuilder output = new StringBuilder(String.format(
|
||||||
|
"%s %.4f %5.1f %.1f %d %.1f %.1f %.1f",
|
||||||
|
sclkTime,
|
||||||
|
dist,
|
||||||
|
pos,
|
||||||
|
frac * 100,
|
||||||
|
flag,
|
||||||
|
Math.toDegrees(incidence),
|
||||||
|
Math.toDegrees(emission),
|
||||||
|
Math.toDegrees(phase)));
|
||||||
|
for (int j = 1; j < parts.length; j++) output.append(String.format(" %s", parts[j]));
|
||||||
|
output.append("\n");
|
||||||
|
String coverage = coverageMap.get(i);
|
||||||
|
if (coverage == null) {
|
||||||
|
coverageMap.put(i, output.toString());
|
||||||
|
} else {
|
||||||
|
coverage += output;
|
||||||
|
coverageMap.put(i, coverage);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
/**
|
||||||
polydata = PolyDataUtil.loadShapeModelAndComputeNormals(objfile);
|
* Find the angular distance between pt1 and pt2 as seen from scPos. All coordinates are in the
|
||||||
smallBodyModel = new SmallBodyModel(polydata);
|
* body fixed frame.
|
||||||
|
*
|
||||||
|
* @param scPos Spacecraft position
|
||||||
|
* @param pt1 Point 1
|
||||||
|
* @param pt2 Point 2
|
||||||
|
* @return distance between pt1 and pt2 in radians.
|
||||||
|
*/
|
||||||
|
private double findDist(Vector3 scPos, Vector3 pt1, Vector3 pt2) {
|
||||||
|
Vector3 scToPt1 = pt1.sub(scPos).hat();
|
||||||
|
Vector3 scToPt2 = pt2.sub(scPos).hat();
|
||||||
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
return Math.acos(scToPt1.dot(scToPt2));
|
||||||
CSPICE.furnsh(spicemetakernel);
|
|
||||||
|
|
||||||
if (useNEAR) {
|
|
||||||
SC_ID = -93;
|
|
||||||
SC_ID_String = "-93"; // "NEAR";
|
|
||||||
TARGET = "2000433";
|
|
||||||
BodyFixed = new ReferenceFrame("IAU_EROS");
|
|
||||||
} else {
|
|
||||||
SC_ID = -64;
|
|
||||||
SC_ID_String = "-64"; // "ORX_SPACECRAFT";
|
|
||||||
TARGET = "2101955";
|
|
||||||
BodyFixed = new ReferenceFrame("IAU_BENNU");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
List<String> sclkLines = FileUtils.readLines(new File(sclkfile), Charset.defaultCharset());
|
public void printMap() {
|
||||||
boolean foundBegin = false;
|
if (debugLevel > 0) {
|
||||||
for (String line : sclkLines) {
|
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
|
||||||
String trimLine = line.trim();
|
outputStream.printf("F%d\n", i + 1);
|
||||||
if (trimLine.startsWith("#")) continue;
|
String output = coverageMap.get(i);
|
||||||
if (trimLine.startsWith("BEGIN")) {
|
if (output != null) outputStream.print(coverageMap.get(i));
|
||||||
foundBegin = true;
|
}
|
||||||
continue;
|
} else {
|
||||||
}
|
List<Integer> list = new ArrayList<>(coverageMap.keySet());
|
||||||
if (foundBegin && !trimLine.startsWith("END")) {
|
Collections.sort(list);
|
||||||
if (trimLine.startsWith("#")) continue;
|
for (Integer i : list) {
|
||||||
|
outputStream.printf("F%d\n", i + 1);
|
||||||
findCoverage(trimLine);
|
outputStream.print(coverageMap.get(i));
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
outputStream.println("END");
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
public void process() throws Exception {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
boolean useNEAR = false;
|
||||||
options.addOption(Option.builder("spice").required().hasArg().desc("SPICE metakernel").build());
|
if (instrument.equalsIgnoreCase("OLA_LOW")) {
|
||||||
options.addOption(Option.builder("obj").required().hasArg().desc("Shape file").build());
|
// instID = -64400; // ORX_OLA_BASE
|
||||||
options.addOption(
|
// instID = -64401; // ORX_OLA_ART
|
||||||
Option.builder("instype")
|
instID = -64403; // ORX_OLA_LOW
|
||||||
.required()
|
}
|
||||||
.hasArg()
|
if (instrument.equalsIgnoreCase("OLA_HIGH")) {
|
||||||
.desc(
|
instID = -64402; // ORX_OLA_HIGH
|
||||||
"one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
|
} else if (instrument.equalsIgnoreCase("OTES")) {
|
||||||
.build());
|
instID = -64310; // ORX_OTES
|
||||||
options.addOption(
|
} else if (instrument.equalsIgnoreCase("OVIRS_SCI")) {
|
||||||
Option.builder("sclk")
|
// instID = -64320; // ORX_OVIRS <- no instrument kernel for this
|
||||||
.required()
|
instID = -64321; // ORX_OVIRS_SCI
|
||||||
.hasArg()
|
// instID = -64322; // ORX_OVIRS_SUN
|
||||||
.desc(
|
} else if (instrument.equalsIgnoreCase("REXIS")) {
|
||||||
"""
|
instID = -64330; // ORX_REXIS
|
||||||
|
} else if (instrument.equalsIgnoreCase("REXIS_SXM")) {
|
||||||
|
instID = -64340; // ORX_REXIS_SXM
|
||||||
|
} else if (instrument.equalsIgnoreCase("POLYCAM")) {
|
||||||
|
instID = -64360; // ORX_OCAMS_POLYCAM
|
||||||
|
} else if (instrument.equalsIgnoreCase("MAPCAM")) {
|
||||||
|
instID = -64361; // ORX_OCAMS_MAPCAM
|
||||||
|
} else if (instrument.equalsIgnoreCase("SAMCAM")) {
|
||||||
|
instID = -64362; // ORX_OCAMS_SAMCAM
|
||||||
|
} else if (instrument.equalsIgnoreCase("NAVCAM")) {
|
||||||
|
// instID = -64070; // ORX_NAVCAM <- no frame kernel for this
|
||||||
|
// instID = -64081; // ORX_NAVCAM1 <- no instrument kernel for this
|
||||||
|
instID = -64082; // ORX_NAVCAM2 <- no instrument kernel for this
|
||||||
|
} else if (instrument.equalsIgnoreCase("NIS_RECT")) {
|
||||||
|
useNEAR = true;
|
||||||
|
// instID = -93021;
|
||||||
|
instID = -93023; // relative to NEAR_NIS_BASE
|
||||||
|
} else if (instrument.equalsIgnoreCase("NIS_SQUARE")) {
|
||||||
|
useNEAR = true;
|
||||||
|
// instID = -93022;
|
||||||
|
instID = -93024; // relative to NEAR_NIS_BASE
|
||||||
|
}
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
polydata = PolyDataUtil.loadShapeModelAndComputeNormals(objfile);
|
||||||
|
smallBodyModel = new SmallBodyModel(polydata);
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
CSPICE.furnsh(spicemetakernel);
|
||||||
|
|
||||||
|
if (useNEAR) {
|
||||||
|
SC_ID = -93;
|
||||||
|
SC_ID_String = "-93"; // "NEAR";
|
||||||
|
TARGET = "2000433";
|
||||||
|
BodyFixed = new ReferenceFrame("IAU_EROS");
|
||||||
|
} else {
|
||||||
|
SC_ID = -64;
|
||||||
|
SC_ID_String = "-64"; // "ORX_SPACECRAFT";
|
||||||
|
TARGET = "2101955";
|
||||||
|
BodyFixed = new ReferenceFrame("IAU_BENNU");
|
||||||
|
}
|
||||||
|
|
||||||
|
List<String> sclkLines = FileUtils.readLines(new File(sclkfile), Charset.defaultCharset());
|
||||||
|
boolean foundBegin = false;
|
||||||
|
for (String line : sclkLines) {
|
||||||
|
String trimLine = line.trim();
|
||||||
|
if (trimLine.startsWith("#")) continue;
|
||||||
|
if (trimLine.startsWith("BEGIN")) {
|
||||||
|
foundBegin = true;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (foundBegin && !trimLine.startsWith("END")) {
|
||||||
|
if (trimLine.startsWith("#")) continue;
|
||||||
|
|
||||||
|
findCoverage(trimLine);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("spice")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("SPICE metakernel")
|
||||||
|
.build());
|
||||||
|
options.addOption(
|
||||||
|
Option.builder("obj").required().hasArg().desc("Shape file").build());
|
||||||
|
options.addOption(Option.builder("instype")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("sclk")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"""
|
||||||
file containing sclk values for instrument observation times. All values between the strings BEGIN and END will be processed.
|
file containing sclk values for instrument observation times. All values between the strings BEGIN and END will be processed.
|
||||||
For example:
|
For example:
|
||||||
BEGIN
|
BEGIN
|
||||||
3/605862045.24157
|
3/605862045.24157
|
||||||
END""")
|
END""")
|
||||||
.build());
|
.build());
|
||||||
options.addOption(
|
options.addOption(Option.builder("maxdist")
|
||||||
Option.builder("maxdist")
|
.required()
|
||||||
.required()
|
.hasArg()
|
||||||
.hasArg()
|
.desc("maximum distance of boresight from facet center in milliradians")
|
||||||
.desc("maximum distance of boresight from facet center in milliradians")
|
.build());
|
||||||
.build());
|
options.addOption(Option.builder("all-facets")
|
||||||
options.addOption(
|
.desc("Optional. If present, entries for all facets will be output, even if there is no intersection.")
|
||||||
Option.builder("all-facets")
|
.build());
|
||||||
.desc(
|
options.addOption(Option.builder("verbose")
|
||||||
"Optional. If present, entries for all facets will be output, even if there is no intersection.")
|
.hasArg()
|
||||||
.build());
|
.desc(
|
||||||
options.addOption(
|
"Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.")
|
||||||
Option.builder("verbose")
|
.build());
|
||||||
.hasArg()
|
return options;
|
||||||
.desc(
|
}
|
||||||
"Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.")
|
|
||||||
.build());
|
public static void main(String[] args) {
|
||||||
return options;
|
|
||||||
}
|
TerrasaurTool defaultOBJ = new GetSpots();
|
||||||
|
|
||||||
public static void main(String[] args) {
|
Options options = defineOptions();
|
||||||
|
|
||||||
TerrasaurTool defaultOBJ = new GetSpots();
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
Options options = defineOptions();
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
String spicemetakernel = cl.getOptionValue("spice");
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
String objfile = cl.getOptionValue("obj");
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
String instrumenttype = cl.getOptionValue("instype");
|
||||||
|
String sclkfile = cl.getOptionValue("sclk");
|
||||||
String spicemetakernel = cl.getOptionValue("spice");
|
double distance = Double.parseDouble(cl.getOptionValue("maxdist"));
|
||||||
String objfile = cl.getOptionValue("obj");
|
int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0"));
|
||||||
String instrumenttype = cl.getOptionValue("instype");
|
if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1;
|
||||||
String sclkfile = cl.getOptionValue("sclk");
|
|
||||||
double distance = Double.parseDouble(cl.getOptionValue("maxdist"));
|
GetSpots gs = new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
|
||||||
int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0"));
|
try {
|
||||||
if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1;
|
gs.process();
|
||||||
|
gs.printMap();
|
||||||
GetSpots gs =
|
} catch (Exception e) {
|
||||||
new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
try {
|
}
|
||||||
gs.process();
|
|
||||||
gs.printMap();
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.BufferedInputStream;
|
import java.io.BufferedInputStream;
|
||||||
@@ -35,492 +57,470 @@ import vtk.vtkPolyDataWriter;
|
|||||||
|
|
||||||
public class PointCloudFormatConverter implements TerrasaurTool {
|
public class PointCloudFormatConverter implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Convert an input point cloud to a new format.";
|
return "Convert an input point cloud to a new format.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer =
|
String footer =
|
||||||
"""
|
"""
|
||||||
This program converts an input point cloud to a new format.
|
This program converts an input point cloud to a new format.
|
||||||
|
|
||||||
Supported input formats are ASCII, BIN3 (x,y,z), BIN4 (x, y, z, w), BIN7 (t, x, y, z, s/c x, y, z), FITS, ICQ, OBJ, PLY, and VTK. Supported output formats are ASCII, BIN3, OBJ, and VTK.
|
Supported input formats are ASCII, BIN3 (x,y,z), BIN4 (x, y, z, w), BIN7 (t, x, y, z, s/c x, y, z), FITS, ICQ, OBJ, PLY, and VTK. Supported output formats are ASCII, BIN3, OBJ, and VTK.
|
||||||
|
|
||||||
ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.""";
|
ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.""";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
|
||||||
|
|
||||||
private FORMATS inFormat;
|
|
||||||
private FORMATS outFormat;
|
|
||||||
private vtkPoints pointsXYZ;
|
|
||||||
// private List<Double> receivedIntensity;
|
|
||||||
private vtkPolyData polyData;
|
|
||||||
private Vector3 center;
|
|
||||||
private int halfSize;
|
|
||||||
private double groundSampleDistance;
|
|
||||||
private double clip;
|
|
||||||
private String additionalGMTArgs;
|
|
||||||
private double mapRadius;
|
|
||||||
|
|
||||||
public static vtkPoints readPointCloud(String filename) {
|
|
||||||
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(filename, FORMATS.VTK);
|
|
||||||
pcfc.read(filename, false);
|
|
||||||
return pcfc.getPoints();
|
|
||||||
}
|
|
||||||
|
|
||||||
private PointCloudFormatConverter() {}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter(FORMATS inFormat, String outFilename) {
|
|
||||||
this(inFormat, FORMATS.formatFromExtension(outFilename));
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter(String inFilename, FORMATS outFormat) {
|
|
||||||
this(FORMATS.formatFromExtension(inFilename), outFormat);
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter(String inFilename, String outFilename) {
|
|
||||||
this(FORMATS.formatFromExtension(inFilename), FORMATS.formatFromExtension(outFilename));
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter(FORMATS inFormat, FORMATS outFormat) {
|
|
||||||
this.inFormat = inFormat;
|
|
||||||
this.outFormat = outFormat;
|
|
||||||
this.pointsXYZ = new vtkPoints();
|
|
||||||
this.polyData = null;
|
|
||||||
|
|
||||||
this.center = null;
|
|
||||||
this.mapRadius = Math.sqrt(2);
|
|
||||||
this.halfSize = -1;
|
|
||||||
this.groundSampleDistance = -1;
|
|
||||||
this.clip = 1;
|
|
||||||
this.additionalGMTArgs = "";
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter setPoints(vtkPoints pointsXYZ) {
|
|
||||||
this.pointsXYZ = pointsXYZ;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public vtkPoints getPoints() {
|
|
||||||
return pointsXYZ;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setClip(Double clip) {
|
|
||||||
this.clip = clip;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setCenter(double[] centerPt) {
|
|
||||||
center = new Vector3(centerPt);
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setMapRadius(double mapRadius) {
|
|
||||||
this.mapRadius = mapRadius;
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter setHalfSize(int halfSize) {
|
|
||||||
this.halfSize = halfSize;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter setGroundSampleDistance(double groundSampleDistance) {
|
|
||||||
this.groundSampleDistance = groundSampleDistance;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudFormatConverter setGMTArgs(String args) {
|
|
||||||
this.additionalGMTArgs = args;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void read(String inFile, boolean inLLR) {
|
|
||||||
switch (inFormat) {
|
|
||||||
case ASCII:
|
|
||||||
try (BufferedReader br = new BufferedReader(new FileReader(inFile))) {
|
|
||||||
String line = br.readLine();
|
|
||||||
while (line != null) {
|
|
||||||
line = line.trim();
|
|
||||||
if (!line.isEmpty() && !line.startsWith("#")) {
|
|
||||||
String[] parts = line.split("\\s+");
|
|
||||||
if (inLLR) {
|
|
||||||
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
|
|
||||||
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
|
|
||||||
double range = Double.parseDouble(parts[2].trim());
|
|
||||||
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray();
|
|
||||||
pointsXYZ.InsertNextPoint(xyz);
|
|
||||||
} else {
|
|
||||||
double[] xyz = new double[3];
|
|
||||||
xyz[0] = Double.parseDouble(parts[0].trim());
|
|
||||||
xyz[1] = Double.parseDouble(parts[1].trim());
|
|
||||||
xyz[2] = Double.parseDouble(parts[2].trim());
|
|
||||||
pointsXYZ.InsertNextPoint(xyz);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
line = br.readLine();
|
|
||||||
}
|
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
case BIN3:
|
|
||||||
case BIN4:
|
|
||||||
case BIN7:
|
|
||||||
try (DataInputStream dis =
|
|
||||||
new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
|
|
||||||
while (dis.available() > 0) {
|
|
||||||
if (inFormat == FORMATS.BIN7) {
|
|
||||||
// skip time field
|
|
||||||
BinaryUtils.readDoubleAndSwap(dis);
|
|
||||||
}
|
|
||||||
if (inLLR) {
|
|
||||||
double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
|
|
||||||
double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
|
|
||||||
double range = BinaryUtils.readDoubleAndSwap(dis);
|
|
||||||
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray();
|
|
||||||
pointsXYZ.InsertNextPoint(xyz);
|
|
||||||
} else {
|
|
||||||
double[] xyz = new double[3];
|
|
||||||
xyz[0] = BinaryUtils.readDoubleAndSwap(dis);
|
|
||||||
xyz[1] = BinaryUtils.readDoubleAndSwap(dis);
|
|
||||||
xyz[2] = BinaryUtils.readDoubleAndSwap(dis);
|
|
||||||
pointsXYZ.InsertNextPoint(xyz);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
case ICQ:
|
|
||||||
case OBJ:
|
|
||||||
case PLT:
|
|
||||||
case PLY:
|
|
||||||
case VTK:
|
|
||||||
try {
|
|
||||||
polyData = PolyDataUtil.loadShapeModel(inFile);
|
|
||||||
pointsXYZ.DeepCopy(polyData.GetPoints());
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
case FITS:
|
|
||||||
try {
|
|
||||||
polyData = PolyDataUtil.loadFITShapeModel(inFile);
|
|
||||||
pointsXYZ.DeepCopy(polyData.GetPoints());
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
break;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (clip != 1) {
|
private FORMATS inFormat;
|
||||||
BoundingBox bbox = new BoundingBox();
|
private FORMATS outFormat;
|
||||||
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
private vtkPoints pointsXYZ;
|
||||||
double[] point = pointsXYZ.GetPoint(i);
|
// private List<Double> receivedIntensity;
|
||||||
bbox.update(new UnwritableVectorIJK(point[0], point[1], point[2]));
|
private vtkPolyData polyData;
|
||||||
}
|
private Vector3 center;
|
||||||
BoundingBox clipped = bbox.getScaledBoundingBox(clip);
|
private int halfSize;
|
||||||
vtkPoints clippedPoints = new vtkPoints();
|
private double groundSampleDistance;
|
||||||
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
private double clip;
|
||||||
if (clipped.contains(pointsXYZ.GetPoint(i)))
|
private String additionalGMTArgs;
|
||||||
clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
|
private double mapRadius;
|
||||||
}
|
|
||||||
pointsXYZ = clippedPoints;
|
public static vtkPoints readPointCloud(String filename) {
|
||||||
polyData = null;
|
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(filename, FORMATS.VTK);
|
||||||
|
pcfc.read(filename, false);
|
||||||
|
return pcfc.getPoints();
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
public void write(String outFile, boolean outLLR) {
|
private PointCloudFormatConverter() {}
|
||||||
switch (outFormat) {
|
|
||||||
case ASCII:
|
|
||||||
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(outFile)))) {
|
|
||||||
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
|
||||||
double[] thisPoint = pointsXYZ.GetPoint(i);
|
|
||||||
if (outLLR) {
|
|
||||||
Vector3D v = new Vector3D(thisPoint);
|
|
||||||
out.printf(
|
|
||||||
"%f %f %f\n",
|
|
||||||
Math.toDegrees(v.getAlpha()), Math.toDegrees(v.getDelta()), v.getNorm());
|
|
||||||
} else {
|
|
||||||
out.printf("%f %f %f\n", thisPoint[0], thisPoint[1], thisPoint[2]);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
case BIN3:
|
|
||||||
try (DataOutputStream os =
|
|
||||||
new DataOutputStream(new BufferedOutputStream(new FileOutputStream(outFile)))) {
|
|
||||||
|
|
||||||
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
public PointCloudFormatConverter(FORMATS inFormat, String outFilename) {
|
||||||
double[] thisPoint = pointsXYZ.GetPoint(i);
|
this(inFormat, FORMATS.formatFromExtension(outFilename));
|
||||||
if (outLLR) {
|
}
|
||||||
Vector3D v = new Vector3D(thisPoint);
|
|
||||||
|
|
||||||
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getAlpha()));
|
public PointCloudFormatConverter(String inFilename, FORMATS outFormat) {
|
||||||
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getDelta()));
|
this(FORMATS.formatFromExtension(inFilename), outFormat);
|
||||||
BinaryUtils.writeDoubleAndSwap(os, v.getNorm());
|
}
|
||||||
} else {
|
|
||||||
for (int ii = 0; ii < 3; ii++) BinaryUtils.writeDoubleAndSwap(os, thisPoint[ii]);
|
public PointCloudFormatConverter(String inFilename, String outFilename) {
|
||||||
}
|
this(FORMATS.formatFromExtension(inFilename), FORMATS.formatFromExtension(outFilename));
|
||||||
}
|
}
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
public PointCloudFormatConverter(FORMATS inFormat, FORMATS outFormat) {
|
||||||
|
this.inFormat = inFormat;
|
||||||
|
this.outFormat = outFormat;
|
||||||
|
this.pointsXYZ = new vtkPoints();
|
||||||
|
this.polyData = null;
|
||||||
|
|
||||||
|
this.center = null;
|
||||||
|
this.mapRadius = Math.sqrt(2);
|
||||||
|
this.halfSize = -1;
|
||||||
|
this.groundSampleDistance = -1;
|
||||||
|
this.clip = 1;
|
||||||
|
this.additionalGMTArgs = "";
|
||||||
|
}
|
||||||
|
|
||||||
|
public PointCloudFormatConverter setPoints(vtkPoints pointsXYZ) {
|
||||||
|
this.pointsXYZ = pointsXYZ;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public vtkPoints getPoints() {
|
||||||
|
return pointsXYZ;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setClip(Double clip) {
|
||||||
|
this.clip = clip;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCenter(double[] centerPt) {
|
||||||
|
center = new Vector3(centerPt);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMapRadius(double mapRadius) {
|
||||||
|
this.mapRadius = mapRadius;
|
||||||
|
}
|
||||||
|
|
||||||
|
public PointCloudFormatConverter setHalfSize(int halfSize) {
|
||||||
|
this.halfSize = halfSize;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public PointCloudFormatConverter setGroundSampleDistance(double groundSampleDistance) {
|
||||||
|
this.groundSampleDistance = groundSampleDistance;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public PointCloudFormatConverter setGMTArgs(String args) {
|
||||||
|
this.additionalGMTArgs = args;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void read(String inFile, boolean inLLR) {
|
||||||
|
switch (inFormat) {
|
||||||
|
case ASCII:
|
||||||
|
try (BufferedReader br = new BufferedReader(new FileReader(inFile))) {
|
||||||
|
String line = br.readLine();
|
||||||
|
while (line != null) {
|
||||||
|
line = line.trim();
|
||||||
|
if (!line.isEmpty() && !line.startsWith("#")) {
|
||||||
|
String[] parts = line.split("\\s+");
|
||||||
|
if (inLLR) {
|
||||||
|
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
|
||||||
|
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
|
||||||
|
double range = Double.parseDouble(parts[2].trim());
|
||||||
|
double[] xyz = new Vector3D(lon, lat)
|
||||||
|
.scalarMultiply(range)
|
||||||
|
.toArray();
|
||||||
|
pointsXYZ.InsertNextPoint(xyz);
|
||||||
|
} else {
|
||||||
|
double[] xyz = new double[3];
|
||||||
|
xyz[0] = Double.parseDouble(parts[0].trim());
|
||||||
|
xyz[1] = Double.parseDouble(parts[1].trim());
|
||||||
|
xyz[2] = Double.parseDouble(parts[2].trim());
|
||||||
|
pointsXYZ.InsertNextPoint(xyz);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
line = br.readLine();
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case BIN3:
|
||||||
|
case BIN4:
|
||||||
|
case BIN7:
|
||||||
|
try (DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
|
||||||
|
while (dis.available() > 0) {
|
||||||
|
if (inFormat == FORMATS.BIN7) {
|
||||||
|
// skip time field
|
||||||
|
BinaryUtils.readDoubleAndSwap(dis);
|
||||||
|
}
|
||||||
|
if (inLLR) {
|
||||||
|
double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
|
||||||
|
double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
|
||||||
|
double range = BinaryUtils.readDoubleAndSwap(dis);
|
||||||
|
double[] xyz =
|
||||||
|
new Vector3D(lon, lat).scalarMultiply(range).toArray();
|
||||||
|
pointsXYZ.InsertNextPoint(xyz);
|
||||||
|
} else {
|
||||||
|
double[] xyz = new double[3];
|
||||||
|
xyz[0] = BinaryUtils.readDoubleAndSwap(dis);
|
||||||
|
xyz[1] = BinaryUtils.readDoubleAndSwap(dis);
|
||||||
|
xyz[2] = BinaryUtils.readDoubleAndSwap(dis);
|
||||||
|
pointsXYZ.InsertNextPoint(xyz);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
case ICQ:
|
||||||
|
case OBJ:
|
||||||
|
case PLT:
|
||||||
|
case PLY:
|
||||||
|
case VTK:
|
||||||
|
try {
|
||||||
|
polyData = PolyDataUtil.loadShapeModel(inFile);
|
||||||
|
pointsXYZ.DeepCopy(polyData.GetPoints());
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case FITS:
|
||||||
|
try {
|
||||||
|
polyData = PolyDataUtil.loadFITShapeModel(inFile);
|
||||||
|
pointsXYZ.DeepCopy(polyData.GetPoints());
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
if (clip != 1) {
|
||||||
case OBJ:
|
BoundingBox bbox = new BoundingBox();
|
||||||
if (polyData != null) {
|
|
||||||
try {
|
|
||||||
PolyDataUtil.saveShapeModelAsOBJ(polyData, outFile);
|
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
if (halfSize < 0 || groundSampleDistance < 0) {
|
|
||||||
System.out.printf(
|
|
||||||
"Must supply -halfSize and -groundSampleDistance for %s output\n",
|
|
||||||
outFormat);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
final double radius = mapRadius * halfSize * groundSampleDistance;
|
|
||||||
vtkPoints vtkPoints = pointsXYZ;
|
|
||||||
if (center != null) {
|
|
||||||
vtkPoints = new vtkPoints();
|
|
||||||
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
||||||
Vector3 pt = new Vector3(pointsXYZ.GetPoint(i));
|
double[] point = pointsXYZ.GetPoint(i);
|
||||||
if (center.sub(new Vector3(pt)).norm() > radius) continue;
|
bbox.update(new UnwritableVectorIJK(point[0], point[1], point[2]));
|
||||||
vtkPoints.InsertNextPoint(pt.toArray());
|
|
||||||
}
|
}
|
||||||
}
|
BoundingBox clipped = bbox.getScaledBoundingBox(clip);
|
||||||
|
vtkPoints clippedPoints = new vtkPoints();
|
||||||
PointCloudToPlane pctp = new PointCloudToPlane(vtkPoints, halfSize, groundSampleDistance);
|
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
||||||
pctp.getGMU().setFieldToHeight();
|
if (clipped.contains(pointsXYZ.GetPoint(i))) clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
|
||||||
pctp.getGMU().setGMTArgs(additionalGMTArgs);
|
}
|
||||||
try {
|
pointsXYZ = clippedPoints;
|
||||||
double[][][] regridField = pctp.getGMU().regridField();
|
polyData = null;
|
||||||
vtkPolyData griddedXYZ = PolyDataUtil.loadLocalFitsLLRModelN(regridField);
|
|
||||||
PolyDataUtil.saveShapeModelAsOBJ(griddedXYZ, outFile);
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
break;
|
}
|
||||||
case VTK:
|
|
||||||
if (polyData == null) {
|
public void write(String outFile, boolean outLLR) {
|
||||||
polyData = new vtkPolyData();
|
switch (outFormat) {
|
||||||
polyData.SetPoints(pointsXYZ);
|
case ASCII:
|
||||||
|
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(outFile)))) {
|
||||||
|
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
||||||
|
double[] thisPoint = pointsXYZ.GetPoint(i);
|
||||||
|
if (outLLR) {
|
||||||
|
Vector3D v = new Vector3D(thisPoint);
|
||||||
|
out.printf(
|
||||||
|
"%f %f %f\n",
|
||||||
|
Math.toDegrees(v.getAlpha()), Math.toDegrees(v.getDelta()), v.getNorm());
|
||||||
|
} else {
|
||||||
|
out.printf("%f %f %f\n", thisPoint[0], thisPoint[1], thisPoint[2]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case BIN3:
|
||||||
|
try (DataOutputStream os =
|
||||||
|
new DataOutputStream(new BufferedOutputStream(new FileOutputStream(outFile)))) {
|
||||||
|
|
||||||
|
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
||||||
|
double[] thisPoint = pointsXYZ.GetPoint(i);
|
||||||
|
if (outLLR) {
|
||||||
|
Vector3D v = new Vector3D(thisPoint);
|
||||||
|
|
||||||
|
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getAlpha()));
|
||||||
|
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getDelta()));
|
||||||
|
BinaryUtils.writeDoubleAndSwap(os, v.getNorm());
|
||||||
|
} else {
|
||||||
|
for (int ii = 0; ii < 3; ii++) BinaryUtils.writeDoubleAndSwap(os, thisPoint[ii]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
break;
|
||||||
|
case OBJ:
|
||||||
|
if (polyData != null) {
|
||||||
|
try {
|
||||||
|
PolyDataUtil.saveShapeModelAsOBJ(polyData, outFile);
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (halfSize < 0 || groundSampleDistance < 0) {
|
||||||
|
logger.error("Must supply -halfSize and -groundSampleDistance for {} output", outFormat);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
final double radius = mapRadius * halfSize * groundSampleDistance;
|
||||||
|
vtkPoints vtkPoints = pointsXYZ;
|
||||||
|
if (center != null) {
|
||||||
|
vtkPoints = new vtkPoints();
|
||||||
|
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
||||||
|
Vector3 pt = new Vector3(pointsXYZ.GetPoint(i));
|
||||||
|
if (center.sub(new Vector3(pt)).norm() > radius) continue;
|
||||||
|
vtkPoints.InsertNextPoint(pt.toArray());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
PointCloudToPlane pctp = new PointCloudToPlane(vtkPoints, halfSize, groundSampleDistance);
|
||||||
|
pctp.getGMU().setFieldToHeight();
|
||||||
|
pctp.getGMU().setGMTArgs(additionalGMTArgs);
|
||||||
|
try {
|
||||||
|
double[][][] regridField = pctp.getGMU().regridField();
|
||||||
|
vtkPolyData griddedXYZ = PolyDataUtil.loadLocalFitsLLRModelN(regridField);
|
||||||
|
PolyDataUtil.saveShapeModelAsOBJ(griddedXYZ, outFile);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case VTK:
|
||||||
|
if (polyData == null) {
|
||||||
|
polyData = new vtkPolyData();
|
||||||
|
polyData.SetPoints(pointsXYZ);
|
||||||
|
}
|
||||||
|
|
||||||
|
vtkCellArray cells = new vtkCellArray();
|
||||||
|
vtkFloatArray albedo = new vtkFloatArray();
|
||||||
|
albedo.SetName("albedo");
|
||||||
|
polyData.SetPolys(cells);
|
||||||
|
polyData.GetPointData().AddArray(albedo);
|
||||||
|
|
||||||
|
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
||||||
|
vtkIdList idList = new vtkIdList();
|
||||||
|
idList.InsertNextId(i);
|
||||||
|
cells.InsertNextCell(idList);
|
||||||
|
albedo.InsertNextValue(0.5f);
|
||||||
|
}
|
||||||
|
|
||||||
|
vtkPolyDataWriter writer = new vtkPolyDataWriter();
|
||||||
|
writer.SetInputData(polyData);
|
||||||
|
writer.SetFileName(outFile);
|
||||||
|
writer.SetFileTypeToBinary();
|
||||||
|
writer.Update();
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim()
|
||||||
|
+ ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of input file. If not present format will be inferred from inputFile extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inputFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of input file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of output file. If not present format will be inferred from outputFile extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outputFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of output file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inllr")
|
||||||
|
.desc(
|
||||||
|
"Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outllr")
|
||||||
|
.desc(
|
||||||
|
"Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("centerXYZ")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated"
|
||||||
|
+ " by commas.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("centerLonLat")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated"
|
||||||
|
+ " by a comma. Shape will be centered on the point closest to this lon,lat pair.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("halfSize")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("groundSampleDistance")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. "
|
||||||
|
+ "Units are the same as the input file, usually km.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("mapRadius")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within "
|
||||||
|
+ "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("gmtArgs")
|
||||||
|
.hasArg()
|
||||||
|
.longOpt("gmt-args")
|
||||||
|
.desc(
|
||||||
|
"Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("clip")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
TerrasaurTool defaultOBJ = new PointCloudFormatConverter();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
|
||||||
|
String inFile = cl.getOptionValue("inputFile");
|
||||||
|
String outFile = cl.getOptionValue("outputFile");
|
||||||
|
boolean inLLR = cl.hasOption("inllr");
|
||||||
|
boolean outLLR = cl.hasOption("outllr");
|
||||||
|
|
||||||
|
FORMATS inFormat = cl.hasOption("inputFormat")
|
||||||
|
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
|
||||||
|
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
|
||||||
|
FORMATS outFormat = cl.hasOption("outputFormat")
|
||||||
|
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
|
||||||
|
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
|
||||||
|
|
||||||
|
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, outFormat);
|
||||||
|
|
||||||
|
if (cl.hasOption("centerXYZ")) {
|
||||||
|
String[] params = cl.getOptionValue("centerXYZ").split(",");
|
||||||
|
double[] array = new double[3];
|
||||||
|
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
|
||||||
|
pcfc.setCenter(array);
|
||||||
}
|
}
|
||||||
|
|
||||||
vtkCellArray cells = new vtkCellArray();
|
if (cl.hasOption("clip")) {
|
||||||
vtkFloatArray albedo = new vtkFloatArray();
|
pcfc.setClip(Double.valueOf(cl.getOptionValue("clip")));
|
||||||
albedo.SetName("albedo");
|
|
||||||
polyData.SetPolys(cells);
|
|
||||||
polyData.GetPointData().AddArray(albedo);
|
|
||||||
|
|
||||||
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
|
|
||||||
vtkIdList idList = new vtkIdList();
|
|
||||||
idList.InsertNextId(i);
|
|
||||||
cells.InsertNextCell(idList);
|
|
||||||
albedo.InsertNextValue(0.5f);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
vtkPolyDataWriter writer = new vtkPolyDataWriter();
|
if (cl.hasOption("gmtArgs")) {
|
||||||
writer.SetInputData(polyData);
|
StringBuilder gmtArgs = new StringBuilder();
|
||||||
writer.SetFileName(outFile);
|
for (String arg : cl.getOptionValues("gmtArgs")) gmtArgs.append(String.format("%s ", arg));
|
||||||
writer.SetFileTypeToBinary();
|
pcfc.setGMTArgs(gmtArgs.toString());
|
||||||
writer.Update();
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logFile")
|
|
||||||
.hasArg()
|
|
||||||
.desc("If present, save screen output to log file.")
|
|
||||||
.build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logLevel")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim()
|
|
||||||
+ ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("inputFormat")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Format of input file. If not present format will be inferred from inputFile extension.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("inputFile")
|
|
||||||
.required()
|
|
||||||
.hasArg()
|
|
||||||
.desc("Required. Name of input file.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("outputFormat")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Format of output file. If not present format will be inferred from outputFile extension.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("outputFile")
|
|
||||||
.required()
|
|
||||||
.hasArg()
|
|
||||||
.desc("Required. Name of output file.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("inllr")
|
|
||||||
.desc(
|
|
||||||
"Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("outllr")
|
|
||||||
.desc(
|
|
||||||
"Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("centerXYZ")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated"
|
|
||||||
+ " by commas.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("centerLonLat")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated"
|
|
||||||
+ " by a comma. Shape will be centered on the point closest to this lon,lat pair.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("halfSize")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("groundSampleDistance")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. "
|
|
||||||
+ "Units are the same as the input file, usually km.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("mapRadius")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within "
|
|
||||||
+ "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("gmtArgs")
|
|
||||||
.hasArg()
|
|
||||||
.longOpt("gmt-args")
|
|
||||||
.desc(
|
|
||||||
"Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("clip")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
TerrasaurTool defaultOBJ = new PointCloudFormatConverter();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
|
|
||||||
String inFile = cl.getOptionValue("inputFile");
|
|
||||||
String outFile = cl.getOptionValue("outputFile");
|
|
||||||
boolean inLLR = cl.hasOption("inllr");
|
|
||||||
boolean outLLR = cl.hasOption("outllr");
|
|
||||||
|
|
||||||
FORMATS inFormat =
|
|
||||||
cl.hasOption("inputFormat")
|
|
||||||
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
|
|
||||||
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
|
|
||||||
FORMATS outFormat =
|
|
||||||
cl.hasOption("outputFormat")
|
|
||||||
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
|
|
||||||
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
|
|
||||||
|
|
||||||
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, outFormat);
|
|
||||||
|
|
||||||
if (cl.hasOption("centerXYZ")) {
|
|
||||||
String[] params = cl.getOptionValue("centerXYZ").split(",");
|
|
||||||
double[] array = new double[3];
|
|
||||||
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
|
|
||||||
pcfc.setCenter(array);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (cl.hasOption("clip")) {
|
|
||||||
pcfc.setClip(Double.valueOf(cl.getOptionValue("clip")));
|
|
||||||
}
|
|
||||||
|
|
||||||
if (cl.hasOption("gmtArgs")) {
|
|
||||||
StringBuilder gmtArgs = new StringBuilder();
|
|
||||||
for (String arg : cl.getOptionValues("gmtArgs")) gmtArgs.append(String.format("%s ", arg));
|
|
||||||
pcfc.setGMTArgs(gmtArgs.toString());
|
|
||||||
}
|
|
||||||
|
|
||||||
pcfc.read(inFile, inLLR);
|
|
||||||
|
|
||||||
if (cl.hasOption("centerLonLat")) {
|
|
||||||
String[] params = cl.getOptionValue("centerLonLat").split(",");
|
|
||||||
Vector3D lcDir =
|
|
||||||
new Vector3D(
|
|
||||||
Math.toRadians(Double.parseDouble(params[0].trim())),
|
|
||||||
Math.toRadians(Double.parseDouble(params[1].trim())));
|
|
||||||
double[] center = null;
|
|
||||||
double minSep = Double.MAX_VALUE;
|
|
||||||
vtkPoints vtkPoints = pcfc.getPoints();
|
|
||||||
for (int i = 0; i < vtkPoints.GetNumberOfPoints(); i++) {
|
|
||||||
double[] pt = vtkPoints.GetPoint(i);
|
|
||||||
double sep = Vector3D.angle(lcDir, new Vector3D(pt));
|
|
||||||
if (sep < minSep) {
|
|
||||||
minSep = sep;
|
|
||||||
center = pt;
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
pcfc.setCenter(center);
|
pcfc.read(inFile, inLLR);
|
||||||
|
|
||||||
|
if (cl.hasOption("centerLonLat")) {
|
||||||
|
String[] params = cl.getOptionValue("centerLonLat").split(",");
|
||||||
|
Vector3D lcDir = new Vector3D(
|
||||||
|
Math.toRadians(Double.parseDouble(params[0].trim())),
|
||||||
|
Math.toRadians(Double.parseDouble(params[1].trim())));
|
||||||
|
double[] center = null;
|
||||||
|
double minSep = Double.MAX_VALUE;
|
||||||
|
vtkPoints vtkPoints = pcfc.getPoints();
|
||||||
|
for (int i = 0; i < vtkPoints.GetNumberOfPoints(); i++) {
|
||||||
|
double[] pt = vtkPoints.GetPoint(i);
|
||||||
|
double sep = Vector3D.angle(lcDir, new Vector3D(pt));
|
||||||
|
if (sep < minSep) {
|
||||||
|
minSep = sep;
|
||||||
|
center = pt;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
pcfc.setCenter(center);
|
||||||
|
}
|
||||||
|
|
||||||
|
pcfc.setMapRadius(
|
||||||
|
cl.hasOption("mapRadius") ? Double.parseDouble(cl.getOptionValue("mapRadius")) : Math.sqrt(2));
|
||||||
|
|
||||||
|
if (cl.hasOption("halfSize") && cl.hasOption("groundSampleDistance")) {
|
||||||
|
// resample on a uniform XY grid
|
||||||
|
pcfc.setHalfSize(Integer.parseInt(cl.getOptionValue("halfSize")));
|
||||||
|
pcfc.setGroundSampleDistance(Double.parseDouble(cl.getOptionValue("groundSampleDistance")));
|
||||||
|
}
|
||||||
|
|
||||||
|
pcfc.write(outFile, outLLR);
|
||||||
}
|
}
|
||||||
|
|
||||||
pcfc.setMapRadius(
|
|
||||||
cl.hasOption("mapRadius") ? Double.parseDouble(cl.getOptionValue("mapRadius")) : Math.sqrt(2));
|
|
||||||
|
|
||||||
if (cl.hasOption("halfSize") && cl.hasOption("groundSampleDistance")) {
|
|
||||||
// resample on a uniform XY grid
|
|
||||||
pcfc.setHalfSize(Integer.parseInt(cl.getOptionValue("halfSize")));
|
|
||||||
pcfc.setGroundSampleDistance(Double.parseDouble(cl.getOptionValue("groundSampleDistance")));
|
|
||||||
}
|
|
||||||
|
|
||||||
pcfc.write(outFile, outLLR);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
399
src/main/java/terrasaur/apps/PointCloudOverlap.java
Normal file
399
src/main/java/terrasaur/apps/PointCloudOverlap.java
Normal file
@@ -0,0 +1,399 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
|
package terrasaur.apps;
|
||||||
|
|
||||||
|
import java.awt.geom.Path2D;
|
||||||
|
import java.awt.geom.Point2D;
|
||||||
|
import java.io.File;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.Collection;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Map;
|
||||||
|
import nom.tam.fits.FitsException;
|
||||||
|
import org.apache.commons.cli.CommandLine;
|
||||||
|
import org.apache.commons.cli.Option;
|
||||||
|
import org.apache.commons.cli.Options;
|
||||||
|
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
||||||
|
import org.apache.commons.math3.geometry.euclidean.twod.Vector2D;
|
||||||
|
import org.apache.commons.math3.geometry.euclidean.twod.hull.MonotoneChain;
|
||||||
|
import org.apache.logging.log4j.LogManager;
|
||||||
|
import org.apache.logging.log4j.Logger;
|
||||||
|
import picante.math.coords.LatitudinalVector;
|
||||||
|
import spice.basic.SpiceException;
|
||||||
|
import spice.basic.Vector3;
|
||||||
|
import terrasaur.enums.FORMATS;
|
||||||
|
import terrasaur.templates.TerrasaurTool;
|
||||||
|
import terrasaur.utils.NativeLibraryLoader;
|
||||||
|
import terrasaur.utils.VectorStatistics;
|
||||||
|
import terrasaur.utils.tessellation.StereographicProjection;
|
||||||
|
import vtk.vtkPoints;
|
||||||
|
|
||||||
|
public class PointCloudOverlap implements TerrasaurTool {
|
||||||
|
|
||||||
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String shortDescription() {
|
||||||
|
return "Find points in a point cloud which overlap a reference point cloud.";
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String fullDescription(Options options) {
|
||||||
|
String header = "";
|
||||||
|
String footer =
|
||||||
|
"\nThis program finds all points in the input point cloud which overlap the points in a reference point cloud.\n\n"
|
||||||
|
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK. Supported output formats are ASCII, BINARY, L2, and VTK. "
|
||||||
|
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.\n\n"
|
||||||
|
+ "A plane is fit to the reference point cloud and all points in each cloud are projected onto this plane. Any point in the "
|
||||||
|
+ "projected input cloud which falls within the outline of the projected reference cloud is considered to be overlapping.";
|
||||||
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
|
}
|
||||||
|
|
||||||
|
private Path2D.Double polygon;
|
||||||
|
|
||||||
|
private StereographicProjection proj;
|
||||||
|
|
||||||
|
/*- Useful for debugging
|
||||||
|
private vtkPoints ref2DPoints;
|
||||||
|
private vtkPoints input2DPoints;
|
||||||
|
private vtkPolyData polygonPolyData;
|
||||||
|
private vtkCellArray polygonCells;
|
||||||
|
private vtkPoints polygonPoints;
|
||||||
|
private vtkDoubleArray polygonSuccessArray;
|
||||||
|
*/
|
||||||
|
|
||||||
|
public PointCloudOverlap(Collection<double[]> refPoints) {
|
||||||
|
if (refPoints != null) {
|
||||||
|
VectorStatistics vStats = new VectorStatistics();
|
||||||
|
for (double[] pt : refPoints) vStats.add(new Vector3(pt));
|
||||||
|
|
||||||
|
Vector3D centerXYZ = vStats.getMean();
|
||||||
|
|
||||||
|
proj = new StereographicProjection(new LatitudinalVector(1, centerXYZ.getDelta(), centerXYZ.getAlpha()));
|
||||||
|
|
||||||
|
createRefPolygon(refPoints);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public StereographicProjection getProjection() {
|
||||||
|
return proj;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void createRefPolygon(Collection<double[]> refPoints) {
|
||||||
|
List<Vector2D> stereographicPoints = new ArrayList<>();
|
||||||
|
for (double[] refPt : refPoints) {
|
||||||
|
Vector3D point3D = new Vector3D(refPt);
|
||||||
|
Point2D point = proj.forward(point3D.getDelta(), point3D.getAlpha());
|
||||||
|
stereographicPoints.add(new Vector2D(point.getX(), point.getY()));
|
||||||
|
}
|
||||||
|
|
||||||
|
/*-
|
||||||
|
ref2DPoints = new vtkPoints();
|
||||||
|
input2DPoints = new vtkPoints();
|
||||||
|
polygonPolyData = new vtkPolyData();
|
||||||
|
polygonCells = new vtkCellArray();
|
||||||
|
polygonPoints = new vtkPoints();
|
||||||
|
polygonSuccessArray = new vtkDoubleArray();
|
||||||
|
polygonSuccessArray.SetName("success")
|
||||||
|
polygonPolyData.SetPoints(polygonPoints);
|
||||||
|
polygonPolyData.SetLines(polygonCells);
|
||||||
|
polygonPolyData.GetCellData().AddArray(polygonSuccessArray);
|
||||||
|
|
||||||
|
for (Vector2D refPoint : refPoints)
|
||||||
|
ref2DPoints.InsertNextPoint(refPoint.getX(), refPoint.getY(), 0);
|
||||||
|
*/
|
||||||
|
|
||||||
|
MonotoneChain mc = new MonotoneChain();
|
||||||
|
List<Vector2D> vertices = new ArrayList<>(mc.findHullVertices(stereographicPoints));
|
||||||
|
/*-
|
||||||
|
for (int i = 1; i < vertices.size(); i++) {
|
||||||
|
Vector2D lastPt = vertices.get(i - 1);
|
||||||
|
Vector2D thisPt = vertices.get(i);
|
||||||
|
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
|
||||||
|
thisPt.getY(), lastPt.distance(thisPt));
|
||||||
|
}
|
||||||
|
Vector2D lastPt = vertices.get(vertices.size() - 1);
|
||||||
|
Vector2D thisPt = vertices.get(0);
|
||||||
|
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
|
||||||
|
thisPt.getY(), lastPt.distance(thisPt));
|
||||||
|
*/
|
||||||
|
// int id0 = 0;
|
||||||
|
for (Vector2D vertex : vertices) {
|
||||||
|
// int id1 = polygonPoints.InsertNextPoint(vertex.getX(), vertex.getY(), 0);
|
||||||
|
|
||||||
|
if (polygon == null) {
|
||||||
|
polygon = new Path2D.Double();
|
||||||
|
polygon.moveTo(vertex.getX(), vertex.getY());
|
||||||
|
} else {
|
||||||
|
polygon.lineTo(vertex.getX(), vertex.getY());
|
||||||
|
/*-
|
||||||
|
vtkLine line = new vtkLine();
|
||||||
|
line.GetPointIds().SetId(0, id0);
|
||||||
|
line.GetPointIds().SetId(1, id1);
|
||||||
|
polygonCells.InsertNextCell(line);
|
||||||
|
*/
|
||||||
|
}
|
||||||
|
// id0 = id1;
|
||||||
|
}
|
||||||
|
polygon.closePath();
|
||||||
|
|
||||||
|
/*-
|
||||||
|
vtkPolyDataWriter writer = new vtkPolyDataWriter();
|
||||||
|
writer.SetInputData(polygonPolyData);
|
||||||
|
writer.SetFileName("polygon2D.vtk");
|
||||||
|
writer.SetFileTypeToBinary();
|
||||||
|
writer.Update();
|
||||||
|
|
||||||
|
writer = new vtkPolyDataWriter();
|
||||||
|
polygonPolyData = new vtkPolyData();
|
||||||
|
polygonPolyData.SetPoints(ref2DPoints);
|
||||||
|
writer.SetInputData(polygonPolyData);
|
||||||
|
writer.SetFileName("refPoints.vtk");
|
||||||
|
writer.SetFileTypeToBinary();
|
||||||
|
writer.Update();
|
||||||
|
*/
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isEnclosed(double[] xyz) {
|
||||||
|
Vector3D point = new Vector3D(xyz);
|
||||||
|
Point2D projected = proj.forward(point.getDelta(), point.getAlpha());
|
||||||
|
return polygon.contains(projected.getX(), projected.getY());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @param inputPoints points to consider
|
||||||
|
* @param scale scale factor
|
||||||
|
* @return indices of all points inside the scaled polygon
|
||||||
|
*/
|
||||||
|
public List<Integer> scalePoints(List<double[]> inputPoints, double scale) {
|
||||||
|
|
||||||
|
List<Vector2D> projected = new ArrayList<>();
|
||||||
|
for (double[] inputPoint : inputPoints) {
|
||||||
|
Vector3D point = new Vector3D(inputPoint);
|
||||||
|
Point2D projectedPoint = proj.forward(point.getDelta(), point.getAlpha());
|
||||||
|
projected.add(new Vector2D(projectedPoint.getX(), projectedPoint.getY()));
|
||||||
|
}
|
||||||
|
|
||||||
|
Vector2D center = new Vector2D(0, 0);
|
||||||
|
for (Vector2D inputPoint : projected) center = center.add(inputPoint);
|
||||||
|
|
||||||
|
center = center.scalarMultiply(1. / inputPoints.size());
|
||||||
|
|
||||||
|
List<Vector2D> translatedPoints = new ArrayList<>();
|
||||||
|
for (Vector2D inputPoint : projected) translatedPoints.add(inputPoint.subtract(center));
|
||||||
|
|
||||||
|
Path2D.Double thisPolygon = null;
|
||||||
|
MonotoneChain mc = new MonotoneChain();
|
||||||
|
Collection<Vector2D> vertices = mc.findHullVertices(translatedPoints);
|
||||||
|
for (Vector2D vertex : vertices) {
|
||||||
|
if (thisPolygon == null) {
|
||||||
|
thisPolygon = new Path2D.Double();
|
||||||
|
thisPolygon.moveTo(scale * vertex.getX(), scale * vertex.getY());
|
||||||
|
} else {
|
||||||
|
thisPolygon.lineTo(scale * vertex.getX(), scale * vertex.getY());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
thisPolygon.closePath();
|
||||||
|
|
||||||
|
List<Integer> indices = new ArrayList<>();
|
||||||
|
for (int i = 0; i < projected.size(); i++) {
|
||||||
|
Vector2D inputPoint = projected.get(i);
|
||||||
|
if (thisPolygon.contains(inputPoint.getX() - center.getX(), inputPoint.getY() - center.getY()))
|
||||||
|
indices.add(i);
|
||||||
|
}
|
||||||
|
return indices;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = new Options();
|
||||||
|
options.addOption(Option.builder("inputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of input file. If not present format will be inferred from file extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inputFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of input file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inllr")
|
||||||
|
.desc(
|
||||||
|
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("referenceFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of reference file. If not present format will be inferred from file extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("referenceFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of reference file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("refllr")
|
||||||
|
.desc(
|
||||||
|
"If present, reference values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of output file. If not present format will be inferred from file extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outputFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of output file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outllr")
|
||||||
|
.desc(
|
||||||
|
"If present, output values will be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("scale")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Value to scale bounding box containing intersect region. Default is 1.0.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws SpiceException, IOException, InterruptedException, FitsException {
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
|
||||||
|
TerrasaurTool defaultOBJ = new PointCloudOverlap(null);
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
// Read the reference file
|
||||||
|
FORMATS refFormat = cl.hasOption("referenceFormat")
|
||||||
|
? FORMATS.valueOf(cl.getOptionValue("referenceFormat").toUpperCase())
|
||||||
|
: FORMATS.formatFromExtension(cl.getOptionValue("referenceFile"));
|
||||||
|
String refFile = cl.getOptionValue("referenceFile");
|
||||||
|
boolean refLLR = cl.hasOption("refllr");
|
||||||
|
|
||||||
|
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(refFormat, FORMATS.VTK);
|
||||||
|
pcfc.read(refFile, refLLR);
|
||||||
|
vtkPoints referencePoints = pcfc.getPoints();
|
||||||
|
logger.info("{} points read from {}", referencePoints.GetNumberOfPoints(), refFile);
|
||||||
|
|
||||||
|
List<double[]> refPts = new ArrayList<>();
|
||||||
|
for (int i = 0; i < referencePoints.GetNumberOfPoints(); i++) {
|
||||||
|
refPts.add(referencePoints.GetPoint(i));
|
||||||
|
}
|
||||||
|
|
||||||
|
// create the overlap object and set the enclosing polygon
|
||||||
|
PointCloudOverlap pco = new PointCloudOverlap(refPts);
|
||||||
|
|
||||||
|
// Read the input point cloud
|
||||||
|
FORMATS inFormat = cl.hasOption("inputFormat")
|
||||||
|
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
|
||||||
|
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
|
||||||
|
String inFile = cl.getOptionValue("inputFile");
|
||||||
|
boolean inLLR = cl.hasOption("inllr");
|
||||||
|
|
||||||
|
pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
|
||||||
|
pcfc.read(inFile, inLLR);
|
||||||
|
vtkPoints inputPoints = pcfc.getPoints();
|
||||||
|
logger.info("{} points read from {}", inputPoints.GetNumberOfPoints(), inFile);
|
||||||
|
|
||||||
|
List<Integer> enclosedIndices = new ArrayList<>();
|
||||||
|
for (int i = 0; i < inputPoints.GetNumberOfPoints(); i++) {
|
||||||
|
double[] pt = inputPoints.GetPoint(i);
|
||||||
|
if (pco.isEnclosed(pt)) enclosedIndices.add(i);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("scale")) {
|
||||||
|
List<double[]> pts = new ArrayList<>();
|
||||||
|
for (Integer i : enclosedIndices) pts.add(inputPoints.GetPoint(i));
|
||||||
|
|
||||||
|
// this list includes which of the enclosed points are inside the scaled polygon
|
||||||
|
List<Integer> theseIndices = pco.scalePoints(pts, Double.parseDouble(cl.getOptionValue("scale")));
|
||||||
|
|
||||||
|
// now relate this list back to the original list of points
|
||||||
|
List<Integer> newIndices = new ArrayList<>();
|
||||||
|
for (Integer i : theseIndices) newIndices.add(enclosedIndices.get(i));
|
||||||
|
enclosedIndices = newIndices;
|
||||||
|
}
|
||||||
|
|
||||||
|
VectorStatistics xyzStats = new VectorStatistics();
|
||||||
|
VectorStatistics xyStats = new VectorStatistics();
|
||||||
|
for (Integer i : enclosedIndices) {
|
||||||
|
double[] thisPt = inputPoints.GetPoint(i);
|
||||||
|
Vector3D thisPt3D = new Vector3D(thisPt);
|
||||||
|
xyzStats.add(thisPt3D);
|
||||||
|
Point2D projectedPt = pco.getProjection().forward(thisPt3D.getDelta(), thisPt3D.getAlpha());
|
||||||
|
xyStats.add(new Vector3(projectedPt.getX(), projectedPt.getY(), 0));
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"Center XYZ: {}, {}, {}",
|
||||||
|
xyzStats.getMean().getX(),
|
||||||
|
xyzStats.getMean().getY(),
|
||||||
|
xyzStats.getMean().getZ());
|
||||||
|
Vector3D centerXYZ = xyzStats.getMean();
|
||||||
|
logger.info(
|
||||||
|
"Center lon, lat: {}, {}\n",
|
||||||
|
Math.toDegrees(centerXYZ.getAlpha()),
|
||||||
|
Math.toDegrees(centerXYZ.getDelta()));
|
||||||
|
logger.info(
|
||||||
|
"xmin/xmax/extent: {}/{}/{}\n",
|
||||||
|
xyzStats.getMin().getX(),
|
||||||
|
xyzStats.getMax().getX(),
|
||||||
|
xyzStats.getMax().getX() - xyzStats.getMin().getX());
|
||||||
|
logger.info(
|
||||||
|
"ymin/ymax/extent: {}/{}/{}\n",
|
||||||
|
xyzStats.getMin().getY(),
|
||||||
|
xyzStats.getMax().getY(),
|
||||||
|
xyzStats.getMax().getY() - xyzStats.getMin().getY());
|
||||||
|
logger.info(
|
||||||
|
"zmin/zmax/extent: {}/{}/{}\n",
|
||||||
|
xyzStats.getMin().getZ(),
|
||||||
|
xyzStats.getMax().getZ(),
|
||||||
|
xyzStats.getMax().getZ() - xyzStats.getMin().getZ());
|
||||||
|
|
||||||
|
FORMATS outFormat = cl.hasOption("outputFormat")
|
||||||
|
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
|
||||||
|
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
|
||||||
|
|
||||||
|
vtkPoints pointsToWrite = new vtkPoints();
|
||||||
|
for (Integer i : enclosedIndices) pointsToWrite.InsertNextPoint(inputPoints.GetPoint(i));
|
||||||
|
pcfc = new PointCloudFormatConverter(FORMATS.VTK, outFormat);
|
||||||
|
pcfc.setPoints(pointsToWrite);
|
||||||
|
String outputFilename = cl.getOptionValue("outputFile");
|
||||||
|
pcfc.write(outputFilename, cl.hasOption("outllr"));
|
||||||
|
if (new File(outputFilename).exists()) {
|
||||||
|
logger.info("{} points written to {}", pointsToWrite.GetNumberOfPoints(), outputFilename);
|
||||||
|
} else {
|
||||||
|
logger.error("Could not write {}", outputFilename);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Finished");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO write out center of output pointcloud
|
||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.awt.Rectangle;
|
import java.awt.Rectangle;
|
||||||
@@ -42,348 +64,320 @@ import vtk.vtkPoints;
|
|||||||
|
|
||||||
public class PointCloudToPlane implements TerrasaurTool {
|
public class PointCloudToPlane implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Find a rotation and translation to transform a point cloud to a height field above the best fit plane.";
|
return "Find a rotation and translation to transform a point cloud to a height field above the best fit plane.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer =
|
String footer =
|
||||||
"\nThis program finds a rotation and translation to transform a point cloud to a height field above the best fit plane. "
|
"\nThis program finds a rotation and translation to transform a point cloud to a height field above the best fit plane. "
|
||||||
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK.\n\n"
|
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK.\n\n"
|
||||||
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates. ";
|
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates. ";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
}
|
||||||
|
|
||||||
private GMTGridUtil gmu;
|
private GMTGridUtil gmu;
|
||||||
|
|
||||||
public GMTGridUtil getGMU() {
|
public GMTGridUtil getGMU() {
|
||||||
return gmu;
|
return gmu;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void writeOutput(String outputFile) {
|
public void writeOutput(String outputFile) {
|
||||||
if (outputFile != null) {
|
if (outputFile != null) {
|
||||||
try (PrintStream ps = new PrintStream(outputFile)) {
|
try (PrintStream ps = new PrintStream(outputFile)) {
|
||||||
double[][] transformation = gmu.getTransformation();
|
double[][] transformation = gmu.getTransformation();
|
||||||
for (int i = 0; i < 4; i++) {
|
for (int i = 0; i < 4; i++) {
|
||||||
for (int j = 0; j < 4; j++) {
|
for (int j = 0; j < 4; j++) {
|
||||||
ps.printf("%24.16e ", transformation[i][j]);
|
ps.printf("%24.16e ", transformation[i][j]);
|
||||||
}
|
}
|
||||||
ps.println();
|
ps.println();
|
||||||
|
}
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private PointCloudToPlane() {}
|
|
||||||
|
|
||||||
public PointCloudToPlane(vtkPoints points) {
|
|
||||||
this(points, 0, 0.);
|
|
||||||
}
|
|
||||||
|
|
||||||
public PointCloudToPlane(vtkPoints points, int halfSize, double groundSampleDistance) {
|
|
||||||
double[] x = new double[(int) points.GetNumberOfPoints()];
|
|
||||||
double[] y = new double[(int) points.GetNumberOfPoints()];
|
|
||||||
double[] z = new double[(int) points.GetNumberOfPoints()];
|
|
||||||
|
|
||||||
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
|
|
||||||
double[] thisPoint = points.GetPoint(i);
|
|
||||||
x[i] = thisPoint[0];
|
|
||||||
y[i] = thisPoint[1];
|
|
||||||
z[i] = thisPoint[2];
|
|
||||||
}
|
}
|
||||||
|
|
||||||
gmu = new GMTGridUtil(halfSize, groundSampleDistance);
|
private PointCloudToPlane() {}
|
||||||
gmu.setXYZ(x, y, z);
|
|
||||||
}
|
|
||||||
|
|
||||||
public BufferedImage makePlot(List<Vector3> points, String name) throws SpiceException {
|
public PointCloudToPlane(vtkPoints points) {
|
||||||
DescriptiveStatistics stats = new DescriptiveStatistics();
|
this(points, 0, 0.);
|
||||||
VectorStatistics vStats = new VectorStatistics();
|
|
||||||
DiscreteDataSet data = new DiscreteDataSet(name);
|
|
||||||
|
|
||||||
PlotConfig config = ImmutablePlotConfig.builder().title(name).build();
|
|
||||||
|
|
||||||
DiscreteDataPlot canvas;
|
|
||||||
boolean orthographic = false;
|
|
||||||
if (orthographic) {
|
|
||||||
for (Vector3 p : points) {
|
|
||||||
stats.addValue(p.getElt(2));
|
|
||||||
vStats.add(p);
|
|
||||||
LatitudinalCoordinates lc = new LatitudinalCoordinates(p);
|
|
||||||
data.add(lc.getLongitude(), lc.getLatitude(), 0, p.getElt(2));
|
|
||||||
}
|
|
||||||
|
|
||||||
double min = stats.getMin();
|
|
||||||
double max = stats.getMax();
|
|
||||||
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
|
|
||||||
data.setSymbol(new Circle().setSize(1.0));
|
|
||||||
data.setColorRamp(ramp);
|
|
||||||
|
|
||||||
Vector3 center = MathConversions.toVector3(vStats.getMean());
|
|
||||||
|
|
||||||
double halfExtent = 0;
|
|
||||||
for (Vector3 p : points) {
|
|
||||||
double dist = center.sep(p);
|
|
||||||
if (dist > halfExtent) halfExtent = dist;
|
|
||||||
}
|
|
||||||
|
|
||||||
ProjectionOrthographic p =
|
|
||||||
new ProjectionOrthographic(
|
|
||||||
config.width(),
|
|
||||||
config.height(),
|
|
||||||
CoordConverters.convertToLatitudinal(
|
|
||||||
new VectorIJK(center.getElt(0), center.getElt(1), center.getElt(2))));
|
|
||||||
p.setRadius(Math.max(0.5, .6 / halfExtent));
|
|
||||||
|
|
||||||
canvas = new MapPlot(config, p);
|
|
||||||
canvas.drawAxes();
|
|
||||||
canvas.plot(data);
|
|
||||||
((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true);
|
|
||||||
|
|
||||||
canvas.drawColorBar(
|
|
||||||
ImmutableColorBar.builder()
|
|
||||||
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
|
|
||||||
.ramp(ramp)
|
|
||||||
.numTicks(5)
|
|
||||||
.tickFunction(StringFunctions.fixedFormat("%.3f"))
|
|
||||||
.build());
|
|
||||||
} else {
|
|
||||||
for (Vector3 p : points) {
|
|
||||||
stats.addValue(p.getElt(2));
|
|
||||||
vStats.add(p);
|
|
||||||
data.add(p.getElt(0), p.getElt(1), 0, p.getElt(2));
|
|
||||||
}
|
|
||||||
|
|
||||||
double min = stats.getMin();
|
|
||||||
double max = stats.getMax();
|
|
||||||
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
|
|
||||||
data.setSymbol(new Circle().setSize(1.0));
|
|
||||||
data.setColorRamp(ramp);
|
|
||||||
|
|
||||||
canvas = new DiscreteDataPlot(config);
|
|
||||||
AxisX xAxis = data.defaultXAxis("X");
|
|
||||||
AxisY yAxis = data.defaultYAxis("Y");
|
|
||||||
canvas.setAxes(xAxis, yAxis);
|
|
||||||
canvas.drawAxes();
|
|
||||||
canvas.plot(data);
|
|
||||||
|
|
||||||
canvas.drawColorBar(
|
|
||||||
ImmutableColorBar.builder()
|
|
||||||
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
|
|
||||||
.ramp(ramp)
|
|
||||||
.numTicks(5)
|
|
||||||
.tickFunction(StringFunctions.fixedFormat("%.3f"))
|
|
||||||
.build());
|
|
||||||
}
|
}
|
||||||
return canvas.getImage();
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
public PointCloudToPlane(vtkPoints points, int halfSize, double groundSampleDistance) {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
double[] x = new double[(int) points.GetNumberOfPoints()];
|
||||||
options.addOption(
|
double[] y = new double[(int) points.GetNumberOfPoints()];
|
||||||
Option.builder("inputFormat")
|
double[] z = new double[(int) points.GetNumberOfPoints()];
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Format of input file. If not present format is inferred from inputFile extension.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("inputFile")
|
|
||||||
.required()
|
|
||||||
.hasArg()
|
|
||||||
.desc("Required. Name of input file.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("inllr")
|
|
||||||
.desc(
|
|
||||||
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logFile")
|
|
||||||
.hasArg()
|
|
||||||
.desc("If present, save screen output to log file.")
|
|
||||||
.build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logLevel")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim()
|
|
||||||
+ ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("outputFile")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top "
|
|
||||||
+ "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert "
|
|
||||||
+ "from global to local:\n transformed = rotation.mxv(point.sub(translation))")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("translate")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Translate surface points and spacecraft position. "
|
|
||||||
+ "Specify by three floating point numbers separated by commas. "
|
|
||||||
+ "Default is to use centroid of input point cloud.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("plotXYZ")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Plot X vs Y (in the local frame) colored by Z. "
|
|
||||||
+ "Argument is the name of PNG file to write.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("plotXYR")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Plot X vs Y (in the local frame) colored by R. "
|
|
||||||
+ "Argument is the name of PNG file to write.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("slope")
|
|
||||||
.desc(
|
|
||||||
"Choose local coordinate frame such that Z points normal to the plane "
|
|
||||||
+ "and X points along the direction of steepest descent.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws SpiceException {
|
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
|
||||||
TerrasaurTool defaultOBJ = new PointCloudToPlane();
|
double[] thisPoint = points.GetPoint(i);
|
||||||
|
x[i] = thisPoint[0];
|
||||||
|
y[i] = thisPoint[1];
|
||||||
|
z[i] = thisPoint[2];
|
||||||
|
}
|
||||||
|
|
||||||
Options options = defineOptions();
|
gmu = new GMTGridUtil(halfSize, groundSampleDistance);
|
||||||
|
gmu.setXYZ(x, y, z);
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
|
|
||||||
String inFile = cl.getOptionValue("inputFile");
|
|
||||||
boolean inLLR = cl.hasOption("inllr");
|
|
||||||
|
|
||||||
FORMATS inFormat =
|
|
||||||
cl.hasOption("inputFormat")
|
|
||||||
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
|
|
||||||
: FORMATS.formatFromExtension(inFile);
|
|
||||||
|
|
||||||
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
|
|
||||||
pcfc.read(inFile, inLLR);
|
|
||||||
System.out.printf("%d points read from %s\n", pcfc.getPoints().GetNumberOfPoints(), inFile);
|
|
||||||
|
|
||||||
int halfSize = 0;
|
|
||||||
double groundSampleDistance = 0;
|
|
||||||
|
|
||||||
vtkPoints points = pcfc.getPoints();
|
|
||||||
PointCloudToPlane pctp = new PointCloudToPlane(points, halfSize, groundSampleDistance);
|
|
||||||
|
|
||||||
Vector3 translation;
|
|
||||||
if (cl.hasOption("translate")) {
|
|
||||||
translation =
|
|
||||||
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
|
|
||||||
pctp.getGMU().setTranslation(translation.toArray());
|
|
||||||
}
|
}
|
||||||
pctp.getGMU().calculateTransformation();
|
|
||||||
|
|
||||||
List<Vector3> globalPts = new ArrayList<>();
|
public BufferedImage makePlot(List<Vector3> points, String name) throws SpiceException {
|
||||||
for (int i = 0; i < points.GetNumberOfPoints(); i++)
|
DescriptiveStatistics stats = new DescriptiveStatistics();
|
||||||
globalPts.add(new Vector3(points.GetPoint(i)));
|
VectorStatistics vStats = new VectorStatistics();
|
||||||
|
DiscreteDataSet data = new DiscreteDataSet(name);
|
||||||
|
|
||||||
double[][] transformation = pctp.getGMU().getTransformation();
|
PlotConfig config = ImmutablePlotConfig.builder().title(name).build();
|
||||||
StringBuilder sb =
|
|
||||||
new StringBuilder(
|
DiscreteDataPlot canvas;
|
||||||
String.format(
|
boolean orthographic = false;
|
||||||
|
if (orthographic) {
|
||||||
|
for (Vector3 p : points) {
|
||||||
|
stats.addValue(p.getElt(2));
|
||||||
|
vStats.add(p);
|
||||||
|
LatitudinalCoordinates lc = new LatitudinalCoordinates(p);
|
||||||
|
data.add(lc.getLongitude(), lc.getLatitude(), 0, p.getElt(2));
|
||||||
|
}
|
||||||
|
|
||||||
|
double min = stats.getMin();
|
||||||
|
double max = stats.getMax();
|
||||||
|
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
|
||||||
|
data.setSymbol(new Circle().setSize(1.0));
|
||||||
|
data.setColorRamp(ramp);
|
||||||
|
|
||||||
|
Vector3 center = MathConversions.toVector3(vStats.getMean());
|
||||||
|
|
||||||
|
double halfExtent = 0;
|
||||||
|
for (Vector3 p : points) {
|
||||||
|
double dist = center.sep(p);
|
||||||
|
if (dist > halfExtent) halfExtent = dist;
|
||||||
|
}
|
||||||
|
|
||||||
|
ProjectionOrthographic p = new ProjectionOrthographic(
|
||||||
|
config.width(),
|
||||||
|
config.height(),
|
||||||
|
CoordConverters.convertToLatitudinal(
|
||||||
|
new VectorIJK(center.getElt(0), center.getElt(1), center.getElt(2))));
|
||||||
|
p.setRadius(Math.max(0.5, .6 / halfExtent));
|
||||||
|
|
||||||
|
canvas = new MapPlot(config, p);
|
||||||
|
canvas.drawAxes();
|
||||||
|
canvas.plot(data);
|
||||||
|
((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true);
|
||||||
|
|
||||||
|
canvas.drawColorBar(ImmutableColorBar.builder()
|
||||||
|
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
|
||||||
|
.ramp(ramp)
|
||||||
|
.numTicks(5)
|
||||||
|
.tickFunction(StringFunctions.fixedFormat("%.3f"))
|
||||||
|
.build());
|
||||||
|
} else {
|
||||||
|
for (Vector3 p : points) {
|
||||||
|
stats.addValue(p.getElt(2));
|
||||||
|
vStats.add(p);
|
||||||
|
data.add(p.getElt(0), p.getElt(1), 0, p.getElt(2));
|
||||||
|
}
|
||||||
|
|
||||||
|
double min = stats.getMin();
|
||||||
|
double max = stats.getMax();
|
||||||
|
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
|
||||||
|
data.setSymbol(new Circle().setSize(1.0));
|
||||||
|
data.setColorRamp(ramp);
|
||||||
|
|
||||||
|
canvas = new DiscreteDataPlot(config);
|
||||||
|
AxisX xAxis = data.defaultXAxis("X");
|
||||||
|
AxisY yAxis = data.defaultYAxis("Y");
|
||||||
|
canvas.setAxes(xAxis, yAxis);
|
||||||
|
canvas.drawAxes();
|
||||||
|
canvas.plot(data);
|
||||||
|
|
||||||
|
canvas.drawColorBar(ImmutableColorBar.builder()
|
||||||
|
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
|
||||||
|
.ramp(ramp)
|
||||||
|
.numTicks(5)
|
||||||
|
.tickFunction(StringFunctions.fixedFormat("%.3f"))
|
||||||
|
.build());
|
||||||
|
}
|
||||||
|
return canvas.getImage();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("inputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of input file. If not present format is inferred from inputFile extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inputFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of input file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inllr")
|
||||||
|
.desc(
|
||||||
|
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim()
|
||||||
|
+ ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outputFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top "
|
||||||
|
+ "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert "
|
||||||
|
+ "from global to local:\n transformed = rotation.mxv(point.sub(translation))")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("translate")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Translate surface points and spacecraft position. "
|
||||||
|
+ "Specify by three floating point numbers separated by commas. "
|
||||||
|
+ "Default is to use centroid of input point cloud.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("plotXYZ")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Plot X vs Y (in the local frame) colored by Z. " + "Argument is the name of PNG file to write.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("plotXYR")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Plot X vs Y (in the local frame) colored by R. " + "Argument is the name of PNG file to write.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("slope")
|
||||||
|
.desc("Choose local coordinate frame such that Z points normal to the plane "
|
||||||
|
+ "and X points along the direction of steepest descent.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws SpiceException {
|
||||||
|
TerrasaurTool defaultOBJ = new PointCloudToPlane();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
|
||||||
|
String inFile = cl.getOptionValue("inputFile");
|
||||||
|
boolean inLLR = cl.hasOption("inllr");
|
||||||
|
|
||||||
|
FORMATS inFormat = cl.hasOption("inputFormat")
|
||||||
|
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
|
||||||
|
: FORMATS.formatFromExtension(inFile);
|
||||||
|
|
||||||
|
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
|
||||||
|
pcfc.read(inFile, inLLR);
|
||||||
|
System.out.printf("%d points read from %s\n", pcfc.getPoints().GetNumberOfPoints(), inFile);
|
||||||
|
|
||||||
|
int halfSize = 0;
|
||||||
|
double groundSampleDistance = 0;
|
||||||
|
|
||||||
|
vtkPoints points = pcfc.getPoints();
|
||||||
|
PointCloudToPlane pctp = new PointCloudToPlane(points, halfSize, groundSampleDistance);
|
||||||
|
|
||||||
|
Vector3 translation;
|
||||||
|
if (cl.hasOption("translate")) {
|
||||||
|
translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
|
||||||
|
pctp.getGMU().setTranslation(translation.toArray());
|
||||||
|
}
|
||||||
|
pctp.getGMU().calculateTransformation();
|
||||||
|
|
||||||
|
List<Vector3> globalPts = new ArrayList<>();
|
||||||
|
for (int i = 0; i < points.GetNumberOfPoints(); i++) globalPts.add(new Vector3(points.GetPoint(i)));
|
||||||
|
|
||||||
|
double[][] transformation = pctp.getGMU().getTransformation();
|
||||||
|
StringBuilder sb = new StringBuilder(String.format(
|
||||||
"translation vector:\n%24.16e%24.16e%24.16e\n",
|
"translation vector:\n%24.16e%24.16e%24.16e\n",
|
||||||
transformation[0][3], transformation[1][3], transformation[2][3]));
|
transformation[0][3], transformation[1][3], transformation[2][3]));
|
||||||
logger.info(sb.toString());
|
logger.info(sb.toString());
|
||||||
sb = new StringBuilder("rotation matrix:\n");
|
sb = new StringBuilder("rotation matrix:\n");
|
||||||
for (int i = 0; i < 3; i++)
|
for (int i = 0; i < 3; i++)
|
||||||
sb.append(
|
sb.append(String.format(
|
||||||
String.format(
|
"%24.16e%24.16e%24.16e\n", transformation[i][0], transformation[i][1], transformation[i][2]));
|
||||||
"%24.16e%24.16e%24.16e\n",
|
logger.info(sb.toString());
|
||||||
transformation[i][0], transformation[i][1], transformation[i][2]));
|
|
||||||
logger.info(sb.toString());
|
|
||||||
|
|
||||||
Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation());
|
Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation());
|
||||||
translation = new Vector3(pctp.getGMU().getTranslation());
|
translation = new Vector3(pctp.getGMU().getTranslation());
|
||||||
|
|
||||||
if (cl.hasOption("slope")) {
|
if (cl.hasOption("slope")) {
|
||||||
Vector3 z = rotation.xpose().mxv(new Vector3(0, 0, 1));
|
Vector3 z = rotation.xpose().mxv(new Vector3(0, 0, 1));
|
||||||
VectorStatistics vStats = new VectorStatistics();
|
VectorStatistics vStats = new VectorStatistics();
|
||||||
for (Vector3 pt : globalPts) vStats.add(pt);
|
for (Vector3 pt : globalPts) vStats.add(pt);
|
||||||
|
|
||||||
Vector3 r = MathConversions.toVector3(vStats.getMean());
|
Vector3 r = MathConversions.toVector3(vStats.getMean());
|
||||||
|
|
||||||
Vector3 y = r.cross(z).hat();
|
Vector3 y = r.cross(z).hat();
|
||||||
Vector3 x = y.cross(z).hat();
|
Vector3 x = y.cross(z).hat();
|
||||||
rotation = new Matrix33(x, y, z);
|
rotation = new Matrix33(x, y, z);
|
||||||
}
|
|
||||||
|
|
||||||
List<Vector3> localPts = new ArrayList<>();
|
|
||||||
for (Vector3 p : globalPts) localPts.add(rotation.mxv(p.sub(translation)));
|
|
||||||
|
|
||||||
VectorStatistics vStats = new VectorStatistics();
|
|
||||||
for (Vector3 localPt : localPts) vStats.add(localPt);
|
|
||||||
|
|
||||||
if (cl.hasOption("plotXYZ")) {
|
|
||||||
BufferedImage image = pctp.makePlot(localPts, "Z (height above plane)");
|
|
||||||
PlotCanvas.writeImage(cl.getOptionValue("plotXYZ"), image);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (cl.hasOption("plotXYR")) {
|
|
||||||
|
|
||||||
// rotate but don't translate
|
|
||||||
List<Vector3> xyr = new ArrayList<>();
|
|
||||||
for (Vector3 p : globalPts) {
|
|
||||||
Vector3 v = rotation.mxv(p);
|
|
||||||
xyr.add(new Vector3(v.getElt(0), v.getElt(1), v.norm()));
|
|
||||||
}
|
|
||||||
|
|
||||||
BufferedImage image = pctp.makePlot(xyr, "R");
|
|
||||||
PlotCanvas.writeImage(cl.getOptionValue("plotXYR"), image);
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.info("statistics on full set");
|
|
||||||
logger.info(vStats);
|
|
||||||
|
|
||||||
Vector3 mean = MathConversions.toVector3(vStats.getMean());
|
|
||||||
Vector3 std = MathConversions.toVector3(vStats.getStandardDeviation());
|
|
||||||
double scale = 5;
|
|
||||||
List<Double> minList = new ArrayList<>();
|
|
||||||
List<Double> maxList = new ArrayList<>();
|
|
||||||
for (int i = 0; i < 3; i++) {
|
|
||||||
minList.add(mean.getElt(i) - scale * std.getElt(i));
|
|
||||||
maxList.add(mean.getElt(i) + scale * std.getElt(i));
|
|
||||||
}
|
|
||||||
|
|
||||||
vStats = new VectorStatistics();
|
|
||||||
for (Vector3 v : localPts) {
|
|
||||||
boolean addThis = true;
|
|
||||||
for (int i = 0; i < 3; i++) {
|
|
||||||
if (v.getElt(i) < minList.get(i) || v.getElt(i) > maxList.get(i)) {
|
|
||||||
addThis = false;
|
|
||||||
break;
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
if (addThis) vStats.add(v);
|
List<Vector3> localPts = new ArrayList<>();
|
||||||
|
for (Vector3 p : globalPts) localPts.add(rotation.mxv(p.sub(translation)));
|
||||||
|
|
||||||
|
VectorStatistics vStats = new VectorStatistics();
|
||||||
|
for (Vector3 localPt : localPts) vStats.add(localPt);
|
||||||
|
|
||||||
|
if (cl.hasOption("plotXYZ")) {
|
||||||
|
BufferedImage image = pctp.makePlot(localPts, "Z (height above plane)");
|
||||||
|
PlotCanvas.writeImage(cl.getOptionValue("plotXYZ"), image);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("plotXYR")) {
|
||||||
|
|
||||||
|
// rotate but don't translate
|
||||||
|
List<Vector3> xyr = new ArrayList<>();
|
||||||
|
for (Vector3 p : globalPts) {
|
||||||
|
Vector3 v = rotation.mxv(p);
|
||||||
|
xyr.add(new Vector3(v.getElt(0), v.getElt(1), v.norm()));
|
||||||
|
}
|
||||||
|
|
||||||
|
BufferedImage image = pctp.makePlot(xyr, "R");
|
||||||
|
PlotCanvas.writeImage(cl.getOptionValue("plotXYR"), image);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("statistics on full set");
|
||||||
|
logger.info(vStats);
|
||||||
|
|
||||||
|
Vector3 mean = MathConversions.toVector3(vStats.getMean());
|
||||||
|
Vector3 std = MathConversions.toVector3(vStats.getStandardDeviation());
|
||||||
|
double scale = 5;
|
||||||
|
List<Double> minList = new ArrayList<>();
|
||||||
|
List<Double> maxList = new ArrayList<>();
|
||||||
|
for (int i = 0; i < 3; i++) {
|
||||||
|
minList.add(mean.getElt(i) - scale * std.getElt(i));
|
||||||
|
maxList.add(mean.getElt(i) + scale * std.getElt(i));
|
||||||
|
}
|
||||||
|
|
||||||
|
vStats = new VectorStatistics();
|
||||||
|
for (Vector3 v : localPts) {
|
||||||
|
boolean addThis = true;
|
||||||
|
for (int i = 0; i < 3; i++) {
|
||||||
|
if (v.getElt(i) < minList.get(i) || v.getElt(i) > maxList.get(i)) {
|
||||||
|
addThis = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (addThis) vStats.add(v);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("statistics on set without points more than 5 standard deviations from the mean:");
|
||||||
|
logger.info(vStats);
|
||||||
|
|
||||||
|
if (cl.hasOption("outputFile")) pctp.writeOutput(cl.getOptionValue("outputFile"));
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("statistics on set without points more than 5 standard deviations from the mean:");
|
|
||||||
logger.info(vStats);
|
|
||||||
|
|
||||||
if (cl.hasOption("outputFile")) pctp.writeOutput(cl.getOptionValue("outputFile"));
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,8 +1,29 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
import org.apache.commons.cli.CommandLine;
|
import org.apache.commons.cli.CommandLine;
|
||||||
import org.apache.commons.cli.Option;
|
import org.apache.commons.cli.Option;
|
||||||
import org.apache.commons.cli.Options;
|
import org.apache.commons.cli.Options;
|
||||||
@@ -23,50 +44,53 @@ import vtk.vtkPolyData;
|
|||||||
*/
|
*/
|
||||||
public class PrintShapeModelStatistics implements TerrasaurTool {
|
public class PrintShapeModelStatistics implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
private PrintShapeModelStatistics() {}
|
private PrintShapeModelStatistics() {}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Print statistics about a shape model.";
|
return "Print statistics about a shape model.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header = "This program prints various statistics about a shape model in OBJ format.";
|
String header = "This program prints various statistics about a shape model in OBJ format.";
|
||||||
String footer = "";
|
String footer = "";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Options defineOptions() {
|
private static Options defineOptions() {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
Options options = TerrasaurTool.defineOptions();
|
||||||
options.addOption(
|
options.addOption(Option.builder("objFile")
|
||||||
Option.builder("objFile").required().hasArg().desc("Path to OBJ file.").build());
|
.required()
|
||||||
return options;
|
.hasArg()
|
||||||
}
|
.desc("Path to OBJ file.")
|
||||||
|
.build());
|
||||||
public static void main(String[] args) throws Exception {
|
return options;
|
||||||
TerrasaurTool defaultOBJ = new PrintShapeModelStatistics();
|
}
|
||||||
|
|
||||||
Options options = defineOptions();
|
public static void main(String[] args) throws Exception {
|
||||||
|
TerrasaurTool defaultOBJ = new PrintShapeModelStatistics();
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
Options options = defineOptions();
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
String filename = cl.getOptionValue("objFile");
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
|
String filename = cl.getOptionValue("objFile");
|
||||||
vtkPolyData polydata = PolyDataUtil.loadShapeModelAndComputeNormals(filename);
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
PolyDataStatistics stat = new PolyDataStatistics(polydata);
|
|
||||||
ArrayList<String> stats = stat.getShapeModelStats();
|
vtkPolyData polydata = PolyDataUtil.loadShapeModelAndComputeNormals(filename);
|
||||||
for (String line : stats) {
|
|
||||||
logger.info(line);
|
PolyDataStatistics stat = new PolyDataStatistics(polydata);
|
||||||
|
ArrayList<String> stats = stat.getShapeModelStats();
|
||||||
|
for (String line : stats) {
|
||||||
|
logger.info(line);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -24,384 +46,375 @@ import vtk.vtkIdList;
|
|||||||
import vtk.vtkPolyData;
|
import vtk.vtkPolyData;
|
||||||
|
|
||||||
public class RangeFromSumFile implements TerrasaurTool {
|
public class RangeFromSumFile implements TerrasaurTool {
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Calculate range to surface from a sumfile.";
|
return "Calculate range to surface from a sumfile.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer =
|
String footer =
|
||||||
"""
|
"""
|
||||||
This program reads a sumfile along with a shape model and \
|
This program reads a sumfile along with a shape model and \
|
||||||
calculates the range to the surface. NOTE: Spacecraft position is \
|
calculates the range to the surface. NOTE: Spacecraft position is \
|
||||||
assumed to be in kilometers. If not, use the -distanceScale option \
|
assumed to be in kilometers. If not, use the -distanceScale option \
|
||||||
to convert to km.
|
to convert to km.
|
||||||
""";
|
""";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
|
||||||
|
|
||||||
private SumFile sumFile;
|
|
||||||
private vtkPolyData polyData;
|
|
||||||
private SmallBodyModel smallBodyModel;
|
|
||||||
|
|
||||||
private int xOffset;
|
|
||||||
private int yOffset;
|
|
||||||
|
|
||||||
private long facet;
|
|
||||||
|
|
||||||
private Vector3D scPos;
|
|
||||||
private Vector3D sunXYZ;
|
|
||||||
private Vector3D surfaceIntercept;
|
|
||||||
|
|
||||||
private double tiltDeg;
|
|
||||||
private double tiltDir;
|
|
||||||
|
|
||||||
private double incidence;
|
|
||||||
private double emission;
|
|
||||||
private double phase;
|
|
||||||
|
|
||||||
private double scAzimuth;
|
|
||||||
private double scElevation;
|
|
||||||
|
|
||||||
private double sunAzimuth;
|
|
||||||
private double sunElevation;
|
|
||||||
|
|
||||||
private DescriptiveStatistics stats;
|
|
||||||
private double centerX, centerY;
|
|
||||||
|
|
||||||
public DescriptiveStatistics getStats() {
|
|
||||||
return stats;
|
|
||||||
}
|
|
||||||
|
|
||||||
private RangeFromSumFile() {}
|
|
||||||
|
|
||||||
public RangeFromSumFile(SumFile sumFile, vtkPolyData polyData) {
|
|
||||||
|
|
||||||
this.sumFile = sumFile;
|
|
||||||
|
|
||||||
int nPixelsX = sumFile.imageWidth();
|
|
||||||
int nPixelsY = sumFile.imageHeight();
|
|
||||||
centerX = 0.5 * (nPixelsX - 1);
|
|
||||||
centerY = 0.5 * (nPixelsY - 1);
|
|
||||||
|
|
||||||
this.polyData = polyData;
|
|
||||||
|
|
||||||
smallBodyModel = new SmallBodyModel(polyData);
|
|
||||||
|
|
||||||
scPos = sumFile.scobj().negate();
|
|
||||||
sunXYZ = sumFile.sunDirection();
|
|
||||||
|
|
||||||
stats = new DescriptiveStatistics();
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setDistanceScale(double distanceScale) {
|
|
||||||
this.scPos = sumFile.scobj().scalarMultiply(distanceScale).negate();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @param xOffset x offset in pixels
|
|
||||||
* @param yOffset y offset in pixels
|
|
||||||
* @return key is cell index, value is surface intercept for the desired pixel offset from the center of the image.
|
|
||||||
*/
|
|
||||||
public Map.Entry<Long, Vector3D> findIntercept(int xOffset, int yOffset) {
|
|
||||||
|
|
||||||
this.xOffset = xOffset;
|
|
||||||
this.yOffset = yOffset;
|
|
||||||
|
|
||||||
Vector3D lookDir = new Vector3D(1.0, sumFile.boresight());
|
|
||||||
|
|
||||||
if (xOffset != 0) {
|
|
||||||
Vector3D offset = new Vector3D(-xOffset, sumFile.xPerPixel());
|
|
||||||
lookDir = lookDir.add(offset);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (yOffset != 0) {
|
private SumFile sumFile;
|
||||||
Vector3D offset = new Vector3D(-yOffset, sumFile.yPerPixel());
|
private vtkPolyData polyData;
|
||||||
lookDir = lookDir.add(offset);
|
private SmallBodyModel smallBodyModel;
|
||||||
|
|
||||||
|
private int xOffset;
|
||||||
|
private int yOffset;
|
||||||
|
|
||||||
|
private long facet;
|
||||||
|
|
||||||
|
private Vector3D scPos;
|
||||||
|
private Vector3D sunXYZ;
|
||||||
|
private Vector3D surfaceIntercept;
|
||||||
|
|
||||||
|
private double tiltDeg;
|
||||||
|
private double tiltDir;
|
||||||
|
|
||||||
|
private double incidence;
|
||||||
|
private double emission;
|
||||||
|
private double phase;
|
||||||
|
|
||||||
|
private double scAzimuth;
|
||||||
|
private double scElevation;
|
||||||
|
|
||||||
|
private double sunAzimuth;
|
||||||
|
private double sunElevation;
|
||||||
|
|
||||||
|
private DescriptiveStatistics stats;
|
||||||
|
private double centerX, centerY;
|
||||||
|
|
||||||
|
public DescriptiveStatistics getStats() {
|
||||||
|
return stats;
|
||||||
}
|
}
|
||||||
|
|
||||||
double[] tmp = new double[3];
|
private RangeFromSumFile() {}
|
||||||
facet = smallBodyModel.computeRayIntersection(scPos.toArray(), lookDir.toArray(), tmp);
|
|
||||||
|
|
||||||
if (facet == -1) {
|
public RangeFromSumFile(SumFile sumFile, vtkPolyData polyData) {
|
||||||
surfaceIntercept = null;
|
|
||||||
} else {
|
|
||||||
surfaceIntercept = new Vector3D(tmp);
|
|
||||||
|
|
||||||
vtkIdList idList = new vtkIdList();
|
this.sumFile = sumFile;
|
||||||
double[] pt0 = new double[3];
|
|
||||||
double[] pt1 = new double[3];
|
|
||||||
double[] pt2 = new double[3];
|
|
||||||
|
|
||||||
polyData.GetCellPoints(facet, idList);
|
int nPixelsX = sumFile.imageWidth();
|
||||||
|
int nPixelsY = sumFile.imageHeight();
|
||||||
|
centerX = 0.5 * (nPixelsX - 1);
|
||||||
|
centerY = 0.5 * (nPixelsY - 1);
|
||||||
|
|
||||||
// get the ids for each point
|
this.polyData = polyData;
|
||||||
long id0 = idList.GetId(0);
|
|
||||||
long id1 = idList.GetId(1);
|
|
||||||
long id2 = idList.GetId(2);
|
|
||||||
|
|
||||||
// get points that comprise the cell
|
smallBodyModel = new SmallBodyModel(polyData);
|
||||||
polyData.GetPoint(id0, pt0);
|
|
||||||
polyData.GetPoint(id1, pt1);
|
|
||||||
polyData.GetPoint(id2, pt2);
|
|
||||||
|
|
||||||
TriangularFacet facet =
|
scPos = sumFile.scobj().negate();
|
||||||
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
|
sunXYZ = sumFile.sunDirection();
|
||||||
|
|
||||||
Vector3 center3 = MathConversions.toVector3(facet.getCenter());
|
stats = new DescriptiveStatistics();
|
||||||
Vector3D center3D = MathConversions.toVector3D(facet.getCenter());
|
|
||||||
Vector3 normal3 = MathConversions.toVector3(facet.getNormal());
|
|
||||||
Vector3D normal3D = MathConversions.toVector3D(facet.getNormal());
|
|
||||||
|
|
||||||
tiltDeg = Math.toDegrees(center3.sep(normal3));
|
|
||||||
if (tiltDeg > 90) tiltDeg = 180 - tiltDeg;
|
|
||||||
|
|
||||||
tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D);
|
|
||||||
|
|
||||||
|
|
||||||
incidence = Vector3D.angle(sunXYZ, normal3D);
|
|
||||||
emission = Vector3D.angle(scPos, normal3D);
|
|
||||||
phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D));
|
|
||||||
|
|
||||||
try {
|
|
||||||
// scPos is in body fixed coordinates
|
|
||||||
Plane p = new Plane(normal3, center3);
|
|
||||||
Vector3 projectedNorth = p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
|
|
||||||
Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3);
|
|
||||||
|
|
||||||
scAzimuth = projected.sep(projectedNorth);
|
|
||||||
if (projected.cross(projectedNorth).dot(center3) < 0) scAzimuth = 2 * Math.PI - scAzimuth;
|
|
||||||
scElevation = Math.PI / 2 - emission;
|
|
||||||
|
|
||||||
// sunXYZ is a unit vector pointing to the sun
|
|
||||||
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3)).sub(center3);
|
|
||||||
|
|
||||||
sunAzimuth = projected.sep(projectedNorth);
|
|
||||||
if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth;
|
|
||||||
sunElevation = Math.PI / 2 - incidence;
|
|
||||||
} catch (SpiceException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
|
|
||||||
stats.addValue(scPos.distance(surfaceIntercept));
|
|
||||||
}
|
|
||||||
return new AbstractMap.SimpleEntry<>(facet, surfaceIntercept);
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getHeader(String filename) {
|
|
||||||
StringBuffer sb = new StringBuffer();
|
|
||||||
sb.append("# x increases to the right and y increases down. Top left corner is 0, 0.\n");
|
|
||||||
sb.append(String.format("# %s\n", filename));
|
|
||||||
sb.append(String.format("%7s", "# x"));
|
|
||||||
sb.append(String.format("%7s", "y"));
|
|
||||||
sb.append(StringUtils.center("facet", 8));
|
|
||||||
sb.append(StringUtils.center("Tilt", 12));
|
|
||||||
sb.append(StringUtils.center("Tilt Dir", 12));
|
|
||||||
sb.append(StringUtils.center("s/c position XYZ", 36));
|
|
||||||
sb.append(StringUtils.center("surface intercept XYZ", 36));
|
|
||||||
sb.append(StringUtils.center("lon", 12));
|
|
||||||
sb.append(StringUtils.center("lat", 12));
|
|
||||||
sb.append(StringUtils.center("rad", 12));
|
|
||||||
sb.append(StringUtils.center("range", 12));
|
|
||||||
sb.append(StringUtils.center("inc", 12));
|
|
||||||
sb.append(StringUtils.center("ems", 12));
|
|
||||||
sb.append(StringUtils.center("phase", 12));
|
|
||||||
sb.append(StringUtils.center("s/c az", 12));
|
|
||||||
sb.append(StringUtils.center("s/c el", 12));
|
|
||||||
sb.append(StringUtils.center("sun az", 12));
|
|
||||||
sb.append(StringUtils.center("sun el", 12));
|
|
||||||
|
|
||||||
sb.append("\n");
|
|
||||||
sb.append(String.format("%7s", "# "));
|
|
||||||
sb.append(String.format("%7s", ""));
|
|
||||||
sb.append(String.format("%8s", ""));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(km)", 36));
|
|
||||||
sb.append(StringUtils.center("(km)", 36));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(km)", 12));
|
|
||||||
sb.append(StringUtils.center("(km)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
sb.append(StringUtils.center("(deg)", 12));
|
|
||||||
return sb.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String toString() {
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
sb.append(String.format("%7.2f", xOffset + centerX));
|
|
||||||
sb.append(String.format("%7.2f", yOffset + centerY));
|
|
||||||
sb.append(String.format("%8d", facet));
|
|
||||||
sb.append(String.format("%12.6f", tiltDeg));
|
|
||||||
sb.append(String.format("%12.6f", tiltDir));
|
|
||||||
sb.append(String.format("%12.6f", scPos.getX()));
|
|
||||||
sb.append(String.format("%12.6f", scPos.getY()));
|
|
||||||
sb.append(String.format("%12.6f", scPos.getZ()));
|
|
||||||
sb.append(String.format("%12.6f", surfaceIntercept.getX()));
|
|
||||||
sb.append(String.format("%12.6f", surfaceIntercept.getY()));
|
|
||||||
sb.append(String.format("%12.6f", surfaceIntercept.getZ()));
|
|
||||||
|
|
||||||
double lon = Math.toDegrees(surfaceIntercept.getAlpha());
|
|
||||||
if (lon < 0) lon += 360;
|
|
||||||
sb.append(String.format("%12.6f", lon));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(surfaceIntercept.getDelta())));
|
|
||||||
sb.append(String.format("%12.6f", surfaceIntercept.getNorm()));
|
|
||||||
|
|
||||||
sb.append(String.format("%12.6f", scPos.distance(surfaceIntercept)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(incidence)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(emission)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(phase)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(scAzimuth)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(scElevation)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(sunAzimuth)));
|
|
||||||
sb.append(String.format("%12.6f", Math.toDegrees(sunElevation)));
|
|
||||||
|
|
||||||
return sb.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("sumFile")
|
|
||||||
.required()
|
|
||||||
.hasArg()
|
|
||||||
.desc("Required. Name of sum file to read.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("objFile")
|
|
||||||
.required()
|
|
||||||
.hasArg()
|
|
||||||
.desc("Required. Name of OBJ shape file.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("pixelOffset")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. "
|
|
||||||
+ "x increases to the right and y increases down.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("xRange")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). "
|
|
||||||
+ "For example -50,50,5.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("yRange")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). "
|
|
||||||
+ "For example -50,50,5.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("radius")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("distanceScale")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("stats")
|
|
||||||
.desc("Print out statistics about range to all selected pixels.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
TerrasaurTool defaultOBJ = new RangeFromSumFile();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
|
|
||||||
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
|
|
||||||
|
|
||||||
int xStart = 0;
|
|
||||||
int xStop = 1;
|
|
||||||
int xSpacing = 1;
|
|
||||||
int yStart = 0;
|
|
||||||
int yStop = 1;
|
|
||||||
int ySpacing = 1;
|
|
||||||
if (cl.hasOption("pixelOffset")) {
|
|
||||||
String[] parts = cl.getOptionValue("pixelOffset").split(",");
|
|
||||||
|
|
||||||
int x = Integer.parseInt(parts[0].trim());
|
|
||||||
int y = Integer.parseInt(parts[1].trim());
|
|
||||||
|
|
||||||
xStart = x;
|
|
||||||
xStop = x + 1;
|
|
||||||
yStart = y;
|
|
||||||
yStop = y + 1;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("xRange")) {
|
public void setDistanceScale(double distanceScale) {
|
||||||
String[] parts = cl.getOptionValue("xRange").split(",");
|
this.scPos = sumFile.scobj().scalarMultiply(distanceScale).negate();
|
||||||
xStart = Integer.parseInt(parts[0].trim());
|
|
||||||
xStop = Integer.parseInt(parts[1].trim());
|
|
||||||
xSpacing = Integer.parseInt(parts[2].trim());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("yRange")) {
|
/**
|
||||||
String[] parts = cl.getOptionValue("yRange").split(",");
|
* @param xOffset x offset in pixels
|
||||||
yStart = Integer.parseInt(parts[0].trim());
|
* @param yOffset y offset in pixels
|
||||||
yStop = Integer.parseInt(parts[1].trim());
|
* @return key is cell index, value is surface intercept for the desired pixel offset from the center of the image.
|
||||||
ySpacing = Integer.parseInt(parts[2].trim());
|
*/
|
||||||
}
|
public Map.Entry<Long, Vector3D> findIntercept(int xOffset, int yOffset) {
|
||||||
|
|
||||||
int checkRadius = 0;
|
this.xOffset = xOffset;
|
||||||
if (cl.hasOption("radius")) {
|
this.yOffset = yOffset;
|
||||||
checkRadius = (int) Math.round(Double.parseDouble(cl.getOptionValue("radius")));
|
|
||||||
xStart -= checkRadius;
|
|
||||||
xStop += checkRadius;
|
|
||||||
yStart -= checkRadius;
|
|
||||||
yStop += checkRadius;
|
|
||||||
}
|
|
||||||
|
|
||||||
String objFile = cl.getOptionValue("objFile");
|
Vector3D lookDir = new Vector3D(1.0, sumFile.boresight());
|
||||||
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
|
|
||||||
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
|
|
||||||
|
|
||||||
if (cl.hasOption("distanceScale"))
|
if (xOffset != 0) {
|
||||||
rfsf.setDistanceScale(Double.parseDouble(cl.getOptionValue("distanceScale")));
|
Vector3D offset = new Vector3D(-xOffset, sumFile.xPerPixel());
|
||||||
|
lookDir = lookDir.add(offset);
|
||||||
System.out.println(rfsf.getHeader(cl.getOptionValue("sumFile")));
|
|
||||||
|
|
||||||
for (int ix = xStart; ix < xStop; ix += xSpacing) {
|
|
||||||
for (int iy = yStart; iy < yStop; iy += ySpacing) {
|
|
||||||
if (checkRadius > 0) {
|
|
||||||
double midx = (xStart + xStop) / 2.;
|
|
||||||
double midy = (yStart + yStop) / 2.;
|
|
||||||
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius)
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
long cellID = rfsf.findIntercept(ix, iy).getKey();
|
|
||||||
if (cellID > -1) System.out.println(rfsf);
|
if (yOffset != 0) {
|
||||||
}
|
Vector3D offset = new Vector3D(-yOffset, sumFile.yPerPixel());
|
||||||
|
lookDir = lookDir.add(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
double[] tmp = new double[3];
|
||||||
|
facet = smallBodyModel.computeRayIntersection(scPos.toArray(), lookDir.toArray(), tmp);
|
||||||
|
|
||||||
|
if (facet == -1) {
|
||||||
|
surfaceIntercept = null;
|
||||||
|
} else {
|
||||||
|
surfaceIntercept = new Vector3D(tmp);
|
||||||
|
|
||||||
|
vtkIdList idList = new vtkIdList();
|
||||||
|
double[] pt0 = new double[3];
|
||||||
|
double[] pt1 = new double[3];
|
||||||
|
double[] pt2 = new double[3];
|
||||||
|
|
||||||
|
polyData.GetCellPoints(facet, idList);
|
||||||
|
|
||||||
|
// get the ids for each point
|
||||||
|
long id0 = idList.GetId(0);
|
||||||
|
long id1 = idList.GetId(1);
|
||||||
|
long id2 = idList.GetId(2);
|
||||||
|
|
||||||
|
// get points that comprise the cell
|
||||||
|
polyData.GetPoint(id0, pt0);
|
||||||
|
polyData.GetPoint(id1, pt1);
|
||||||
|
polyData.GetPoint(id2, pt2);
|
||||||
|
|
||||||
|
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
|
||||||
|
|
||||||
|
Vector3 center3 = MathConversions.toVector3(facet.getCenter());
|
||||||
|
Vector3D center3D = MathConversions.toVector3D(facet.getCenter());
|
||||||
|
Vector3 normal3 = MathConversions.toVector3(facet.getNormal());
|
||||||
|
Vector3D normal3D = MathConversions.toVector3D(facet.getNormal());
|
||||||
|
|
||||||
|
tiltDeg = Math.toDegrees(center3.sep(normal3));
|
||||||
|
if (tiltDeg > 90) tiltDeg = 180 - tiltDeg;
|
||||||
|
|
||||||
|
tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D);
|
||||||
|
|
||||||
|
incidence = Vector3D.angle(sunXYZ, normal3D);
|
||||||
|
emission = Vector3D.angle(scPos, normal3D);
|
||||||
|
phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D));
|
||||||
|
|
||||||
|
try {
|
||||||
|
// scPos is in body fixed coordinates
|
||||||
|
Plane p = new Plane(normal3, center3);
|
||||||
|
Vector3 projectedNorth =
|
||||||
|
p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
|
||||||
|
Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3);
|
||||||
|
|
||||||
|
scAzimuth = projected.sep(projectedNorth);
|
||||||
|
if (projected.cross(projectedNorth).dot(center3) < 0) scAzimuth = 2 * Math.PI - scAzimuth;
|
||||||
|
scElevation = Math.PI / 2 - emission;
|
||||||
|
|
||||||
|
// sunXYZ is a unit vector pointing to the sun
|
||||||
|
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3))
|
||||||
|
.sub(center3);
|
||||||
|
|
||||||
|
sunAzimuth = projected.sep(projectedNorth);
|
||||||
|
if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth;
|
||||||
|
sunElevation = Math.PI / 2 - incidence;
|
||||||
|
} catch (SpiceException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
stats.addValue(scPos.distance(surfaceIntercept));
|
||||||
|
}
|
||||||
|
return new AbstractMap.SimpleEntry<>(facet, surfaceIntercept);
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getHeader(String filename) {
|
||||||
|
StringBuffer sb = new StringBuffer();
|
||||||
|
sb.append("# x increases to the right and y increases down. Top left corner is 0, 0.\n");
|
||||||
|
sb.append(String.format("# %s\n", filename));
|
||||||
|
sb.append(String.format("%7s", "# x"));
|
||||||
|
sb.append(String.format("%7s", "y"));
|
||||||
|
sb.append(StringUtils.center("facet", 8));
|
||||||
|
sb.append(StringUtils.center("Tilt", 12));
|
||||||
|
sb.append(StringUtils.center("Tilt Dir", 12));
|
||||||
|
sb.append(StringUtils.center("s/c position XYZ", 36));
|
||||||
|
sb.append(StringUtils.center("surface intercept XYZ", 36));
|
||||||
|
sb.append(StringUtils.center("lon", 12));
|
||||||
|
sb.append(StringUtils.center("lat", 12));
|
||||||
|
sb.append(StringUtils.center("rad", 12));
|
||||||
|
sb.append(StringUtils.center("range", 12));
|
||||||
|
sb.append(StringUtils.center("inc", 12));
|
||||||
|
sb.append(StringUtils.center("ems", 12));
|
||||||
|
sb.append(StringUtils.center("phase", 12));
|
||||||
|
sb.append(StringUtils.center("s/c az", 12));
|
||||||
|
sb.append(StringUtils.center("s/c el", 12));
|
||||||
|
sb.append(StringUtils.center("sun az", 12));
|
||||||
|
sb.append(StringUtils.center("sun el", 12));
|
||||||
|
|
||||||
|
sb.append("\n");
|
||||||
|
sb.append(String.format("%7s", "# "));
|
||||||
|
sb.append(String.format("%7s", ""));
|
||||||
|
sb.append(String.format("%8s", ""));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(km)", 36));
|
||||||
|
sb.append(StringUtils.center("(km)", 36));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(km)", 12));
|
||||||
|
sb.append(StringUtils.center("(km)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
sb.append(StringUtils.center("(deg)", 12));
|
||||||
|
return sb.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String toString() {
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
sb.append(String.format("%7.2f", xOffset + centerX));
|
||||||
|
sb.append(String.format("%7.2f", yOffset + centerY));
|
||||||
|
sb.append(String.format("%8d", facet));
|
||||||
|
sb.append(String.format("%12.6f", tiltDeg));
|
||||||
|
sb.append(String.format("%12.6f", tiltDir));
|
||||||
|
sb.append(String.format("%12.6f", scPos.getX()));
|
||||||
|
sb.append(String.format("%12.6f", scPos.getY()));
|
||||||
|
sb.append(String.format("%12.6f", scPos.getZ()));
|
||||||
|
sb.append(String.format("%12.6f", surfaceIntercept.getX()));
|
||||||
|
sb.append(String.format("%12.6f", surfaceIntercept.getY()));
|
||||||
|
sb.append(String.format("%12.6f", surfaceIntercept.getZ()));
|
||||||
|
|
||||||
|
double lon = Math.toDegrees(surfaceIntercept.getAlpha());
|
||||||
|
if (lon < 0) lon += 360;
|
||||||
|
sb.append(String.format("%12.6f", lon));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(surfaceIntercept.getDelta())));
|
||||||
|
sb.append(String.format("%12.6f", surfaceIntercept.getNorm()));
|
||||||
|
|
||||||
|
sb.append(String.format("%12.6f", scPos.distance(surfaceIntercept)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(incidence)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(emission)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(phase)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(scAzimuth)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(scElevation)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(sunAzimuth)));
|
||||||
|
sb.append(String.format("%12.6f", Math.toDegrees(sunElevation)));
|
||||||
|
|
||||||
|
return sb.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("sumFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of sum file to read.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("objFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of OBJ shape file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("pixelOffset")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. "
|
||||||
|
+ "x increases to the right and y increases down.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("xRange")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). "
|
||||||
|
+ "For example -50,50,5.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("yRange")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). "
|
||||||
|
+ "For example -50,50,5.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("radius")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("distanceScale")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("stats")
|
||||||
|
.desc("Print out statistics about range to all selected pixels.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws Exception {
|
||||||
|
TerrasaurTool defaultOBJ = new RangeFromSumFile();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
|
||||||
|
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
|
||||||
|
|
||||||
|
int xStart = 0;
|
||||||
|
int xStop = 1;
|
||||||
|
int xSpacing = 1;
|
||||||
|
int yStart = 0;
|
||||||
|
int yStop = 1;
|
||||||
|
int ySpacing = 1;
|
||||||
|
if (cl.hasOption("pixelOffset")) {
|
||||||
|
String[] parts = cl.getOptionValue("pixelOffset").split(",");
|
||||||
|
|
||||||
|
int x = Integer.parseInt(parts[0].trim());
|
||||||
|
int y = Integer.parseInt(parts[1].trim());
|
||||||
|
|
||||||
|
xStart = x;
|
||||||
|
xStop = x + 1;
|
||||||
|
yStart = y;
|
||||||
|
yStop = y + 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("xRange")) {
|
||||||
|
String[] parts = cl.getOptionValue("xRange").split(",");
|
||||||
|
xStart = Integer.parseInt(parts[0].trim());
|
||||||
|
xStop = Integer.parseInt(parts[1].trim());
|
||||||
|
xSpacing = Integer.parseInt(parts[2].trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("yRange")) {
|
||||||
|
String[] parts = cl.getOptionValue("yRange").split(",");
|
||||||
|
yStart = Integer.parseInt(parts[0].trim());
|
||||||
|
yStop = Integer.parseInt(parts[1].trim());
|
||||||
|
ySpacing = Integer.parseInt(parts[2].trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
int checkRadius = 0;
|
||||||
|
if (cl.hasOption("radius")) {
|
||||||
|
checkRadius = (int) Math.round(Double.parseDouble(cl.getOptionValue("radius")));
|
||||||
|
xStart -= checkRadius;
|
||||||
|
xStop += checkRadius;
|
||||||
|
yStart -= checkRadius;
|
||||||
|
yStop += checkRadius;
|
||||||
|
}
|
||||||
|
|
||||||
|
String objFile = cl.getOptionValue("objFile");
|
||||||
|
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
|
||||||
|
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
|
||||||
|
|
||||||
|
if (cl.hasOption("distanceScale"))
|
||||||
|
rfsf.setDistanceScale(Double.parseDouble(cl.getOptionValue("distanceScale")));
|
||||||
|
|
||||||
|
System.out.println(rfsf.getHeader(cl.getOptionValue("sumFile")));
|
||||||
|
|
||||||
|
for (int ix = xStart; ix < xStop; ix += xSpacing) {
|
||||||
|
for (int iy = yStart; iy < yStop; iy += ySpacing) {
|
||||||
|
if (checkRadius > 0) {
|
||||||
|
double midx = (xStart + xStop) / 2.;
|
||||||
|
double midy = (yStart + yStop) / 2.;
|
||||||
|
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius) continue;
|
||||||
|
}
|
||||||
|
long cellID = rfsf.findIntercept(ix, iy).getKey();
|
||||||
|
if (cellID > -1) System.out.println(rfsf);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (cl.hasOption("stats")) System.out.println("Range " + rfsf.getStats());
|
||||||
}
|
}
|
||||||
if (cl.hasOption("stats")) System.out.println("Range " + rfsf.getStats());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -20,201 +42,228 @@ import terrasaur.templates.TerrasaurTool;
|
|||||||
|
|
||||||
public class RotationConversion implements TerrasaurTool {
|
public class RotationConversion implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Convert rotations between different types.";
|
return "Convert rotations between different types.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer =
|
String footer =
|
||||||
"""
|
"""
|
||||||
This program converts rotations between angle and axis, 3x3 matrix, quaternions, \
|
This program converts rotations between angle and axis, 3x3 matrix, quaternions, \
|
||||||
and ZXZ rotation Euler angles. Note that the rotation modifies the frame; \
|
and ZXZ rotation Euler angles. Note that the rotation modifies the frame; \
|
||||||
the vector is considered to be fixed. To find the rotation that modifies the \
|
the vector is considered to be fixed. To find the rotation that modifies the \
|
||||||
vector in a fixed frame, take the transpose of this matrix.
|
vector in a fixed frame, take the transpose of this matrix.
|
||||||
""";
|
""";
|
||||||
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(Option.builder("logFile").hasArg()
|
|
||||||
.desc("If present, save screen output to log file.").build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values())
|
|
||||||
sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(Option.builder("logLevel").hasArg()
|
|
||||||
.desc("If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim() + ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
|
|
||||||
options.addOption(Option.builder("angle").hasArg().desc("Rotation angle, in radians.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("axis0").hasArg().desc("First element of rotation axis.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("axis1").hasArg().desc("Second element of rotation axis.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("axis2").hasArg().desc("Third element of rotation axis.").build());
|
|
||||||
options.addOption(Option.builder("cardanXYZ1").hasArg()
|
|
||||||
.desc("Cardan angle for the first rotation (about the X axis) in radians.").build());
|
|
||||||
options.addOption(Option.builder("cardanXYZ2").hasArg()
|
|
||||||
.desc("Cardan angle for the second rotation (about the Y axis) in radians.").build());
|
|
||||||
options.addOption(Option.builder("cardanXYZ3").hasArg()
|
|
||||||
.desc("Cardan angle for the third rotation (about the Z axis) in radians.").build());
|
|
||||||
options.addOption(Option.builder("eulerZXZ1").hasArg()
|
|
||||||
.desc("Euler angle for the first rotation (about the Z axis) in radians.").build());
|
|
||||||
options.addOption(Option.builder("eulerZXZ2").hasArg()
|
|
||||||
.desc("Euler angle for the second rotation (about the rotated X axis) in radians.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("eulerZXZ3").hasArg()
|
|
||||||
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("q0").hasArg().desc("Scalar term for quaternion: cos(theta/2)").build());
|
|
||||||
options.addOption(Option.builder("q1").hasArg()
|
|
||||||
.desc("First vector term for quaternion: sin(theta/2) * V[0]").build());
|
|
||||||
options.addOption(Option.builder("q2").hasArg()
|
|
||||||
.desc("Second vector term for quaternion: sin(theta/2) * V[1]").build());
|
|
||||||
options.addOption(Option.builder("q3").hasArg()
|
|
||||||
.desc("Third vector term for quaternion: sin(theta/2) * V[2]").build());
|
|
||||||
options.addOption(Option.builder("matrix").hasArg()
|
|
||||||
.desc("name of file containing rotation matrix to convert to Euler angles. "
|
|
||||||
+ "Format is 3x3 array in plain text separated by white space.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("anglesInDegrees").desc(
|
|
||||||
"If present, input angles in degrees and print output angles in degrees. Default is false.")
|
|
||||||
.build()); return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
RotationConversion defaultOBJ = new RotationConversion();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
boolean inDegrees = cl.hasOption("anglesInDegrees");
|
|
||||||
|
|
||||||
boolean axisAndAngle = cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1")
|
|
||||||
&& cl.hasOption("axis2");
|
|
||||||
boolean cardanXYZ =
|
|
||||||
cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
|
|
||||||
boolean eulerZXZ =
|
|
||||||
cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
|
|
||||||
boolean quaternion =
|
|
||||||
cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
|
|
||||||
boolean matrix = cl.hasOption("matrix");
|
|
||||||
|
|
||||||
if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) {
|
|
||||||
logger.warn(
|
|
||||||
"Must specify input rotation as axis and angle, Cardan or Euler angles, matrix, or quaternion.");
|
|
||||||
System.exit(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
Rotation r = null;
|
private static Options defineOptions() {
|
||||||
if (matrix) {
|
Options options = TerrasaurTool.defineOptions();
|
||||||
List<String> lines =
|
options.addOption(Option.builder("logFile")
|
||||||
FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
|
.hasArg()
|
||||||
double[][] m = new double[3][3];
|
.desc("If present, save screen output to log file.")
|
||||||
for (int i = 0; i < 3; i++) {
|
.build());
|
||||||
String[] parts = lines.get(i).trim().split("\\s+");
|
StringBuilder sb = new StringBuilder();
|
||||||
for (int j = 0; j < 3; j++)
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
m[i][j] = Double.parseDouble(parts[j].trim());
|
options.addOption(Option.builder("logLevel")
|
||||||
}
|
.hasArg()
|
||||||
r = new Rotation(m, 1e-10);
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim() + ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
|
||||||
|
options.addOption(Option.builder("angle")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Rotation angle, in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("axis0")
|
||||||
|
.hasArg()
|
||||||
|
.desc("First element of rotation axis.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("axis1")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Second element of rotation axis.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("axis2")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Third element of rotation axis.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("cardanXYZ1")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Cardan angle for the first rotation (about the X axis) in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("cardanXYZ2")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Cardan angle for the second rotation (about the Y axis) in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("cardanXYZ3")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Cardan angle for the third rotation (about the Z axis) in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("eulerZXZ1")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Euler angle for the first rotation (about the Z axis) in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("eulerZXZ2")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Euler angle for the second rotation (about the rotated X axis) in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("eulerZXZ3")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("q0")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Scalar term for quaternion: cos(theta/2)")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("q1")
|
||||||
|
.hasArg()
|
||||||
|
.desc("First vector term for quaternion: sin(theta/2) * V[0]")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("q2")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Second vector term for quaternion: sin(theta/2) * V[1]")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("q3")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Third vector term for quaternion: sin(theta/2) * V[2]")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("matrix")
|
||||||
|
.hasArg()
|
||||||
|
.desc("name of file containing rotation matrix to convert to Euler angles. "
|
||||||
|
+ "Format is 3x3 array in plain text separated by white space.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("anglesInDegrees")
|
||||||
|
.desc("If present, input angles in degrees and print output angles in degrees. Default is false.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (axisAndAngle) {
|
public static void main(String[] args) throws Exception {
|
||||||
double angle = Double.parseDouble(cl.getOptionValue("angle").trim());
|
RotationConversion defaultOBJ = new RotationConversion();
|
||||||
if (inDegrees)
|
|
||||||
angle = Math.toRadians(angle);
|
Options options = defineOptions();
|
||||||
r = new Rotation(
|
|
||||||
new Vector3D(Double.parseDouble(cl.getOptionValue("axis0").trim()),
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
Double.parseDouble(cl.getOptionValue("axis1").trim()),
|
|
||||||
Double.parseDouble(cl.getOptionValue("axis2").trim())),
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
angle, RotationConvention.FRAME_TRANSFORM);
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
boolean inDegrees = cl.hasOption("anglesInDegrees");
|
||||||
|
|
||||||
|
boolean axisAndAngle =
|
||||||
|
cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1") && cl.hasOption("axis2");
|
||||||
|
boolean cardanXYZ = cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
|
||||||
|
boolean eulerZXZ = cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
|
||||||
|
boolean quaternion = cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
|
||||||
|
boolean matrix = cl.hasOption("matrix");
|
||||||
|
|
||||||
|
if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) {
|
||||||
|
logger.warn(
|
||||||
|
"Must specify input rotation as axis and angle, Cardan or Euler angles, matrix, or quaternion.");
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
Rotation r = null;
|
||||||
|
if (matrix) {
|
||||||
|
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
|
||||||
|
double[][] m = new double[3][3];
|
||||||
|
for (int i = 0; i < 3; i++) {
|
||||||
|
String[] parts = lines.get(i).trim().split("\\s+");
|
||||||
|
for (int j = 0; j < 3; j++) m[i][j] = Double.parseDouble(parts[j].trim());
|
||||||
|
}
|
||||||
|
r = new Rotation(m, 1e-10);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (axisAndAngle) {
|
||||||
|
double angle = Double.parseDouble(cl.getOptionValue("angle").trim());
|
||||||
|
if (inDegrees) angle = Math.toRadians(angle);
|
||||||
|
r = new Rotation(
|
||||||
|
new Vector3D(
|
||||||
|
Double.parseDouble(cl.getOptionValue("axis0").trim()),
|
||||||
|
Double.parseDouble(cl.getOptionValue("axis1").trim()),
|
||||||
|
Double.parseDouble(cl.getOptionValue("axis2").trim())),
|
||||||
|
angle,
|
||||||
|
RotationConvention.FRAME_TRANSFORM);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cardanXYZ) {
|
||||||
|
double angle1 = Double.parseDouble(cl.getOptionValue("cardanXYZ1").trim());
|
||||||
|
double angle2 = Double.parseDouble(cl.getOptionValue("cardanXYZ2").trim());
|
||||||
|
double angle3 = Double.parseDouble(cl.getOptionValue("cardanXYZ3").trim());
|
||||||
|
if (inDegrees) {
|
||||||
|
angle1 = Math.toRadians(angle1);
|
||||||
|
angle2 = Math.toRadians(angle2);
|
||||||
|
angle3 = Math.toRadians(angle3);
|
||||||
|
}
|
||||||
|
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (eulerZXZ) {
|
||||||
|
double angle1 = Double.parseDouble(cl.getOptionValue("eulerZXZ1").trim());
|
||||||
|
double angle2 = Double.parseDouble(cl.getOptionValue("eulerZXZ2").trim());
|
||||||
|
double angle3 = Double.parseDouble(cl.getOptionValue("eulerZXZ3").trim());
|
||||||
|
if (inDegrees) {
|
||||||
|
angle1 = Math.toRadians(angle1);
|
||||||
|
angle2 = Math.toRadians(angle2);
|
||||||
|
angle3 = Math.toRadians(angle3);
|
||||||
|
}
|
||||||
|
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (quaternion) {
|
||||||
|
r = new Rotation(
|
||||||
|
Double.parseDouble(cl.getOptionValue("q0").trim()),
|
||||||
|
Double.parseDouble(cl.getOptionValue("q1").trim()),
|
||||||
|
Double.parseDouble(cl.getOptionValue("q2").trim()),
|
||||||
|
Double.parseDouble(cl.getOptionValue("q3").trim()),
|
||||||
|
true);
|
||||||
|
}
|
||||||
|
|
||||||
|
double[][] m = r.getMatrix();
|
||||||
|
String matrixString = String.format(
|
||||||
|
"rotation matrix:\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e",
|
||||||
|
m[0][0], m[0][1], m[0][2], m[1][0], m[1][1], m[1][2], m[2][0], m[2][1], m[2][2]);
|
||||||
|
System.out.println(matrixString);
|
||||||
|
|
||||||
|
String axisAndAngleString = inDegrees
|
||||||
|
? String.format(
|
||||||
|
"angle (degrees), axis:\n%g, %s",
|
||||||
|
Math.toDegrees(r.getAngle()), r.getAxis(RotationConvention.FRAME_TRANSFORM))
|
||||||
|
: String.format(
|
||||||
|
"angle (radians), axis:\n%g, %s", r.getAngle(), r.getAxis(RotationConvention.FRAME_TRANSFORM));
|
||||||
|
System.out.println(axisAndAngleString);
|
||||||
|
|
||||||
|
try {
|
||||||
|
double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM);
|
||||||
|
String cardanString = inDegrees
|
||||||
|
? String.format(
|
||||||
|
"Cardan XYZ angles (degrees):\n%g, %g, %g",
|
||||||
|
Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
|
||||||
|
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
|
||||||
|
System.out.println(cardanString);
|
||||||
|
} catch (CardanEulerSingularityException e) {
|
||||||
|
System.out.println("Cardan angles: encountered singularity, cannot solve");
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM);
|
||||||
|
String eulerString = inDegrees
|
||||||
|
? String.format(
|
||||||
|
"Euler ZXZ angles (degrees):\n%g, %g, %g",
|
||||||
|
Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
|
||||||
|
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
|
||||||
|
System.out.println(eulerString);
|
||||||
|
} catch (CardanEulerSingularityException e) {
|
||||||
|
System.out.println("Euler angles: encountered singularity, cannot solve");
|
||||||
|
}
|
||||||
|
|
||||||
|
System.out.printf("Quaternion:\n%g, %g, %g, %g\n", r.getQ0(), r.getQ1(), r.getQ2(), r.getQ3());
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cardanXYZ) {
|
|
||||||
double angle1 = Double.parseDouble(cl.getOptionValue("cardanXYZ1").trim());
|
|
||||||
double angle2 = Double.parseDouble(cl.getOptionValue("cardanXYZ2").trim());
|
|
||||||
double angle3 = Double.parseDouble(cl.getOptionValue("cardanXYZ3").trim());
|
|
||||||
if (inDegrees) {
|
|
||||||
angle1 = Math.toRadians(angle1);
|
|
||||||
angle2 = Math.toRadians(angle2);
|
|
||||||
angle3 = Math.toRadians(angle3);
|
|
||||||
}
|
|
||||||
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2,
|
|
||||||
angle3);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (eulerZXZ) {
|
|
||||||
double angle1 = Double.parseDouble(cl.getOptionValue("eulerZXZ1").trim());
|
|
||||||
double angle2 = Double.parseDouble(cl.getOptionValue("eulerZXZ2").trim());
|
|
||||||
double angle3 = Double.parseDouble(cl.getOptionValue("eulerZXZ3").trim());
|
|
||||||
if (inDegrees) {
|
|
||||||
angle1 = Math.toRadians(angle1);
|
|
||||||
angle2 = Math.toRadians(angle2);
|
|
||||||
angle3 = Math.toRadians(angle3);
|
|
||||||
}
|
|
||||||
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2,
|
|
||||||
angle3);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (quaternion) {
|
|
||||||
r = new Rotation(Double.parseDouble(cl.getOptionValue("q0").trim()),
|
|
||||||
Double.parseDouble(cl.getOptionValue("q1").trim()),
|
|
||||||
Double.parseDouble(cl.getOptionValue("q2").trim()),
|
|
||||||
Double.parseDouble(cl.getOptionValue("q3").trim()), true);
|
|
||||||
}
|
|
||||||
|
|
||||||
double[][] m = r.getMatrix();
|
|
||||||
String matrixString = String.format(
|
|
||||||
"rotation matrix:\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e",
|
|
||||||
m[0][0], m[0][1], m[0][2], m[1][0], m[1][1], m[1][2], m[2][0], m[2][1], m[2][2]);
|
|
||||||
System.out.println(matrixString);
|
|
||||||
|
|
||||||
String axisAndAngleString = inDegrees
|
|
||||||
? String.format("angle (degrees), axis:\n%g, %s", Math.toDegrees(r.getAngle()),
|
|
||||||
r.getAxis(RotationConvention.FRAME_TRANSFORM))
|
|
||||||
: String.format("angle (radians), axis:\n%g, %s", r.getAngle(),
|
|
||||||
r.getAxis(RotationConvention.FRAME_TRANSFORM));
|
|
||||||
System.out.println(axisAndAngleString);
|
|
||||||
|
|
||||||
try {
|
|
||||||
double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM);
|
|
||||||
String cardanString = inDegrees
|
|
||||||
? String.format("Cardan XYZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]),
|
|
||||||
Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
|
|
||||||
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1],
|
|
||||||
angles[2]);
|
|
||||||
System.out.println(cardanString);
|
|
||||||
} catch (CardanEulerSingularityException e) {
|
|
||||||
System.out.println("Cardan angles: encountered singularity, cannot solve");
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM);
|
|
||||||
String eulerString = inDegrees
|
|
||||||
? String.format("Euler ZXZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]),
|
|
||||||
Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
|
|
||||||
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1],
|
|
||||||
angles[2]);
|
|
||||||
System.out.println(eulerString);
|
|
||||||
} catch (CardanEulerSingularityException e) {
|
|
||||||
System.out.println("Euler angles: encountered singularity, cannot solve");
|
|
||||||
}
|
|
||||||
|
|
||||||
System.out.printf("Quaternion:\n%g, %g, %g, %g\n", r.getQ0(), r.getQ1(), r.getQ2(), r.getQ3());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.*;
|
import java.io.*;
|
||||||
@@ -22,7 +44,7 @@ import terrasaur.utils.math.MathConversions;
|
|||||||
|
|
||||||
public class SPKFromSumFile implements TerrasaurTool {
|
public class SPKFromSumFile implements TerrasaurTool {
|
||||||
|
|
||||||
private final static Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
@@ -32,7 +54,8 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer = """
|
String footer =
|
||||||
|
"""
|
||||||
Given three or more sumfiles, fit a parabola to the spacecraft
|
Given three or more sumfiles, fit a parabola to the spacecraft
|
||||||
trajectory in J2000 and create an input file for MKSPK.
|
trajectory in J2000 and create an input file for MKSPK.
|
||||||
""";
|
""";
|
||||||
@@ -48,10 +71,11 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
private NavigableMap<Double, String> sumFilenames;
|
private NavigableMap<Double, String> sumFilenames;
|
||||||
private UnwritableInterval interval;
|
private UnwritableInterval interval;
|
||||||
|
|
||||||
private SPKFromSumFile(){}
|
private SPKFromSumFile() {}
|
||||||
|
|
||||||
private SPKFromSumFile(Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap,
|
private SPKFromSumFile(
|
||||||
double extend) throws SpiceException {
|
Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap, double extend)
|
||||||
|
throws SpiceException {
|
||||||
this.observer = observer;
|
this.observer = observer;
|
||||||
this.target = target;
|
this.target = target;
|
||||||
this.bodyFixed = bodyFixed;
|
this.bodyFixed = bodyFixed;
|
||||||
@@ -80,8 +104,9 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
* @param velocityIsJ2000 if true, user-supplied velocity is in J2000 frame
|
* @param velocityIsJ2000 if true, user-supplied velocity is in J2000 frame
|
||||||
* @return command to run MKSPK
|
* @return command to run MKSPK
|
||||||
*/
|
*/
|
||||||
public String writeMKSPKFiles(String basename, List<String> comments, int degree, final Vector3 velocity,
|
public String writeMKSPKFiles(
|
||||||
boolean velocityIsJ2000) throws SpiceException {
|
String basename, List<String> comments, int degree, final Vector3 velocity, boolean velocityIsJ2000)
|
||||||
|
throws SpiceException {
|
||||||
|
|
||||||
String commentFile = basename + "-comments.txt";
|
String commentFile = basename + "-comments.txt";
|
||||||
String setupFile = basename + ".setup";
|
String setupFile = basename + ".setup";
|
||||||
@@ -90,23 +115,27 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
if (!comments.isEmpty()) {
|
if (!comments.isEmpty()) {
|
||||||
for (String comment : comments)
|
for (String comment : comments) sb.append(comment).append("\n");
|
||||||
sb.append(comment).append("\n");
|
|
||||||
sb.append("\n");
|
sb.append("\n");
|
||||||
}
|
}
|
||||||
sb.append(String.format("This SPK for %s was generated by fitting a parabola to each component of the " + "SCOBJ vector from " + "the following sumfiles:\n", target));
|
sb.append(String.format(
|
||||||
|
"This SPK for %s was generated by fitting a parabola to each component of the "
|
||||||
|
+ "SCOBJ vector from " + "the following sumfiles:\n",
|
||||||
|
target));
|
||||||
for (String sumFile : sumFilenames.values()) {
|
for (String sumFile : sumFilenames.values()) {
|
||||||
sb.append(String.format("\t%s\n", sumFile));
|
sb.append(String.format("\t%s\n", sumFile));
|
||||||
}
|
}
|
||||||
sb.append("The SCOBJ vector was transformed to J2000 and an aberration correction ");
|
sb.append("The SCOBJ vector was transformed to J2000 and an aberration correction ");
|
||||||
sb.append(String.format("was applied to find the geometric position relative to %s before the parabola " + "fit. ", target.getName()));
|
sb.append(String.format(
|
||||||
sb.append(String.format("The period covered by this SPK is %s to %s.",
|
"was applied to find the geometric position relative to %s before the parabola " + "fit. ",
|
||||||
|
target.getName()));
|
||||||
|
sb.append(String.format(
|
||||||
|
"The period covered by this SPK is %s to %s.",
|
||||||
new TDBTime(interval.getBegin()).toUTCString("ISOC", 3),
|
new TDBTime(interval.getBegin()).toUTCString("ISOC", 3),
|
||||||
new TDBTime(interval.getEnd()).toUTCString("ISOC", 3)));
|
new TDBTime(interval.getEnd()).toUTCString("ISOC", 3)));
|
||||||
|
|
||||||
String allComments = sb.toString();
|
String allComments = sb.toString();
|
||||||
for (String comment : allComments.split("\\r?\\n"))
|
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
|
||||||
pw.println(WordUtils.wrap(comment, 80));
|
|
||||||
} catch (FileNotFoundException e) {
|
} catch (FileNotFoundException e) {
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
}
|
}
|
||||||
@@ -174,23 +203,37 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
PolynomialFunction zPos = new PolynomialFunction(zCoeff);
|
PolynomialFunction zPos = new PolynomialFunction(zCoeff);
|
||||||
PolynomialFunction zVel = zPos.polynomialDerivative();
|
PolynomialFunction zVel = zPos.polynomialDerivative();
|
||||||
|
|
||||||
logger.info("Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
|
logger.info(
|
||||||
observer.getName(), target.getName());
|
"Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
|
||||||
|
observer.getName(),
|
||||||
|
target.getName());
|
||||||
StringBuilder xMsg = new StringBuilder(String.format("X = %e ", xCoeff[0]));
|
StringBuilder xMsg = new StringBuilder(String.format("X = %e ", xCoeff[0]));
|
||||||
StringBuilder yMsg = new StringBuilder(String.format("Y = %e ", yCoeff[0]));
|
StringBuilder yMsg = new StringBuilder(String.format("Y = %e ", yCoeff[0]));
|
||||||
StringBuilder zMsg = new StringBuilder(String.format("Z = %e ", zCoeff[0]));
|
StringBuilder zMsg = new StringBuilder(String.format("Z = %e ", zCoeff[0]));
|
||||||
for (int i = 1; i <= degree; i++) {
|
for (int i = 1; i <= degree; i++) {
|
||||||
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(xCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
|
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ")
|
||||||
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(yCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
|
.append(String.format("%e ", Math.abs(xCoeff[i])))
|
||||||
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(zCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
|
.append("t")
|
||||||
|
.append(i > 1 ? "^" + i : "")
|
||||||
|
.append(" ");
|
||||||
|
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ")
|
||||||
|
.append(String.format("%e ", Math.abs(yCoeff[i])))
|
||||||
|
.append("t")
|
||||||
|
.append(i > 1 ? "^" + i : "")
|
||||||
|
.append(" ");
|
||||||
|
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ")
|
||||||
|
.append(String.format("%e ", Math.abs(zCoeff[i])))
|
||||||
|
.append("t")
|
||||||
|
.append(i > 1 ? "^" + i : "")
|
||||||
|
.append(" ");
|
||||||
}
|
}
|
||||||
logger.info(xMsg);
|
logger.info(xMsg);
|
||||||
logger.info(yMsg);
|
logger.info(yMsg);
|
||||||
logger.info(zMsg);
|
logger.info(zMsg);
|
||||||
|
|
||||||
logger.debug("");
|
logger.debug("");
|
||||||
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration " +
|
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration "
|
||||||
"correction=NONE for fit.");
|
+ "correction=NONE for fit.");
|
||||||
for (Double t : sumFiles.keySet()) {
|
for (Double t : sumFiles.keySet()) {
|
||||||
TDBTime tdb = new TDBTime(t);
|
TDBTime tdb = new TDBTime(t);
|
||||||
SumFile sumFile = sumFiles.get(t);
|
SumFile sumFile = sumFiles.get(t);
|
||||||
@@ -212,32 +255,34 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
double vx = -xVel.value(t);
|
double vx = -xVel.value(t);
|
||||||
double vy = -yVel.value(t);
|
double vy = -yVel.value(t);
|
||||||
double vz = -zVel.value(t);
|
double vz = -zVel.value(t);
|
||||||
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t),
|
pw.printf(
|
||||||
-zPos.value(t), vx, vy, vz);
|
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
|
||||||
|
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
|
||||||
} else {
|
} else {
|
||||||
Vector3 thisVelocity = new Vector3(velocity);
|
Vector3 thisVelocity = new Vector3(velocity);
|
||||||
if (!velocityIsJ2000) {
|
if (!velocityIsJ2000) {
|
||||||
TDBTime tdb = new TDBTime(t);
|
TDBTime tdb = new TDBTime(t);
|
||||||
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
|
thisVelocity =
|
||||||
|
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
|
||||||
}
|
}
|
||||||
double vx = thisVelocity.getElt(0);
|
double vx = thisVelocity.getElt(0);
|
||||||
double vy = thisVelocity.getElt(1);
|
double vy = thisVelocity.getElt(1);
|
||||||
double vz = thisVelocity.getElt(2);
|
double vz = thisVelocity.getElt(2);
|
||||||
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t),
|
pw.printf(
|
||||||
-zPos.value(t), vx, vy, vz);
|
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
|
||||||
|
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} catch (FileNotFoundException e) {
|
} catch (FileNotFoundException e) {
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(basename + ".csv")) {
|
try (PrintWriter pw = new PrintWriter(basename + ".csv")) {
|
||||||
pw.println("# Note: fit quantities are without light time or aberration corrections");
|
pw.println("# Note: fit quantities are without light time or aberration corrections");
|
||||||
pw.println("# SCOBJ");
|
pw.println("# SCOBJ");
|
||||||
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x," +
|
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x,"
|
||||||
" y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric " +
|
+ " y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric "
|
||||||
"J2000) x, y, z");
|
+ "J2000) x, y, z");
|
||||||
for (Double t : sumFiles.keySet()) {
|
for (Double t : sumFiles.keySet()) {
|
||||||
SumFile sumFile = sumFiles.get(t);
|
SumFile sumFile = sumFiles.get(t);
|
||||||
pw.printf("%s,", sumFile.utcString());
|
pw.printf("%s,", sumFile.utcString());
|
||||||
@@ -278,8 +323,8 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
pw.println();
|
pw.println();
|
||||||
}
|
}
|
||||||
pw.println("\n# Velocity");
|
pw.println("\n# Velocity");
|
||||||
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, " +
|
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, "
|
||||||
"y, z, Fit (J2000) x, y, z");
|
+ "y, z, Fit (J2000) x, y, z");
|
||||||
for (Double t : sumFiles.keySet()) {
|
for (Double t : sumFiles.keySet()) {
|
||||||
SumFile sumFile = sumFiles.get(t);
|
SumFile sumFile = sumFiles.get(t);
|
||||||
pw.printf("%s,", sumFile.utcString());
|
pw.printf("%s,", sumFile.utcString());
|
||||||
@@ -305,7 +350,8 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
if (velocity != null) {
|
if (velocity != null) {
|
||||||
Vector3 thisVelocity = new Vector3(velocity);
|
Vector3 thisVelocity = new Vector3(velocity);
|
||||||
if (!velocityIsJ2000) {
|
if (!velocityIsJ2000) {
|
||||||
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
|
thisVelocity =
|
||||||
|
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
|
||||||
}
|
}
|
||||||
double vx = thisVelocity.getElt(0);
|
double vx = thisVelocity.getElt(0);
|
||||||
double vy = thisVelocity.getElt(1);
|
double vy = thisVelocity.getElt(1);
|
||||||
@@ -313,8 +359,8 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
velJ2000 = new Vector3(vx, vy, vz);
|
velJ2000 = new Vector3(vx, vy, vz);
|
||||||
}
|
}
|
||||||
|
|
||||||
StateVector stateJ2000 = new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)),
|
StateVector stateJ2000 =
|
||||||
velJ2000);
|
new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)), velJ2000);
|
||||||
velBodyFixed = j2000ToBodyFixed.mxv(stateJ2000).getVector3(1);
|
velBodyFixed = j2000ToBodyFixed.mxv(stateJ2000).getVector3(1);
|
||||||
|
|
||||||
pw.printf("%s, ", velBodyFixed.getElt(0));
|
pw.printf("%s, ", velBodyFixed.getElt(0));
|
||||||
@@ -335,19 +381,39 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
|
|
||||||
private static Options defineOptions() {
|
private static Options defineOptions() {
|
||||||
Options options = TerrasaurTool.defineOptions();
|
Options options = TerrasaurTool.defineOptions();
|
||||||
options.addOption(Option.builder("degree").hasArg().desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.").build());
|
options.addOption(Option.builder("degree")
|
||||||
options.addOption(Option.builder("extend").hasArg().desc("Extend SPK past the last sumFile by <arg> seconds. "
|
.hasArg()
|
||||||
+ " Default is zero.").build());
|
.desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.")
|
||||||
options.addOption(Option.builder("frame").hasArg().desc("Name of body fixed frame. This will default to the "
|
.build());
|
||||||
+ "target's body fixed frame.").build());
|
options.addOption(Option.builder("extend")
|
||||||
options.addOption(Option.builder("logFile").hasArg().desc("If present, save screen output to log file.").build());
|
.hasArg()
|
||||||
|
.desc("Extend SPK past the last sumFile by <arg> seconds. " + " Default is zero.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("frame")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Name of body fixed frame. This will default to the " + "target's body fixed frame.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
for (StandardLevel l : StandardLevel.values())
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
sb.append(String.format("%s ", l.name()));
|
options.addOption(Option.builder("logLevel")
|
||||||
options.addOption(Option.builder("logLevel").hasArg().desc("If present, print messages above selected " +
|
.hasArg()
|
||||||
"priority. Valid values are " + sb.toString().trim() + ". Default is INFO.").build());
|
.desc("If present, print messages above selected " + "priority. Valid values are "
|
||||||
options.addOption(Option.builder("observer").required().hasArg().desc("Required. SPICE ID for the observer.").build());
|
+ sb.toString().trim() + ". Default is INFO.")
|
||||||
options.addOption(Option.builder("sumFile").hasArg().required().desc("""
|
.build());
|
||||||
|
options.addOption(Option.builder("observer")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. SPICE ID for the observer.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("sumFile")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc(
|
||||||
|
"""
|
||||||
File listing sumfiles to read. This is a text file,
|
File listing sumfiles to read. This is a text file,
|
||||||
one per line. You can include an optional weight
|
one per line. You can include an optional weight
|
||||||
after each filename. The default weight is 1.0.
|
after each filename. The default weight is 1.0.
|
||||||
@@ -363,14 +429,31 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
D717506131G0.SUM
|
D717506131G0.SUM
|
||||||
# Weight this last image less than the others
|
# Weight this last image less than the others
|
||||||
D717506132G0.SUM 0.25
|
D717506132G0.SUM 0.25
|
||||||
""").build());
|
""")
|
||||||
options.addOption(Option.builder("spice").required().hasArgs().desc("Required. SPICE metakernel file " +
|
.build());
|
||||||
"containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
|
options.addOption(Option.builder("spice")
|
||||||
+ "whitespace.").build());
|
.required()
|
||||||
options.addOption(Option.builder("target").required().hasArg().desc("Required. SPICE ID for the target.").build());
|
.hasArgs()
|
||||||
options.addOption(Option.builder("velocity").hasArgs().desc("Spacecraft velocity relative to target in the " + "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to " + "take the derivative of the fit position. Specify as three floating point values in km/sec," + "separated by whitespace.").build());
|
.desc("Required. SPICE metakernel file "
|
||||||
options.addOption(Option.builder("velocityJ2000").desc("If present, argument to -velocity is in J2000 frame. "
|
+ "containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
|
||||||
+ " Ignored if -velocity is not set.").build()); return options;
|
+ "whitespace.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("target")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. SPICE ID for the target.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("velocity")
|
||||||
|
.hasArgs()
|
||||||
|
.desc("Spacecraft velocity relative to target in the "
|
||||||
|
+ "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to "
|
||||||
|
+ "take the derivative of the fit position. Specify as three floating point values in km/sec,"
|
||||||
|
+ "separated by whitespace.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("velocityJ2000")
|
||||||
|
.desc("If present, argument to -velocity is in J2000 frame. " + " Ignored if -velocity is not set.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static void main(String[] args) throws SpiceException {
|
public static void main(String[] args) throws SpiceException {
|
||||||
@@ -386,11 +469,9 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
|
||||||
|
|
||||||
final double extend = cl.hasOption("extend") ? Double.parseDouble(cl.getOptionValue("extend")) : 0;
|
final double extend = cl.hasOption("extend") ? Double.parseDouble(cl.getOptionValue("extend")) : 0;
|
||||||
|
|
||||||
for (String kernel : cl.getOptionValues("spice"))
|
for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
|
||||||
KernelDatabase.load(kernel);
|
|
||||||
|
|
||||||
Body observer = new Body(cl.getOptionValue("observer"));
|
Body observer = new Body(cl.getOptionValue("observer"));
|
||||||
Body target = new Body(cl.getOptionValue("target"));
|
Body target = new Body(cl.getOptionValue("target"));
|
||||||
@@ -446,5 +527,4 @@ public class SPKFromSumFile implements TerrasaurTool {
|
|||||||
|
|
||||||
logger.info("Finished.");
|
logger.info("Finished.");
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -33,476 +55,452 @@ import vtk.vtkPolyData;
|
|||||||
|
|
||||||
public class ShapeFormatConverter implements TerrasaurTool {
|
public class ShapeFormatConverter implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Transform a shape model to a new coordinate system.";
|
return "Transform a shape model to a new coordinate system.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
|
|
||||||
String header = "";
|
|
||||||
String footer =
|
|
||||||
"This program will rotate, translate, and/or scale a shape model. It can additionally transform a "
|
|
||||||
+ "single point, a sum file, or an SBMT ellipse file. For a sum file, the SCOBJ vector is "
|
|
||||||
+ "transformed and the cx, cy, cz, and sz vectors are rotated. For SBMT ellipse files, only "
|
|
||||||
+ "center of the ellipse is transformed. The size, orientation, and all other fields in the "
|
|
||||||
+ "file are unchanged.";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
|
||||||
}
|
|
||||||
|
|
||||||
private enum COORDTYPE {
|
|
||||||
LATLON,
|
|
||||||
XYZ,
|
|
||||||
POLYDATA
|
|
||||||
}
|
|
||||||
|
|
||||||
private enum FORMATS {
|
|
||||||
ICQ,
|
|
||||||
LLR,
|
|
||||||
OBJ,
|
|
||||||
PDS,
|
|
||||||
PLT,
|
|
||||||
PLY,
|
|
||||||
STL,
|
|
||||||
VTK,
|
|
||||||
FITS,
|
|
||||||
SUM,
|
|
||||||
SBMT
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("centerOfRotation")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Subtract this point before applying rotation matrix, add back after. "
|
|
||||||
+ "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("decimate")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
|
|
||||||
+ "For example, if a model has 100 facets and the argument to -decimate is 0.90, "
|
|
||||||
+ "there will be approximately 10 facets after the decimation.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("input")
|
|
||||||
.required()
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Required. Name of shape model to transform. Extension must be icq, fits, llr, obj, pds, plt, ply, sbmt, stl, sum, or vtk. "
|
|
||||||
+ "Alternately transform a single point using three floating point numbers separated "
|
|
||||||
+ "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. "
|
|
||||||
+ "Transformed point will be written to stdout in the same format as the input string.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("inputFormat")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Format of input file. If not present format will be inferred from inputFile extension.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("output")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Required for all but single point input. Name of transformed file. "
|
|
||||||
+ "Extension must be obj, plt, sbmt, stl, sum, or vtk.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("outputFormat")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Format of output file. If not present format will be inferred from outputFile extension.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("register")
|
|
||||||
.hasArg()
|
|
||||||
.desc("Use SVD to transform input file to best align with register file.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("rotate")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Rotate surface points and spacecraft position. "
|
|
||||||
+ "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) "
|
|
||||||
+ "separated by commas.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("rotateToPrincipalAxes")
|
|
||||||
.desc("Rotate body to align along its principal axes of inertia.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("scale")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Scale the shape model by <arg>. This can either be one value or three "
|
|
||||||
+ "separated by commas. One value scales all three axes uniformly, "
|
|
||||||
+ "three values scale the x, y, and z axes respectively. For example, "
|
|
||||||
+ "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the "
|
|
||||||
+ "y dimension by 0.25, the z dimension by 1.5.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("translate")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Translate surface points and spacecraft position. "
|
|
||||||
+ "Specify by three floating point numbers separated by commas.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("translateToCenter")
|
|
||||||
.desc("Translate body so that its center of mass is at the origin.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("transform")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Translate and rotate surface points and spacecraft position. "
|
|
||||||
+ "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix "
|
|
||||||
+ "is the rotation matrix. The top three entries in the right hand column are the translation "
|
|
||||||
+ "vector. The bottom row is always 0 0 0 1.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
|
|
||||||
TerrasaurTool defaultOBJ = new ShapeFormatConverter();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
|
|
||||||
String filename = cl.getOptionValue("input");
|
|
||||||
COORDTYPE coordType = COORDTYPE.POLYDATA;
|
|
||||||
vtkPolyData polydata = null;
|
|
||||||
SumFile sumFile = null;
|
|
||||||
List<SBMTEllipseRecord> sbmtEllipse = null;
|
|
||||||
|
|
||||||
String extension = null;
|
|
||||||
if (cl.hasOption("inputFormat")) {
|
|
||||||
try {
|
|
||||||
extension =
|
|
||||||
FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase()).name().toLowerCase();
|
|
||||||
} catch (IllegalArgumentException e) {
|
|
||||||
logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat"));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
|
|
||||||
switch (extension) {
|
|
||||||
case "icq", "llr", "obj", "pds", "plt", "ply", "stl", "vtk" ->
|
|
||||||
polydata = PolyDataUtil.loadShapeModel(filename, extension);
|
|
||||||
case "fits" -> polydata = PolyDataUtil.loadFITShapeModel(filename);
|
|
||||||
case "sum" -> {
|
|
||||||
List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset());
|
|
||||||
sumFile = SumFile.fromLines(lines);
|
|
||||||
}
|
|
||||||
case "sbmt" -> {
|
|
||||||
sbmtEllipse = new ArrayList<>();
|
|
||||||
vtkPoints points = new vtkPoints();
|
|
||||||
polydata = new vtkPolyData();
|
|
||||||
polydata.SetPoints(points);
|
|
||||||
for (String line : FileUtils.readLines(new File(filename), Charset.defaultCharset())) {
|
|
||||||
SBMTEllipseRecord record = SBMTEllipseRecord.fromString(line);
|
|
||||||
sbmtEllipse.add(record);
|
|
||||||
points.InsertNextPoint(record.x(), record.y(), record.z());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
default -> {
|
|
||||||
// Single point
|
|
||||||
String[] params = filename.split(",");
|
|
||||||
vtkPoints points = new vtkPoints();
|
|
||||||
polydata = new vtkPolyData();
|
|
||||||
polydata.SetPoints(points);
|
|
||||||
if (params.length == 2) {
|
|
||||||
double[] array =
|
|
||||||
new Vector3D(
|
|
||||||
Math.toRadians(Double.parseDouble(params[0].trim())),
|
|
||||||
Math.toRadians(Double.parseDouble(params[1].trim())))
|
|
||||||
.toArray();
|
|
||||||
points.InsertNextPoint(array);
|
|
||||||
coordType = COORDTYPE.LATLON;
|
|
||||||
} else if (params.length == 3) {
|
|
||||||
double[] array = new double[3];
|
|
||||||
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
|
|
||||||
points.InsertNextPoint(array);
|
|
||||||
coordType = COORDTYPE.XYZ;
|
|
||||||
} else {
|
|
||||||
logger.error(
|
|
||||||
"Can't read input shape model {} with format {}", filename, extension.toUpperCase());
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("decimate") && polydata != null) {
|
@Override
|
||||||
double reduction = Double.parseDouble(cl.getOptionValue("decimate"));
|
public String fullDescription(Options options) {
|
||||||
if (reduction < 0) {
|
|
||||||
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to zero.", reduction);
|
String header = "";
|
||||||
reduction = 0;
|
String footer =
|
||||||
}
|
"This program will rotate, translate, and/or scale a shape model. It can additionally transform a "
|
||||||
if (reduction > 1) {
|
+ "single point, a sum file, or an SBMT ellipse file. For a sum file, the SCOBJ vector is "
|
||||||
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to one.", reduction);
|
+ "transformed and the cx, cy, cz, and sz vectors are rotated. For SBMT ellipse files, only "
|
||||||
reduction = 1;
|
+ "center of the ellipse is transformed. The size, orientation, and all other fields in the "
|
||||||
}
|
+ "file are unchanged.";
|
||||||
PolyDataUtil.decimatePolyData(polydata, reduction);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (coordType == COORDTYPE.POLYDATA && !cl.hasOption("output")) {
|
private enum COORDTYPE {
|
||||||
logger.error(String.format("No output file specified for input file %s", filename));
|
LATLON,
|
||||||
System.exit(0);
|
XYZ,
|
||||||
|
POLYDATA
|
||||||
}
|
}
|
||||||
|
|
||||||
Vector3 centerOfRotation = null;
|
private enum FORMATS {
|
||||||
Matrix33 rotation = null;
|
ICQ,
|
||||||
Vector3 translation = null;
|
LLR,
|
||||||
Vector3 scale = new Vector3(1., 1., 1.);
|
OBJ,
|
||||||
for (Option option : cl.getOptions()) {
|
PDS,
|
||||||
if (option.getOpt().equals("centerOfRotation"))
|
PLT,
|
||||||
centerOfRotation =
|
PLY,
|
||||||
MathConversions.toVector3(
|
STL,
|
||||||
VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
|
VTK,
|
||||||
|
FITS,
|
||||||
if (option.getOpt().equals("rotate"))
|
SUM,
|
||||||
rotation =
|
SBMT
|
||||||
MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
|
|
||||||
|
|
||||||
if (option.getOpt().equals("scale")) {
|
|
||||||
String scaleString = cl.getOptionValue("scale");
|
|
||||||
if (scaleString.contains(",")) {
|
|
||||||
scale = MathConversions.toVector3(VectorUtils.stringToVector3D(scaleString));
|
|
||||||
} else {
|
|
||||||
scale = scale.scale(Double.parseDouble(scaleString));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (option.getOpt().equals("translate"))
|
|
||||||
translation =
|
|
||||||
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
|
|
||||||
|
|
||||||
if (option.getOpt().equals("transform")) {
|
|
||||||
List<String> lines =
|
|
||||||
FileUtils.readLines(new File(cl.getOptionValue("transform")), Charset.defaultCharset());
|
|
||||||
Pair<Vector3D, Rotation> pair = RotationUtils.stringToTransform(lines);
|
|
||||||
translation = MathConversions.toVector3(pair.getKey());
|
|
||||||
rotation = MathConversions.toMatrix33(pair.getValue());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("rotateToPrincipalAxes")) {
|
private static Options defineOptions() {
|
||||||
if (polydata != null) {
|
Options options = TerrasaurTool.defineOptions();
|
||||||
PolyDataStatistics stats = new PolyDataStatistics(polydata);
|
options.addOption(Option.builder("centerOfRotation")
|
||||||
if (stats.isClosed()) {
|
.hasArg()
|
||||||
ArrayList<double[]> axes = stats.getPrincipalAxes();
|
.desc(
|
||||||
// make X primary, Y secondary
|
"Subtract this point before applying rotation matrix, add back after. "
|
||||||
rotation = new Matrix33(new Vector3(axes.get(0)), 1, new Vector3(axes.get(1)), 2);
|
+ "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).")
|
||||||
} else {
|
.build());
|
||||||
logger.warn("Shape is not closed, cannot determine principal axes.");
|
options.addOption(Option.builder("decimate")
|
||||||
}
|
.hasArg()
|
||||||
}
|
.desc("Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
|
||||||
|
+ "For example, if a model has 100 facets and the argument to -decimate is 0.90, "
|
||||||
|
+ "there will be approximately 10 facets after the decimation.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("input")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Required. Name of shape model to transform. Extension must be icq, fits, llr, obj, pds, plt, ply, sbmt, stl, sum, or vtk. "
|
||||||
|
+ "Alternately transform a single point using three floating point numbers separated "
|
||||||
|
+ "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. "
|
||||||
|
+ "Transformed point will be written to stdout in the same format as the input string.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of input file. If not present format will be inferred from inputFile extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("output")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required for all but single point input. Name of transformed file. "
|
||||||
|
+ "Extension must be obj, plt, sbmt, stl, sum, or vtk.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outputFormat")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Format of output file. If not present format will be inferred from outputFile extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("register")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Use SVD to transform input file to best align with register file.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("rotate")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Rotate surface points and spacecraft position. "
|
||||||
|
+ "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) "
|
||||||
|
+ "separated by commas.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("rotateToPrincipalAxes")
|
||||||
|
.desc("Rotate body to align along its principal axes of inertia.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("scale")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Scale the shape model by <arg>. This can either be one value or three "
|
||||||
|
+ "separated by commas. One value scales all three axes uniformly, "
|
||||||
|
+ "three values scale the x, y, and z axes respectively. For example, "
|
||||||
|
+ "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the "
|
||||||
|
+ "y dimension by 0.25, the z dimension by 1.5.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("translate")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Translate surface points and spacecraft position. "
|
||||||
|
+ "Specify by three floating point numbers separated by commas.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("translateToCenter")
|
||||||
|
.desc("Translate body so that its center of mass is at the origin.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("transform")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Translate and rotate surface points and spacecraft position. "
|
||||||
|
+ "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix "
|
||||||
|
+ "is the rotation matrix. The top three entries in the right hand column are the translation "
|
||||||
|
+ "vector. The bottom row is always 0 0 0 1.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("register")) {
|
public static void main(String[] args) throws Exception {
|
||||||
String register = cl.getOptionValue("register");
|
|
||||||
vtkPolyData registeredPolydata = null;
|
|
||||||
extension = FilenameUtils.getExtension(register).toLowerCase();
|
|
||||||
if (extension.equals("llr")
|
|
||||||
|| extension.equals("obj")
|
|
||||||
|| extension.equals("pds")
|
|
||||||
|| extension.equals("plt")
|
|
||||||
|| extension.equals("ply")
|
|
||||||
|| extension.equals("stl")
|
|
||||||
|| extension.equals("vtk")) {
|
|
||||||
registeredPolydata = PolyDataUtil.loadShapeModelAndComputeNormals(register);
|
|
||||||
} else {
|
|
||||||
logger.error(String.format("Can't read input shape model for registration: %s", register));
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (registeredPolydata != null) {
|
TerrasaurTool defaultOBJ = new ShapeFormatConverter();
|
||||||
Vector3D centerA = PolyDataUtil.computePolyDataCentroid(polydata);
|
|
||||||
Vector3D centerB = PolyDataUtil.computePolyDataCentroid(registeredPolydata);
|
|
||||||
|
|
||||||
vtkPoints points = polydata.GetPoints();
|
Options options = defineOptions();
|
||||||
double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3];
|
|
||||||
for (int i = 0; i < points.GetNumberOfPoints(); i++)
|
|
||||||
pointsA[i] = new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
|
|
||||||
points = registeredPolydata.GetPoints();
|
|
||||||
|
|
||||||
if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) {
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
logger.error("registered polydata does not have the same number of points as input.");
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
|
|
||||||
double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3];
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
for (int i = 0; i < points.GetNumberOfPoints(); i++)
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
pointsB[i] = new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
double[][] H = new double[3][3];
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) {
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
for (int i = 0; i < 3; i++) {
|
|
||||||
for (int j = 0; j < 3; j++) {
|
String filename = cl.getOptionValue("input");
|
||||||
H[i][j] += pointsA[ii][i] * pointsB[ii][j];
|
COORDTYPE coordType = COORDTYPE.POLYDATA;
|
||||||
|
vtkPolyData polydata = null;
|
||||||
|
SumFile sumFile = null;
|
||||||
|
List<SBMTEllipseRecord> sbmtEllipse = null;
|
||||||
|
|
||||||
|
String extension = null;
|
||||||
|
if (cl.hasOption("inputFormat")) {
|
||||||
|
try {
|
||||||
|
extension = FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
|
||||||
|
.name()
|
||||||
|
.toLowerCase();
|
||||||
|
} catch (IllegalArgumentException e) {
|
||||||
|
logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
|
||||||
|
switch (extension) {
|
||||||
|
case "icq", "llr", "obj", "pds", "plt", "ply", "stl", "vtk" ->
|
||||||
|
polydata = PolyDataUtil.loadShapeModel(filename, extension);
|
||||||
|
case "fits" -> polydata = PolyDataUtil.loadFITShapeModel(filename);
|
||||||
|
case "sum" -> {
|
||||||
|
List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset());
|
||||||
|
sumFile = SumFile.fromLines(lines);
|
||||||
|
}
|
||||||
|
case "sbmt" -> {
|
||||||
|
sbmtEllipse = new ArrayList<>();
|
||||||
|
vtkPoints points = new vtkPoints();
|
||||||
|
polydata = new vtkPolyData();
|
||||||
|
polydata.SetPoints(points);
|
||||||
|
for (String line : FileUtils.readLines(new File(filename), Charset.defaultCharset())) {
|
||||||
|
SBMTEllipseRecord record = SBMTEllipseRecord.fromString(line);
|
||||||
|
sbmtEllipse.add(record);
|
||||||
|
points.InsertNextPoint(record.x(), record.y(), record.z());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
default -> {
|
||||||
|
// Single point
|
||||||
|
String[] params = filename.split(",");
|
||||||
|
vtkPoints points = new vtkPoints();
|
||||||
|
polydata = new vtkPolyData();
|
||||||
|
polydata.SetPoints(points);
|
||||||
|
if (params.length == 2) {
|
||||||
|
double[] array = new Vector3D(
|
||||||
|
Math.toRadians(Double.parseDouble(params[0].trim())),
|
||||||
|
Math.toRadians(Double.parseDouble(params[1].trim())))
|
||||||
|
.toArray();
|
||||||
|
points.InsertNextPoint(array);
|
||||||
|
coordType = COORDTYPE.LATLON;
|
||||||
|
} else if (params.length == 3) {
|
||||||
|
double[] array = new double[3];
|
||||||
|
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
|
||||||
|
points.InsertNextPoint(array);
|
||||||
|
coordType = COORDTYPE.XYZ;
|
||||||
|
} else {
|
||||||
|
logger.error("Can't read input shape model {} with format {}", filename, extension.toUpperCase());
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
RealMatrix pointMatrix = new Array2DRowRealMatrix(H);
|
if (cl.hasOption("decimate") && polydata != null) {
|
||||||
|
double reduction = Double.parseDouble(cl.getOptionValue("decimate"));
|
||||||
SingularValueDecomposition svd = new SingularValueDecomposition(pointMatrix);
|
if (reduction < 0) {
|
||||||
RealMatrix uT = svd.getUT();
|
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to zero.", reduction);
|
||||||
RealMatrix v = svd.getV();
|
reduction = 0;
|
||||||
RealMatrix R = v.multiply(uT);
|
}
|
||||||
|
if (reduction > 1) {
|
||||||
if (new LUDecomposition(R).getDeterminant() < 0) {
|
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to one.", reduction);
|
||||||
for (int i = 0; i < 3; i++) {
|
reduction = 1;
|
||||||
R.multiplyEntry(i, 2, -1);
|
}
|
||||||
}
|
PolyDataUtil.decimatePolyData(polydata, reduction);
|
||||||
}
|
}
|
||||||
rotation = new Matrix33(R.getData());
|
|
||||||
translation = MathConversions.toVector3(centerB);
|
|
||||||
translation = translation.sub(rotation.mxv(MathConversions.toVector3(centerA)));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (sumFile != null) {
|
if (coordType == COORDTYPE.POLYDATA && !cl.hasOption("output")) {
|
||||||
if (rotation != null && translation != null)
|
logger.error(String.format("No output file specified for input file %s", filename));
|
||||||
sumFile.transform(
|
|
||||||
MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
|
|
||||||
} else {
|
|
||||||
|
|
||||||
Vector3 center;
|
|
||||||
if (polydata.GetNumberOfPoints() > 1) {
|
|
||||||
PolyDataStatistics stats = new PolyDataStatistics(polydata);
|
|
||||||
center = new Vector3(stats.getCentroid());
|
|
||||||
} else {
|
|
||||||
center = new Vector3(polydata.GetPoint(0));
|
|
||||||
}
|
|
||||||
if (cl.hasOption("translateToCenter")) translation = center.negate();
|
|
||||||
|
|
||||||
double[] values = new double[3];
|
|
||||||
for (int j = 0; j < 3; j++) values[j] = center.getElt(j) * scale.getElt(j);
|
|
||||||
Vector3 scaledCenter = new Vector3(values);
|
|
||||||
|
|
||||||
vtkPoints points = polydata.GetPoints();
|
|
||||||
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
|
|
||||||
Vector3 thisPoint = new Vector3(points.GetPoint(i));
|
|
||||||
thisPoint = thisPoint.sub(center);
|
|
||||||
for (int j = 0; j < 3; j++) values[j] = thisPoint.getElt(j) * scale.getElt(j);
|
|
||||||
thisPoint = new Vector3(values);
|
|
||||||
thisPoint = thisPoint.add(scaledCenter);
|
|
||||||
|
|
||||||
if (rotation != null) {
|
|
||||||
if (centerOfRotation == null) centerOfRotation = new Vector3();
|
|
||||||
/*-
|
|
||||||
else {
|
|
||||||
System.out.printf("Center of rotation:\n%s\n", centerOfRotation.toString());
|
|
||||||
System.out.printf("-centerOfRotation %f,%f,%f\n", centerOfRotation.getElt(0),
|
|
||||||
centerOfRotation.getElt(1), centerOfRotation.getElt(2));
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
thisPoint = rotation.mxv(thisPoint.sub(centerOfRotation)).add(centerOfRotation);
|
|
||||||
}
|
|
||||||
if (translation != null) thisPoint = thisPoint.add(translation);
|
|
||||||
points.SetPoint(i, thisPoint.toArray());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/*-
|
|
||||||
if (rotation != null) {
|
|
||||||
AxisAndAngle aaa = new AxisAndAngle(rotation);
|
|
||||||
System.out.printf("Rotation:\n%s\n", rotation.toString());
|
|
||||||
System.out.printf("-rotate %.5e,%.5e,%.5e,%.5e\n", Math.toDegrees(aaa.getAngle()),
|
|
||||||
aaa.getAxis().getElt(0), aaa.getAxis().getElt(1), aaa.getAxis().getElt(2));
|
|
||||||
}
|
|
||||||
|
|
||||||
if (translation != null) {
|
|
||||||
System.out.printf("Translation:\n%s\n", translation.toString());
|
|
||||||
System.out.printf("-translate %.5e,%.5e,%.5e\n", translation.getElt(0), translation.getElt(1),
|
|
||||||
translation.getElt(2));
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
if (coordType == COORDTYPE.LATLON) {
|
|
||||||
double[] pt = new double[3];
|
|
||||||
polydata.GetPoint(0, pt);
|
|
||||||
Vector3D point = new Vector3D(pt);
|
|
||||||
double lon = Math.toDegrees(point.getAlpha());
|
|
||||||
if (lon < 0) lon += 360;
|
|
||||||
System.out.printf("%.16f,%.16f\n", Math.toDegrees(point.getDelta()), lon);
|
|
||||||
} else if (coordType == COORDTYPE.XYZ) {
|
|
||||||
double[] pt = new double[3];
|
|
||||||
polydata.GetPoint(0, pt);
|
|
||||||
System.out.printf("%.16f,%.16f,%.16f\n", pt[0], pt[1], pt[2]);
|
|
||||||
} else {
|
|
||||||
filename = cl.getOptionValue("output");
|
|
||||||
extension = null;
|
|
||||||
if (cl.hasOption("outputFormat")) {
|
|
||||||
try {
|
|
||||||
extension =
|
|
||||||
FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase()).name().toLowerCase();
|
|
||||||
} catch (IllegalArgumentException e) {
|
|
||||||
logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat"));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
|
|
||||||
|
|
||||||
switch (extension) {
|
|
||||||
case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename);
|
|
||||||
case "obj" -> PolyDataUtil.saveShapeModelAsOBJ(polydata, filename);
|
|
||||||
case "plt" -> PolyDataUtil.saveShapeModelAsPLT(polydata, filename);
|
|
||||||
case "stl" -> PolyDataUtil.saveShapeModelAsSTL(polydata, filename);
|
|
||||||
case "sum" -> {
|
|
||||||
try (PrintWriter pw = new PrintWriter(filename)) {
|
|
||||||
pw.print(sumFile.toString());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
case "sbmt" -> {
|
|
||||||
if (sbmtEllipse == null) {
|
|
||||||
logger.error("No input SBMT ellipse specified!");
|
|
||||||
System.exit(0);
|
System.exit(0);
|
||||||
}
|
}
|
||||||
double[] pt = new double[3];
|
|
||||||
List<SBMTEllipseRecord> transformedRecords = new ArrayList<>();
|
Vector3 centerOfRotation = null;
|
||||||
for (SBMTEllipseRecord record : sbmtEllipse) {
|
Matrix33 rotation = null;
|
||||||
|
Vector3 translation = null;
|
||||||
|
Vector3 scale = new Vector3(1., 1., 1.);
|
||||||
|
for (Option option : cl.getOptions()) {
|
||||||
|
if (option.getOpt().equals("centerOfRotation"))
|
||||||
|
centerOfRotation =
|
||||||
|
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
|
||||||
|
|
||||||
|
if (option.getOpt().equals("rotate"))
|
||||||
|
rotation = MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
|
||||||
|
|
||||||
|
if (option.getOpt().equals("scale")) {
|
||||||
|
String scaleString = cl.getOptionValue("scale");
|
||||||
|
if (scaleString.contains(",")) {
|
||||||
|
scale = MathConversions.toVector3(VectorUtils.stringToVector3D(scaleString));
|
||||||
|
} else {
|
||||||
|
scale = scale.scale(Double.parseDouble(scaleString));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (option.getOpt().equals("translate"))
|
||||||
|
translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
|
||||||
|
|
||||||
|
if (option.getOpt().equals("transform")) {
|
||||||
|
List<String> lines =
|
||||||
|
FileUtils.readLines(new File(cl.getOptionValue("transform")), Charset.defaultCharset());
|
||||||
|
Pair<Vector3D, Rotation> pair = RotationUtils.stringToTransform(lines);
|
||||||
|
translation = MathConversions.toVector3(pair.getKey());
|
||||||
|
rotation = MathConversions.toMatrix33(pair.getValue());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("rotateToPrincipalAxes")) {
|
||||||
|
if (polydata != null) {
|
||||||
|
PolyDataStatistics stats = new PolyDataStatistics(polydata);
|
||||||
|
if (stats.isClosed()) {
|
||||||
|
ArrayList<double[]> axes = stats.getPrincipalAxes();
|
||||||
|
// make X primary, Y secondary
|
||||||
|
rotation = new Matrix33(new Vector3(axes.get(0)), 1, new Vector3(axes.get(1)), 2);
|
||||||
|
} else {
|
||||||
|
logger.warn("Shape is not closed, cannot determine principal axes.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("register")) {
|
||||||
|
String register = cl.getOptionValue("register");
|
||||||
|
vtkPolyData registeredPolydata = null;
|
||||||
|
extension = FilenameUtils.getExtension(register).toLowerCase();
|
||||||
|
if (extension.equals("llr")
|
||||||
|
|| extension.equals("obj")
|
||||||
|
|| extension.equals("pds")
|
||||||
|
|| extension.equals("plt")
|
||||||
|
|| extension.equals("ply")
|
||||||
|
|| extension.equals("stl")
|
||||||
|
|| extension.equals("vtk")) {
|
||||||
|
registeredPolydata = PolyDataUtil.loadShapeModelAndComputeNormals(register);
|
||||||
|
} else {
|
||||||
|
logger.error(String.format("Can't read input shape model for registration: %s", register));
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (registeredPolydata != null) {
|
||||||
|
Vector3D centerA = PolyDataUtil.computePolyDataCentroid(polydata);
|
||||||
|
Vector3D centerB = PolyDataUtil.computePolyDataCentroid(registeredPolydata);
|
||||||
|
|
||||||
|
vtkPoints points = polydata.GetPoints();
|
||||||
|
double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3];
|
||||||
|
for (int i = 0; i < points.GetNumberOfPoints(); i++)
|
||||||
|
pointsA[i] =
|
||||||
|
new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
|
||||||
|
points = registeredPolydata.GetPoints();
|
||||||
|
|
||||||
|
if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) {
|
||||||
|
logger.error("registered polydata does not have the same number of points as input.");
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3];
|
||||||
|
for (int i = 0; i < points.GetNumberOfPoints(); i++)
|
||||||
|
pointsB[i] =
|
||||||
|
new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
|
||||||
|
|
||||||
|
double[][] H = new double[3][3];
|
||||||
|
for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) {
|
||||||
|
for (int i = 0; i < 3; i++) {
|
||||||
|
for (int j = 0; j < 3; j++) {
|
||||||
|
H[i][j] += pointsA[ii][i] * pointsB[ii][j];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
RealMatrix pointMatrix = new Array2DRowRealMatrix(H);
|
||||||
|
|
||||||
|
SingularValueDecomposition svd = new SingularValueDecomposition(pointMatrix);
|
||||||
|
RealMatrix uT = svd.getUT();
|
||||||
|
RealMatrix v = svd.getV();
|
||||||
|
RealMatrix R = v.multiply(uT);
|
||||||
|
|
||||||
|
if (new LUDecomposition(R).getDeterminant() < 0) {
|
||||||
|
for (int i = 0; i < 3; i++) {
|
||||||
|
R.multiplyEntry(i, 2, -1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
rotation = new Matrix33(R.getData());
|
||||||
|
translation = MathConversions.toVector3(centerB);
|
||||||
|
translation = translation.sub(rotation.mxv(MathConversions.toVector3(centerA)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (sumFile != null) {
|
||||||
|
if (rotation != null && translation != null)
|
||||||
|
sumFile.transform(MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
|
||||||
|
} else {
|
||||||
|
|
||||||
|
Vector3 center;
|
||||||
|
if (polydata.GetNumberOfPoints() > 1) {
|
||||||
|
PolyDataStatistics stats = new PolyDataStatistics(polydata);
|
||||||
|
center = new Vector3(stats.getCentroid());
|
||||||
|
} else {
|
||||||
|
center = new Vector3(polydata.GetPoint(0));
|
||||||
|
}
|
||||||
|
if (cl.hasOption("translateToCenter")) translation = center.negate();
|
||||||
|
|
||||||
|
double[] values = new double[3];
|
||||||
|
for (int j = 0; j < 3; j++) values[j] = center.getElt(j) * scale.getElt(j);
|
||||||
|
Vector3 scaledCenter = new Vector3(values);
|
||||||
|
|
||||||
|
vtkPoints points = polydata.GetPoints();
|
||||||
|
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
|
||||||
|
Vector3 thisPoint = new Vector3(points.GetPoint(i));
|
||||||
|
thisPoint = thisPoint.sub(center);
|
||||||
|
for (int j = 0; j < 3; j++) values[j] = thisPoint.getElt(j) * scale.getElt(j);
|
||||||
|
thisPoint = new Vector3(values);
|
||||||
|
thisPoint = thisPoint.add(scaledCenter);
|
||||||
|
|
||||||
|
if (rotation != null) {
|
||||||
|
if (centerOfRotation == null) centerOfRotation = new Vector3();
|
||||||
|
/*-
|
||||||
|
else {
|
||||||
|
System.out.printf("Center of rotation:\n%s\n", centerOfRotation.toString());
|
||||||
|
System.out.printf("-centerOfRotation %f,%f,%f\n", centerOfRotation.getElt(0),
|
||||||
|
centerOfRotation.getElt(1), centerOfRotation.getElt(2));
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
thisPoint = rotation.mxv(thisPoint.sub(centerOfRotation)).add(centerOfRotation);
|
||||||
|
}
|
||||||
|
if (translation != null) thisPoint = thisPoint.add(translation);
|
||||||
|
points.SetPoint(i, thisPoint.toArray());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/*-
|
||||||
|
if (rotation != null) {
|
||||||
|
AxisAndAngle aaa = new AxisAndAngle(rotation);
|
||||||
|
System.out.printf("Rotation:\n%s\n", rotation.toString());
|
||||||
|
System.out.printf("-rotate %.5e,%.5e,%.5e,%.5e\n", Math.toDegrees(aaa.getAngle()),
|
||||||
|
aaa.getAxis().getElt(0), aaa.getAxis().getElt(1), aaa.getAxis().getElt(2));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (translation != null) {
|
||||||
|
System.out.printf("Translation:\n%s\n", translation.toString());
|
||||||
|
System.out.printf("-translate %.5e,%.5e,%.5e\n", translation.getElt(0), translation.getElt(1),
|
||||||
|
translation.getElt(2));
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
|
||||||
|
if (coordType == COORDTYPE.LATLON) {
|
||||||
|
double[] pt = new double[3];
|
||||||
polydata.GetPoint(0, pt);
|
polydata.GetPoint(0, pt);
|
||||||
Vector3D point = new Vector3D(pt);
|
Vector3D point = new Vector3D(pt);
|
||||||
double lon = Math.toDegrees(point.getAlpha());
|
double lon = Math.toDegrees(point.getAlpha());
|
||||||
if (lon < 0) lon += 360;
|
if (lon < 0) lon += 360;
|
||||||
Builder builder = ImmutableSBMTEllipseRecord.builder().from(record);
|
System.out.printf("%.16f,%.16f\n", Math.toDegrees(point.getDelta()), lon);
|
||||||
builder.x(point.getX());
|
} else if (coordType == COORDTYPE.XYZ) {
|
||||||
builder.y(point.getY());
|
double[] pt = new double[3];
|
||||||
builder.z(point.getZ());
|
polydata.GetPoint(0, pt);
|
||||||
builder.lat(Math.toDegrees(point.getDelta()));
|
System.out.printf("%.16f,%.16f,%.16f\n", pt[0], pt[1], pt[2]);
|
||||||
builder.lon(lon);
|
} else {
|
||||||
builder.radius(point.getNorm());
|
filename = cl.getOptionValue("output");
|
||||||
|
extension = null;
|
||||||
|
if (cl.hasOption("outputFormat")) {
|
||||||
|
try {
|
||||||
|
extension = FORMATS.valueOf(
|
||||||
|
cl.getOptionValue("outputFormat").toUpperCase())
|
||||||
|
.name()
|
||||||
|
.toLowerCase();
|
||||||
|
} catch (IllegalArgumentException e) {
|
||||||
|
logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (extension == null)
|
||||||
|
extension = FilenameUtils.getExtension(filename).toLowerCase();
|
||||||
|
|
||||||
transformedRecords.add(builder.build());
|
switch (extension) {
|
||||||
}
|
case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename);
|
||||||
try (PrintWriter pw = new PrintWriter(filename)) {
|
case "obj" -> PolyDataUtil.saveShapeModelAsOBJ(polydata, filename);
|
||||||
for (SBMTEllipseRecord record : transformedRecords) pw.println(record.toString());
|
case "plt" -> PolyDataUtil.saveShapeModelAsPLT(polydata, filename);
|
||||||
}
|
case "stl" -> PolyDataUtil.saveShapeModelAsSTL(polydata, filename);
|
||||||
|
case "sum" -> {
|
||||||
|
try (PrintWriter pw = new PrintWriter(filename)) {
|
||||||
|
pw.print(sumFile.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case "sbmt" -> {
|
||||||
|
if (sbmtEllipse == null) {
|
||||||
|
logger.error("No input SBMT ellipse specified!");
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
double[] pt = new double[3];
|
||||||
|
List<SBMTEllipseRecord> transformedRecords = new ArrayList<>();
|
||||||
|
for (SBMTEllipseRecord record : sbmtEllipse) {
|
||||||
|
polydata.GetPoint(0, pt);
|
||||||
|
Vector3D point = new Vector3D(pt);
|
||||||
|
double lon = Math.toDegrees(point.getAlpha());
|
||||||
|
if (lon < 0) lon += 360;
|
||||||
|
Builder builder = ImmutableSBMTEllipseRecord.builder().from(record);
|
||||||
|
builder.x(point.getX());
|
||||||
|
builder.y(point.getY());
|
||||||
|
builder.z(point.getZ());
|
||||||
|
builder.lat(Math.toDegrees(point.getDelta()));
|
||||||
|
builder.lon(lon);
|
||||||
|
builder.radius(point.getNorm());
|
||||||
|
|
||||||
|
transformedRecords.add(builder.build());
|
||||||
|
}
|
||||||
|
try (PrintWriter pw = new PrintWriter(filename)) {
|
||||||
|
for (SBMTEllipseRecord record : transformedRecords) pw.println(record.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
default -> {
|
||||||
|
logger.error("Can't write output shape model {} with format {}", filename, extension.toUpperCase());
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
logger.info("Wrote {}", filename);
|
||||||
}
|
}
|
||||||
default -> {
|
|
||||||
logger.error(
|
|
||||||
"Can't write output shape model {} with format {}",
|
|
||||||
filename,
|
|
||||||
extension.toUpperCase());
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
logger.info("Wrote {}", filename);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -35,19 +57,19 @@ import terrasaur.utils.spice.SpiceBundle;
|
|||||||
|
|
||||||
public class SumFilesFromFlyby implements TerrasaurTool {
|
public class SumFilesFromFlyby implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Create sumfiles for a simplified flyby.";
|
return "Create sumfiles for a simplified flyby.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer =
|
String footer =
|
||||||
"""
|
"""
|
||||||
|
|
||||||
This tool creates sumfiles at points along a straight line trajectory past a body to be imaged.
|
This tool creates sumfiles at points along a straight line trajectory past a body to be imaged.
|
||||||
|
|
||||||
@@ -60,402 +82,366 @@ public class SumFilesFromFlyby implements TerrasaurTool {
|
|||||||
|
|
||||||
Given these assumptions, the trajectory can be specified using closest approach distance and phase along with speed.
|
Given these assumptions, the trajectory can be specified using closest approach distance and phase along with speed.
|
||||||
""";
|
""";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
|
||||||
|
|
||||||
private SumFile sumfile;
|
|
||||||
private Function<Double, Vector3D> scPosFunc;
|
|
||||||
private Function<Double, Vector3D> scVelFunc;
|
|
||||||
|
|
||||||
private SumFilesFromFlyby() {}
|
|
||||||
|
|
||||||
public SumFilesFromFlyby(SumFile sumfile, double distance, double phase, double speed) {
|
|
||||||
this.sumfile = sumfile;
|
|
||||||
|
|
||||||
// given phase angle p, closest approach point is (cos p, sin p)
|
|
||||||
Vector3D closestApproach =
|
|
||||||
new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
|
|
||||||
Vector3D velocity =
|
|
||||||
new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
|
|
||||||
|
|
||||||
/*-
|
|
||||||
* Assumptions:
|
|
||||||
*
|
|
||||||
* Sun lies along the X axis
|
|
||||||
* flyby is in the equatorial (XY) plane
|
|
||||||
*
|
|
||||||
*/
|
|
||||||
scPosFunc = t -> closestApproach.add(velocity.scalarMultiply(t));
|
|
||||||
|
|
||||||
scVelFunc = t -> velocity;
|
|
||||||
}
|
|
||||||
|
|
||||||
public SumFile getSumFile(double t) {
|
|
||||||
|
|
||||||
TimeConversion tc = TimeConversion.createUsingInternalConstants();
|
|
||||||
|
|
||||||
Builder builder = ImmutableSumFile.builder().from(sumfile);
|
|
||||||
double imageTime = t + tc.utcStringToTDB(sumfile.utcString());
|
|
||||||
builder.picnm(String.format("%s%d", sumfile.picnm(), (int) Math.round(imageTime)));
|
|
||||||
builder.utcString(tc.format("C").apply(imageTime));
|
|
||||||
|
|
||||||
Vector3D scPos = scPosFunc.apply(t);
|
|
||||||
|
|
||||||
builder.scobj(scPos.negate());
|
|
||||||
|
|
||||||
Rotation bodyFixedToCamera = RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K);
|
|
||||||
builder.cx(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_I));
|
|
||||||
builder.cy(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_J));
|
|
||||||
builder.cz(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_K));
|
|
||||||
|
|
||||||
builder.sz(Vector3D.PLUS_I.scalarMultiply(1e8));
|
|
||||||
|
|
||||||
SumFile s = builder.build();
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"{}: S/C position {}, phase {}",
|
|
||||||
s.utcString(),
|
|
||||||
s.scobj().negate(),
|
|
||||||
Math.toDegrees(Vector3D.angle(s.scobj().negate(), s.sz())));
|
|
||||||
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
|
|
||||||
private String writeMSOPCKFiles(
|
|
||||||
String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
|
|
||||||
throws SpiceException {
|
|
||||||
|
|
||||||
File commentFile = new File(basename + "_msopck-comments.txt");
|
|
||||||
if (commentFile.exists()) commentFile.delete();
|
|
||||||
String setupFile = basename + "_msopck.setup";
|
|
||||||
String inputFile = basename + "_msopck.inp";
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
|
||||||
|
|
||||||
String allComments = "";
|
|
||||||
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
File fk = bundle.findKernel(String.format(".*%s\\.tf", basename));
|
private SumFile sumfile;
|
||||||
File lsk = bundle.findKernel(".*naif[0-9]{4}\\.tls");
|
private Function<Double, Vector3D> scPosFunc;
|
||||||
|
private Function<Double, Vector3D> scVelFunc;
|
||||||
|
|
||||||
Map<String, String> map = new TreeMap<>();
|
private SumFilesFromFlyby() {}
|
||||||
map.put("LSK_FILE_NAME", "'" + lsk + "'");
|
|
||||||
map.put("MAKE_FAKE_SCLK", String.format("'%s.tsc'", basename));
|
|
||||||
map.put("CK_TYPE", "3");
|
|
||||||
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
|
|
||||||
map.put("INSTRUMENT_ID", Integer.toString(frameID));
|
|
||||||
map.put("REFERENCE_FRAME_NAME", "'J2000'");
|
|
||||||
map.put("FRAMES_FILE_NAME", "'" + fk.getPath() + "'");
|
|
||||||
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
|
|
||||||
map.put("INPUT_TIME_TYPE", "'UTC'");
|
|
||||||
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
|
|
||||||
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(setupFile)) {
|
public SumFilesFromFlyby(SumFile sumfile, double distance, double phase, double speed) {
|
||||||
pw.println("\\begindata");
|
this.sumfile = sumfile;
|
||||||
for (String key : map.keySet()) {
|
|
||||||
pw.printf("%s = %s\n", key, map.get(key));
|
// given phase angle p, closest approach point is (cos p, sin p)
|
||||||
}
|
Vector3D closestApproach = new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
|
||||||
} catch (FileNotFoundException e) {
|
Vector3D velocity = new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
|
/*-
|
||||||
|
* Assumptions:
|
||||||
|
*
|
||||||
|
* Sun lies along the X axis
|
||||||
|
* flyby is in the equatorial (XY) plane
|
||||||
|
*
|
||||||
|
*/
|
||||||
|
scPosFunc = t -> closestApproach.add(velocity.scalarMultiply(t));
|
||||||
|
|
||||||
|
scVelFunc = t -> velocity;
|
||||||
}
|
}
|
||||||
|
|
||||||
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
|
public SumFile getSumFile(double t) {
|
||||||
|
|
||||||
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
|
TimeConversion tc = TimeConversion.createUsingInternalConstants();
|
||||||
|
|
||||||
for (UnwritableInterval interval : intervals) {
|
Builder builder = ImmutableSumFile.builder().from(sumfile);
|
||||||
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
|
double imageTime = t + tc.utcStringToTDB(sumfile.utcString());
|
||||||
|
builder.picnm(String.format("%s%d", sumfile.picnm(), (int) Math.round(imageTime)));
|
||||||
double imageTime = t + t0;
|
builder.utcString(tc.format("C").apply(imageTime));
|
||||||
|
|
||||||
Vector3D scPos = scPosFunc.apply(t);
|
Vector3D scPos = scPosFunc.apply(t);
|
||||||
SpiceQuaternion q =
|
|
||||||
new SpiceQuaternion(
|
|
||||||
MathConversions.toMatrix33(
|
|
||||||
RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
|
|
||||||
|
|
||||||
attitudeMap.put(imageTime, q);
|
builder.scobj(scPos.negate());
|
||||||
}
|
|
||||||
|
Rotation bodyFixedToCamera = RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K);
|
||||||
|
builder.cx(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_I));
|
||||||
|
builder.cy(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_J));
|
||||||
|
builder.cz(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_K));
|
||||||
|
|
||||||
|
builder.sz(Vector3D.PLUS_I.scalarMultiply(1e8));
|
||||||
|
|
||||||
|
SumFile s = builder.build();
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"{}: S/C position {}, phase {}",
|
||||||
|
s.utcString(),
|
||||||
|
s.scobj().negate(),
|
||||||
|
Math.toDegrees(Vector3D.angle(s.scobj().negate(), s.sz())));
|
||||||
|
|
||||||
|
return s;
|
||||||
}
|
}
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
|
private String writeMSOPCKFiles(String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
|
||||||
for (double t : attitudeMap.keySet()) {
|
throws SpiceException {
|
||||||
SpiceQuaternion q = attitudeMap.get(t);
|
|
||||||
Vector3 v = q.getVector();
|
|
||||||
pw.printf(
|
|
||||||
"%s %.14e %.14e %.14e %.14e\n",
|
|
||||||
new TDBTime(t).toUTCString("ISOC", 6),
|
|
||||||
q.getScalar(),
|
|
||||||
v.getElt(0),
|
|
||||||
v.getElt(1),
|
|
||||||
v.getElt(2));
|
|
||||||
}
|
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
|
|
||||||
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
|
File commentFile = new File(basename + "_msopck-comments.txt");
|
||||||
}
|
if (commentFile.exists()) commentFile.delete();
|
||||||
|
String setupFile = basename + "_msopck.setup";
|
||||||
|
String inputFile = basename + "_msopck.inp";
|
||||||
|
|
||||||
/**
|
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
||||||
* @param basename file basename
|
|
||||||
* @param intervals time intervals
|
|
||||||
* @param centerID NAIF id of center body
|
|
||||||
* @param bundle SPICE bundle
|
|
||||||
* @return command to run MKSPK
|
|
||||||
*/
|
|
||||||
private String writeMKSPKFiles(
|
|
||||||
String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
|
|
||||||
|
|
||||||
String commentFile = basename + "_mkspk-comments.txt";
|
String allComments = "";
|
||||||
String setupFile = basename + "_mkspk.setup";
|
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
|
||||||
String inputFile = basename + "_mkspk.inp";
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
|
||||||
|
|
||||||
String allComments = "";
|
|
||||||
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
|
|
||||||
File lsk = bundle.findKernel(".*naif[0-9]{4}.tls");
|
|
||||||
|
|
||||||
Map<String, String> map = new TreeMap<>();
|
|
||||||
map.put("INPUT_DATA_TYPE", "'STATES'");
|
|
||||||
map.put("OUTPUT_SPK_TYPE", "13"); // hermite polynomial, unevenly spaced in time
|
|
||||||
map.put("OBJECT_ID", "-999");
|
|
||||||
map.put("CENTER_ID", String.format("%d", centerID));
|
|
||||||
map.put("COMMENT_FILE", String.format("'%s'", commentFile));
|
|
||||||
map.put("REF_FRAME_NAME", "'J2000'");
|
|
||||||
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
|
|
||||||
map.put("DATA_ORDER", "'EPOCH X Y Z VX VY VZ'");
|
|
||||||
map.put("DATA_DELIMITER", "' '");
|
|
||||||
map.put("LEAPSECONDS_FILE", String.format("'%s'", lsk));
|
|
||||||
map.put("TIME_WRAPPER", "'# ETSECONDS'");
|
|
||||||
map.put("POLYNOM_DEGREE", "7");
|
|
||||||
map.put("SEGMENT_ID", "'SPK_STATES_13'");
|
|
||||||
map.put("LINES_PER_RECORD", "1");
|
|
||||||
try (PrintWriter pw = new PrintWriter(setupFile)) {
|
|
||||||
pw.println("\\begindata");
|
|
||||||
for (String key : map.keySet()) {
|
|
||||||
pw.printf("%s = %s\n", key, map.get(key));
|
|
||||||
}
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
|
|
||||||
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(inputFile)) {
|
|
||||||
for (UnwritableInterval interval : intervals) {
|
|
||||||
for (double t = interval.getBegin();
|
|
||||||
t < interval.getEnd();
|
|
||||||
t += interval.getLength() / 100) {
|
|
||||||
|
|
||||||
Vector3D scPos = scPosFunc.apply(t);
|
|
||||||
Vector3D scVel = scVelFunc.apply(t);
|
|
||||||
pw.printf(
|
|
||||||
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
|
|
||||||
t + t0,
|
|
||||||
scPos.getX(),
|
|
||||||
scPos.getY(),
|
|
||||||
scPos.getZ(),
|
|
||||||
scVel.getX(),
|
|
||||||
scVel.getY(),
|
|
||||||
scVel.getZ());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
return String.format(
|
|
||||||
"mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("distance")
|
|
||||||
.hasArg()
|
|
||||||
.required()
|
|
||||||
.desc("Required. Flyby distance from body center in km.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logFile")
|
|
||||||
.hasArg()
|
|
||||||
.desc("If present, save screen output to log file.")
|
|
||||||
.build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("logLevel")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim()
|
|
||||||
+ ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("mk")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("mkspk")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("msopck")
|
|
||||||
.desc("If present, create input files for MSOPCK. This option requires -lsk.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("phase")
|
|
||||||
.hasArg()
|
|
||||||
.required()
|
|
||||||
.desc("Required. Phase angle at closest approach in degrees.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("speed")
|
|
||||||
.hasArg()
|
|
||||||
.required()
|
|
||||||
.desc("Required. Flyby speed at closest approach in km/s.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("template")
|
|
||||||
.hasArg()
|
|
||||||
.required()
|
|
||||||
.desc(
|
|
||||||
"Required. An existing sumfile to use as a template. Camera parameters are taken from this "
|
|
||||||
+ "file, while camera position and orientation are calculated.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("times")
|
|
||||||
.hasArgs()
|
|
||||||
.desc(
|
|
||||||
"If present, list of times separated by white space. In seconds, relative to closest approach.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws IOException, SpiceException {
|
|
||||||
TerrasaurTool defaultOBJ = new SumFilesFromFlyby();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
double phase = Double.parseDouble(cl.getOptionValue("phase"));
|
|
||||||
if (phase < 0 || phase > 180) {
|
|
||||||
logger.error("Phase angle {} out of range [0, 180]", phase);
|
|
||||||
System.exit(0);
|
|
||||||
}
|
|
||||||
|
|
||||||
String sumFileTemplate = cl.getOptionValue("template");
|
|
||||||
|
|
||||||
String base = FilenameUtils.getBaseName(sumFileTemplate);
|
|
||||||
String ext = FilenameUtils.getExtension(sumFileTemplate);
|
|
||||||
SumFilesFromFlyby app =
|
|
||||||
new SumFilesFromFlyby(
|
|
||||||
SumFile.fromFile(new File(sumFileTemplate)),
|
|
||||||
Double.parseDouble(cl.getOptionValue("distance")),
|
|
||||||
Math.toRadians(phase),
|
|
||||||
Double.parseDouble(cl.getOptionValue("speed")));
|
|
||||||
|
|
||||||
NavigableSet<Double> times = new TreeSet<>();
|
|
||||||
if (cl.hasOption("times")) {
|
|
||||||
for (String s : cl.getOptionValues("times")) times.add(Double.parseDouble(s));
|
|
||||||
} else times.add(0.);
|
|
||||||
|
|
||||||
SpiceBundle bundle = null;
|
|
||||||
if (cl.hasOption("mk")) {
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
bundle =
|
|
||||||
new SpiceBundle.Builder()
|
|
||||||
.addMetakernels(Collections.singletonList(cl.getOptionValue("mk")))
|
|
||||||
.build();
|
|
||||||
KernelDatabase.load(cl.getOptionValue("mk"));
|
|
||||||
}
|
|
||||||
|
|
||||||
TimeConversion tc =
|
|
||||||
bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
|
|
||||||
|
|
||||||
for (double t : times) {
|
|
||||||
SumFile s = app.getSumFile(t);
|
|
||||||
|
|
||||||
try (PrintWriter pw =
|
|
||||||
new PrintWriter(
|
|
||||||
String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
|
|
||||||
|
|
||||||
pw.println(s);
|
|
||||||
|
|
||||||
} catch (FileNotFoundException e) {
|
|
||||||
logger.error(e.getLocalizedMessage(), e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (cl.hasOption("mkspk")) {
|
|
||||||
if (bundle == null) {
|
|
||||||
logger.error("Need -mk to use -mkspk!");
|
|
||||||
} else {
|
|
||||||
IntervalSet.Builder builder = IntervalSet.builder();
|
|
||||||
for (Double t : times) {
|
|
||||||
Double next = times.higher(t);
|
|
||||||
if (next != null) builder.add(new UnwritableInterval(t, next));
|
|
||||||
}
|
|
||||||
int centerID = Integer.parseInt(cl.getOptionValue("mkspk"));
|
|
||||||
|
|
||||||
String command = app.writeMKSPKFiles(base, builder.build(), centerID, bundle);
|
|
||||||
logger.info("Command to create SPK:\n{}", command);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (cl.hasOption("msopck")) {
|
|
||||||
if (bundle == null) {
|
|
||||||
logger.error("Need -mk to use -msopck!");
|
|
||||||
} else {
|
|
||||||
IntervalSet.Builder builder = IntervalSet.builder();
|
|
||||||
for (Double t : times) {
|
|
||||||
Double next = times.higher(t);
|
|
||||||
if (next != null) builder.add(new UnwritableInterval(t, next));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
final int scID = -999;
|
File fk = bundle.findKernel(String.format(".*%s\\.tf", basename));
|
||||||
final int frameID = scID * 1000;
|
File lsk = bundle.findKernel(".*naif[0-9]{4}\\.tls");
|
||||||
|
|
||||||
File spacecraftFK = new File(String.format("%s.tf", base));
|
Map<String, String> map = new TreeMap<>();
|
||||||
try (PrintWriter pw = new PrintWriter(spacecraftFK)) {
|
map.put("LSK_FILE_NAME", "'" + lsk + "'");
|
||||||
pw.println("\\begindata");
|
map.put("MAKE_FAKE_SCLK", String.format("'%s.tsc'", basename));
|
||||||
pw.printf("FRAME_%s_FIXED = %d\n", base, frameID);
|
map.put("CK_TYPE", "3");
|
||||||
pw.printf("FRAME_%d_NAME = '%s_FIXED'\n", frameID, base);
|
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
|
||||||
pw.printf("FRAME_%d_CLASS = 3\n", frameID);
|
map.put("INSTRUMENT_ID", Integer.toString(frameID));
|
||||||
pw.printf("FRAME_%d_CENTER = %d\n", frameID, scID);
|
map.put("REFERENCE_FRAME_NAME", "'J2000'");
|
||||||
pw.printf("FRAME_%d_CLASS_ID = %d\n", frameID, frameID);
|
map.put("FRAMES_FILE_NAME", "'" + fk.getPath() + "'");
|
||||||
pw.println("\\begintext");
|
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
|
||||||
|
map.put("INPUT_TIME_TYPE", "'UTC'");
|
||||||
|
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
|
||||||
|
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(setupFile)) {
|
||||||
|
pw.println("\\begindata");
|
||||||
|
for (String key : map.keySet()) {
|
||||||
|
pw.printf("%s = %s\n", key, map.get(key));
|
||||||
|
}
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
}
|
}
|
||||||
|
|
||||||
List<File> kernels = new ArrayList<>(bundle.getKernels());
|
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
|
||||||
kernels.add(spacecraftFK);
|
|
||||||
|
|
||||||
bundle = new SpiceBundle.Builder().addKernelList(kernels).build();
|
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
|
||||||
|
|
||||||
File spacecraftSCLK = new File(String.format("%s.tsc", base));
|
for (UnwritableInterval interval : intervals) {
|
||||||
if (spacecraftSCLK.exists()) spacecraftSCLK.delete();
|
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
|
||||||
|
|
||||||
String command = app.writeMSOPCKFiles(base, builder.build(), frameID, bundle);
|
double imageTime = t + t0;
|
||||||
logger.info("Command to create SPK:\n{}", command);
|
|
||||||
}
|
Vector3D scPos = scPosFunc.apply(t);
|
||||||
|
SpiceQuaternion q = new SpiceQuaternion(
|
||||||
|
MathConversions.toMatrix33(RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
|
||||||
|
|
||||||
|
attitudeMap.put(imageTime, q);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
|
||||||
|
for (double t : attitudeMap.keySet()) {
|
||||||
|
SpiceQuaternion q = attitudeMap.get(t);
|
||||||
|
Vector3 v = q.getVector();
|
||||||
|
pw.printf(
|
||||||
|
"%s %.14e %.14e %.14e %.14e\n",
|
||||||
|
new TDBTime(t).toUTCString("ISOC", 6), q.getScalar(), v.getElt(0), v.getElt(1), v.getElt(2));
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @param basename file basename
|
||||||
|
* @param intervals time intervals
|
||||||
|
* @param centerID NAIF id of center body
|
||||||
|
* @param bundle SPICE bundle
|
||||||
|
* @return command to run MKSPK
|
||||||
|
*/
|
||||||
|
private String writeMKSPKFiles(String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
|
||||||
|
|
||||||
|
String commentFile = basename + "_mkspk-comments.txt";
|
||||||
|
String setupFile = basename + "_mkspk.setup";
|
||||||
|
String inputFile = basename + "_mkspk.inp";
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(commentFile)) {
|
||||||
|
|
||||||
|
String allComments = "";
|
||||||
|
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
File lsk = bundle.findKernel(".*naif[0-9]{4}.tls");
|
||||||
|
|
||||||
|
Map<String, String> map = new TreeMap<>();
|
||||||
|
map.put("INPUT_DATA_TYPE", "'STATES'");
|
||||||
|
map.put("OUTPUT_SPK_TYPE", "13"); // hermite polynomial, unevenly spaced in time
|
||||||
|
map.put("OBJECT_ID", "-999");
|
||||||
|
map.put("CENTER_ID", String.format("%d", centerID));
|
||||||
|
map.put("COMMENT_FILE", String.format("'%s'", commentFile));
|
||||||
|
map.put("REF_FRAME_NAME", "'J2000'");
|
||||||
|
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
|
||||||
|
map.put("DATA_ORDER", "'EPOCH X Y Z VX VY VZ'");
|
||||||
|
map.put("DATA_DELIMITER", "' '");
|
||||||
|
map.put("LEAPSECONDS_FILE", String.format("'%s'", lsk));
|
||||||
|
map.put("TIME_WRAPPER", "'# ETSECONDS'");
|
||||||
|
map.put("POLYNOM_DEGREE", "7");
|
||||||
|
map.put("SEGMENT_ID", "'SPK_STATES_13'");
|
||||||
|
map.put("LINES_PER_RECORD", "1");
|
||||||
|
try (PrintWriter pw = new PrintWriter(setupFile)) {
|
||||||
|
pw.println("\\begindata");
|
||||||
|
for (String key : map.keySet()) {
|
||||||
|
pw.printf("%s = %s\n", key, map.get(key));
|
||||||
|
}
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
|
||||||
|
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
|
||||||
|
|
||||||
|
try (PrintWriter pw = new PrintWriter(inputFile)) {
|
||||||
|
for (UnwritableInterval interval : intervals) {
|
||||||
|
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
|
||||||
|
|
||||||
|
Vector3D scPos = scPosFunc.apply(t);
|
||||||
|
Vector3D scVel = scVelFunc.apply(t);
|
||||||
|
pw.printf(
|
||||||
|
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
|
||||||
|
t + t0, scPos.getX(), scPos.getY(), scPos.getZ(), scVel.getX(), scVel.getY(), scVel.getZ());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
return String.format("mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("distance")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc("Required. Flyby distance from body center in km.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim()
|
||||||
|
+ ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("mk")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("mkspk")
|
||||||
|
.hasArg()
|
||||||
|
.desc(
|
||||||
|
"If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("msopck")
|
||||||
|
.desc("If present, create input files for MSOPCK. This option requires -lsk.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("phase")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc("Required. Phase angle at closest approach in degrees.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("speed")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc("Required. Flyby speed at closest approach in km/s.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("template")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc("Required. An existing sumfile to use as a template. Camera parameters are taken from this "
|
||||||
|
+ "file, while camera position and orientation are calculated.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("times")
|
||||||
|
.hasArgs()
|
||||||
|
.desc("If present, list of times separated by white space. In seconds, relative to closest approach.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws IOException, SpiceException {
|
||||||
|
TerrasaurTool defaultOBJ = new SumFilesFromFlyby();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
double phase = Double.parseDouble(cl.getOptionValue("phase"));
|
||||||
|
if (phase < 0 || phase > 180) {
|
||||||
|
logger.error("Phase angle {} out of range [0, 180]", phase);
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
String sumFileTemplate = cl.getOptionValue("template");
|
||||||
|
|
||||||
|
String base = FilenameUtils.getBaseName(sumFileTemplate);
|
||||||
|
String ext = FilenameUtils.getExtension(sumFileTemplate);
|
||||||
|
SumFilesFromFlyby app = new SumFilesFromFlyby(
|
||||||
|
SumFile.fromFile(new File(sumFileTemplate)),
|
||||||
|
Double.parseDouble(cl.getOptionValue("distance")),
|
||||||
|
Math.toRadians(phase),
|
||||||
|
Double.parseDouble(cl.getOptionValue("speed")));
|
||||||
|
|
||||||
|
NavigableSet<Double> times = new TreeSet<>();
|
||||||
|
if (cl.hasOption("times")) {
|
||||||
|
for (String s : cl.getOptionValues("times")) times.add(Double.parseDouble(s));
|
||||||
|
} else times.add(0.);
|
||||||
|
|
||||||
|
SpiceBundle bundle = null;
|
||||||
|
if (cl.hasOption("mk")) {
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
bundle = new SpiceBundle.Builder()
|
||||||
|
.addMetakernels(Collections.singletonList(cl.getOptionValue("mk")))
|
||||||
|
.build();
|
||||||
|
KernelDatabase.load(cl.getOptionValue("mk"));
|
||||||
|
}
|
||||||
|
|
||||||
|
TimeConversion tc = bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
|
||||||
|
|
||||||
|
for (double t : times) {
|
||||||
|
SumFile s = app.getSumFile(t);
|
||||||
|
|
||||||
|
try (PrintWriter pw =
|
||||||
|
new PrintWriter(String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
|
||||||
|
|
||||||
|
pw.println(s);
|
||||||
|
|
||||||
|
} catch (FileNotFoundException e) {
|
||||||
|
logger.error(e.getLocalizedMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("mkspk")) {
|
||||||
|
if (bundle == null) {
|
||||||
|
logger.error("Need -mk to use -mkspk!");
|
||||||
|
} else {
|
||||||
|
IntervalSet.Builder builder = IntervalSet.builder();
|
||||||
|
for (Double t : times) {
|
||||||
|
Double next = times.higher(t);
|
||||||
|
if (next != null) builder.add(new UnwritableInterval(t, next));
|
||||||
|
}
|
||||||
|
int centerID = Integer.parseInt(cl.getOptionValue("mkspk"));
|
||||||
|
|
||||||
|
String command = app.writeMKSPKFiles(base, builder.build(), centerID, bundle);
|
||||||
|
logger.info("Command to create SPK:\n{}", command);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cl.hasOption("msopck")) {
|
||||||
|
if (bundle == null) {
|
||||||
|
logger.error("Need -mk to use -msopck!");
|
||||||
|
} else {
|
||||||
|
IntervalSet.Builder builder = IntervalSet.builder();
|
||||||
|
for (Double t : times) {
|
||||||
|
Double next = times.higher(t);
|
||||||
|
if (next != null) builder.add(new UnwritableInterval(t, next));
|
||||||
|
}
|
||||||
|
|
||||||
|
final int scID = -999;
|
||||||
|
final int frameID = scID * 1000;
|
||||||
|
|
||||||
|
File spacecraftFK = new File(String.format("%s.tf", base));
|
||||||
|
try (PrintWriter pw = new PrintWriter(spacecraftFK)) {
|
||||||
|
pw.println("\\begindata");
|
||||||
|
pw.printf("FRAME_%s_FIXED = %d\n", base, frameID);
|
||||||
|
pw.printf("FRAME_%d_NAME = '%s_FIXED'\n", frameID, base);
|
||||||
|
pw.printf("FRAME_%d_CLASS = 3\n", frameID);
|
||||||
|
pw.printf("FRAME_%d_CENTER = %d\n", frameID, scID);
|
||||||
|
pw.printf("FRAME_%d_CLASS_ID = %d\n", frameID, frameID);
|
||||||
|
pw.println("\\begintext");
|
||||||
|
}
|
||||||
|
|
||||||
|
List<File> kernels = new ArrayList<>(bundle.getKernels());
|
||||||
|
kernels.add(spacecraftFK);
|
||||||
|
|
||||||
|
bundle = new SpiceBundle.Builder().addKernelList(kernels).build();
|
||||||
|
|
||||||
|
File spacecraftSCLK = new File(String.format("%s.tsc", base));
|
||||||
|
if (spacecraftSCLK.exists()) spacecraftSCLK.delete();
|
||||||
|
|
||||||
|
String command = app.writeMSOPCKFiles(base, builder.build(), frameID, bundle);
|
||||||
|
logger.info("Command to create SPK:\n{}", command);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.awt.Color;
|
import java.awt.Color;
|
||||||
@@ -42,298 +64,299 @@ import terrasaur.utils.tessellation.FibonacciSphere;
|
|||||||
*/
|
*/
|
||||||
public class TileLookup implements TerrasaurTool {
|
public class TileLookup implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Locate tiles on the unit sphere.";
|
return "Locate tiles on the unit sphere.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
String header = "";
|
|
||||||
String footer = "";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Given the base database name and a tile number, return the path to that database file. For
|
|
||||||
* example:
|
|
||||||
*
|
|
||||||
* <pre>
|
|
||||||
* System.out.println(TileLookup.getDBName("/path/to/database/ola.db", 6));
|
|
||||||
* System.out.println(TileLookup.getDBName("./ola.db", 6));
|
|
||||||
* System.out.println(TileLookup.getDBName("ola.db", 6));
|
|
||||||
*
|
|
||||||
* /path/to/database/ola.6.db
|
|
||||||
* ./ola.6.db
|
|
||||||
* ./ola.6.db
|
|
||||||
* </pre>
|
|
||||||
*
|
|
||||||
* @param dbName basename for database (e.g. /path/to/database/ola.db)
|
|
||||||
* @param tile tile index (e.g. 6)
|
|
||||||
* @return path to database file (e.g. /path/to/database/ola.6.db)
|
|
||||||
*/
|
|
||||||
public static String getDBName(String dbName, int tile) {
|
|
||||||
String fullPath = FilenameUtils.getFullPath(dbName);
|
|
||||||
if (fullPath.trim().isEmpty()) fullPath = ".";
|
|
||||||
if (!fullPath.endsWith(File.separator)) fullPath += File.separator;
|
|
||||||
return String.format(
|
|
||||||
"%s%s.%d.%s",
|
|
||||||
fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("nTiles")
|
|
||||||
.hasArg()
|
|
||||||
.required()
|
|
||||||
.desc("Number of points covering the sphere.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("printCoords")
|
|
||||||
.desc(
|
|
||||||
"Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("printDistance")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Print a table of points sorted by distance from the input point. "
|
|
||||||
+ "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("printStats")
|
|
||||||
.desc(
|
|
||||||
"Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("png")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
TerrasaurTool defaultOBJ = new TileLookup();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
final int npts = Integer.parseInt(cl.getOptionValue("nTiles"));
|
|
||||||
FibonacciSphere fs = new FibonacciSphere(npts);
|
|
||||||
|
|
||||||
if (cl.hasOption("printCoords")) {
|
|
||||||
String header = String.format("%7s, %10s, %9s", "# index", "longitude", "latitude");
|
|
||||||
System.out.println(header);
|
|
||||||
// System.out.printf("%7s, %10s, %9s, %6s\n", "# index", "longitude", "latitude", "mapola");
|
|
||||||
for (int i = 0; i < npts; i++) {
|
|
||||||
LatitudinalVector lv = fs.getTileCenter(i);
|
|
||||||
double lon = Math.toDegrees(lv.getLongitude());
|
|
||||||
if (lon < 0) lon += 360;
|
|
||||||
double lat = Math.toDegrees(lv.getLatitude());
|
|
||||||
System.out.printf("%7d, %10.5f, %9.5f\n", i, lon, lat);
|
|
||||||
}
|
|
||||||
System.exit(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("printStats")) {
|
@Override
|
||||||
System.out.println(
|
public String fullDescription(Options options) {
|
||||||
"Statistics on distances between each point and its nearest neighbor (degrees):");
|
String header = "";
|
||||||
System.out.println(fs.getDistanceStats());
|
String footer = "";
|
||||||
System.exit(0);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("printDistance")) {
|
/**
|
||||||
String[] parts = cl.getOptionValue("printDistance").split(",");
|
* Given the base database name and a tile number, return the path to that database file. For
|
||||||
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
|
* example:
|
||||||
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
|
*
|
||||||
LatitudinalVector lv = new LatitudinalVector(1, lat, lon);
|
* <pre>
|
||||||
NavigableMap<Double, Integer> distanceMap = fs.getDistanceMap(lv);
|
* System.out.println(TileLookup.getDBName("/path/to/database/ola.db", 6));
|
||||||
System.out.printf("%11s, %5s, %10s, %9s\n", "# distance", "index", "longitude", "latitude");
|
* System.out.println(TileLookup.getDBName("./ola.db", 6));
|
||||||
System.out.printf("%11s, %5s, %10s, %9s\n", "# (degrees)", "", "(degrees)", "(degrees)");
|
* System.out.println(TileLookup.getDBName("ola.db", 6));
|
||||||
for (Double dist : distanceMap.keySet()) {
|
*
|
||||||
int index = distanceMap.get(dist);
|
* /path/to/database/ola.6.db
|
||||||
lv = fs.getTileCenter(index);
|
* ./ola.6.db
|
||||||
System.out.printf(
|
* ./ola.6.db
|
||||||
"%11.5f, %5d, %10.5f, %9.5f\n",
|
* </pre>
|
||||||
Math.toDegrees(dist),
|
*
|
||||||
index,
|
* @param dbName basename for database (e.g. /path/to/database/ola.db)
|
||||||
Math.toDegrees(lv.getLongitude()),
|
* @param tile tile index (e.g. 6)
|
||||||
Math.toDegrees(lv.getLatitude()));
|
* @return path to database file (e.g. /path/to/database/ola.6.db)
|
||||||
}
|
*/
|
||||||
System.exit(0);
|
public static String getDBName(String dbName, int tile) {
|
||||||
|
String fullPath = FilenameUtils.getFullPath(dbName);
|
||||||
|
if (fullPath.trim().isEmpty()) fullPath = ".";
|
||||||
|
if (!fullPath.endsWith(File.separator)) fullPath += File.separator;
|
||||||
|
return String.format(
|
||||||
|
"%s%s.%d.%s", fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
|
||||||
}
|
}
|
||||||
|
|
||||||
if (cl.hasOption("png")) {
|
private static Options defineOptions() {
|
||||||
PlotConfig config = ImmutablePlotConfig.builder().width(1000).height(1000).build();
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("nTiles")
|
||||||
String title = String.format("Fibonacci Sphere, n = %d, ", npts);
|
.hasArg()
|
||||||
|
.required()
|
||||||
Map<String, Projection> projections = new LinkedHashMap<>();
|
.desc("Number of points covering the sphere.")
|
||||||
projections.put(
|
.build());
|
||||||
title + "Rectangular Projection",
|
options.addOption(Option.builder("printCoords")
|
||||||
new ProjectionRectangular(config.width(), config.height() / 2));
|
.desc(
|
||||||
projections.put(
|
"Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.")
|
||||||
title + "Orthographic Projection (0, 90)",
|
.build());
|
||||||
new ProjectionOrthographic(
|
options.addOption(Option.builder("printDistance")
|
||||||
config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0)));
|
.hasArg()
|
||||||
projections.put(
|
.desc(
|
||||||
title + "Orthographic Projection (0, 0)",
|
"Print a table of points sorted by distance from the input point. "
|
||||||
new ProjectionOrthographic(
|
+ "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.")
|
||||||
config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
|
.build());
|
||||||
projections.put(
|
options.addOption(Option.builder("printStats")
|
||||||
title + "Orthographic Projection (90, 0)",
|
.desc(
|
||||||
new ProjectionOrthographic(
|
"Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.")
|
||||||
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2)));
|
.build());
|
||||||
projections.put(
|
options.addOption(Option.builder("png")
|
||||||
title + "Orthographic Projection (180, 0)",
|
.hasArg()
|
||||||
new ProjectionOrthographic(
|
.desc(
|
||||||
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
|
"Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.")
|
||||||
projections.put(
|
.build());
|
||||||
title + "Orthographic Projection (270, 0)",
|
return options;
|
||||||
new ProjectionOrthographic(
|
}
|
||||||
config.width(), config.height(), new LatitudinalVector(1, 0, 3 * Math.PI / 2)));
|
|
||||||
projections.put(
|
public static void main(String[] args) {
|
||||||
title + "Orthographic Projection (0, -90)",
|
TerrasaurTool defaultOBJ = new TileLookup();
|
||||||
new ProjectionOrthographic(
|
|
||||||
config.width(), config.height(), new LatitudinalVector(1, -Math.PI / 2, 0)));
|
Options options = defineOptions();
|
||||||
|
|
||||||
final int nColors = 6;
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
ColorRamp ramp = ColorRamp.createLinear(1, nColors - 1);
|
|
||||||
List<Color> colors = new ArrayList<>();
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
colors.add(Color.BLACK);
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i));
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
colors.add(Color.WHITE);
|
|
||||||
ramp = ImmutableColorRamp.builder().min(0).max(nColors).colors(colors).build();
|
final int npts = Integer.parseInt(cl.getOptionValue("nTiles"));
|
||||||
|
FibonacciSphere fs = new FibonacciSphere(npts);
|
||||||
double radius = fs.getDistanceStats().getMax();
|
|
||||||
ramp = ColorRamp.createLinear(0, radius).addLimitColors();
|
if (cl.hasOption("printCoords")) {
|
||||||
|
String header = String.format("%7s, %10s, %9s", "# index", "longitude", "latitude");
|
||||||
ArrayList<BufferedImage> images = new ArrayList<>();
|
System.out.println(header);
|
||||||
for (String t : projections.keySet()) {
|
// System.out.printf("%7s, %10s, %9s, %6s\n", "# index", "longitude", "latitude", "mapola");
|
||||||
config = ImmutablePlotConfig.builder().from(config).title(t).build();
|
for (int i = 0; i < npts; i++) {
|
||||||
Projection p = projections.get(t);
|
LatitudinalVector lv = fs.getTileCenter(i);
|
||||||
|
double lon = Math.toDegrees(lv.getLongitude());
|
||||||
if (p instanceof ProjectionRectangular)
|
if (lon < 0) lon += 360;
|
||||||
config = ImmutablePlotConfig.builder().from(config).height(500).build();
|
double lat = Math.toDegrees(lv.getLatitude());
|
||||||
else config = ImmutablePlotConfig.builder().from(config).height(1000).build();
|
System.out.printf("%7d, %10.5f, %9.5f\n", i, lon, lat);
|
||||||
|
}
|
||||||
MapPlot canvas = new MapPlot(config, p);
|
System.exit(0);
|
||||||
AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE");
|
}
|
||||||
AxisY yLeftAxis = new AxisY(-90, 90, "Latitude (degrees)", "%.0f");
|
|
||||||
|
if (cl.hasOption("printStats")) {
|
||||||
canvas.drawTitle();
|
System.out.println("Statistics on distances between each point and its nearest neighbor (degrees):");
|
||||||
canvas.setAxes(xLowerAxis, yLeftAxis);
|
System.out.println(fs.getDistanceStats());
|
||||||
// canvas.drawAxes();
|
System.exit(0);
|
||||||
|
}
|
||||||
BufferedImage image = canvas.getImage();
|
|
||||||
for (int i = 0; i < config.width(); i++) {
|
if (cl.hasOption("printDistance")) {
|
||||||
for (int j = 0; j < config.height(); j++) {
|
String[] parts = cl.getOptionValue("printDistance").split(",");
|
||||||
LatitudinalVector lv = p.pixelToSpherical(i, j);
|
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
|
||||||
if (lv == null) continue;
|
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
|
||||||
double closestDistance = Math.toDegrees(fs.getNearest(lv).getKey());
|
LatitudinalVector lv = new LatitudinalVector(1, lat, lon);
|
||||||
// int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size();
|
NavigableMap<Double, Integer> distanceMap = fs.getDistanceMap(lv);
|
||||||
image.setRGB(
|
System.out.printf("%11s, %5s, %10s, %9s\n", "# distance", "index", "longitude", "latitude");
|
||||||
config.leftMargin() + i,
|
System.out.printf("%11s, %5s, %10s, %9s\n", "# (degrees)", "", "(degrees)", "(degrees)");
|
||||||
config.topMargin() + j,
|
for (Double dist : distanceMap.keySet()) {
|
||||||
ramp.getColor(closestDistance).getRGB());
|
int index = distanceMap.get(dist);
|
||||||
}
|
lv = fs.getTileCenter(index);
|
||||||
}
|
System.out.printf(
|
||||||
|
"%11.5f, %5d, %10.5f, %9.5f\n",
|
||||||
DiscreteDataSet points = new DiscreteDataSet("");
|
Math.toDegrees(dist),
|
||||||
for (int i = 0; i < fs.getNumTiles(); i++) {
|
index,
|
||||||
LatitudinalVector lv = fs.getTileCenter(i);
|
Math.toDegrees(lv.getLongitude()),
|
||||||
points.add(lv.getLongitude(), lv.getLatitude());
|
Math.toDegrees(lv.getLatitude()));
|
||||||
}
|
}
|
||||||
|
System.exit(0);
|
||||||
if (p instanceof ProjectionRectangular) {
|
}
|
||||||
canvas.drawColorBar(
|
|
||||||
ImmutableColorBar.builder()
|
if (cl.hasOption("png")) {
|
||||||
.rect(
|
PlotConfig config =
|
||||||
new Rectangle(
|
ImmutablePlotConfig.builder().width(1000).height(1000).build();
|
||||||
canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
|
|
||||||
.ramp(ramp)
|
String title = String.format("Fibonacci Sphere, n = %d, ", npts);
|
||||||
.numTicks(nColors + 1)
|
|
||||||
.tickFunction(aDouble -> String.format("%.1f", aDouble))
|
Map<String, Projection> projections = new LinkedHashMap<>();
|
||||||
.build());
|
projections.put(
|
||||||
|
title + "Rectangular Projection", new ProjectionRectangular(config.width(), config.height() / 2));
|
||||||
// for (int i = 0; i < fs.getNumTiles(); i++) {
|
projections.put(
|
||||||
// LatitudinalVector lv = fs.getTileCenter(i);
|
title + "Orthographic Projection (0, 90)",
|
||||||
// canvas.drawCircle(lv, radius, Math.toRadians(1), Color.RED);
|
new ProjectionOrthographic(
|
||||||
// }
|
config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0)));
|
||||||
}
|
projections.put(
|
||||||
|
title + "Orthographic Projection (0, 0)",
|
||||||
for (int i = 0; i < fs.getNumTiles(); i++) {
|
new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
|
||||||
LatitudinalVector lv = fs.getTileCenter(i);
|
projections.put(
|
||||||
canvas.addAnnotation(
|
title + "Orthographic Projection (90, 0)",
|
||||||
ImmutableAnnotation.builder().text(String.format("%d", i)).build(),
|
new ProjectionOrthographic(
|
||||||
lv.getLongitude(),
|
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2)));
|
||||||
lv.getLatitude());
|
projections.put(
|
||||||
}
|
title + "Orthographic Projection (180, 0)",
|
||||||
|
new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
|
||||||
images.add(canvas.getImage());
|
projections.put(
|
||||||
}
|
title + "Orthographic Projection (270, 0)",
|
||||||
|
new ProjectionOrthographic(
|
||||||
int width = 2400;
|
config.width(), config.height(), new LatitudinalVector(1, 0, 3 * Math.PI / 2)));
|
||||||
int height = 2400;
|
projections.put(
|
||||||
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
|
title + "Orthographic Projection (0, -90)",
|
||||||
|
new ProjectionOrthographic(
|
||||||
Graphics2D g = image.createGraphics();
|
config.width(), config.height(), new LatitudinalVector(1, -Math.PI / 2, 0)));
|
||||||
g.setRenderingHint(
|
|
||||||
RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
|
final int nColors = 6;
|
||||||
int imageWidth = width;
|
ColorRamp ramp = ColorRamp.createLinear(1, nColors - 1);
|
||||||
int imageHeight = height / 3;
|
List<Color> colors = new ArrayList<>();
|
||||||
g.drawImage(
|
colors.add(Color.BLACK);
|
||||||
images.getFirst(),
|
for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i));
|
||||||
width / 6,
|
colors.add(Color.WHITE);
|
||||||
0,
|
ramp = ImmutableColorRamp.builder()
|
||||||
5 * width / 6,
|
.min(0)
|
||||||
imageHeight,
|
.max(nColors)
|
||||||
0,
|
.colors(colors)
|
||||||
0,
|
.build();
|
||||||
images.getFirst().getWidth(),
|
|
||||||
images.getFirst().getHeight(),
|
double radius = fs.getDistanceStats().getMax();
|
||||||
null);
|
ramp = ColorRamp.createLinear(0, radius).addLimitColors();
|
||||||
|
|
||||||
imageWidth = width / 3;
|
ArrayList<BufferedImage> images = new ArrayList<>();
|
||||||
for (int i = 1; i < 4; i++) {
|
for (String t : projections.keySet()) {
|
||||||
g.drawImage(
|
config = ImmutablePlotConfig.builder().from(config).title(t).build();
|
||||||
images.get(i),
|
Projection p = projections.get(t);
|
||||||
(i - 1) * imageWidth,
|
|
||||||
imageHeight,
|
if (p instanceof ProjectionRectangular)
|
||||||
i * imageWidth,
|
config = ImmutablePlotConfig.builder()
|
||||||
2 * imageHeight,
|
.from(config)
|
||||||
0,
|
.height(500)
|
||||||
0,
|
.build();
|
||||||
images.get(i).getWidth(),
|
else
|
||||||
images.get(i).getHeight(),
|
config = ImmutablePlotConfig.builder()
|
||||||
null);
|
.from(config)
|
||||||
}
|
.height(1000)
|
||||||
for (int i = 4; i < 7; i++) {
|
.build();
|
||||||
g.drawImage(
|
|
||||||
images.get(i),
|
MapPlot canvas = new MapPlot(config, p);
|
||||||
(i - 4) * imageWidth,
|
AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE");
|
||||||
2 * imageHeight,
|
AxisY yLeftAxis = new AxisY(-90, 90, "Latitude (degrees)", "%.0f");
|
||||||
(i - 3) * imageWidth,
|
|
||||||
3 * imageHeight,
|
canvas.drawTitle();
|
||||||
0,
|
canvas.setAxes(xLowerAxis, yLeftAxis);
|
||||||
0,
|
// canvas.drawAxes();
|
||||||
images.get(i).getWidth(),
|
|
||||||
images.get(i).getHeight(),
|
BufferedImage image = canvas.getImage();
|
||||||
null);
|
for (int i = 0; i < config.width(); i++) {
|
||||||
}
|
for (int j = 0; j < config.height(); j++) {
|
||||||
g.dispose();
|
LatitudinalVector lv = p.pixelToSpherical(i, j);
|
||||||
|
if (lv == null) continue;
|
||||||
PlotCanvas.writeImage(cl.getOptionValue("png"), image);
|
double closestDistance =
|
||||||
|
Math.toDegrees(fs.getNearest(lv).getKey());
|
||||||
|
// int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size();
|
||||||
|
image.setRGB(
|
||||||
|
config.leftMargin() + i,
|
||||||
|
config.topMargin() + j,
|
||||||
|
ramp.getColor(closestDistance).getRGB());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
DiscreteDataSet points = new DiscreteDataSet("");
|
||||||
|
for (int i = 0; i < fs.getNumTiles(); i++) {
|
||||||
|
LatitudinalVector lv = fs.getTileCenter(i);
|
||||||
|
points.add(lv.getLongitude(), lv.getLatitude());
|
||||||
|
}
|
||||||
|
|
||||||
|
if (p instanceof ProjectionRectangular) {
|
||||||
|
canvas.drawColorBar(ImmutableColorBar.builder()
|
||||||
|
.rect(new Rectangle(canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
|
||||||
|
.ramp(ramp)
|
||||||
|
.numTicks(nColors + 1)
|
||||||
|
.tickFunction(aDouble -> String.format("%.1f", aDouble))
|
||||||
|
.build());
|
||||||
|
|
||||||
|
// for (int i = 0; i < fs.getNumTiles(); i++) {
|
||||||
|
// LatitudinalVector lv = fs.getTileCenter(i);
|
||||||
|
// canvas.drawCircle(lv, radius, Math.toRadians(1), Color.RED);
|
||||||
|
// }
|
||||||
|
}
|
||||||
|
|
||||||
|
for (int i = 0; i < fs.getNumTiles(); i++) {
|
||||||
|
LatitudinalVector lv = fs.getTileCenter(i);
|
||||||
|
canvas.addAnnotation(
|
||||||
|
ImmutableAnnotation.builder()
|
||||||
|
.text(String.format("%d", i))
|
||||||
|
.build(),
|
||||||
|
lv.getLongitude(),
|
||||||
|
lv.getLatitude());
|
||||||
|
}
|
||||||
|
|
||||||
|
images.add(canvas.getImage());
|
||||||
|
}
|
||||||
|
|
||||||
|
int width = 2400;
|
||||||
|
int height = 2400;
|
||||||
|
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
|
||||||
|
|
||||||
|
Graphics2D g = image.createGraphics();
|
||||||
|
g.setRenderingHint(RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
|
||||||
|
int imageWidth = width;
|
||||||
|
int imageHeight = height / 3;
|
||||||
|
g.drawImage(
|
||||||
|
images.getFirst(),
|
||||||
|
width / 6,
|
||||||
|
0,
|
||||||
|
5 * width / 6,
|
||||||
|
imageHeight,
|
||||||
|
0,
|
||||||
|
0,
|
||||||
|
images.getFirst().getWidth(),
|
||||||
|
images.getFirst().getHeight(),
|
||||||
|
null);
|
||||||
|
|
||||||
|
imageWidth = width / 3;
|
||||||
|
for (int i = 1; i < 4; i++) {
|
||||||
|
g.drawImage(
|
||||||
|
images.get(i),
|
||||||
|
(i - 1) * imageWidth,
|
||||||
|
imageHeight,
|
||||||
|
i * imageWidth,
|
||||||
|
2 * imageHeight,
|
||||||
|
0,
|
||||||
|
0,
|
||||||
|
images.get(i).getWidth(),
|
||||||
|
images.get(i).getHeight(),
|
||||||
|
null);
|
||||||
|
}
|
||||||
|
for (int i = 4; i < 7; i++) {
|
||||||
|
g.drawImage(
|
||||||
|
images.get(i),
|
||||||
|
(i - 4) * imageWidth,
|
||||||
|
2 * imageHeight,
|
||||||
|
(i - 3) * imageWidth,
|
||||||
|
3 * imageHeight,
|
||||||
|
0,
|
||||||
|
0,
|
||||||
|
images.get(i).getWidth(),
|
||||||
|
images.get(i).getHeight(),
|
||||||
|
null);
|
||||||
|
}
|
||||||
|
g.dispose();
|
||||||
|
|
||||||
|
PlotCanvas.writeImage(cl.getOptionValue("png"), image);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -28,127 +50,140 @@ import terrasaur.utils.NativeLibraryLoader;
|
|||||||
import terrasaur.utils.SPICEUtil;
|
import terrasaur.utils.SPICEUtil;
|
||||||
|
|
||||||
public class TransformFrame implements TerrasaurTool {
|
public class TransformFrame implements TerrasaurTool {
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
|
@Override
|
||||||
@Override
|
public String shortDescription() {
|
||||||
public String shortDescription() {
|
return "Transform coordinates between reference frames.";
|
||||||
return "Transform coordinates between reference frames.";
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
String header = "";
|
|
||||||
String footer = "\nThis program transforms coordinates between reference frames.\n";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
|
||||||
}
|
|
||||||
|
|
||||||
private NavigableMap<TDBTime, Vector3> pointsIn;
|
|
||||||
private NavigableMap<TDBTime, Vector3> pointsOut;
|
|
||||||
|
|
||||||
public TransformFrame() {}
|
|
||||||
|
|
||||||
public void setPoints(NavigableMap<TDBTime, Vector3> pointsIn) {
|
|
||||||
this.pointsIn = pointsIn;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void transformCoordinates(String inFrame, String outFrame) {
|
|
||||||
try {
|
|
||||||
ReferenceFrame from = new ReferenceFrame(inFrame);
|
|
||||||
ReferenceFrame to = new ReferenceFrame(outFrame);
|
|
||||||
pointsOut = new TreeMap<>(SPICEUtil.tdbComparator);
|
|
||||||
for (TDBTime t : pointsIn.keySet()) {
|
|
||||||
Matrix33 transform = from.getPositionTransformation(to, t);
|
|
||||||
pointsOut.put(t, transform.mxv(pointsIn.get(t)));
|
|
||||||
}
|
|
||||||
} catch (SpiceException e) {
|
|
||||||
logger.error(e.getLocalizedMessage());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public void write(String outFile) {
|
|
||||||
|
|
||||||
try (PrintWriter pw = new PrintWriter(outFile)) {
|
|
||||||
for (TDBTime t : pointsOut.keySet()) {
|
|
||||||
Vector3 v = pointsOut.get(t);
|
|
||||||
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1),
|
|
||||||
v.getElt(2));
|
|
||||||
}
|
|
||||||
} catch (FileNotFoundException | SpiceException e) {
|
|
||||||
logger.error(e.getLocalizedMessage());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
@Override
|
||||||
|
public String fullDescription(Options options) {
|
||||||
|
String header = "";
|
||||||
|
String footer = "\nThis program transforms coordinates between reference frames.\n";
|
||||||
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
|
}
|
||||||
|
|
||||||
private static Options defineOptions() {
|
private NavigableMap<TDBTime, Vector3> pointsIn;
|
||||||
Options options = TerrasaurTool.defineOptions();
|
private NavigableMap<TDBTime, Vector3> pointsOut;
|
||||||
options.addOption(Option.builder("inFile").required().hasArg()
|
|
||||||
.desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("inFrame").required().hasArg()
|
|
||||||
.desc("Required. Name of inFile reference frame.").build());
|
|
||||||
options.addOption(Option.builder("outFile").required().hasArg()
|
|
||||||
.desc("Required. Name of output file. It will be in the same format as inFile.").build());
|
|
||||||
options.addOption(Option.builder("outFrame").required().hasArg()
|
|
||||||
.desc("Required. Name of outFile reference frame.").build());
|
|
||||||
options.addOption(Option.builder("spice").required().hasArg().desc(
|
|
||||||
"Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("logFile").hasArg()
|
|
||||||
.desc("If present, save screen output to log file.").build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values())
|
|
||||||
sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(Option.builder("logLevel").hasArg()
|
|
||||||
.desc("If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim() + ". Default is INFO.")
|
|
||||||
.build()); return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
public TransformFrame() {}
|
||||||
TerrasaurTool defaultOBJ = new TransformFrame();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
public void setPoints(NavigableMap<TDBTime, Vector3> pointsIn) {
|
||||||
|
this.pointsIn = pointsIn;
|
||||||
|
}
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
public void transformCoordinates(String inFrame, String outFrame) {
|
||||||
|
try {
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
ReferenceFrame from = new ReferenceFrame(inFrame);
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
ReferenceFrame to = new ReferenceFrame(outFrame);
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
pointsOut = new TreeMap<>(SPICEUtil.tdbComparator);
|
||||||
|
for (TDBTime t : pointsIn.keySet()) {
|
||||||
TransformFrame tf = new TransformFrame();
|
Matrix33 transform = from.getPositionTransformation(to, t);
|
||||||
NavigableMap<TDBTime, Vector3> map = new TreeMap<>(SPICEUtil.tdbComparator);
|
pointsOut.put(t, transform.mxv(pointsIn.get(t)));
|
||||||
try {
|
}
|
||||||
File f = new File(cl.getOptionValue("inFile"));
|
} catch (SpiceException e) {
|
||||||
List<String> lines = FileUtils.readLines(f, Charset.defaultCharset());
|
logger.error(e.getLocalizedMessage());
|
||||||
for (String line : lines) {
|
|
||||||
String trim = line.trim();
|
|
||||||
if (trim.isEmpty() || trim.startsWith("#"))
|
|
||||||
continue;
|
|
||||||
String[] parts = trim.split(",");
|
|
||||||
double et = Double.parseDouble(parts[0].trim());
|
|
||||||
if (et > 0) {
|
|
||||||
TDBTime t = new TDBTime(et);
|
|
||||||
Vector3 v = new Vector3(Double.parseDouble(parts[1].trim()),
|
|
||||||
Double.parseDouble(parts[2].trim()), Double.parseDouble(parts[3].trim()));
|
|
||||||
map.put(t, v);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
} catch (IOException e) {
|
|
||||||
logger.error(e.getLocalizedMessage());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
tf.setPoints(map);
|
public void write(String outFile) {
|
||||||
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
try (PrintWriter pw = new PrintWriter(outFile)) {
|
||||||
try {
|
for (TDBTime t : pointsOut.keySet()) {
|
||||||
KernelDatabase.load(cl.getOptionValue("spice"));
|
Vector3 v = pointsOut.get(t);
|
||||||
} catch (SpiceErrorException e) {
|
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1), v.getElt(2));
|
||||||
logger.error(e.getLocalizedMessage());
|
}
|
||||||
|
} catch (FileNotFoundException | SpiceException e) {
|
||||||
|
logger.error(e.getLocalizedMessage());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame"));
|
private static Options defineOptions() {
|
||||||
tf.write(cl.getOptionValue("outFile"));
|
Options options = TerrasaurTool.defineOptions();
|
||||||
}
|
options.addOption(Option.builder("inFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("inFrame")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of inFile reference frame.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outFile")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of output file. It will be in the same format as inFile.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("outFrame")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of outFile reference frame.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("spice")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim() + ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
TerrasaurTool defaultOBJ = new TransformFrame();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
TransformFrame tf = new TransformFrame();
|
||||||
|
NavigableMap<TDBTime, Vector3> map = new TreeMap<>(SPICEUtil.tdbComparator);
|
||||||
|
try {
|
||||||
|
File f = new File(cl.getOptionValue("inFile"));
|
||||||
|
List<String> lines = FileUtils.readLines(f, Charset.defaultCharset());
|
||||||
|
for (String line : lines) {
|
||||||
|
String trim = line.trim();
|
||||||
|
if (trim.isEmpty() || trim.startsWith("#")) continue;
|
||||||
|
String[] parts = trim.split(",");
|
||||||
|
double et = Double.parseDouble(parts[0].trim());
|
||||||
|
if (et > 0) {
|
||||||
|
TDBTime t = new TDBTime(et);
|
||||||
|
Vector3 v = new Vector3(
|
||||||
|
Double.parseDouble(parts[1].trim()),
|
||||||
|
Double.parseDouble(parts[2].trim()),
|
||||||
|
Double.parseDouble(parts[3].trim()));
|
||||||
|
map.put(t, v);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error(e.getLocalizedMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
tf.setPoints(map);
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
try {
|
||||||
|
KernelDatabase.load(cl.getOptionValue("spice"));
|
||||||
|
} catch (SpiceErrorException e) {
|
||||||
|
logger.error(e.getLocalizedMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame"));
|
||||||
|
tf.write(cl.getOptionValue("outFile"));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.util.LinkedHashMap;
|
import java.util.LinkedHashMap;
|
||||||
@@ -21,220 +43,232 @@ import terrasaur.utils.NativeLibraryLoader;
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Translate time between formats.
|
* Translate time between formats.
|
||||||
*
|
*
|
||||||
* @author nairah1
|
* @author nairah1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class TranslateTime implements TerrasaurTool {
|
public class TranslateTime implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Convert between different time systems.";
|
return "Convert between different time systems.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
String header = "";
|
|
||||||
String footer = "\nConvert between different time systems.\n";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
private enum Types {
|
|
||||||
JULIAN, SCLK, TDB, TDBCALENDAR, UTC
|
|
||||||
}
|
|
||||||
|
|
||||||
private Map<Integer, SCLK> sclkMap;
|
|
||||||
|
|
||||||
private TranslateTime(){}
|
|
||||||
|
|
||||||
public TranslateTime(Map<Integer, SCLK> sclkMap) {
|
|
||||||
this.sclkMap = sclkMap;
|
|
||||||
}
|
|
||||||
|
|
||||||
private TDBTime tdb;
|
|
||||||
|
|
||||||
public String toJulian() throws SpiceErrorException {
|
|
||||||
return tdb.toString("JULIAND.######");
|
|
||||||
}
|
|
||||||
|
|
||||||
private SCLK sclkKernel;
|
|
||||||
|
|
||||||
public SCLK getSCLKKernel() {
|
|
||||||
return sclkKernel;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setSCLKKernel(int sclkID) {
|
|
||||||
sclkKernel = sclkMap.get(sclkID);
|
|
||||||
if (sclkKernel == null) {
|
|
||||||
logger.error("SCLK {} is not loaded!", sclkID);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public SCLKTime toSCLK() throws SpiceException {
|
|
||||||
return new SCLKTime(sclkKernel, tdb);
|
|
||||||
}
|
|
||||||
|
|
||||||
public TDBTime toTDB() {
|
|
||||||
return tdb;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String toUTC() throws SpiceErrorException {
|
|
||||||
return tdb.toUTCString("ISOC", 3);
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setJulianDate(double julianDate) throws SpiceErrorException {
|
|
||||||
tdb = new TDBTime(String.format("%.6f JDUTC", julianDate));
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setSCLK(String sclkString) throws SpiceException {
|
|
||||||
tdb = new TDBTime(new SCLKTime(sclkKernel, sclkString));
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setTDB(double tdb) {
|
|
||||||
this.tdb = new TDBTime(tdb);
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setTDBCalendarString(String tdbString) throws SpiceErrorException {
|
|
||||||
tdb = new TDBTime(String.format("%s TDB", tdbString));
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setUTC(String utcStr) throws SpiceErrorException {
|
|
||||||
tdb = new TDBTime(utcStr);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(Option.builder("logFile").hasArg()
|
|
||||||
.desc("If present, save screen output to log file.").build());
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (StandardLevel l : StandardLevel.values())
|
|
||||||
sb.append(String.format("%s ", l.name()));
|
|
||||||
options.addOption(Option.builder("logLevel").hasArg()
|
|
||||||
.desc("If present, print messages above selected priority. Valid values are "
|
|
||||||
+ sb.toString().trim() + ". Default is INFO.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("sclk").hasArg().desc(
|
|
||||||
"SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
|
|
||||||
.build());
|
|
||||||
options.addOption(Option.builder("spice").required().hasArg()
|
|
||||||
.desc("Required. SPICE metakernel containing leap second and SCLK.").build());
|
|
||||||
options.addOption(Option.builder("gui").desc("Launch a GUI.").build());
|
|
||||||
options.addOption(Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
|
|
||||||
sb = new StringBuilder();
|
|
||||||
for (Types system : Types.values()) {
|
|
||||||
sb.append(String.format("%s ", system.name()));
|
|
||||||
}
|
|
||||||
options.addOption(Option.builder("inputSystem").hasArg().desc(
|
|
||||||
"Timesystem of inputDate. Valid values are " + sb.toString().trim() + ". Default is UTC.")
|
|
||||||
.build()); return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws SpiceException {
|
|
||||||
TerrasaurTool defaultOBJ = new TranslateTime();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
// This is to avoid java crashing due to inability to connect to an X display
|
|
||||||
if (!cl.hasOption("gui"))
|
|
||||||
System.setProperty("java.awt.headless", "true");
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadSpiceLibraries();
|
|
||||||
|
|
||||||
for (String kernel : cl.getOptionValues("spice"))
|
|
||||||
KernelDatabase.load(kernel);
|
|
||||||
|
|
||||||
LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>();
|
|
||||||
String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*");
|
|
||||||
for (String s : sclk_data_type) {
|
|
||||||
String[] parts = s.split("_");
|
|
||||||
int sclkID = -Integer.parseInt(parts[parts.length - 1]);
|
|
||||||
sclkMap.put(sclkID, new SCLK(sclkID));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
SCLK sclk = null;
|
@Override
|
||||||
if (cl.hasOption("sclk")) {
|
public String fullDescription(Options options) {
|
||||||
int sclkID = Integer.parseInt(cl.getOptionValue("sclk"));
|
String header = "";
|
||||||
if (sclkMap.containsKey(sclkID))
|
String footer = "\nConvert between different time systems.\n";
|
||||||
sclk = sclkMap.get(sclkID);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
else {
|
|
||||||
logger.error("Cannot find SCLK {} in kernel pool!", sclkID);
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
for (Integer id : sclkMap.keySet())
|
|
||||||
sb.append(String.format("%d ", id));
|
|
||||||
logger.error("Loaded IDs are {}", sb.toString());
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
if (!sclkMap.values().isEmpty())
|
|
||||||
// set the SCLK to the first one found
|
|
||||||
sclk = sclkMap.values().stream().toList().get(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (sclk == null) {
|
private enum Types {
|
||||||
logger.fatal("Cannot load SCLK");
|
JULIAN,
|
||||||
System.exit(0);
|
SCLK,
|
||||||
|
TDB,
|
||||||
|
TDBCALENDAR,
|
||||||
|
UTC
|
||||||
}
|
}
|
||||||
|
|
||||||
TranslateTime tt = new TranslateTime(sclkMap);
|
private Map<Integer, SCLK> sclkMap;
|
||||||
|
|
||||||
if (cl.hasOption("gui")) {
|
private TranslateTime() {}
|
||||||
TranslateTimeFX.setTranslateTime(tt);
|
|
||||||
TranslateTimeFX.setSCLKIDs(sclkMap.keySet());
|
public TranslateTime(Map<Integer, SCLK> sclkMap) {
|
||||||
TranslateTimeFX.main(args);
|
this.sclkMap = sclkMap;
|
||||||
System.exit(0);
|
|
||||||
} else {
|
|
||||||
if (!cl.hasOption("inputDate")) {
|
|
||||||
logger.fatal("Missing required option -inputDate!");
|
|
||||||
System.exit(1);
|
|
||||||
}
|
|
||||||
tt.setSCLKKernel(sclk.getIDCode());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
StringBuilder sb = new StringBuilder();
|
private TDBTime tdb;
|
||||||
for (String s : cl.getOptionValues("inputDate"))
|
|
||||||
sb.append(String.format("%s ", s));
|
|
||||||
String inputDate = sb.toString().trim();
|
|
||||||
|
|
||||||
Types type =
|
public String toJulian() throws SpiceErrorException {
|
||||||
cl.hasOption("inputSystem") ? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
|
return tdb.toString("JULIAND.######");
|
||||||
: Types.UTC;
|
|
||||||
|
|
||||||
switch (type) {
|
|
||||||
case JULIAN:
|
|
||||||
tt.setJulianDate(Double.parseDouble(inputDate));
|
|
||||||
break;
|
|
||||||
case SCLK:
|
|
||||||
tt.setSCLK(inputDate);
|
|
||||||
break;
|
|
||||||
case TDB:
|
|
||||||
tt.setTDB(Double.parseDouble(inputDate));
|
|
||||||
break;
|
|
||||||
case TDBCALENDAR:
|
|
||||||
tt.setTDBCalendarString(inputDate);
|
|
||||||
break;
|
|
||||||
case UTC:
|
|
||||||
tt.setUTC(inputDate);
|
|
||||||
break;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
System.out.printf("# input date %s (%s)\n", inputDate, type.name());
|
private SCLK sclkKernel;
|
||||||
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n",
|
|
||||||
sclk.getIDCode());
|
|
||||||
|
|
||||||
String utcString = tt.toTDB().toUTCString("ISOC", 3);
|
public SCLK getSCLKKernel() {
|
||||||
String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB");
|
return sclkKernel;
|
||||||
String doyString = tt.toTDB().toString("DOY");
|
}
|
||||||
|
|
||||||
System.out.printf("%s, %s, %s, %.6f, %s, %s\n", utcString, tdbString, doyString,
|
public void setSCLKKernel(int sclkID) {
|
||||||
tt.toTDB().getTDBSeconds(), tt.toJulian(), tt.toSCLK().toString());
|
sclkKernel = sclkMap.get(sclkID);
|
||||||
|
if (sclkKernel == null) {
|
||||||
|
logger.error("SCLK {} is not loaded!", sclkID);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
}
|
public SCLKTime toSCLK() throws SpiceException {
|
||||||
|
return new SCLKTime(sclkKernel, tdb);
|
||||||
|
}
|
||||||
|
|
||||||
|
public TDBTime toTDB() {
|
||||||
|
return tdb;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String toUTC() throws SpiceErrorException {
|
||||||
|
return tdb.toUTCString("ISOC", 3);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setJulianDate(double julianDate) throws SpiceErrorException {
|
||||||
|
tdb = new TDBTime(String.format("%.6f JDUTC", julianDate));
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSCLK(String sclkString) throws SpiceException {
|
||||||
|
tdb = new TDBTime(new SCLKTime(sclkKernel, sclkString));
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTDB(double tdb) {
|
||||||
|
this.tdb = new TDBTime(tdb);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTDBCalendarString(String tdbString) throws SpiceErrorException {
|
||||||
|
tdb = new TDBTime(String.format("%s TDB", tdbString));
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUTC(String utcStr) throws SpiceErrorException {
|
||||||
|
tdb = new TDBTime(utcStr);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
|
+ sb.toString().trim() + ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("sclk")
|
||||||
|
.hasArg()
|
||||||
|
.desc("SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("spice")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Required. SPICE metakernel containing leap second and SCLK.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("gui").desc("Launch a GUI.").build());
|
||||||
|
options.addOption(
|
||||||
|
Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
|
||||||
|
sb = new StringBuilder();
|
||||||
|
for (Types system : Types.values()) {
|
||||||
|
sb.append(String.format("%s ", system.name()));
|
||||||
|
}
|
||||||
|
options.addOption(Option.builder("inputSystem")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Timesystem of inputDate. Valid values are "
|
||||||
|
+ sb.toString().trim() + ". Default is UTC.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws SpiceException {
|
||||||
|
TerrasaurTool defaultOBJ = new TranslateTime();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet())
|
||||||
|
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
||||||
|
|
||||||
|
// This is to avoid java crashing due to inability to connect to an X display
|
||||||
|
if (!cl.hasOption("gui")) System.setProperty("java.awt.headless", "true");
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadSpiceLibraries();
|
||||||
|
|
||||||
|
for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
|
||||||
|
|
||||||
|
LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>();
|
||||||
|
String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*");
|
||||||
|
for (String s : sclk_data_type) {
|
||||||
|
String[] parts = s.split("_");
|
||||||
|
int sclkID = -Integer.parseInt(parts[parts.length - 1]);
|
||||||
|
sclkMap.put(sclkID, new SCLK(sclkID));
|
||||||
|
}
|
||||||
|
|
||||||
|
SCLK sclk = null;
|
||||||
|
if (cl.hasOption("sclk")) {
|
||||||
|
int sclkID = Integer.parseInt(cl.getOptionValue("sclk"));
|
||||||
|
if (sclkMap.containsKey(sclkID)) sclk = sclkMap.get(sclkID);
|
||||||
|
else {
|
||||||
|
logger.error("Cannot find SCLK {} in kernel pool!", sclkID);
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (Integer id : sclkMap.keySet()) sb.append(String.format("%d ", id));
|
||||||
|
logger.error("Loaded IDs are {}", sb.toString());
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (!sclkMap.values().isEmpty())
|
||||||
|
// set the SCLK to the first one found
|
||||||
|
sclk = sclkMap.values().stream().toList().get(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (sclk == null) {
|
||||||
|
logger.fatal("Cannot load SCLK");
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
TranslateTime tt = new TranslateTime(sclkMap);
|
||||||
|
|
||||||
|
if (cl.hasOption("gui")) {
|
||||||
|
TranslateTimeFX.setTranslateTime(tt);
|
||||||
|
TranslateTimeFX.setSCLKIDs(sclkMap.keySet());
|
||||||
|
TranslateTimeFX.main(args);
|
||||||
|
System.exit(0);
|
||||||
|
} else {
|
||||||
|
if (!cl.hasOption("inputDate")) {
|
||||||
|
logger.fatal("Missing required option -inputDate!");
|
||||||
|
System.exit(1);
|
||||||
|
}
|
||||||
|
tt.setSCLKKernel(sclk.getIDCode());
|
||||||
|
}
|
||||||
|
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (String s : cl.getOptionValues("inputDate")) sb.append(String.format("%s ", s));
|
||||||
|
String inputDate = sb.toString().trim();
|
||||||
|
|
||||||
|
Types type = cl.hasOption("inputSystem")
|
||||||
|
? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
|
||||||
|
: Types.UTC;
|
||||||
|
|
||||||
|
switch (type) {
|
||||||
|
case JULIAN:
|
||||||
|
tt.setJulianDate(Double.parseDouble(inputDate));
|
||||||
|
break;
|
||||||
|
case SCLK:
|
||||||
|
tt.setSCLK(inputDate);
|
||||||
|
break;
|
||||||
|
case TDB:
|
||||||
|
tt.setTDB(Double.parseDouble(inputDate));
|
||||||
|
break;
|
||||||
|
case TDBCALENDAR:
|
||||||
|
tt.setTDBCalendarString(inputDate);
|
||||||
|
break;
|
||||||
|
case UTC:
|
||||||
|
tt.setUTC(inputDate);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
System.out.printf("# input date %s (%s)\n", inputDate, type.name());
|
||||||
|
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n", sclk.getIDCode());
|
||||||
|
|
||||||
|
String utcString = tt.toTDB().toUTCString("ISOC", 3);
|
||||||
|
String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB");
|
||||||
|
String doyString = tt.toTDB().toString("DOY");
|
||||||
|
|
||||||
|
System.out.printf(
|
||||||
|
"%s, %s, %s, %.6f, %s, %s\n",
|
||||||
|
utcString,
|
||||||
|
tdbString,
|
||||||
|
doyString,
|
||||||
|
tt.toTDB().getTDBSeconds(),
|
||||||
|
tt.toJulian(),
|
||||||
|
tt.toSCLK().toString());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
159
src/main/java/terrasaur/apps/TriAx.java
Normal file
159
src/main/java/terrasaur/apps/TriAx.java
Normal file
@@ -0,0 +1,159 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
|
package terrasaur.apps;
|
||||||
|
|
||||||
|
import java.io.*;
|
||||||
|
import java.util.Map;
|
||||||
|
import org.apache.commons.cli.CommandLine;
|
||||||
|
import org.apache.commons.cli.Option;
|
||||||
|
import org.apache.commons.cli.Options;
|
||||||
|
import org.apache.commons.io.FilenameUtils;
|
||||||
|
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
||||||
|
import org.apache.logging.log4j.LogManager;
|
||||||
|
import org.apache.logging.log4j.Logger;
|
||||||
|
import terrasaur.templates.TerrasaurTool;
|
||||||
|
import terrasaur.utils.ICQUtils;
|
||||||
|
import terrasaur.utils.NativeLibraryLoader;
|
||||||
|
import terrasaur.utils.PolyDataUtil;
|
||||||
|
import vtk.vtkPolyData;
|
||||||
|
|
||||||
|
public class TriAx implements TerrasaurTool {
|
||||||
|
|
||||||
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
|
private TriAx() {}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String shortDescription() {
|
||||||
|
return "Generate a triaxial ellipsoid in ICQ format.";
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
|
String footer =
|
||||||
|
"\nTriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.\n";
|
||||||
|
return TerrasaurTool.super.fullDescription(options, "", footer);
|
||||||
|
}
|
||||||
|
|
||||||
|
static Options defineOptions() {
|
||||||
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("A")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Long axis of the ellipsoid, arbitrary units (usually assumed to be km).")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("B")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Medium axis of the ellipsoid, arbitrary units (usually assumed to be km).")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("C")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Short axis of the ellipsoid, arbitrary units (usually assumed to be km).")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("Q")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("ICQ size parameter. This is conventionally but not necessarily a power of 2.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("saveOBJ")
|
||||||
|
.desc("If present, save in OBJ format as well. "
|
||||||
|
+ "File will have the same name as ICQ file with an OBJ extension.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("output")
|
||||||
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc("Name of ICQ file to write.")
|
||||||
|
.build());
|
||||||
|
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
static final int MAX_Q = 512;
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
TerrasaurTool defaultOBJ = new TriAx();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
|
||||||
|
|
||||||
|
int q = Integer.parseInt(cl.getOptionValue("Q"));
|
||||||
|
String shapefile = cl.getOptionValue("output");
|
||||||
|
|
||||||
|
double[] ax = new double[3];
|
||||||
|
ax[0] = Double.parseDouble(cl.getOptionValue("A"));
|
||||||
|
ax[1] = Double.parseDouble(cl.getOptionValue("B"));
|
||||||
|
ax[2] = Double.parseDouble(cl.getOptionValue("C"));
|
||||||
|
|
||||||
|
double[][][][] vec = new double[3][MAX_Q + 1][MAX_Q + 1][6];
|
||||||
|
for (int f = 0; f < 6; f++) {
|
||||||
|
for (int i = 0; i <= q; i++) {
|
||||||
|
for (int j = 0; j <= q; j++) {
|
||||||
|
|
||||||
|
double[] u = ICQUtils.xyf2u(q, i, j, f, ax);
|
||||||
|
double z = 1
|
||||||
|
/ Math.sqrt(
|
||||||
|
Math.pow(u[0] / ax[0], 2) + Math.pow(u[1] / ax[1], 2) + Math.pow(u[2] / ax[2], 2));
|
||||||
|
|
||||||
|
double[] v = new Vector3D(u).scalarMultiply(z).toArray();
|
||||||
|
for (int k = 0; k < 3; k++) {
|
||||||
|
vec[k][i][j][f] = v[k];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
ICQUtils.writeICQ(q, vec, shapefile);
|
||||||
|
|
||||||
|
if (cl.hasOption("saveOBJ")) {
|
||||||
|
|
||||||
|
String basename = FilenameUtils.getBaseName(shapefile);
|
||||||
|
String dirname = FilenameUtils.getFullPath(shapefile);
|
||||||
|
if (dirname.isEmpty()) dirname = ".";
|
||||||
|
File obj = new File(dirname, basename + ".obj");
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
try {
|
||||||
|
vtkPolyData polydata = PolyDataUtil.loadShapeModel(shapefile);
|
||||||
|
if (polydata == null) {
|
||||||
|
logger.error("Cannot read {}", shapefile);
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
polydata = PolyDataUtil.removeDuplicatePoints(polydata);
|
||||||
|
polydata = PolyDataUtil.removeUnreferencedPoints(polydata);
|
||||||
|
polydata = PolyDataUtil.removeZeroAreaFacets(polydata);
|
||||||
|
|
||||||
|
PolyDataUtil.saveShapeModelAsOBJ(polydata, obj.getPath());
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.time.Instant;
|
import java.time.Instant;
|
||||||
@@ -8,7 +30,6 @@ import java.util.concurrent.Callable;
|
|||||||
import java.util.concurrent.ExecutorService;
|
import java.util.concurrent.ExecutorService;
|
||||||
import java.util.concurrent.Executors;
|
import java.util.concurrent.Executors;
|
||||||
import java.util.concurrent.Future;
|
import java.util.concurrent.Future;
|
||||||
|
|
||||||
import org.apache.commons.cli.CommandLine;
|
import org.apache.commons.cli.CommandLine;
|
||||||
import org.apache.commons.cli.Option;
|
import org.apache.commons.cli.Option;
|
||||||
import org.apache.commons.cli.Options;
|
import org.apache.commons.cli.Options;
|
||||||
@@ -24,238 +45,238 @@ import vtk.vtkOBJReader;
|
|||||||
import vtk.vtkPolyData;
|
import vtk.vtkPolyData;
|
||||||
|
|
||||||
public class ValidateNormals implements TerrasaurTool {
|
public class ValidateNormals implements TerrasaurTool {
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
private ValidateNormals() {}
|
private ValidateNormals() {}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Check facet normal directions for an OBJ shape file.";
|
return "Check facet normal directions for an OBJ shape file.";
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String fullDescription(Options options) {
|
|
||||||
|
|
||||||
String footer =
|
|
||||||
"\nThis program checks that the normals of the shape model are not pointing inward.\n";
|
|
||||||
return TerrasaurTool.super.fullDescription(options, "", footer);
|
|
||||||
}
|
|
||||||
|
|
||||||
static Options defineOptions() {
|
|
||||||
Options options = TerrasaurTool.defineOptions();
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("origin")
|
|
||||||
.hasArg()
|
|
||||||
.desc(
|
|
||||||
"If present, center of body in xyz coordinates. "
|
|
||||||
+ "Specify as three floating point values separated by commas. Default is to use the centroid of "
|
|
||||||
+ "the input shape model.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("output")
|
|
||||||
.hasArg()
|
|
||||||
.desc("Write out new OBJ file with corrected vertex orders for facets.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("numThreads")
|
|
||||||
.hasArg()
|
|
||||||
.desc("Number of threads to run. Default is 1.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
private vtkPolyData polyData;
|
|
||||||
private ThreadLocal<vtkOBBTree> threadLocalsearchTree;
|
|
||||||
private double[] origin;
|
|
||||||
|
|
||||||
public ValidateNormals(vtkPolyData polyData) {
|
|
||||||
this.polyData = polyData;
|
|
||||||
|
|
||||||
PolyDataStatistics stats = new PolyDataStatistics(polyData);
|
|
||||||
origin = stats.getCentroid();
|
|
||||||
|
|
||||||
threadLocalsearchTree = new ThreadLocal<>();
|
|
||||||
}
|
|
||||||
|
|
||||||
public vtkOBBTree getOBBTree() {
|
|
||||||
vtkOBBTree searchTree = threadLocalsearchTree.get();
|
|
||||||
if (searchTree == null) {
|
|
||||||
searchTree = new vtkOBBTree();
|
|
||||||
searchTree.SetDataSet(polyData);
|
|
||||||
searchTree.SetTolerance(1e-12);
|
|
||||||
searchTree.BuildLocator();
|
|
||||||
threadLocalsearchTree.set(searchTree);
|
|
||||||
}
|
|
||||||
return searchTree;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setOrigin(double[] origin) {
|
|
||||||
this.origin = origin;
|
|
||||||
}
|
|
||||||
|
|
||||||
private class FlippedNormalFinder implements Callable<List<Long>> {
|
|
||||||
|
|
||||||
private static final DateTimeFormatter defaultFormatter =
|
|
||||||
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
|
|
||||||
.withLocale(Locale.getDefault())
|
|
||||||
.withZone(ZoneId.systemDefault());
|
|
||||||
|
|
||||||
private final long index0;
|
|
||||||
private final long index1;
|
|
||||||
|
|
||||||
public FlippedNormalFinder(long index0, long index1) {
|
|
||||||
this.index0 = index0;
|
|
||||||
this.index1 = index1;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public List<Long> call() {
|
public String fullDescription(Options options) {
|
||||||
|
|
||||||
logger.info("Thread {}: indices {} to {}", Thread.currentThread().threadId(), index0, index1);
|
String footer = "\nThis program checks that the normals of the shape model are not pointing inward.\n";
|
||||||
vtkIdList idList = new vtkIdList();
|
return TerrasaurTool.super.fullDescription(options, "", footer);
|
||||||
vtkIdList cellIds = new vtkIdList();
|
}
|
||||||
List<Long> flippedNormals = new ArrayList<>();
|
|
||||||
|
|
||||||
final long startTime = Instant.now().getEpochSecond();
|
static Options defineOptions() {
|
||||||
final long numFacets = index1 - index0;
|
Options options = TerrasaurTool.defineOptions();
|
||||||
for (int i = 0; i < numFacets; ++i) {
|
options.addOption(Option.builder("fast")
|
||||||
|
.desc("If present, only check for overhangs if center and normal point in opposite "
|
||||||
|
+ "directions. Default behavior is to always check for intersections between body center "
|
||||||
|
+ "and facet center.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("origin")
|
||||||
|
.hasArg()
|
||||||
|
.desc("If present, center of body in xyz coordinates. "
|
||||||
|
+ "Specify as three floating point values separated by commas. Default is to use the centroid of "
|
||||||
|
+ "the input shape model.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("obj")
|
||||||
|
.required()
|
||||||
|
.hasArg()
|
||||||
|
.desc("Shape model to validate.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("output")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Write out new OBJ file with corrected vertex orders for facets.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("numThreads")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Number of threads to run. Default is 1.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
if (i > 0 && i % (numFacets / 10) == 0) {
|
private vtkPolyData polyData;
|
||||||
double pctDone = i / (numFacets * .01);
|
private ThreadLocal<vtkOBBTree> threadLocalsearchTree;
|
||||||
long elapsed = Instant.now().getEpochSecond() - startTime;
|
private double[] origin;
|
||||||
long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime);
|
|
||||||
String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish));
|
public ValidateNormals(vtkPolyData polyData) {
|
||||||
logger.info(
|
this.polyData = polyData;
|
||||||
String.format(
|
|
||||||
"Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s",
|
PolyDataStatistics stats = new PolyDataStatistics(polyData);
|
||||||
Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish));
|
origin = stats.getCentroid();
|
||||||
|
|
||||||
|
threadLocalsearchTree = new ThreadLocal<>();
|
||||||
|
}
|
||||||
|
|
||||||
|
public vtkOBBTree getOBBTree() {
|
||||||
|
vtkOBBTree searchTree = threadLocalsearchTree.get();
|
||||||
|
if (searchTree == null) {
|
||||||
|
searchTree = new vtkOBBTree();
|
||||||
|
searchTree.SetDataSet(polyData);
|
||||||
|
searchTree.SetTolerance(1e-12);
|
||||||
|
searchTree.BuildLocator();
|
||||||
|
threadLocalsearchTree.set(searchTree);
|
||||||
|
}
|
||||||
|
return searchTree;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setOrigin(double[] origin) {
|
||||||
|
this.origin = origin;
|
||||||
|
}
|
||||||
|
|
||||||
|
private class FlippedNormalFinder implements Callable<List<Long>> {
|
||||||
|
|
||||||
|
private static final DateTimeFormatter defaultFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
|
||||||
|
.withLocale(Locale.getDefault())
|
||||||
|
.withZone(ZoneId.systemDefault());
|
||||||
|
|
||||||
|
private final long index0;
|
||||||
|
private final long index1;
|
||||||
|
private final boolean fast;
|
||||||
|
|
||||||
|
public FlippedNormalFinder(long index0, long index1, boolean fast) {
|
||||||
|
this.index0 = index0;
|
||||||
|
this.index1 = index1;
|
||||||
|
this.fast = fast;
|
||||||
}
|
}
|
||||||
|
|
||||||
long index = index0 + i;
|
@Override
|
||||||
|
public List<Long> call() {
|
||||||
|
|
||||||
CellInfo ci = CellInfo.getCellInfo(polyData, index, idList);
|
logger.info("Thread {}: indices {} to {}", Thread.currentThread().threadId(), index0, index1);
|
||||||
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
|
vtkIdList idList = new vtkIdList();
|
||||||
|
vtkIdList cellIds = new vtkIdList();
|
||||||
|
List<Long> flippedNormals = new ArrayList<>();
|
||||||
|
|
||||||
// count up all crossings of the surface between the origin and the facet.
|
final long startTime = Instant.now().getEpochSecond();
|
||||||
int numCrossings = 0;
|
final long numFacets = index1 - index0;
|
||||||
for (int j = 0; j < cellIds.GetNumberOfIds(); j++) {
|
for (int i = 0; i < numFacets; ++i) {
|
||||||
if (cellIds.GetId(j) == index) break;
|
|
||||||
numCrossings++;
|
if (i > 0 && i % (numFacets / 10) == 0) {
|
||||||
|
double pctDone = i / (numFacets * .01);
|
||||||
|
long elapsed = Instant.now().getEpochSecond() - startTime;
|
||||||
|
long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime);
|
||||||
|
String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish));
|
||||||
|
logger.info(String.format(
|
||||||
|
"Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s",
|
||||||
|
Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish));
|
||||||
|
}
|
||||||
|
|
||||||
|
long index = index0 + i;
|
||||||
|
|
||||||
|
CellInfo ci = CellInfo.getCellInfo(polyData, index, idList);
|
||||||
|
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
|
||||||
|
|
||||||
|
int numCrossings = 0;
|
||||||
|
if (isOpposite || !fast) {
|
||||||
|
// count up all crossings of the surface between the origin and the facet.
|
||||||
|
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
|
||||||
|
for (int j = 0; j < cellIds.GetNumberOfIds(); j++) {
|
||||||
|
if (cellIds.GetId(j) == index) break;
|
||||||
|
numCrossings++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// if numCrossings is even, the radial and normal should point in the same direction. If it
|
||||||
|
// is odd, the radial and normal should point in opposite directions.
|
||||||
|
boolean shouldBeOpposite = (numCrossings % 2 == 1);
|
||||||
|
|
||||||
|
// XOR operator - true if both conditions are different
|
||||||
|
if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index);
|
||||||
|
}
|
||||||
|
|
||||||
|
return flippedNormals;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void flipNormals(Collection<Long> facets) {
|
||||||
|
vtkCellArray cells = new vtkCellArray();
|
||||||
|
for (long i = 0; i < polyData.GetNumberOfCells(); ++i) {
|
||||||
|
vtkIdList idList = new vtkIdList();
|
||||||
|
polyData.GetCellPoints(i, idList);
|
||||||
|
if (facets.contains(i)) {
|
||||||
|
long id0 = idList.GetId(0);
|
||||||
|
long id1 = idList.GetId(1);
|
||||||
|
long id2 = idList.GetId(2);
|
||||||
|
idList.SetId(0, id0);
|
||||||
|
idList.SetId(1, id2);
|
||||||
|
idList.SetId(2, id1);
|
||||||
|
}
|
||||||
|
cells.InsertNextCell(idList);
|
||||||
|
}
|
||||||
|
polyData.SetPolys(cells);
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws Exception {
|
||||||
|
TerrasaurTool defaultOBJ = new ValidateNormals();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
|
||||||
|
// PolyDataUtil's OBJ reader messes with the normals - not reliable for a local obj
|
||||||
|
vtkOBJReader smallBodyReader = new vtkOBJReader();
|
||||||
|
smallBodyReader.SetFileName(cl.getOptionValue("obj"));
|
||||||
|
smallBodyReader.Update();
|
||||||
|
vtkPolyData polyData = new vtkPolyData();
|
||||||
|
polyData.ShallowCopy(smallBodyReader.GetOutput());
|
||||||
|
|
||||||
|
smallBodyReader.Delete();
|
||||||
|
|
||||||
|
ValidateNormals app = new ValidateNormals(polyData);
|
||||||
|
|
||||||
|
logger.info("Read {} facets from {}", polyData.GetNumberOfCells(), cl.getOptionValue("obj"));
|
||||||
|
|
||||||
|
if (cl.hasOption("origin")) {
|
||||||
|
String[] parts = cl.getOptionValue("origin").split(",");
|
||||||
|
double[] origin = new double[3];
|
||||||
|
for (int i = 0; i < 3; i++) origin[i] = Double.parseDouble(parts[i]);
|
||||||
|
app.setOrigin(origin);
|
||||||
}
|
}
|
||||||
|
|
||||||
// if numCrossings is even, the radial and normal should point in the same direction. If it
|
Set<Long> flippedNormals = new HashSet<>();
|
||||||
// is odd, the
|
|
||||||
// radial and normal should point in opposite directions.
|
|
||||||
boolean shouldBeOpposite = (numCrossings % 2 == 1);
|
|
||||||
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
|
|
||||||
|
|
||||||
// XOR operator - true if both conditions are different
|
boolean fast = cl.hasOption("fast");
|
||||||
if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index);
|
int numThreads = cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
|
||||||
}
|
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
|
||||||
|
List<Future<List<Long>>> futures = new ArrayList<>();
|
||||||
|
|
||||||
return flippedNormals;
|
long numFacets = polyData.GetNumberOfCells() / numThreads;
|
||||||
}
|
for (int i = 0; i < numThreads; i++) {
|
||||||
}
|
long fromIndex = i * numFacets;
|
||||||
|
long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets);
|
||||||
|
|
||||||
public void flipNormals(Collection<Long> facets) {
|
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex, fast);
|
||||||
vtkCellArray cells = new vtkCellArray();
|
futures.add(executor.submit(fnf));
|
||||||
for (long i = 0; i < polyData.GetNumberOfCells(); ++i) {
|
}
|
||||||
vtkIdList idList = new vtkIdList();
|
|
||||||
polyData.GetCellPoints(i, idList);
|
|
||||||
if (facets.contains(i)) {
|
|
||||||
long id0 = idList.GetId(0);
|
|
||||||
long id1 = idList.GetId(1);
|
|
||||||
long id2 = idList.GetId(2);
|
|
||||||
idList.SetId(0, id0);
|
|
||||||
idList.SetId(1, id2);
|
|
||||||
idList.SetId(2, id1);
|
|
||||||
}
|
|
||||||
cells.InsertNextCell(idList);
|
|
||||||
}
|
|
||||||
polyData.SetPolys(cells);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
for (Future<List<Long>> future : futures) flippedNormals.addAll(future.get());
|
||||||
TerrasaurTool defaultOBJ = new ValidateNormals();
|
|
||||||
|
|
||||||
Options options = defineOptions();
|
executor.shutdown();
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
|
|
||||||
// PolyDataUtil's OBJ reader messes with the normals - not reliable for a local obj
|
|
||||||
vtkOBJReader smallBodyReader = new vtkOBJReader();
|
|
||||||
smallBodyReader.SetFileName(cl.getOptionValue("obj"));
|
|
||||||
smallBodyReader.Update();
|
|
||||||
vtkPolyData polyData = new vtkPolyData();
|
|
||||||
polyData.ShallowCopy(smallBodyReader.GetOutput());
|
|
||||||
|
|
||||||
smallBodyReader.Delete();
|
|
||||||
|
|
||||||
ValidateNormals app = new ValidateNormals(polyData);
|
|
||||||
|
|
||||||
logger.info("Read {} facets from {}", polyData.GetNumberOfCells(), cl.getOptionValue("obj"));
|
|
||||||
|
|
||||||
if (cl.hasOption("origin")) {
|
|
||||||
String[] parts = cl.getOptionValue("origin").split(",");
|
|
||||||
double[] origin = new double[3];
|
|
||||||
for (int i = 0; i < 3; i++) origin[i] = Double.parseDouble(parts[i]);
|
|
||||||
app.setOrigin(origin);
|
|
||||||
}
|
|
||||||
|
|
||||||
Set<Long> flippedNormals = new HashSet<>();
|
|
||||||
|
|
||||||
int numThreads =
|
|
||||||
cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
|
|
||||||
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
|
|
||||||
List<Future<List<Long>>> futures = new ArrayList<>();
|
|
||||||
|
|
||||||
long numFacets = polyData.GetNumberOfCells() / numThreads;
|
|
||||||
for (int i = 0; i < numThreads; i++) {
|
|
||||||
long fromIndex = i * numFacets;
|
|
||||||
long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets);
|
|
||||||
|
|
||||||
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex);
|
|
||||||
futures.add(executor.submit(fnf));
|
|
||||||
}
|
|
||||||
|
|
||||||
for (Future<List<Long>> future : futures) flippedNormals.addAll(future.get());
|
|
||||||
|
|
||||||
executor.shutdown();
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"Found {} flipped normals out of {} facets",
|
|
||||||
flippedNormals.size(),
|
|
||||||
polyData.GetNumberOfCells());
|
|
||||||
|
|
||||||
if (cl.hasOption("output")) {
|
|
||||||
NavigableSet<Long> sorted = new TreeSet<>(flippedNormals);
|
|
||||||
String header = "";
|
|
||||||
if (!flippedNormals.isEmpty()) {
|
|
||||||
header = "# The following indices were flipped from " + cl.getOptionValue("obj") + ":\n";
|
|
||||||
StringBuilder sb = new StringBuilder("# ");
|
|
||||||
for (Long index : sorted) {
|
|
||||||
sb.append(String.format("%d", index));
|
|
||||||
if (index < sorted.last()) sb.append(", ");
|
|
||||||
}
|
}
|
||||||
sb.append("\n");
|
|
||||||
header += WordUtils.wrap(sb.toString(), 80, "\n# ", false);
|
|
||||||
logger.info(header);
|
|
||||||
}
|
|
||||||
|
|
||||||
app.flipNormals(flippedNormals);
|
logger.info("Found {} flipped normals out of {} facets", flippedNormals.size(), polyData.GetNumberOfCells());
|
||||||
PolyDataUtil.saveShapeModelAsOBJ(app.polyData, cl.getOptionValue("output"), header);
|
|
||||||
logger.info("wrote OBJ file {}", cl.getOptionValue("output"));
|
if (cl.hasOption("output")) {
|
||||||
|
NavigableSet<Long> sorted = new TreeSet<>(flippedNormals);
|
||||||
|
String header = "";
|
||||||
|
if (!flippedNormals.isEmpty()) {
|
||||||
|
header = "# The following indices were flipped from " + cl.getOptionValue("obj") + ":\n";
|
||||||
|
StringBuilder sb = new StringBuilder("# ");
|
||||||
|
for (Long index : sorted) {
|
||||||
|
sb.append(String.format("%d", index));
|
||||||
|
if (index < sorted.last()) sb.append(", ");
|
||||||
|
}
|
||||||
|
sb.append("\n");
|
||||||
|
header += WordUtils.wrap(sb.toString(), 80, "\n# ", false);
|
||||||
|
logger.info(header);
|
||||||
|
}
|
||||||
|
|
||||||
|
app.flipNormals(flippedNormals);
|
||||||
|
PolyDataUtil.saveShapeModelAsOBJ(app.polyData, cl.getOptionValue("output"), header);
|
||||||
|
logger.info("wrote OBJ file {}", cl.getOptionValue("output"));
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("ValidateNormals done");
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("ValidateNormals done");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.apps;
|
package terrasaur.apps;
|
||||||
|
|
||||||
import java.util.*;
|
import java.util.*;
|
||||||
@@ -29,374 +51,361 @@ import vtk.vtkPolyData;
|
|||||||
*/
|
*/
|
||||||
public class ValidateOBJ implements TerrasaurTool {
|
public class ValidateOBJ implements TerrasaurTool {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String shortDescription() {
|
public String shortDescription() {
|
||||||
return "Check a closed shape file in OBJ format for errors.";
|
return "Check a closed shape file in OBJ format for errors.";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String fullDescription(Options options) {
|
public String fullDescription(Options options) {
|
||||||
String header = "";
|
String header = "";
|
||||||
String footer =
|
String footer =
|
||||||
"""
|
"""
|
||||||
This program checks that a shape model has the correct number of facets and vertices. \
|
This program checks that a shape model has the correct number of facets and vertices. \
|
||||||
It will also check for duplicate vertices, vertices that are not referenced by any facet, and zero area facets.
|
It will also check for duplicate vertices, vertices that are not referenced by any facet, and zero area facets.
|
||||||
""";
|
""";
|
||||||
return TerrasaurTool.super.fullDescription(options, header, footer);
|
return TerrasaurTool.super.fullDescription(options, header, footer);
|
||||||
}
|
|
||||||
|
|
||||||
private vtkPolyData polyData;
|
|
||||||
private String validationMsg;
|
|
||||||
|
|
||||||
private ValidateOBJ() {}
|
|
||||||
|
|
||||||
public ValidateOBJ(vtkPolyData polyData) {
|
|
||||||
this.polyData = polyData;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return {@link vtkPolyData#GetNumberOfCells()}
|
|
||||||
*/
|
|
||||||
public long facetCount() {
|
|
||||||
return polyData.GetNumberOfCells();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return {@link vtkPolyData#GetNumberOfPoints()}
|
|
||||||
*/
|
|
||||||
public long vertexCount() {
|
|
||||||
return polyData.GetNumberOfPoints();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return description of test result
|
|
||||||
*/
|
|
||||||
public String getMessage() {
|
|
||||||
return validationMsg;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return true if number of facets in the shape model satisfies 3*4^n where n is an integer
|
|
||||||
*/
|
|
||||||
public boolean testFacets() {
|
|
||||||
boolean meetsCondition = facetCount() % 3 == 0;
|
|
||||||
|
|
||||||
if (meetsCondition) {
|
|
||||||
long facet3 = facetCount() / 3;
|
|
||||||
double logFacet3 = Math.log(facet3) / Math.log(4);
|
|
||||||
if (Math.ceil(logFacet3) != Math.floor(logFacet3)) meetsCondition = false;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5);
|
private vtkPolyData polyData;
|
||||||
if (meetsCondition) {
|
private String validationMsg;
|
||||||
validationMsg =
|
|
||||||
String.format(
|
private ValidateOBJ() {}
|
||||||
"Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
|
|
||||||
} else {
|
public ValidateOBJ(vtkPolyData polyData) {
|
||||||
validationMsg =
|
this.polyData = polyData;
|
||||||
String.format(
|
|
||||||
"Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.",
|
|
||||||
facetCount(), 3 * Math.pow(4, n), n);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return meetsCondition;
|
/**
|
||||||
}
|
* @return {@link vtkPolyData#GetNumberOfCells()}
|
||||||
|
*/
|
||||||
/**
|
public long facetCount() {
|
||||||
* @return true if number of vertices in the shape model satisfies v=f/2+2
|
return polyData.GetNumberOfCells();
|
||||||
*/
|
|
||||||
public boolean testVertices() {
|
|
||||||
boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount();
|
|
||||||
if (meetsCondition)
|
|
||||||
validationMsg =
|
|
||||||
String.format(
|
|
||||||
"Model has %d vertices and %d facets. This satisfies v = f/2+2.",
|
|
||||||
vertexCount(), facetCount());
|
|
||||||
else
|
|
||||||
validationMsg =
|
|
||||||
String.format(
|
|
||||||
"Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.",
|
|
||||||
vertexCount(), facetCount(), facetCount() / 2 + 2);
|
|
||||||
|
|
||||||
return meetsCondition;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return key is vertex id, value is a list of vertices within a hard coded distance of 1e-10.
|
|
||||||
*/
|
|
||||||
public NavigableMap<Long, List<Long>> findDuplicateVertices() {
|
|
||||||
|
|
||||||
SmallBodyModel sbm = new SmallBodyModel(polyData);
|
|
||||||
|
|
||||||
double[] iPt = new double[3];
|
|
||||||
|
|
||||||
NavigableMap<Long, List<Long>> map = new TreeMap<>();
|
|
||||||
double tol = 1e-10;
|
|
||||||
for (long i = 0; i < vertexCount(); i++) {
|
|
||||||
polyData.GetPoint(i, iPt);
|
|
||||||
|
|
||||||
List<Long> closestVertices = new ArrayList<>();
|
|
||||||
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol))
|
|
||||||
if (id > i) closestVertices.add(id);
|
|
||||||
if (!closestVertices.isEmpty()) map.put(i, closestVertices);
|
|
||||||
|
|
||||||
if (map.containsKey(i) && !map.get(i).isEmpty()) {
|
|
||||||
StringBuilder sb = new StringBuilder();
|
|
||||||
sb.append(String.format("Duplicates for vertex %d: ", i + 1));
|
|
||||||
for (Long dupId : map.get(i)) sb.append(String.format("%d ", dupId + 1));
|
|
||||||
logger.debug(sb.toString());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
validationMsg = String.format("%d vertices have duplicates", map.size());
|
/**
|
||||||
|
* @return {@link vtkPolyData#GetNumberOfPoints()}
|
||||||
|
*/
|
||||||
|
public long vertexCount() {
|
||||||
|
return polyData.GetNumberOfPoints();
|
||||||
|
}
|
||||||
|
|
||||||
return map;
|
/**
|
||||||
}
|
* @return description of test result
|
||||||
|
*/
|
||||||
|
public String getMessage() {
|
||||||
|
return validationMsg;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @return a list of vertex indices where one or more of the coordinates fail {@link
|
* @return true if number of facets in the shape model satisfies 3*4^n where n is an integer
|
||||||
* Double#isFinite(double)}.
|
*/
|
||||||
*/
|
public boolean testFacets() {
|
||||||
public List<Integer> findMalformedVertices() {
|
boolean meetsCondition = facetCount() % 3 == 0;
|
||||||
double[] iPt = new double[3];
|
|
||||||
NavigableSet<Integer> vertexIndices = new TreeSet<>();
|
if (meetsCondition) {
|
||||||
for (int i = 0; i < vertexCount(); i++) {
|
long facet3 = facetCount() / 3;
|
||||||
polyData.GetPoint(i, iPt);
|
double logFacet3 = Math.log(facet3) / Math.log(4);
|
||||||
for (int j = 0; j < 3; j++) {
|
if (Math.ceil(logFacet3) != Math.floor(logFacet3)) meetsCondition = false;
|
||||||
if (!Double.isFinite(iPt[j])) {
|
|
||||||
logger.debug("Vertex {}: {} {} {}", i, iPt[0], iPt[1], iPt[2]);
|
|
||||||
vertexIndices.add(i);
|
|
||||||
break;
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
validationMsg = String.format("%d malformed vertices ", vertexIndices.size());
|
|
||||||
return new ArrayList<>(vertexIndices);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5);
|
||||||
* @return a list of vertex indices that are not referenced by any facet
|
if (meetsCondition) {
|
||||||
*/
|
validationMsg =
|
||||||
public List<Long> findUnreferencedVertices() {
|
String.format("Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
|
||||||
NavigableSet<Long> vertexIndices = new TreeSet<>();
|
} else {
|
||||||
for (long i = 0; i < polyData.GetNumberOfPoints(); i++) {
|
validationMsg = String.format(
|
||||||
vertexIndices.add(i);
|
"Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.",
|
||||||
|
facetCount(), 3 * Math.pow(4, n), n);
|
||||||
|
}
|
||||||
|
|
||||||
|
return meetsCondition;
|
||||||
}
|
}
|
||||||
|
|
||||||
vtkIdList idList = new vtkIdList();
|
/**
|
||||||
for (int i = 0; i < facetCount(); ++i) {
|
* @return true if number of vertices in the shape model satisfies v=f/2+2
|
||||||
polyData.GetCellPoints(i, idList);
|
*/
|
||||||
long id0 = idList.GetId(0);
|
public boolean testVertices() {
|
||||||
long id1 = idList.GetId(1);
|
boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount();
|
||||||
long id2 = idList.GetId(2);
|
if (meetsCondition)
|
||||||
|
validationMsg = String.format(
|
||||||
|
"Model has %d vertices and %d facets. This satisfies v = f/2+2.", vertexCount(), facetCount());
|
||||||
|
else
|
||||||
|
validationMsg = String.format(
|
||||||
|
"Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.",
|
||||||
|
vertexCount(), facetCount(), facetCount() / 2 + 2);
|
||||||
|
|
||||||
vertexIndices.remove(id0);
|
return meetsCondition;
|
||||||
vertexIndices.remove(id1);
|
|
||||||
vertexIndices.remove(id2);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!vertexIndices.isEmpty()) {
|
/**
|
||||||
double[] pt = new double[3];
|
* @return key is vertex id, value is a list of vertices within a hard coded distance of 1e-10.
|
||||||
for (long id : vertexIndices) {
|
*/
|
||||||
polyData.GetPoint(id, pt);
|
public NavigableMap<Long, List<Long>> findDuplicateVertices() {
|
||||||
logger.debug("Unreferenced vertex {} [{}, {}, {}]", id + 1, pt[0], pt[1], pt[2]);
|
|
||||||
// note OBJ vertices are numbered from 1 but VTK uses 0
|
SmallBodyModel sbm = new SmallBodyModel(polyData);
|
||||||
}
|
|
||||||
|
double[] iPt = new double[3];
|
||||||
|
|
||||||
|
NavigableMap<Long, List<Long>> map = new TreeMap<>();
|
||||||
|
double tol = 1e-10;
|
||||||
|
for (long i = 0; i < vertexCount(); i++) {
|
||||||
|
polyData.GetPoint(i, iPt);
|
||||||
|
|
||||||
|
List<Long> closestVertices = new ArrayList<>();
|
||||||
|
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol)) if (id > i) closestVertices.add(id);
|
||||||
|
if (!closestVertices.isEmpty()) map.put(i, closestVertices);
|
||||||
|
|
||||||
|
if (map.containsKey(i) && !map.get(i).isEmpty()) {
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
sb.append(String.format("Duplicates for vertex %d: ", i + 1));
|
||||||
|
for (Long dupId : map.get(i)) sb.append(String.format("%d ", dupId + 1));
|
||||||
|
logger.debug(sb.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
validationMsg = String.format("%d vertices have duplicates", map.size());
|
||||||
|
|
||||||
|
return map;
|
||||||
}
|
}
|
||||||
|
|
||||||
validationMsg = String.format("%d unreferenced vertices found", vertexIndices.size());
|
/**
|
||||||
|
* @return a list of vertex indices where one or more of the coordinates fail {@link
|
||||||
return new ArrayList<>(vertexIndices);
|
* Double#isFinite(double)}.
|
||||||
}
|
*/
|
||||||
|
public List<Integer> findMalformedVertices() {
|
||||||
/**
|
double[] iPt = new double[3];
|
||||||
* @return a list of facet indices where the facet has zero area
|
NavigableSet<Integer> vertexIndices = new TreeSet<>();
|
||||||
*/
|
for (int i = 0; i < vertexCount(); i++) {
|
||||||
public List<Integer> findZeroAreaFacets() {
|
polyData.GetPoint(i, iPt);
|
||||||
List<Integer> zeroAreaFacets = new ArrayList<>();
|
for (int j = 0; j < 3; j++) {
|
||||||
vtkIdList idList = new vtkIdList();
|
if (!Double.isFinite(iPt[j])) {
|
||||||
double[] pt0 = new double[3];
|
logger.debug("Vertex {}: {} {} {}", i, iPt[0], iPt[1], iPt[2]);
|
||||||
double[] pt1 = new double[3];
|
vertexIndices.add(i);
|
||||||
double[] pt2 = new double[3];
|
break;
|
||||||
|
}
|
||||||
for (int i = 0; i < facetCount(); ++i) {
|
}
|
||||||
polyData.GetCellPoints(i, idList);
|
}
|
||||||
long id0 = idList.GetId(0);
|
validationMsg = String.format("%d malformed vertices ", vertexIndices.size());
|
||||||
long id1 = idList.GetId(1);
|
return new ArrayList<>(vertexIndices);
|
||||||
long id2 = idList.GetId(2);
|
|
||||||
polyData.GetPoint(id0, pt0);
|
|
||||||
polyData.GetPoint(id1, pt1);
|
|
||||||
polyData.GetPoint(id2, pt2);
|
|
||||||
|
|
||||||
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
|
|
||||||
// duplicate vertices
|
|
||||||
TriangularFacet facet =
|
|
||||||
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
|
|
||||||
double area = facet.getArea();
|
|
||||||
if (area == 0) {
|
|
||||||
zeroAreaFacets.add(i);
|
|
||||||
logger.debug(
|
|
||||||
"Facet {} has zero area. Vertices are {} [{}, {}, {}], {} [{}, {}, {}], and {} [{}, {}, {}]",
|
|
||||||
i + 1,
|
|
||||||
id0 + 1,
|
|
||||||
pt0[0],
|
|
||||||
pt0[1],
|
|
||||||
pt0[2],
|
|
||||||
id1 + 1,
|
|
||||||
pt1[0],
|
|
||||||
pt1[1],
|
|
||||||
pt1[2],
|
|
||||||
id2 + 1,
|
|
||||||
pt2[0],
|
|
||||||
pt2[1],
|
|
||||||
pt2[2]);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
validationMsg = String.format("%d zero area facets found", zeroAreaFacets.size());
|
/**
|
||||||
return zeroAreaFacets;
|
* @return a list of vertex indices that are not referenced by any facet
|
||||||
}
|
*/
|
||||||
|
public List<Long> findUnreferencedVertices() {
|
||||||
|
NavigableSet<Long> vertexIndices = new TreeSet<>();
|
||||||
|
for (long i = 0; i < polyData.GetNumberOfPoints(); i++) {
|
||||||
|
vertexIndices.add(i);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
vtkIdList idList = new vtkIdList();
|
||||||
* @return statistics on the angle between the facet radial and normal vectors
|
for (int i = 0; i < facetCount(); ++i) {
|
||||||
*/
|
polyData.GetCellPoints(i, idList);
|
||||||
public DescriptiveStatistics normalStats() {
|
long id0 = idList.GetId(0);
|
||||||
DescriptiveStatistics stats = new DescriptiveStatistics();
|
long id1 = idList.GetId(1);
|
||||||
|
long id2 = idList.GetId(2);
|
||||||
|
|
||||||
VectorStatistics cStats = new VectorStatistics();
|
vertexIndices.remove(id0);
|
||||||
VectorStatistics nStats = new VectorStatistics();
|
vertexIndices.remove(id1);
|
||||||
|
vertexIndices.remove(id2);
|
||||||
|
}
|
||||||
|
|
||||||
vtkIdList idList = new vtkIdList();
|
if (!vertexIndices.isEmpty()) {
|
||||||
double[] pt0 = new double[3];
|
double[] pt = new double[3];
|
||||||
double[] pt1 = new double[3];
|
for (long id : vertexIndices) {
|
||||||
double[] pt2 = new double[3];
|
polyData.GetPoint(id, pt);
|
||||||
for (int i = 0; i < facetCount(); ++i) {
|
logger.debug("Unreferenced vertex {} [{}, {}, {}]", id + 1, pt[0], pt[1], pt[2]);
|
||||||
polyData.GetCellPoints(i, idList);
|
// note OBJ vertices are numbered from 1 but VTK uses 0
|
||||||
long id0 = idList.GetId(0);
|
}
|
||||||
long id1 = idList.GetId(1);
|
}
|
||||||
long id2 = idList.GetId(2);
|
|
||||||
polyData.GetPoint(id0, pt0);
|
|
||||||
polyData.GetPoint(id1, pt1);
|
|
||||||
polyData.GetPoint(id2, pt2);
|
|
||||||
|
|
||||||
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
|
validationMsg = String.format("%d unreferenced vertices found", vertexIndices.size());
|
||||||
// duplicate vertices
|
|
||||||
TriangularFacet facet =
|
return new ArrayList<>(vertexIndices);
|
||||||
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
|
|
||||||
if (facet.getArea() > 0) {
|
|
||||||
stats.addValue(facet.getCenter().getDot(facet.getNormal()));
|
|
||||||
cStats.add(facet.getCenter());
|
|
||||||
nStats.add(facet.getNormal());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
validationMsg =
|
/**
|
||||||
String.format(
|
* @return a list of facet indices where the facet has zero area
|
||||||
"Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, "
|
*/
|
||||||
+ "angle between mean radial and mean normal is %f degrees",
|
public List<Integer> findZeroAreaFacets() {
|
||||||
stats.getN(),
|
List<Integer> zeroAreaFacets = new ArrayList<>();
|
||||||
Math.toDegrees(Math.acos(stats.getMean())),
|
vtkIdList idList = new vtkIdList();
|
||||||
Math.toDegrees(Vector3D.angle(cStats.getMean(), nStats.getMean())));
|
double[] pt0 = new double[3];
|
||||||
|
double[] pt1 = new double[3];
|
||||||
|
double[] pt2 = new double[3];
|
||||||
|
|
||||||
return stats;
|
for (int i = 0; i < facetCount(); ++i) {
|
||||||
}
|
polyData.GetCellPoints(i, idList);
|
||||||
|
long id0 = idList.GetId(0);
|
||||||
|
long id1 = idList.GetId(1);
|
||||||
|
long id2 = idList.GetId(2);
|
||||||
|
polyData.GetPoint(id0, pt0);
|
||||||
|
polyData.GetPoint(id1, pt1);
|
||||||
|
polyData.GetPoint(id2, pt2);
|
||||||
|
|
||||||
private static Options defineOptions() {
|
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
|
||||||
Options options = TerrasaurTool.defineOptions();
|
// duplicate vertices
|
||||||
options.addOption(
|
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
|
||||||
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build());
|
double area = facet.getArea();
|
||||||
options.addOption(
|
if (area == 0) {
|
||||||
Option.builder("logFile")
|
zeroAreaFacets.add(i);
|
||||||
.hasArg()
|
logger.debug(
|
||||||
.desc("If present, save screen output to log file.")
|
"Facet {} has zero area. Vertices are {} [{}, {}, {}], {} [{}, {}, {}], and {} [{}, {}, {}]",
|
||||||
.build());
|
i + 1,
|
||||||
StringBuilder sb = new StringBuilder();
|
id0 + 1,
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
pt0[0],
|
||||||
options.addOption(
|
pt0[1],
|
||||||
Option.builder("logLevel")
|
pt0[2],
|
||||||
.hasArg()
|
id1 + 1,
|
||||||
.desc(
|
pt1[0],
|
||||||
"If present, print messages above selected priority. Valid values are "
|
pt1[1],
|
||||||
+ sb.toString().trim()
|
pt1[2],
|
||||||
+ ". Default is INFO.")
|
id2 + 1,
|
||||||
.build());
|
pt2[0],
|
||||||
options.addOption(Option.builder("output").hasArg().desc("Filename for output OBJ.").build());
|
pt2[1],
|
||||||
options.addOption(
|
pt2[2]);
|
||||||
Option.builder("removeDuplicateVertices")
|
}
|
||||||
.desc("Remove duplicate vertices. Use with -output to save OBJ.")
|
}
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("removeUnreferencedVertices")
|
|
||||||
.desc("Remove unreferenced vertices. Use with -output to save OBJ.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("removeZeroAreaFacets")
|
|
||||||
.desc("Remove facets with zero area. Use with -output to save OBJ.")
|
|
||||||
.build());
|
|
||||||
options.addOption(
|
|
||||||
Option.builder("cleanup")
|
|
||||||
.desc(
|
|
||||||
"Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
|
|
||||||
.build());
|
|
||||||
return options;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
validationMsg = String.format("%d zero area facets found", zeroAreaFacets.size());
|
||||||
TerrasaurTool defaultOBJ = new ValidateOBJ();
|
return zeroAreaFacets;
|
||||||
|
|
||||||
Options options = defineOptions();
|
|
||||||
|
|
||||||
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
|
||||||
|
|
||||||
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
|
||||||
for (MessageLabel ml : startupMessages.keySet())
|
|
||||||
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
|
|
||||||
|
|
||||||
NativeLibraryLoader.loadVtkLibraries();
|
|
||||||
vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
|
|
||||||
|
|
||||||
if (polyData == null) {
|
|
||||||
logger.error("Cannot read {}, exiting.", cl.getOptionValue("obj"));
|
|
||||||
System.exit(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
ValidateOBJ vo = new ValidateOBJ(polyData);
|
/**
|
||||||
|
* @return statistics on the angle between the facet radial and normal vectors
|
||||||
|
*/
|
||||||
|
public DescriptiveStatistics normalStats() {
|
||||||
|
DescriptiveStatistics stats = new DescriptiveStatistics();
|
||||||
|
|
||||||
logger.log(vo.testFacets() ? Level.INFO : Level.WARN, vo.getMessage());
|
VectorStatistics cStats = new VectorStatistics();
|
||||||
logger.log(vo.testVertices() ? Level.INFO : Level.WARN, vo.getMessage());
|
VectorStatistics nStats = new VectorStatistics();
|
||||||
|
|
||||||
DescriptiveStatistics stats = vo.normalStats();
|
vtkIdList idList = new vtkIdList();
|
||||||
logger.log(stats.getMean() > 0 ? Level.INFO : Level.WARN, vo.getMessage());
|
double[] pt0 = new double[3];
|
||||||
|
double[] pt1 = new double[3];
|
||||||
|
double[] pt2 = new double[3];
|
||||||
|
for (int i = 0; i < facetCount(); ++i) {
|
||||||
|
polyData.GetCellPoints(i, idList);
|
||||||
|
long id0 = idList.GetId(0);
|
||||||
|
long id1 = idList.GetId(1);
|
||||||
|
long id2 = idList.GetId(2);
|
||||||
|
polyData.GetPoint(id0, pt0);
|
||||||
|
polyData.GetPoint(id1, pt1);
|
||||||
|
polyData.GetPoint(id2, pt2);
|
||||||
|
|
||||||
List<Integer> mfv = vo.findMalformedVertices();
|
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
|
||||||
logger.log(!mfv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
// duplicate vertices
|
||||||
|
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
|
||||||
|
if (facet.getArea() > 0) {
|
||||||
|
stats.addValue(facet.getCenter().createUnitized().getDot(facet.getNormal()));
|
||||||
|
cStats.add(facet.getCenter());
|
||||||
|
nStats.add(facet.getNormal());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
NavigableMap<Long, List<Long>> dv = vo.findDuplicateVertices();
|
validationMsg = String.format(
|
||||||
logger.log(!dv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
"Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, "
|
||||||
|
+ "angle between mean radial and mean normal is %f degrees",
|
||||||
|
stats.getN(),
|
||||||
|
Math.toDegrees(Math.acos(stats.getMean())),
|
||||||
|
Math.toDegrees(Vector3D.angle(cStats.getMean(), nStats.getMean())));
|
||||||
|
|
||||||
List<Long> urv = vo.findUnreferencedVertices();
|
return stats;
|
||||||
logger.log(!urv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
}
|
||||||
|
|
||||||
List<Integer> zaf = vo.findZeroAreaFacets();
|
private static Options defineOptions() {
|
||||||
logger.log(!zaf.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
Options options = TerrasaurTool.defineOptions();
|
||||||
|
options.addOption(Option.builder("obj")
|
||||||
final boolean cleanup = cl.hasOption("cleanup");
|
.required()
|
||||||
final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices");
|
.hasArg()
|
||||||
final boolean removeUnreferencedVertices =
|
.desc("Shape model to validate.")
|
||||||
cleanup || cl.hasOption("removeUnreferencedVertices");
|
.build());
|
||||||
final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets");
|
options.addOption(Option.builder("logFile")
|
||||||
|
.hasArg()
|
||||||
if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData);
|
.desc("If present, save screen output to log file.")
|
||||||
|
.build());
|
||||||
if (removeUnreferencedVertices) polyData = PolyDataUtil.removeUnreferencedPoints(polyData);
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
if (removeZeroAreaFacets) polyData = PolyDataUtil.removeZeroAreaFacets(polyData);
|
options.addOption(Option.builder("logLevel")
|
||||||
|
.hasArg()
|
||||||
if (cl.hasOption("output")) {
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output"));
|
+ sb.toString().trim()
|
||||||
logger.info(String.format("Wrote OBJ file %s", cl.getOptionValue("output")));
|
+ ". Default is INFO.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("output")
|
||||||
|
.hasArg()
|
||||||
|
.desc("Filename for output OBJ.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("removeDuplicateVertices")
|
||||||
|
.desc("Remove duplicate vertices. Use with -output to save OBJ.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("removeUnreferencedVertices")
|
||||||
|
.desc("Remove unreferenced vertices. Use with -output to save OBJ.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("removeZeroAreaFacets")
|
||||||
|
.desc("Remove facets with zero area. Use with -output to save OBJ.")
|
||||||
|
.build());
|
||||||
|
options.addOption(Option.builder("cleanup")
|
||||||
|
.desc("Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
|
||||||
|
.build());
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static void main(String[] args) throws Exception {
|
||||||
|
TerrasaurTool defaultOBJ = new ValidateOBJ();
|
||||||
|
|
||||||
|
Options options = defineOptions();
|
||||||
|
|
||||||
|
CommandLine cl = defaultOBJ.parseArgs(args, options);
|
||||||
|
|
||||||
|
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
|
||||||
|
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
|
||||||
|
|
||||||
|
NativeLibraryLoader.loadVtkLibraries();
|
||||||
|
vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
|
||||||
|
|
||||||
|
if (polyData == null) {
|
||||||
|
logger.error("Cannot read {}, exiting.", cl.getOptionValue("obj"));
|
||||||
|
System.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
ValidateOBJ vo = new ValidateOBJ(polyData);
|
||||||
|
|
||||||
|
logger.log(vo.testFacets() ? Level.INFO : Level.WARN, vo.getMessage());
|
||||||
|
logger.log(vo.testVertices() ? Level.INFO : Level.WARN, vo.getMessage());
|
||||||
|
|
||||||
|
DescriptiveStatistics stats = vo.normalStats();
|
||||||
|
logger.log(stats.getMean() > 0 ? Level.INFO : Level.WARN, vo.getMessage());
|
||||||
|
|
||||||
|
List<Integer> mfv = vo.findMalformedVertices();
|
||||||
|
logger.log(!mfv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
||||||
|
|
||||||
|
NavigableMap<Long, List<Long>> dv = vo.findDuplicateVertices();
|
||||||
|
logger.log(!dv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
||||||
|
|
||||||
|
List<Long> urv = vo.findUnreferencedVertices();
|
||||||
|
logger.log(!urv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
||||||
|
|
||||||
|
List<Integer> zaf = vo.findZeroAreaFacets();
|
||||||
|
logger.log(!zaf.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
|
||||||
|
|
||||||
|
final boolean cleanup = cl.hasOption("cleanup");
|
||||||
|
final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices");
|
||||||
|
final boolean removeUnreferencedVertices = cleanup || cl.hasOption("removeUnreferencedVertices");
|
||||||
|
final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets");
|
||||||
|
|
||||||
|
if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData);
|
||||||
|
|
||||||
|
if (removeUnreferencedVertices) polyData = PolyDataUtil.removeUnreferencedPoints(polyData);
|
||||||
|
|
||||||
|
if (removeZeroAreaFacets) polyData = PolyDataUtil.removeZeroAreaFacets(polyData);
|
||||||
|
|
||||||
|
if (cl.hasOption("output")) {
|
||||||
|
PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output"));
|
||||||
|
logger.info("Wrote OBJ file {}", cl.getOptionValue("output"));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
/**
|
/**
|
||||||
* Applications that can be run on the command line. Running each without arguments gives a usage
|
* Applications that can be run on the command line. Running each without arguments gives a usage
|
||||||
* summary.
|
* summary.
|
||||||
|
|||||||
@@ -1,32 +1,57 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.config;
|
package terrasaur.config;
|
||||||
|
|
||||||
import java.util.List;
|
|
||||||
import jackfruit.annotations.Comment;
|
import jackfruit.annotations.Comment;
|
||||||
import jackfruit.annotations.DefaultValue;
|
import jackfruit.annotations.DefaultValue;
|
||||||
import jackfruit.annotations.Jackfruit;
|
import jackfruit.annotations.Jackfruit;
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
@Jackfruit
|
@Jackfruit
|
||||||
public interface CKFromSumFileConfig {
|
public interface CKFromSumFileConfig {
|
||||||
|
|
||||||
@Comment("""
|
@Comment(
|
||||||
|
"""
|
||||||
Body fixed frame for the target body. If blank, use SPICE-defined
|
Body fixed frame for the target body. If blank, use SPICE-defined
|
||||||
body fixed frame. This will be the reference frame unless the J2000
|
body fixed frame. This will be the reference frame unless the J2000
|
||||||
parameter is set to true.""")
|
parameter is set to true.""")
|
||||||
@DefaultValue("IAU_DIMORPHOS")
|
@DefaultValue("IAU_DIMORPHOS")
|
||||||
String bodyFrame();
|
String bodyFrame();
|
||||||
|
|
||||||
@Comment("Target body name.")
|
@Comment("Target body name.")
|
||||||
@DefaultValue("DIMORPHOS")
|
@DefaultValue("DIMORPHOS")
|
||||||
String bodyName();
|
String bodyName();
|
||||||
|
|
||||||
@Comment("""
|
@Comment(
|
||||||
|
"""
|
||||||
Extend CK past the last sumFile by this number of seconds. Default
|
Extend CK past the last sumFile by this number of seconds. Default
|
||||||
is zero. Attitude is assumed to be fixed to the value given by the
|
is zero. Attitude is assumed to be fixed to the value given by the
|
||||||
last sumfile.""")
|
last sumfile.""")
|
||||||
@DefaultValue("0")
|
@DefaultValue("0")
|
||||||
double extend();
|
double extend();
|
||||||
|
|
||||||
@Comment("""
|
@Comment(
|
||||||
|
"""
|
||||||
SPC defines the camera X axis to be increasing to the right, Y to
|
SPC defines the camera X axis to be increasing to the right, Y to
|
||||||
be increasing down, and Z to point into the page:
|
be increasing down, and Z to point into the page:
|
||||||
|
|
||||||
@@ -52,48 +77,48 @@ public interface CKFromSumFileConfig {
|
|||||||
(flipX, flipY, flipZ) = ( 2,-1, 3) SPICE frame is camera frame rotated 90 degrees about Z.
|
(flipX, flipY, flipZ) = ( 2,-1, 3) SPICE frame is camera frame rotated 90 degrees about Z.
|
||||||
(flipX, flipY, flipZ) = (-2, 1, 3) SPICE frame is camera frame rotated -90 degrees about Z.
|
(flipX, flipY, flipZ) = (-2, 1, 3) SPICE frame is camera frame rotated -90 degrees about Z.
|
||||||
(flipX, flipY, flipZ) = ( 1,-2,-3) rotates the image 180 degrees about X.""")
|
(flipX, flipY, flipZ) = ( 1,-2,-3) rotates the image 180 degrees about X.""")
|
||||||
@DefaultValue("-1")
|
@DefaultValue("-1")
|
||||||
int flipX();
|
int flipX();
|
||||||
|
|
||||||
@Comment("Map the camera Y axis to a SPICE axis. See flipX for details.")
|
@Comment("Map the camera Y axis to a SPICE axis. See flipX for details.")
|
||||||
@DefaultValue("2")
|
@DefaultValue("2")
|
||||||
int flipY();
|
int flipY();
|
||||||
|
|
||||||
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
|
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
|
||||||
@DefaultValue("-3")
|
@DefaultValue("-3")
|
||||||
int flipZ();
|
int flipZ();
|
||||||
|
|
||||||
@Comment("""
|
@Comment(
|
||||||
|
"""
|
||||||
Supply this frame kernel to MSOPCK. Only needed if the reference frame
|
Supply this frame kernel to MSOPCK. Only needed if the reference frame
|
||||||
(set by bodyFrame or J2000) is not built into SPICE""")
|
(set by bodyFrame or J2000) is not built into SPICE""")
|
||||||
@DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf")
|
@DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf")
|
||||||
String fk();
|
String fk();
|
||||||
|
|
||||||
@Comment("Instrument frame name")
|
@Comment("Instrument frame name")
|
||||||
@DefaultValue("DART_DRACO")
|
@DefaultValue("DART_DRACO")
|
||||||
String instrumentFrameName();
|
String instrumentFrameName();
|
||||||
|
|
||||||
@Comment("If set to true, use J2000 as the reference frame")
|
@Comment("If set to true, use J2000 as the reference frame")
|
||||||
@DefaultValue("true")
|
@DefaultValue("true")
|
||||||
boolean J2000();
|
boolean J2000();
|
||||||
|
|
||||||
@Comment("Path to leapseconds kernel.")
|
@Comment("Path to leapseconds kernel.")
|
||||||
@DefaultValue("/project/dart/data/SPICE/flight/lsk/naif0012.tls")
|
@DefaultValue("/project/dart/data/SPICE/flight/lsk/naif0012.tls")
|
||||||
String lsk();
|
String lsk();
|
||||||
|
|
||||||
@Comment("Path to spacecraft SCLK file.")
|
@Comment("Path to spacecraft SCLK file.")
|
||||||
@DefaultValue("/project/dart/data/SPICE/flight/sclk/dart_sclk_0204.tsc")
|
@DefaultValue("/project/dart/data/SPICE/flight/sclk/dart_sclk_0204.tsc")
|
||||||
String sclk();
|
String sclk();
|
||||||
|
|
||||||
@Comment("Name of spacecraft frame.")
|
@Comment("Name of spacecraft frame.")
|
||||||
@DefaultValue("DART_SPACECRAFT")
|
@DefaultValue("DART_SPACECRAFT")
|
||||||
String spacecraftFrame();
|
String spacecraftFrame();
|
||||||
|
|
||||||
@Comment("""
|
@Comment(
|
||||||
|
"""
|
||||||
SPICE metakernel to read. This may be specified more than once
|
SPICE metakernel to read. This may be specified more than once
|
||||||
for multiple metakernels.""")
|
for multiple metakernels.""")
|
||||||
@DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm")
|
@DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm")
|
||||||
List<String> metakernel();
|
List<String> metakernel();
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.config;
|
package terrasaur.config;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -7,230 +29,233 @@ import org.apache.logging.log4j.Level;
|
|||||||
import org.apache.logging.log4j.LogManager;
|
import org.apache.logging.log4j.LogManager;
|
||||||
import org.apache.logging.log4j.Logger;
|
import org.apache.logging.log4j.Logger;
|
||||||
import org.apache.logging.log4j.spi.StandardLevel;
|
import org.apache.logging.log4j.spi.StandardLevel;
|
||||||
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
|
|
||||||
import terrasaur.utils.Log4j2Configurator;
|
import terrasaur.utils.Log4j2Configurator;
|
||||||
|
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
|
||||||
|
|
||||||
public class CommandLineOptions {
|
public class CommandLineOptions {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Configuration file to load
|
* Configuration file to load
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addConfig() {
|
public static Option addConfig() {
|
||||||
return Option.builder("config").hasArg().required().desc("Configuration file to load").build();
|
return Option.builder("config")
|
||||||
}
|
.hasArg()
|
||||||
|
.required()
|
||||||
|
.desc("Configuration file to load")
|
||||||
|
.build();
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Color ramp style. See {@link ColorRamp.TYPE} for valid values.
|
* Color ramp style. See {@link ColorRamp.TYPE} for valid values.
|
||||||
*
|
*
|
||||||
* @param defaultCRType
|
* @param defaultCRType
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addColorRamp(ColorRamp.TYPE defaultCRType) {
|
public static Option addColorRamp(ColorRamp.TYPE defaultCRType) {
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name()));
|
for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name()));
|
||||||
return Option.builder("colorRamp")
|
return Option.builder("colorRamp")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc(
|
.desc("Color ramp style. Valid values are "
|
||||||
"Color ramp style. Valid values are "
|
+ sb.toString().trim()
|
||||||
+ sb.toString().trim()
|
+ ". Default is "
|
||||||
+ ". Default is "
|
+ defaultCRType.name()
|
||||||
+ defaultCRType.name()
|
+ ". Run the ColorMaps application to see all supported color ramps.")
|
||||||
+ ". Run the ColorMaps application to see all supported color ramps.")
|
.build();
|
||||||
.build();
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Return a color ramp type or the default value.
|
* Return a color ramp type or the default value.
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @param defaultCRType
|
* @param defaultCRType
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) {
|
public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) {
|
||||||
ColorRamp.TYPE crType =
|
ColorRamp.TYPE crType = cl.hasOption("colorRamp")
|
||||||
cl.hasOption("colorRamp")
|
? ColorRamp.TYPE.valueOf(
|
||||||
? ColorRamp.TYPE.valueOf(cl.getOptionValue("colorRamp").toUpperCase().strip())
|
cl.getOptionValue("colorRamp").toUpperCase().strip())
|
||||||
: defaultCRType;
|
: defaultCRType;
|
||||||
return crType;
|
return crType;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be
|
* Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be
|
||||||
* lower than hardMin.
|
* lower than hardMin.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addHardMin() {
|
public static Option addHardMin() {
|
||||||
return Option.builder("hardMin")
|
return Option.builder("hardMin")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc(
|
.desc(
|
||||||
"Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be lower than hardMin.")
|
"Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be lower than hardMin.")
|
||||||
.build();
|
.build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be
|
* Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be
|
||||||
* higher than hardMax.
|
* higher than hardMax.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addHardMax() {
|
public static Option addHardMax() {
|
||||||
return Option.builder("hardMax")
|
return Option.builder("hardMax")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc(
|
.desc(
|
||||||
"Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be higher than hardMax.")
|
"Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be higher than hardMax.")
|
||||||
.build();
|
.build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get the hard minimum for the colorbar.
|
* Get the hard minimum for the colorbar.
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static double getHardMin(CommandLine cl) {
|
public static double getHardMin(CommandLine cl) {
|
||||||
return cl.hasOption("hardMin") ? Double.parseDouble(cl.getOptionValue("hardMin")) : Double.NaN;
|
return cl.hasOption("hardMin") ? Double.parseDouble(cl.getOptionValue("hardMin")) : Double.NaN;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get the hard maximum for the colorbar.
|
* Get the hard maximum for the colorbar.
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static double getHardMax(CommandLine cl) {
|
public static double getHardMax(CommandLine cl) {
|
||||||
return cl.hasOption("hardMax") ? Double.parseDouble(cl.getOptionValue("hardMax")) : Double.NaN;
|
return cl.hasOption("hardMax") ? Double.parseDouble(cl.getOptionValue("hardMax")) : Double.NaN;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* If present, save screen output to log file.
|
* If present, save screen output to log file.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addLogFile() {
|
public static Option addLogFile() {
|
||||||
return Option.builder("logFile")
|
return Option.builder("logFile")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc("If present, save screen output to log file.")
|
.desc("If present, save screen output to log file.")
|
||||||
.build();
|
.build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* If present, print messages above selected priority. See {@link StandardLevel} for valid values.
|
* If present, print messages above selected priority. See {@link StandardLevel} for valid values.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addLogLevel() {
|
public static Option addLogLevel() {
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
|
||||||
return Option.builder("logLevel")
|
return Option.builder("logLevel")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc(
|
.desc("If present, print messages above selected priority. Valid values are "
|
||||||
"If present, print messages above selected priority. Valid values are "
|
+ sb.toString().trim()
|
||||||
+ sb.toString().trim()
|
+ ". Default is INFO.")
|
||||||
+ ". Default is INFO.")
|
.build();
|
||||||
.build();
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Set the logging level from the command line option.
|
* Set the logging level from the command line option.
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
*/
|
*/
|
||||||
public static void setLogLevel(CommandLine cl) {
|
public static void setLogLevel(CommandLine cl) {
|
||||||
Log4j2Configurator lc = Log4j2Configurator.getInstance();
|
Log4j2Configurator lc = Log4j2Configurator.getInstance();
|
||||||
if (cl.hasOption("logLevel"))
|
if (cl.hasOption("logLevel"))
|
||||||
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
|
lc.setLevel(
|
||||||
}
|
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Log to file named on the command line as well as others
|
* Log to file named on the command line as well as others
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @param others
|
* @param others
|
||||||
*/
|
*/
|
||||||
public static void setLogFile(CommandLine cl, String... others) {
|
public static void setLogFile(CommandLine cl, String... others) {
|
||||||
Log4j2Configurator lc = Log4j2Configurator.getInstance();
|
Log4j2Configurator lc = Log4j2Configurator.getInstance();
|
||||||
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
|
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
|
||||||
for (String other : others) lc.addFile(other);
|
for (String other : others) lc.addFile(other);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Maximum number of simultaneous threads to execute.
|
* Maximum number of simultaneous threads to execute.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addNumCPU() {
|
public static Option addNumCPU() {
|
||||||
return Option.builder("numCPU")
|
return Option.builder("numCPU")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc(
|
.desc(
|
||||||
"Maximum number of simultaneous threads to execute. Default is numCPU value in configuration file.")
|
"Maximum number of simultaneous threads to execute. Default is numCPU value in configuration file.")
|
||||||
.build();
|
.build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Directory to place output files. Default is the working directory.
|
* Directory to place output files. Default is the working directory.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addOutputDir() {
|
public static Option addOutputDir() {
|
||||||
return Option.builder("outputDir")
|
return Option.builder("outputDir")
|
||||||
.hasArg()
|
.hasArg()
|
||||||
.desc("Directory to place output files. Default is the working directory.")
|
.desc("Directory to place output files. Default is the working directory.")
|
||||||
.build();
|
.build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Set the output dir from the command line argument.
|
* Set the output dir from the command line argument.
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static String setOutputDir(CommandLine cl) {
|
public static String setOutputDir(CommandLine cl) {
|
||||||
String path = cl.hasOption("outputDir") ? cl.getOptionValue("outputDir") : ".";
|
String path = cl.hasOption("outputDir") ? cl.getOptionValue("outputDir") : ".";
|
||||||
File parent = new File(path);
|
File parent = new File(path);
|
||||||
if (!parent.exists()) parent.mkdirs();
|
if (!parent.exists()) parent.mkdirs();
|
||||||
return path;
|
return path;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Minimum value to plot.
|
* Minimum value to plot.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addPlotMin() {
|
public static Option addPlotMin() {
|
||||||
return Option.builder("plotMin").hasArg().desc("Min value to plot.").build();
|
return Option.builder("plotMin").hasArg().desc("Min value to plot.").build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get plot min from command line argument
|
* Get plot min from command line argument
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static double getPlotMin(CommandLine cl) {
|
public static double getPlotMin(CommandLine cl) {
|
||||||
return cl.hasOption("plotMin") ? Double.parseDouble(cl.getOptionValue("plotMin")) : Double.NaN;
|
return cl.hasOption("plotMin") ? Double.parseDouble(cl.getOptionValue("plotMin")) : Double.NaN;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Maximum value to plot.
|
* Maximum value to plot.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static Option addPlotMax() {
|
public static Option addPlotMax() {
|
||||||
return Option.builder("plotMax").hasArg().desc("Max value to plot.").build();
|
return Option.builder("plotMax").hasArg().desc("Max value to plot.").build();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get plot max from command line argument
|
* Get plot max from command line argument
|
||||||
*
|
*
|
||||||
* @param cl
|
* @param cl
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static double getPlotMax(CommandLine cl) {
|
public static double getPlotMax(CommandLine cl) {
|
||||||
return cl.hasOption("plotMax") ? Double.parseDouble(cl.getOptionValue("plotMax")) : Double.NaN;
|
return cl.hasOption("plotMax") ? Double.parseDouble(cl.getOptionValue("plotMax")) : Double.NaN;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.config;
|
package terrasaur.config;
|
||||||
|
|
||||||
import jackfruit.annotations.Comment;
|
import jackfruit.annotations.Comment;
|
||||||
@@ -8,16 +30,16 @@ import jackfruit.annotations.Jackfruit;
|
|||||||
@Jackfruit
|
@Jackfruit
|
||||||
public interface ConfigBlock {
|
public interface ConfigBlock {
|
||||||
|
|
||||||
String introLines =
|
String introLines =
|
||||||
"""
|
"""
|
||||||
###############################################################################
|
###############################################################################
|
||||||
# GENERAL PARAMETERS
|
# GENERAL PARAMETERS
|
||||||
###############################################################################
|
###############################################################################
|
||||||
""";
|
""";
|
||||||
|
|
||||||
@Comment(
|
@Comment(
|
||||||
introLines
|
introLines
|
||||||
+ """
|
+ """
|
||||||
Set the logging level. Valid values in order of increasing detail:
|
Set the logging level. Valid values in order of increasing detail:
|
||||||
OFF
|
OFF
|
||||||
FATAL
|
FATAL
|
||||||
@@ -28,28 +50,28 @@ public interface ConfigBlock {
|
|||||||
TRACE
|
TRACE
|
||||||
ALL
|
ALL
|
||||||
See org.apache.logging.log4j.Level.""")
|
See org.apache.logging.log4j.Level.""")
|
||||||
@DefaultValue("INFO")
|
@DefaultValue("INFO")
|
||||||
String logLevel();
|
String logLevel();
|
||||||
|
|
||||||
@Comment(
|
@Comment(
|
||||||
"Format for log messages. See https://logging.apache.org/log4j/2.x/manual/layouts.html#PatternLayout for more details.")
|
"Format for log messages. See https://logging.apache.org/log4j/2.x/manual/layouts.html#PatternLayout for more details.")
|
||||||
@DefaultValue("%highlight{%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%c{1}:%L] %msg%n%throwable}")
|
@DefaultValue("%highlight{%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%c{1}:%L] %msg%n%throwable}")
|
||||||
String logFormat();
|
String logFormat();
|
||||||
|
|
||||||
@Comment(
|
@Comment(
|
||||||
"""
|
"""
|
||||||
Format for time strings. Allowed values are:
|
Format for time strings. Allowed values are:
|
||||||
C (e.g. 1986 APR 12 16:31:09.814)
|
C (e.g. 1986 APR 12 16:31:09.814)
|
||||||
D (e.g. 1986-102 // 16:31:12.814)
|
D (e.g. 1986-102 // 16:31:12.814)
|
||||||
J (e.g. 2446533.18834276)
|
J (e.g. 2446533.18834276)
|
||||||
ISOC (e.g. 1986-04-12T16:31:12.814)
|
ISOC (e.g. 1986-04-12T16:31:12.814)
|
||||||
ISOD (e.g. 1986-102T16:31:12.814)""")
|
ISOD (e.g. 1986-102T16:31:12.814)""")
|
||||||
@DefaultValue("ISOC")
|
@DefaultValue("ISOC")
|
||||||
String timeFormat();
|
String timeFormat();
|
||||||
|
|
||||||
@Include
|
@Include
|
||||||
MissionBlock missionBlock();
|
MissionBlock missionBlock();
|
||||||
|
|
||||||
@Include
|
@Include
|
||||||
SPCBlock spcBlock();
|
SPCBlock spcBlock();
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,37 +1,58 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.config;
|
package terrasaur.config;
|
||||||
|
|
||||||
import jackfruit.annotations.Comment;
|
import jackfruit.annotations.Comment;
|
||||||
import jackfruit.annotations.DefaultValue;
|
import jackfruit.annotations.DefaultValue;
|
||||||
import jackfruit.annotations.Jackfruit;
|
import jackfruit.annotations.Jackfruit;
|
||||||
|
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
@Jackfruit(prefix = "mission")
|
@Jackfruit(prefix = "mission")
|
||||||
public interface MissionBlock {
|
public interface MissionBlock {
|
||||||
|
|
||||||
String introLines =
|
String introLines =
|
||||||
"""
|
"""
|
||||||
###############################################################################
|
###############################################################################
|
||||||
# MISSION PARAMETERS
|
# MISSION PARAMETERS
|
||||||
###############################################################################
|
###############################################################################
|
||||||
""";
|
""";
|
||||||
|
|
||||||
@Comment(introLines + "Mission name (e.g. DART)")
|
@Comment(introLines + "Mission name (e.g. DART)")
|
||||||
@DefaultValue("mission")
|
@DefaultValue("mission")
|
||||||
String missionName();
|
String missionName();
|
||||||
|
|
||||||
@Comment(
|
@Comment(
|
||||||
"""
|
"""
|
||||||
SPICE metakernel to read. This may be specified more than once
|
SPICE metakernel to read. This may be specified more than once
|
||||||
for multiple metakernels (e.g. /project/dart/data/SPICE/flight/mk/current.tm)""")
|
for multiple metakernels (e.g. /project/dart/data/SPICE/flight/mk/current.tm)""")
|
||||||
@DefaultValue("metakernel.tm")
|
@DefaultValue("metakernel.tm")
|
||||||
List<String> metakernel();
|
List<String> metakernel();
|
||||||
|
|
||||||
@Comment("Name of spacecraft frame (e.g. DART_SPACECRAFT)")
|
@Comment("Name of spacecraft frame (e.g. DART_SPACECRAFT)")
|
||||||
@DefaultValue("SPACECRAFT_FRAME")
|
@DefaultValue("SPACECRAFT_FRAME")
|
||||||
String spacecraftFrame();
|
String spacecraftFrame();
|
||||||
|
|
||||||
@Comment("Instrument frame name (e.g. DART_DRACO)")
|
@Comment("Instrument frame name (e.g. DART_DRACO)")
|
||||||
@DefaultValue("INSTRUMENT_FRAME")
|
@DefaultValue("INSTRUMENT_FRAME")
|
||||||
String instrumentFrameName();
|
String instrumentFrameName();
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.config;
|
package terrasaur.config;
|
||||||
|
|
||||||
import jackfruit.annotations.Comment;
|
import jackfruit.annotations.Comment;
|
||||||
@@ -7,14 +29,16 @@ import jackfruit.annotations.Jackfruit;
|
|||||||
@Jackfruit(prefix = "spc")
|
@Jackfruit(prefix = "spc")
|
||||||
public interface SPCBlock {
|
public interface SPCBlock {
|
||||||
|
|
||||||
String introLines =
|
String introLines =
|
||||||
"""
|
"""
|
||||||
###############################################################################
|
###############################################################################
|
||||||
# SPC PARAMETERS
|
# SPC PARAMETERS
|
||||||
###############################################################################
|
###############################################################################
|
||||||
""";
|
""";
|
||||||
|
|
||||||
@Comment(introLines + """
|
@Comment(
|
||||||
|
introLines
|
||||||
|
+ """
|
||||||
SPC defines the camera X axis to be increasing to the right, Y to
|
SPC defines the camera X axis to be increasing to the right, Y to
|
||||||
be increasing down, and Z to point into the page:
|
be increasing down, and Z to point into the page:
|
||||||
|
|
||||||
@@ -50,5 +74,4 @@ public interface SPCBlock {
|
|||||||
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
|
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
|
||||||
@DefaultValue("-3")
|
@DefaultValue("-3")
|
||||||
int flipZ();
|
int flipZ();
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.config;
|
package terrasaur.config;
|
||||||
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
@@ -19,71 +41,68 @@ import terrasaur.utils.AppVersion;
|
|||||||
|
|
||||||
public class TerrasaurConfig {
|
public class TerrasaurConfig {
|
||||||
|
|
||||||
private static final Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
private static TerrasaurConfig instance = null;
|
private static TerrasaurConfig instance = null;
|
||||||
|
|
||||||
private TerrasaurConfig() {}
|
private TerrasaurConfig() {}
|
||||||
|
|
||||||
private ConfigBlock configBlock;
|
private ConfigBlock configBlock;
|
||||||
|
|
||||||
public static ConfigBlock getConfig() {
|
public static ConfigBlock getConfig() {
|
||||||
if (instance == null) {
|
if (instance == null) {
|
||||||
logger.error("Configuration has not been loaded! Returning null.");
|
logger.error("Configuration has not been loaded! Returning null.");
|
||||||
return null;
|
return null;
|
||||||
|
}
|
||||||
|
return instance.configBlock;
|
||||||
}
|
}
|
||||||
return instance.configBlock;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static ConfigBlock getTemplate() {
|
public static ConfigBlock getTemplate() {
|
||||||
if (instance == null) {
|
if (instance == null) {
|
||||||
instance = new TerrasaurConfig();
|
instance = new TerrasaurConfig();
|
||||||
ConfigBlockFactory factory = new ConfigBlockFactory();
|
ConfigBlockFactory factory = new ConfigBlockFactory();
|
||||||
instance.configBlock = factory.getTemplate();
|
instance.configBlock = factory.getTemplate();
|
||||||
|
}
|
||||||
|
return instance.configBlock;
|
||||||
}
|
}
|
||||||
return instance.configBlock;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static ConfigBlock load(Path filename) {
|
public static ConfigBlock load(Path filename) {
|
||||||
if (!Files.exists(filename)) {
|
if (!Files.exists(filename)) {
|
||||||
System.err.println("Cannot load configuration file " + filename);
|
System.err.println("Cannot load configuration file " + filename);
|
||||||
Thread.dumpStack();
|
Thread.dumpStack();
|
||||||
System.exit(1);
|
System.exit(1);
|
||||||
|
}
|
||||||
|
if (instance == null) {
|
||||||
|
instance = new TerrasaurConfig();
|
||||||
|
|
||||||
|
try {
|
||||||
|
PropertiesConfiguration config = new Configurations().properties(filename.toFile());
|
||||||
|
instance.configBlock = new ConfigBlockFactory().fromConfig(config);
|
||||||
|
} catch (ConfigurationException e) {
|
||||||
|
e.printStackTrace();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return instance.configBlock;
|
||||||
}
|
}
|
||||||
if (instance == null) {
|
|
||||||
instance = new TerrasaurConfig();
|
|
||||||
|
|
||||||
try {
|
@Override
|
||||||
PropertiesConfiguration config = new Configurations().properties(filename.toFile());
|
public String toString() {
|
||||||
instance.configBlock = new ConfigBlockFactory().fromConfig(config);
|
StringWriter string = new StringWriter();
|
||||||
} catch (ConfigurationException e) {
|
try (PrintWriter pw = new PrintWriter(string)) {
|
||||||
e.printStackTrace();
|
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock);
|
||||||
}
|
PropertiesConfigurationLayout layout = config.getLayout();
|
||||||
|
|
||||||
|
String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
|
||||||
|
.withLocale(Locale.getDefault())
|
||||||
|
.withZone(ZoneOffset.UTC)
|
||||||
|
.format(Instant.now());
|
||||||
|
layout.setHeaderComment(
|
||||||
|
String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
|
||||||
|
|
||||||
|
config.write(pw);
|
||||||
|
} catch (ConfigurationException | IOException e) {
|
||||||
|
e.printStackTrace();
|
||||||
|
}
|
||||||
|
return string.toString();
|
||||||
}
|
}
|
||||||
return instance.configBlock;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String toString() {
|
|
||||||
StringWriter string = new StringWriter();
|
|
||||||
try (PrintWriter pw = new PrintWriter(string)) {
|
|
||||||
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock);
|
|
||||||
PropertiesConfigurationLayout layout = config.getLayout();
|
|
||||||
|
|
||||||
String now =
|
|
||||||
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
|
|
||||||
.withLocale(Locale.getDefault())
|
|
||||||
.withZone(ZoneOffset.UTC)
|
|
||||||
.format(Instant.now());
|
|
||||||
layout.setHeaderComment(
|
|
||||||
String.format(
|
|
||||||
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
|
|
||||||
|
|
||||||
config.write(pw);
|
|
||||||
} catch (ConfigurationException | IOException e) {
|
|
||||||
e.printStackTrace();
|
|
||||||
}
|
|
||||||
return string.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,60 +1,88 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.enums;
|
package terrasaur.enums;
|
||||||
|
|
||||||
import org.apache.commons.io.FilenameUtils;
|
import org.apache.commons.io.FilenameUtils;
|
||||||
|
|
||||||
public enum FORMATS {
|
public enum FORMATS {
|
||||||
|
ASCII(true),
|
||||||
|
BIN3(true),
|
||||||
|
BIN4(true),
|
||||||
|
BIN7(true),
|
||||||
|
FITS(false),
|
||||||
|
ICQ(false),
|
||||||
|
OBJ(false),
|
||||||
|
PLT(false),
|
||||||
|
PLY(false),
|
||||||
|
VTK(false);
|
||||||
|
|
||||||
ASCII(true), BIN3(true), BIN4(true), BIN7(true), FITS(false), ICQ(false), OBJ(false), PLT(
|
/** True if this format contains no facet information */
|
||||||
false), PLY(false), VTK(false);
|
public boolean pointsOnly;
|
||||||
|
|
||||||
/** True if this format contains no facet information */
|
private FORMATS(boolean pointsOnly) {
|
||||||
public boolean pointsOnly;
|
this.pointsOnly = pointsOnly;
|
||||||
|
|
||||||
private FORMATS(boolean pointsOnly) {
|
|
||||||
this.pointsOnly = pointsOnly;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Guess the format from the (case-insensitive) filename extension.
|
|
||||||
* <p>
|
|
||||||
* ASCII: ascii, txt, xyz
|
|
||||||
* <p>
|
|
||||||
* BINARY: binary, bin
|
|
||||||
* <p>
|
|
||||||
* FITS: fits, fit
|
|
||||||
* <p>
|
|
||||||
* L2: l2, dat
|
|
||||||
* <p>
|
|
||||||
* OBJ: obj
|
|
||||||
* <p>
|
|
||||||
* PLT: plt
|
|
||||||
* <p>
|
|
||||||
* PLY: ply
|
|
||||||
* <p>
|
|
||||||
* VTK: vtk
|
|
||||||
*
|
|
||||||
* @param filename
|
|
||||||
* @return matched format type, or null if a match is not found
|
|
||||||
*/
|
|
||||||
public static FORMATS formatFromExtension(String filename) {
|
|
||||||
String extension = FilenameUtils.getExtension(filename);
|
|
||||||
for (FORMATS f : FORMATS.values()) {
|
|
||||||
if (f.name().equalsIgnoreCase(extension)) {
|
|
||||||
return f;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
switch (extension.toUpperCase()) {
|
/**
|
||||||
case "TXT":
|
* Guess the format from the (case-insensitive) filename extension.
|
||||||
case "XYZ":
|
* <p>
|
||||||
return FORMATS.ASCII;
|
* ASCII: ascii, txt, xyz
|
||||||
case "BIN":
|
* <p>
|
||||||
return FORMATS.BIN3;
|
* BINARY: binary, bin
|
||||||
case "FIT":
|
* <p>
|
||||||
return FORMATS.FITS;
|
* FITS: fits, fit
|
||||||
|
* <p>
|
||||||
|
* L2: l2, dat
|
||||||
|
* <p>
|
||||||
|
* OBJ: obj
|
||||||
|
* <p>
|
||||||
|
* PLT: plt
|
||||||
|
* <p>
|
||||||
|
* PLY: ply
|
||||||
|
* <p>
|
||||||
|
* VTK: vtk
|
||||||
|
*
|
||||||
|
* @param filename
|
||||||
|
* @return matched format type, or null if a match is not found
|
||||||
|
*/
|
||||||
|
public static FORMATS formatFromExtension(String filename) {
|
||||||
|
String extension = FilenameUtils.getExtension(filename);
|
||||||
|
for (FORMATS f : FORMATS.values()) {
|
||||||
|
if (f.name().equalsIgnoreCase(extension)) {
|
||||||
|
return f;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
switch (extension.toUpperCase()) {
|
||||||
|
case "TXT":
|
||||||
|
case "XYZ":
|
||||||
|
return FORMATS.ASCII;
|
||||||
|
case "BIN":
|
||||||
|
return FORMATS.BIN3;
|
||||||
|
case "FIT":
|
||||||
|
return FORMATS.FITS;
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,33 +1,62 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.enums;
|
package terrasaur.enums;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Enums for a given fits header format. This is used to keep fits headers for the different types
|
* Enums for a given fits header format. This is used to keep fits headers for the different types
|
||||||
* separately configurable.
|
* separately configurable.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum FitsHeaderType {
|
public enum FitsHeaderType {
|
||||||
|
ANCIGLOBALGENERIC,
|
||||||
|
ANCILOCALGENERIC,
|
||||||
|
ANCIGLOBALALTWG,
|
||||||
|
ANCIG_FACETRELATION_ALTWG,
|
||||||
|
ANCILOCALALTWG,
|
||||||
|
DTMGLOBALALTWG,
|
||||||
|
DTMLOCALALTWG,
|
||||||
|
DTMGLOBALGENERIC,
|
||||||
|
DTMLOCALGENERIC,
|
||||||
|
NFTMLN;
|
||||||
|
|
||||||
ANCIGLOBALGENERIC, ANCILOCALGENERIC, ANCIGLOBALALTWG, ANCIG_FACETRELATION_ALTWG, ANCILOCALALTWG, DTMGLOBALALTWG, DTMLOCALALTWG, DTMGLOBALGENERIC, DTMLOCALGENERIC, NFTMLN;
|
public static boolean isGLobal(FitsHeaderType hdrType) {
|
||||||
|
|
||||||
public static boolean isGLobal(FitsHeaderType hdrType) {
|
boolean globalProduct;
|
||||||
|
|
||||||
boolean globalProduct;
|
switch (hdrType) {
|
||||||
|
case ANCIGLOBALALTWG:
|
||||||
|
case ANCIGLOBALGENERIC:
|
||||||
|
case DTMGLOBALALTWG:
|
||||||
|
case DTMGLOBALGENERIC:
|
||||||
|
globalProduct = true;
|
||||||
|
break;
|
||||||
|
|
||||||
switch (hdrType) {
|
default:
|
||||||
|
globalProduct = false;
|
||||||
|
}
|
||||||
|
|
||||||
case ANCIGLOBALALTWG:
|
return globalProduct;
|
||||||
case ANCIGLOBALGENERIC:
|
|
||||||
case DTMGLOBALALTWG:
|
|
||||||
case DTMGLOBALGENERIC:
|
|
||||||
globalProduct = true;
|
|
||||||
break;
|
|
||||||
|
|
||||||
default:
|
|
||||||
globalProduct = false;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return globalProduct;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.enums;
|
package terrasaur.enums;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -10,147 +32,143 @@ import nom.tam.fits.HeaderCardException;
|
|||||||
* Enumeration containing the values and comments to use for FITS tags describing data stored in
|
* Enumeration containing the values and comments to use for FITS tags describing data stored in
|
||||||
* FITS data cubes. The enumeration name references the type of data stored in a given plane. This
|
* FITS data cubes. The enumeration name references the type of data stored in a given plane. This
|
||||||
* way the user can choose their own value for the FITS keyword (i.e. "PLANE1" or "PLANE10").
|
* way the user can choose their own value for the FITS keyword (i.e. "PLANE1" or "PLANE10").
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum PlaneInfo {
|
public enum PlaneInfo {
|
||||||
|
|
||||||
//@formatter:off
|
// @formatter:off
|
||||||
LAT("Latitude of vertices", "[deg]", "deg"),
|
LAT("Latitude of vertices", "[deg]", "deg"),
|
||||||
LON("Longitude of vertices", "[deg]", "deg"),
|
LON("Longitude of vertices", "[deg]", "deg"),
|
||||||
RAD("Radius of vertices", "[km]", "km"),
|
RAD("Radius of vertices", "[km]", "km"),
|
||||||
X("X coordinate of vertices", "[km]", "km"),
|
X("X coordinate of vertices", "[km]", "km"),
|
||||||
Y("Y coordinate of vertices", "[km]", "km"),
|
Y("Y coordinate of vertices", "[km]", "km"),
|
||||||
Z("Z coordinate of vertices", "[km]", "km"),
|
Z("Z coordinate of vertices", "[km]", "km"),
|
||||||
NORM_VECTOR_X("Normal vector X", null, null),
|
NORM_VECTOR_X("Normal vector X", null, null),
|
||||||
NORM_VECTOR_Y("Normal vector Y", null, null),
|
NORM_VECTOR_Y("Normal vector Y", null, null),
|
||||||
NORM_VECTOR_Z("Normal vector Z", null, null),
|
NORM_VECTOR_Z("Normal vector Z", null, null),
|
||||||
GRAV_VECTOR_X("Gravity vector X", "[m/s^2]", "m/s**2"),
|
GRAV_VECTOR_X("Gravity vector X", "[m/s^2]", "m/s**2"),
|
||||||
GRAV_VECTOR_Y("Gravity vector Y", "[m/s^2]", "m/s**2"),
|
GRAV_VECTOR_Y("Gravity vector Y", "[m/s^2]", "m/s**2"),
|
||||||
GRAV_VECTOR_Z("Gravity vector Z", "[m/s^2]", "m/s**2"),
|
GRAV_VECTOR_Z("Gravity vector Z", "[m/s^2]", "m/s**2"),
|
||||||
GRAV_MAG("Gravitational magnitude", "[m/s^2]", "m/s**2"),
|
GRAV_MAG("Gravitational magnitude", "[m/s^2]", "m/s**2"),
|
||||||
GRAV_POT("Gravitational potential", "[J/kg]", "J/kg"),
|
GRAV_POT("Gravitational potential", "[J/kg]", "J/kg"),
|
||||||
ELEV("Elevation", "[m]", "m"),
|
ELEV("Elevation", "[m]", "m"),
|
||||||
AREA("Area", "[km^2]", "km**2"),
|
AREA("Area", "[km^2]", "km**2"),
|
||||||
|
|
||||||
// no longer needed! same as HEIGHT!
|
// no longer needed! same as HEIGHT!
|
||||||
// ELEV_NORM("Elevation relative to normal plane", "[m]", "m"),
|
// ELEV_NORM("Elevation relative to normal plane", "[m]", "m"),
|
||||||
SLOPE("Slope", "[deg]", "deg"),
|
SLOPE("Slope", "[deg]", "deg"),
|
||||||
SHADE("Shaded relief", null, null),
|
SHADE("Shaded relief", null, null),
|
||||||
TILT("Facet tilt", "[deg]", "deg"),
|
TILT("Facet tilt", "[deg]", "deg"),
|
||||||
TILT_DIRECTION("Facet tilt direction", "[deg]", "deg"),
|
TILT_DIRECTION("Facet tilt direction", "[deg]", "deg"),
|
||||||
TILT_AVERAGE("Mean tilt", "[deg]", "deg"),
|
TILT_AVERAGE("Mean tilt", "[deg]", "deg"),
|
||||||
TILT_VARIATION("Tilt variation", "[deg]", "deg"),
|
TILT_VARIATION("Tilt variation", "[deg]", "deg"),
|
||||||
TILT_AVERAGE_DIRECTION("Mean tilt direction", "[deg]", "deg"),
|
TILT_AVERAGE_DIRECTION("Mean tilt direction", "[deg]", "deg"),
|
||||||
TILT_DIRECTION_VARIATION("Tilt direction variation", "[deg]", "deg"),
|
TILT_DIRECTION_VARIATION("Tilt direction variation", "[deg]", "deg"),
|
||||||
TILT_RELATIVE("Relative tilt", "[deg]", "deg"),
|
TILT_RELATIVE("Relative tilt", "[deg]", "deg"),
|
||||||
TILT_RELATIVE_DIRECTION("Relative tilt direction", "[deg]", "deg"),
|
TILT_RELATIVE_DIRECTION("Relative tilt direction", "[deg]", "deg"),
|
||||||
TILT_UNCERTAINTY("Tilt Uncertainty", "[deg]", "deg"),
|
TILT_UNCERTAINTY("Tilt Uncertainty", "[deg]", "deg"),
|
||||||
FACETRAD("Facet radius", "[m]", "m"),
|
FACETRAD("Facet radius", "[m]", "m"),
|
||||||
HEIGHT("Height relative to normal plane", "[km]", "km"),
|
HEIGHT("Height relative to normal plane", "[km]", "km"),
|
||||||
RELATIVE_HEIGHT("Max relative height", "[km]", "km"),
|
RELATIVE_HEIGHT("Max relative height", "[km]", "km"),
|
||||||
ALBEDO("Relative albedo", null, null),
|
ALBEDO("Relative albedo", null, null),
|
||||||
INTENSITY("Return Intensity", null, null),
|
INTENSITY("Return Intensity", null, null),
|
||||||
SIGMA("Sigma", null, null),
|
SIGMA("Sigma", null, null),
|
||||||
QUALITY("Quality", null, null),
|
QUALITY("Quality", null, null),
|
||||||
SHADED("Shaded relief", null, null),
|
SHADED("Shaded relief", null, null),
|
||||||
NUMPOINTS("Number of OLA points used", null, null),
|
NUMPOINTS("Number of OLA points used", null, null),
|
||||||
HEIGHT_RESIDUAL("Mean of residual between points and fitted height", "[km]", "km"),
|
HEIGHT_RESIDUAL("Mean of residual between points and fitted height", "[km]", "km"),
|
||||||
HEIGHT_STDDEV("Std deviation of residual between points and fitted height", "[km]", "km"),
|
HEIGHT_STDDEV("Std deviation of residual between points and fitted height", "[km]", "km"),
|
||||||
HAZARD("Hazard", "1 indicates a hazard to the spacecraft", null);
|
HAZARD("Hazard", "1 indicates a hazard to the spacecraft", null);
|
||||||
//@formatter:on
|
// @formatter:on
|
||||||
|
|
||||||
private String keyValue; // value associated with FITS keyword
|
private String keyValue; // value associated with FITS keyword
|
||||||
private String comment; // comment associated with FITS keyword
|
private String comment; // comment associated with FITS keyword
|
||||||
private String units; // units associated with the plane. Usually in PDS4 nomenclature
|
private String units; // units associated with the plane. Usually in PDS4 nomenclature
|
||||||
|
|
||||||
PlaneInfo(String keyVal, String comment, String units) {
|
PlaneInfo(String keyVal, String comment, String units) {
|
||||||
this.keyValue = keyVal;
|
this.keyValue = keyVal;
|
||||||
this.comment = comment;
|
this.comment = comment;
|
||||||
this.units = units;
|
this.units = units;
|
||||||
}
|
|
||||||
|
|
||||||
public String value() {
|
|
||||||
return keyValue;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String comment() {
|
|
||||||
return comment;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String units() {
|
|
||||||
return units;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Try to parse the enum from the given Keyval string. Needs to match exactly (but case
|
|
||||||
* insensitive)!
|
|
||||||
*
|
|
||||||
* @param keyVal
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static PlaneInfo keyVal2Plane(String keyVal) {
|
|
||||||
for (PlaneInfo planeName : values()) {
|
|
||||||
if ((planeName.value() != null) && (planeName.value().equalsIgnoreCase(keyVal))) {
|
|
||||||
return planeName;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static PlaneInfo planeFromString(String plane) {
|
public String value() {
|
||||||
for (PlaneInfo planeName : values()) {
|
return keyValue;
|
||||||
if (planeName.toString().equals(plane)) {
|
|
||||||
return planeName;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
/*
|
public String comment() {
|
||||||
* Create enumeration set for the first 6 planes. These are the initial planes created from the
|
return comment;
|
||||||
* Osiris Rex netCDF file.
|
|
||||||
*/
|
|
||||||
public static final EnumSet<PlaneInfo> first6HTags = EnumSet.range(PlaneInfo.LAT, PlaneInfo.Z);
|
|
||||||
|
|
||||||
public static List<PlaneInfo> coreTiltPlanes() {
|
|
||||||
|
|
||||||
List<PlaneInfo> coreTilts = new ArrayList<PlaneInfo>();
|
|
||||||
coreTilts.add(PlaneInfo.TILT_AVERAGE);
|
|
||||||
coreTilts.add(PlaneInfo.TILT_VARIATION);
|
|
||||||
coreTilts.add(PlaneInfo.TILT_AVERAGE_DIRECTION);
|
|
||||||
coreTilts.add(PlaneInfo.TILT_DIRECTION_VARIATION);
|
|
||||||
coreTilts.add(PlaneInfo.TILT_RELATIVE);
|
|
||||||
coreTilts.add(PlaneInfo.TILT_RELATIVE_DIRECTION);
|
|
||||||
coreTilts.add(PlaneInfo.RELATIVE_HEIGHT);
|
|
||||||
|
|
||||||
return coreTilts;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Convert List<PlaneInfo> to List<HeaderCard> where each HeaderCard in List follows the
|
|
||||||
* convention: for each "thisPlane" in List<PlaneInfo> HeaderCard = new HeaderCard("PLANE" + cc,
|
|
||||||
* thisPlane.value(), thisPlane.comment()) The order in List<HeaderCard> follows the order in
|
|
||||||
* List<PlaneInfo>
|
|
||||||
*
|
|
||||||
* @param planeList
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList)
|
|
||||||
throws HeaderCardException {
|
|
||||||
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
|
|
||||||
String plane = "PLANE";
|
|
||||||
int cc = 1;
|
|
||||||
for (PlaneInfo thisPlane : planeList) {
|
|
||||||
|
|
||||||
planeHeaders.add(new HeaderCard(plane + cc, thisPlane.value(), thisPlane.comment()));
|
|
||||||
cc++;
|
|
||||||
}
|
}
|
||||||
return planeHeaders;
|
|
||||||
|
|
||||||
}
|
public String units() {
|
||||||
|
return units;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Try to parse the enum from the given Keyval string. Needs to match exactly (but case
|
||||||
|
* insensitive)!
|
||||||
|
*
|
||||||
|
* @param keyVal
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static PlaneInfo keyVal2Plane(String keyVal) {
|
||||||
|
for (PlaneInfo planeName : values()) {
|
||||||
|
if ((planeName.value() != null) && (planeName.value().equalsIgnoreCase(keyVal))) {
|
||||||
|
return planeName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static PlaneInfo planeFromString(String plane) {
|
||||||
|
for (PlaneInfo planeName : values()) {
|
||||||
|
if (planeName.toString().equals(plane)) {
|
||||||
|
return planeName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Create enumeration set for the first 6 planes. These are the initial planes created from the
|
||||||
|
* Osiris Rex netCDF file.
|
||||||
|
*/
|
||||||
|
public static final EnumSet<PlaneInfo> first6HTags = EnumSet.range(PlaneInfo.LAT, PlaneInfo.Z);
|
||||||
|
|
||||||
|
public static List<PlaneInfo> coreTiltPlanes() {
|
||||||
|
|
||||||
|
List<PlaneInfo> coreTilts = new ArrayList<PlaneInfo>();
|
||||||
|
coreTilts.add(PlaneInfo.TILT_AVERAGE);
|
||||||
|
coreTilts.add(PlaneInfo.TILT_VARIATION);
|
||||||
|
coreTilts.add(PlaneInfo.TILT_AVERAGE_DIRECTION);
|
||||||
|
coreTilts.add(PlaneInfo.TILT_DIRECTION_VARIATION);
|
||||||
|
coreTilts.add(PlaneInfo.TILT_RELATIVE);
|
||||||
|
coreTilts.add(PlaneInfo.TILT_RELATIVE_DIRECTION);
|
||||||
|
coreTilts.add(PlaneInfo.RELATIVE_HEIGHT);
|
||||||
|
|
||||||
|
return coreTilts;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert List<PlaneInfo> to List<HeaderCard> where each HeaderCard in List follows the
|
||||||
|
* convention: for each "thisPlane" in List<PlaneInfo> HeaderCard = new HeaderCard("PLANE" + cc,
|
||||||
|
* thisPlane.value(), thisPlane.comment()) The order in List<HeaderCard> follows the order in
|
||||||
|
* List<PlaneInfo>
|
||||||
|
*
|
||||||
|
* @param planeList
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList) throws HeaderCardException {
|
||||||
|
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
|
||||||
|
String plane = "PLANE";
|
||||||
|
int cc = 1;
|
||||||
|
for (PlaneInfo thisPlane : planeList) {
|
||||||
|
|
||||||
|
planeHeaders.add(new HeaderCard(plane + cc, thisPlane.value(), thisPlane.comment()));
|
||||||
|
cc++;
|
||||||
|
}
|
||||||
|
return planeHeaders;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.enums;
|
package terrasaur.enums;
|
||||||
|
|
||||||
import com.google.common.base.Strings;
|
import com.google.common.base.Strings;
|
||||||
@@ -5,112 +27,109 @@ import com.google.common.base.Strings;
|
|||||||
/**
|
/**
|
||||||
* Enum for defining the types of sigma files that can be loaded and utilized by the Pipeline. This
|
* Enum for defining the types of sigma files that can be loaded and utilized by the Pipeline. This
|
||||||
* allows the pipeline to load and parse different formats of sigma files.
|
* allows the pipeline to load and parse different formats of sigma files.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum SigmaFileType {
|
public enum SigmaFileType {
|
||||||
|
SPCSIGMA {
|
||||||
|
|
||||||
SPCSIGMA {
|
@Override
|
||||||
|
public String commentSymbol() {
|
||||||
@Override
|
return "";
|
||||||
public String commentSymbol() {
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String stringArg() {
|
|
||||||
return "spc";
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int sigmaCol() {
|
|
||||||
return 3;
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
ERRORFROMSQLSIGMA {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String commentSymbol() {
|
|
||||||
return "#";
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String stringArg() {
|
|
||||||
return "errorfromsql";
|
|
||||||
}
|
|
||||||
|
|
||||||
// should be the Standard Deviation column in ErrorFromSQL file.
|
|
||||||
public int sigmaCol() {
|
|
||||||
return 8;
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
NOMATCH {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String commentSymbol() {
|
|
||||||
return "NAN";
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String stringArg() {
|
|
||||||
return "NAN";
|
|
||||||
}
|
|
||||||
|
|
||||||
public int sigmaCol() {
|
|
||||||
return -1;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// returns the symbol that is used to denote comment lines
|
|
||||||
public abstract String commentSymbol();
|
|
||||||
|
|
||||||
// input argument to match
|
|
||||||
public abstract String stringArg();
|
|
||||||
|
|
||||||
// column number where sigma values are stored
|
|
||||||
public abstract int sigmaCol();
|
|
||||||
|
|
||||||
public static SigmaFileType getFileType(String sigmaFileType) {
|
|
||||||
|
|
||||||
if (!Strings.isNullOrEmpty(sigmaFileType)) {
|
|
||||||
for (SigmaFileType thisType : values()) {
|
|
||||||
|
|
||||||
if (sigmaFileType.toLowerCase().equals(thisType.stringArg())) {
|
|
||||||
|
|
||||||
return thisType;
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
return NOMATCH;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
@Override
|
||||||
* Return the SigmaFileType associated with the SrcProductType.
|
public String stringArg() {
|
||||||
*
|
return "spc";
|
||||||
* @param srcType
|
}
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) {
|
|
||||||
SigmaFileType sigmaType = SigmaFileType.NOMATCH;
|
|
||||||
switch (srcType) {
|
|
||||||
|
|
||||||
case SPC:
|
@Override
|
||||||
sigmaType = SigmaFileType.SPCSIGMA;
|
public int sigmaCol() {
|
||||||
break;
|
return 3;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
case OLA:
|
ERRORFROMSQLSIGMA {
|
||||||
sigmaType = SigmaFileType.ERRORFROMSQLSIGMA;
|
|
||||||
break;
|
|
||||||
|
|
||||||
default:
|
@Override
|
||||||
sigmaType = SigmaFileType.NOMATCH;
|
public String commentSymbol() {
|
||||||
break;
|
return "#";
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String stringArg() {
|
||||||
|
return "errorfromsql";
|
||||||
|
}
|
||||||
|
|
||||||
|
// should be the Standard Deviation column in ErrorFromSQL file.
|
||||||
|
public int sigmaCol() {
|
||||||
|
return 8;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
NOMATCH {
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String commentSymbol() {
|
||||||
|
return "NAN";
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String stringArg() {
|
||||||
|
return "NAN";
|
||||||
|
}
|
||||||
|
|
||||||
|
public int sigmaCol() {
|
||||||
|
return -1;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// returns the symbol that is used to denote comment lines
|
||||||
|
public abstract String commentSymbol();
|
||||||
|
|
||||||
|
// input argument to match
|
||||||
|
public abstract String stringArg();
|
||||||
|
|
||||||
|
// column number where sigma values are stored
|
||||||
|
public abstract int sigmaCol();
|
||||||
|
|
||||||
|
public static SigmaFileType getFileType(String sigmaFileType) {
|
||||||
|
|
||||||
|
if (!Strings.isNullOrEmpty(sigmaFileType)) {
|
||||||
|
for (SigmaFileType thisType : values()) {
|
||||||
|
|
||||||
|
if (sigmaFileType.toLowerCase().equals(thisType.stringArg())) {
|
||||||
|
|
||||||
|
return thisType;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return NOMATCH;
|
||||||
}
|
}
|
||||||
|
|
||||||
return sigmaType;
|
/**
|
||||||
}
|
* Return the SigmaFileType associated with the SrcProductType.
|
||||||
|
*
|
||||||
|
* @param srcType
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) {
|
||||||
|
SigmaFileType sigmaType = SigmaFileType.NOMATCH;
|
||||||
|
switch (srcType) {
|
||||||
|
case SPC:
|
||||||
|
sigmaType = SigmaFileType.SPCSIGMA;
|
||||||
|
break;
|
||||||
|
|
||||||
|
case OLA:
|
||||||
|
sigmaType = SigmaFileType.ERRORFROMSQLSIGMA;
|
||||||
|
break;
|
||||||
|
|
||||||
|
default:
|
||||||
|
sigmaType = SigmaFileType.NOMATCH;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
return sigmaType;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,93 +1,110 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.enums;
|
package terrasaur.enums;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Enumeration storing the source product type: the product type of the source data used in creation
|
* Enumeration storing the source product type: the product type of the source data used in creation
|
||||||
* of an ALTWG product.
|
* of an ALTWG product.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum SrcProductType {
|
public enum SrcProductType {
|
||||||
|
SFM {
|
||||||
|
|
||||||
SFM {
|
@Override
|
||||||
|
public String getAltwgFrag() {
|
||||||
|
return "sfm";
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
@Override
|
SPC {
|
||||||
public String getAltwgFrag() {
|
@Override
|
||||||
return "sfm";
|
public String getAltwgFrag() {
|
||||||
|
return "spc";
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// OLA Altimetry
|
||||||
|
OLA {
|
||||||
|
@Override
|
||||||
|
public String getAltwgFrag() {
|
||||||
|
return "alt";
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// SPC-OLA
|
||||||
|
SPO {
|
||||||
|
@Override
|
||||||
|
public String getAltwgFrag() {
|
||||||
|
return "spo";
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
TRUTH {
|
||||||
|
@Override
|
||||||
|
public String getAltwgFrag() {
|
||||||
|
return "tru";
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
UNKNOWN {
|
||||||
|
@Override
|
||||||
|
public String getAltwgFrag() {
|
||||||
|
return "unk";
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
public static SrcProductType getType(String value) {
|
||||||
|
value = value.toUpperCase();
|
||||||
|
for (SrcProductType srcType : values()) {
|
||||||
|
if (srcType.toString().equals(value)) {
|
||||||
|
return srcType;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return UNKNOWN;
|
||||||
}
|
}
|
||||||
|
|
||||||
},
|
/**
|
||||||
|
* Returns the string fragment associated with the source product type. Follows the ALTWG naming
|
||||||
|
* convention.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public abstract String getAltwgFrag();
|
||||||
|
|
||||||
SPC {
|
/**
|
||||||
@Override
|
* Return the SrcProductType whose getAltwgFrag() string matches the stringFrag. Return UNKNOWN if
|
||||||
public String getAltwgFrag() {
|
* a match is not found.
|
||||||
return "spc";
|
*
|
||||||
|
* @param stringFrag
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static SrcProductType fromAltwgFrag(String stringFrag) {
|
||||||
|
|
||||||
|
for (SrcProductType prodType : SrcProductType.values()) {
|
||||||
|
if (prodType.getAltwgFrag().equals(stringFrag)) return prodType;
|
||||||
|
}
|
||||||
|
return UNKNOWN;
|
||||||
}
|
}
|
||||||
},
|
|
||||||
|
|
||||||
// OLA Altimetry
|
|
||||||
OLA {
|
|
||||||
@Override
|
|
||||||
public String getAltwgFrag() {
|
|
||||||
return "alt";
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
// SPC-OLA
|
|
||||||
SPO {
|
|
||||||
@Override
|
|
||||||
public String getAltwgFrag() {
|
|
||||||
return "spo";
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
TRUTH {
|
|
||||||
@Override
|
|
||||||
public String getAltwgFrag() {
|
|
||||||
return "tru";
|
|
||||||
}
|
|
||||||
|
|
||||||
},
|
|
||||||
|
|
||||||
UNKNOWN {
|
|
||||||
@Override
|
|
||||||
public String getAltwgFrag() {
|
|
||||||
return "unk";
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
public static SrcProductType getType(String value) {
|
|
||||||
value = value.toUpperCase();
|
|
||||||
for (SrcProductType srcType : values()) {
|
|
||||||
if (srcType.toString().equals(value)) {
|
|
||||||
return srcType;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return UNKNOWN;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Returns the string fragment associated with the source product type. Follows the ALTWG naming
|
|
||||||
* convention.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public abstract String getAltwgFrag();
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return the SrcProductType whose getAltwgFrag() string matches the stringFrag. Return UNKNOWN if
|
|
||||||
* a match is not found.
|
|
||||||
*
|
|
||||||
* @param stringFrag
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static SrcProductType fromAltwgFrag(String stringFrag) {
|
|
||||||
|
|
||||||
for (SrcProductType prodType : SrcProductType.values()) {
|
|
||||||
if (prodType.getAltwgFrag().equals(stringFrag))
|
|
||||||
return prodType;
|
|
||||||
}
|
|
||||||
return UNKNOWN;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.EnumSet;
|
import java.util.EnumSet;
|
||||||
@@ -5,298 +27,344 @@ import java.util.Map;
|
|||||||
|
|
||||||
public enum AltPipelnEnum {
|
public enum AltPipelnEnum {
|
||||||
|
|
||||||
// settings enums. Used to determine type of product to create.
|
// settings enums. Used to determine type of product to create.
|
||||||
ANCIGLOBAL,
|
ANCIGLOBAL,
|
||||||
|
|
||||||
// optional: tells code to skip generation of products for highest res shape model
|
// optional: tells code to skip generation of products for highest res shape model
|
||||||
SKIPORIGINALSHP,
|
SKIPORIGINALSHP,
|
||||||
|
|
||||||
// controls whether or not certain global products get created.
|
// controls whether or not certain global products get created.
|
||||||
// REQUIRED TO BE IN CONFIGFILE:
|
// REQUIRED TO BE IN CONFIGFILE:
|
||||||
DOGLOBALSHAPE, OBJTOFITS, ADDOBJCOMMENTS, GLOBALRES0, DUMBERVALUES, DOGRAVGLOBAL, DOGLOBALGRAV_ANCI, DOGLOBALTILT_ANCI, DOGLOBALMISC_ANCI, DOGLOBALTILT,
|
DOGLOBALSHAPE,
|
||||||
|
OBJTOFITS,
|
||||||
|
ADDOBJCOMMENTS,
|
||||||
|
GLOBALRES0,
|
||||||
|
DUMBERVALUES,
|
||||||
|
DOGRAVGLOBAL,
|
||||||
|
DOGLOBALGRAV_ANCI,
|
||||||
|
DOGLOBALTILT_ANCI,
|
||||||
|
DOGLOBALMISC_ANCI,
|
||||||
|
DOGLOBALTILT,
|
||||||
|
|
||||||
// controls number of slots per job to use when running global distributed gravity
|
// controls number of slots per job to use when running global distributed gravity
|
||||||
// in grid engine mode. Does not apply if running in local parallel mode
|
// in grid engine mode. Does not apply if running in local parallel mode
|
||||||
GLOBALGRAVSLOTS,
|
GLOBALGRAVSLOTS,
|
||||||
|
|
||||||
// controls number of slots per job to use when running local distributed gravity
|
// controls number of slots per job to use when running local distributed gravity
|
||||||
// in grid engine mode. Does not apply if running in local parallel mode
|
// in grid engine mode. Does not apply if running in local parallel mode
|
||||||
LOCALGRAVSLOTS,
|
LOCALGRAVSLOTS,
|
||||||
|
|
||||||
// full path to SPICE metakernel to use when generating DSK products
|
// full path to SPICE metakernel to use when generating DSK products
|
||||||
DSKERNEL,
|
DSKERNEL,
|
||||||
|
|
||||||
// enable/disable the creation of global and local DSK files
|
// enable/disable the creation of global and local DSK files
|
||||||
DOGLOBALDSK, DOLOCALDSK,
|
DOGLOBALDSK,
|
||||||
|
DOLOCALDSK,
|
||||||
|
|
||||||
// global tilt semi-major axis in km. Now a required variable
|
// global tilt semi-major axis in km. Now a required variable
|
||||||
GTILTMAJAXIS,
|
GTILTMAJAXIS,
|
||||||
|
|
||||||
// settings for every local product generated by the pipeline
|
// settings for every local product generated by the pipeline
|
||||||
USEOLDMAPLETS, DODTMFITSOBJ, DOGRAVLOCAL, GENLOCALGRAV, DOLOCALTILT, DOLOCAL_GRAVANCI, DOLOCAL_TILTANCI, DOLOCAL_MISCANCI,
|
USEOLDMAPLETS,
|
||||||
|
DODTMFITSOBJ,
|
||||||
|
DOGRAVLOCAL,
|
||||||
|
GENLOCALGRAV,
|
||||||
|
DOLOCALTILT,
|
||||||
|
DOLOCAL_GRAVANCI,
|
||||||
|
DOLOCAL_TILTANCI,
|
||||||
|
DOLOCAL_MISCANCI,
|
||||||
|
|
||||||
// controls whether to use average gravitational potential for global and local gravity
|
// controls whether to use average gravitational potential for global and local gravity
|
||||||
// calculations. =0 use minimum reference potential, =1 use average reference potential
|
// calculations. =0 use minimum reference potential, =1 use average reference potential
|
||||||
AVGGRAVPOTGLOBAL, AVGGRAVPOTLOCAL,
|
AVGGRAVPOTGLOBAL,
|
||||||
|
AVGGRAVPOTLOCAL,
|
||||||
|
|
||||||
// controls RunBigMap.
|
// controls RunBigMap.
|
||||||
// integrate slope to height required. Defaults to "n" if this enum does not exist in config file.
|
// integrate slope to height required. Defaults to "n" if this enum does not exist in config file.
|
||||||
// otherwise will evaluate value. 0="n", 1="y"
|
// otherwise will evaluate value. 0="n", 1="y"
|
||||||
INTSLOPE,
|
INTSLOPE,
|
||||||
|
|
||||||
// use grotesque model in RunBigMap. Defaults to not using it if this enum does not exist in
|
// use grotesque model in RunBigMap. Defaults to not using it if this enum does not exist in
|
||||||
// config file
|
// config file
|
||||||
// otherwise will evaluate value. 0="do not use grotesque model", 1="do use grotesque model"
|
// otherwise will evaluate value. 0="do not use grotesque model", 1="do use grotesque model"
|
||||||
USEGROTESQUE,
|
USEGROTESQUE,
|
||||||
|
|
||||||
// controls the source data, product destination, naming convention,
|
// controls the source data, product destination, naming convention,
|
||||||
// and process flow.
|
// and process flow.
|
||||||
DATASOURCE, REPORTTYPE, INDATADIR, OUTPUTDIR, PDSNAMING, RENAMEGLOBALOBJ, USEBIGMAPS, DOREPORT,
|
DATASOURCE,
|
||||||
|
REPORTTYPE,
|
||||||
|
INDATADIR,
|
||||||
|
OUTPUTDIR,
|
||||||
|
PDSNAMING,
|
||||||
|
RENAMEGLOBALOBJ,
|
||||||
|
USEBIGMAPS,
|
||||||
|
DOREPORT,
|
||||||
|
|
||||||
// shape model density and rotation rate are now required variables. This way we can easily spot
|
// shape model density and rotation rate are now required variables. This way we can easily spot
|
||||||
// what we are using
|
// what we are using
|
||||||
// as defaults.
|
// as defaults.
|
||||||
SMDENSITY, SMRRATE,
|
SMDENSITY,
|
||||||
|
SMRRATE,
|
||||||
|
|
||||||
// stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct.
|
// stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct.
|
||||||
NAMINGCONVENTION,
|
NAMINGCONVENTION,
|
||||||
|
|
||||||
// set values that cannot be derived from data.
|
// set values that cannot be derived from data.
|
||||||
REPORTBASENAME, VERSION,
|
REPORTBASENAME,
|
||||||
|
VERSION,
|
||||||
|
|
||||||
// everything after this is not a required keyword
|
// everything after this is not a required keyword
|
||||||
|
|
||||||
// (Optional) controls whether there is an external body that needs to be accounted for when
|
// (Optional) controls whether there is an external body that needs to be accounted for when
|
||||||
// running gravity code
|
// running gravity code
|
||||||
// the values should be a csv string with no spaces. THe values are: mass(kg),x,y,z where x,y,z
|
// the values should be a csv string with no spaces. THe values are: mass(kg),x,y,z where x,y,z
|
||||||
// are the body
|
// are the body
|
||||||
// fixed coordinates in km.
|
// fixed coordinates in km.
|
||||||
// e.g. 521951167,1.19,0,0
|
// e.g. 521951167,1.19,0,0
|
||||||
EXTERNALBODY,
|
EXTERNALBODY,
|
||||||
|
|
||||||
// (Optional). If the keyword exists and value is 1 then no GLOBAL DTMs are assumed to be created.
|
// (Optional). If the keyword exists and value is 1 then no GLOBAL DTMs are assumed to be created.
|
||||||
// for example, in the DART Derived Product set we are not creating g_*dtm*.fits files
|
// for example, in the DART Derived Product set we are not creating g_*dtm*.fits files
|
||||||
NOGLOBALDTM,
|
NOGLOBALDTM,
|
||||||
|
|
||||||
// (Optional). If the keyword exists then evaluate the shapes to process by parsing the
|
|
||||||
// comma-separated values. Ex.values are 0,1,2 then pipeline will assume it has to
|
|
||||||
// process shape0, shape1, shape2. The pipeline will also disregard the values in
|
|
||||||
// DUMBERVALUES that otherwise determine how many shape files to process.
|
|
||||||
SHAPE2PROC,
|
|
||||||
|
|
||||||
// (optional) controls whether or not STL files get generated. If these do not exist in the
|
|
||||||
// pipeConfig file then they will NOT
|
|
||||||
// get generated!
|
|
||||||
GLOBALSTL, LOCALSTL,
|
|
||||||
|
|
||||||
// keywords for local products
|
// (Optional). If the keyword exists then evaluate the shapes to process by parsing the
|
||||||
//
|
// comma-separated values. Ex.values are 0,1,2 then pipeline will assume it has to
|
||||||
// MAPSmallerSZ: resize local DTMs to a different half size. For pipeline we may want to generate
|
// process shape0, shape1, shape2. The pipeline will also disregard the values in
|
||||||
// DTMs at halfsize + tilt radius then resize the DTMs to halfsize in order to have tilts
|
// DUMBERVALUES that otherwise determine how many shape files to process.
|
||||||
// evaluated with the full range of points at the edges.
|
SHAPE2PROC,
|
||||||
//
|
|
||||||
// MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps.
|
|
||||||
// defaults to auto-generated tiles if this is not specified.
|
|
||||||
// allow for pointers to different files for 30cm and 10cm map products.
|
|
||||||
DOLOCALMAP, MAPDIR, MAPSCALE, MAPHALFSZ, REPORT, MAPSmallerSZ, MAPFILE, ISTAGSITE, LTILTMAJAXIS, LTILTMINAXIS, LTILTPA, MAXSCALE,
|
|
||||||
|
|
||||||
// settings for local tag sites. Note TAGSFILE is not optional.
|
// (optional) controls whether or not STL files get generated. If these do not exist in the
|
||||||
// it contains the tagsite name and lat,lon of tag site tile center
|
// pipeConfig file then they will NOT
|
||||||
TAGDIR, TAGSCALE, TAGHALFSZ, TAGSFILE, REPORTTAG,
|
// get generated!
|
||||||
|
GLOBALSTL,
|
||||||
|
LOCALSTL,
|
||||||
|
|
||||||
// pointer to OLA database. only required if DATASOURCE is OLA
|
// keywords for local products
|
||||||
OLADBPATH,
|
//
|
||||||
|
// MAPSmallerSZ: resize local DTMs to a different half size. For pipeline we may want to generate
|
||||||
|
// DTMs at halfsize + tilt radius then resize the DTMs to halfsize in order to have tilts
|
||||||
|
// evaluated with the full range of points at the edges.
|
||||||
|
//
|
||||||
|
// MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps.
|
||||||
|
// defaults to auto-generated tiles if this is not specified.
|
||||||
|
// allow for pointers to different files for 30cm and 10cm map products.
|
||||||
|
DOLOCALMAP,
|
||||||
|
MAPDIR,
|
||||||
|
MAPSCALE,
|
||||||
|
MAPHALFSZ,
|
||||||
|
REPORT,
|
||||||
|
MAPSmallerSZ,
|
||||||
|
MAPFILE,
|
||||||
|
ISTAGSITE,
|
||||||
|
LTILTMAJAXIS,
|
||||||
|
LTILTMINAXIS,
|
||||||
|
LTILTPA,
|
||||||
|
MAXSCALE,
|
||||||
|
|
||||||
// force sigma files to all be NaN
|
// settings for local tag sites. Note TAGSFILE is not optional.
|
||||||
FORCESIGMA_NAN,
|
// it contains the tagsite name and lat,lon of tag site tile center
|
||||||
|
TAGDIR,
|
||||||
|
TAGSCALE,
|
||||||
|
TAGHALFSZ,
|
||||||
|
TAGSFILE,
|
||||||
|
REPORTTAG,
|
||||||
|
|
||||||
// global sigma scale factor
|
// pointer to OLA database. only required if DATASOURCE is OLA
|
||||||
SIGMA_SCALEFACTOR,
|
OLADBPATH,
|
||||||
|
|
||||||
// local sigma scale factor
|
// force sigma files to all be NaN
|
||||||
LOCAL_SIGMA_SCALEFACTOR,
|
FORCESIGMA_NAN,
|
||||||
|
|
||||||
// SIGMA file type. No longer tied to DataSource!
|
// global sigma scale factor
|
||||||
SIGMAFILE_TYPE,
|
SIGMA_SCALEFACTOR,
|
||||||
|
|
||||||
// force the Report page to be HTML. Default is created at PHP
|
// local sigma scale factor
|
||||||
REPORTASHTML,
|
LOCAL_SIGMA_SCALEFACTOR,
|
||||||
|
|
||||||
/*
|
// SIGMA file type. No longer tied to DataSource!
|
||||||
* The following are used to change default values used by the pipeline these are the shape model
|
SIGMAFILE_TYPE,
|
||||||
* density, rotation rate, gravitational algorithm, gravitational constant, global average
|
|
||||||
* reference potential, local average reference potential. Added values defining the tilt ellipse
|
|
||||||
* to use when evaluating tilts. Note the different enums for global versus local tilt ellipse
|
|
||||||
* parameters. The pipeline will use default values for these enums if they are not defined in the
|
|
||||||
* pipeline configuration file.
|
|
||||||
*/
|
|
||||||
GALGORITHM, GRAVCONST, GTILTMINAXIS, GTILTPA, MASSUNCERT, VSEARCHRAD_PCTGSD, FSEARCHRAD_PCTGSD, PXPERDEG,
|
|
||||||
|
|
||||||
// The following are options to subvert normal pipeline operations or to configure pipeline for
|
// force the Report page to be HTML. Default is created at PHP
|
||||||
// other missions
|
REPORTASHTML,
|
||||||
|
|
||||||
|
/*
|
||||||
|
* The following are used to change default values used by the pipeline these are the shape model
|
||||||
|
* density, rotation rate, gravitational algorithm, gravitational constant, global average
|
||||||
|
* reference potential, local average reference potential. Added values defining the tilt ellipse
|
||||||
|
* to use when evaluating tilts. Note the different enums for global versus local tilt ellipse
|
||||||
|
* parameters. The pipeline will use default values for these enums if they are not defined in the
|
||||||
|
* pipeline configuration file.
|
||||||
|
*/
|
||||||
|
GALGORITHM,
|
||||||
|
GRAVCONST,
|
||||||
|
GTILTMINAXIS,
|
||||||
|
GTILTPA,
|
||||||
|
MASSUNCERT,
|
||||||
|
VSEARCHRAD_PCTGSD,
|
||||||
|
FSEARCHRAD_PCTGSD,
|
||||||
|
PXPERDEG,
|
||||||
|
|
||||||
// global objs are supplied at all resolutions as the starting point.
|
// The following are options to subvert normal pipeline operations or to configure pipeline for
|
||||||
// This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls
|
// other missions
|
||||||
OBJASINPUT,
|
|
||||||
|
|
||||||
// gzip the obj files to save space
|
// global objs are supplied at all resolutions as the starting point.
|
||||||
DOGZIP,
|
// This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls
|
||||||
|
OBJASINPUT,
|
||||||
|
|
||||||
// specify the queue to use in the GRID ENGINE
|
// gzip the obj files to save space
|
||||||
GRIDQUEUE,
|
DOGZIP,
|
||||||
|
|
||||||
// default mode for local product creation is to parallelize DistributedGravity
|
// specify the queue to use in the GRID ENGINE
|
||||||
// for each tile. Then processing for each job is done in local mode.
|
GRIDQUEUE,
|
||||||
// set this flag to 1 to submit DistributedGravity for each tile sequentially,
|
|
||||||
// and have each job spawn to the grid engine
|
|
||||||
DISTGRAVITY_USEGRID,
|
|
||||||
|
|
||||||
// override grid engine mode and use local parallel mode with the specified number of cores
|
// default mode for local product creation is to parallelize DistributedGravity
|
||||||
LOCALPARALLEL,
|
// for each tile. Then processing for each job is done in local mode.
|
||||||
|
// set this flag to 1 to submit DistributedGravity for each tile sequentially,
|
||||||
|
// and have each job spawn to the grid engine
|
||||||
|
DISTGRAVITY_USEGRID,
|
||||||
|
|
||||||
// when creating local gravity products skip creation of gravity files that already exist
|
// override grid engine mode and use local parallel mode with the specified number of cores
|
||||||
USEOLDGRAV,
|
LOCALPARALLEL,
|
||||||
|
|
||||||
// override ancillary fits table default setting (binary). Set to ASCII instead
|
// when creating local gravity products skip creation of gravity files that already exist
|
||||||
ANCITXTTABLE,
|
USEOLDGRAV,
|
||||||
|
|
||||||
// contains pointer to fits header config file (optional)
|
// override ancillary fits table default setting (binary). Set to ASCII instead
|
||||||
FITSCONFIGFILE,
|
ANCITXTTABLE,
|
||||||
|
|
||||||
// contains pointer to OBJ comments header file (optional). Will only
|
// contains pointer to fits header config file (optional)
|
||||||
// add commentes if ADDOBJCOMMENTS flag is set.
|
FITSCONFIGFILE,
|
||||||
OBJCOMHEADER,
|
|
||||||
|
|
||||||
// identifies whether there are poisson reconstruct data products to include in the webpage report
|
// contains pointer to OBJ comments header file (optional). Will only
|
||||||
LOCALPOISSON, GLOBALPOISSON;
|
// add commentes if ADDOBJCOMMENTS flag is set.
|
||||||
|
OBJCOMHEADER,
|
||||||
|
|
||||||
/*
|
// identifies whether there are poisson reconstruct data products to include in the webpage report
|
||||||
* The following enumerations are required to exist in the altwg pipeline config file.
|
LOCALPOISSON,
|
||||||
*/
|
GLOBALPOISSON;
|
||||||
public static final EnumSet<AltPipelnEnum> reqTags = EnumSet.range(DOGLOBALSHAPE, VERSION);
|
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* The following enumerations do not have to be present in the config file. But, if they are not
|
* The following enumerations are required to exist in the altwg pipeline config file.
|
||||||
* then the pipeline should use the default values associated with the enums.
|
*/
|
||||||
*/
|
public static final EnumSet<AltPipelnEnum> reqTags = EnumSet.range(DOGLOBALSHAPE, VERSION);
|
||||||
public static final EnumSet<AltPipelnEnum> overrideTags = EnumSet.range(SMDENSITY, GTILTPA);
|
|
||||||
|
|
||||||
public static String mapToString(Map<AltPipelnEnum, String> pipeConfig) {
|
/*
|
||||||
StringBuilder sb = new StringBuilder();
|
* The following enumerations do not have to be present in the config file. But, if they are not
|
||||||
for (Map.Entry<AltPipelnEnum, String> entry : pipeConfig.entrySet()) {
|
* then the pipeline should use the default values associated with the enums.
|
||||||
sb.append(String.format("%s:%s\n", entry.getKey().toString(), entry.getValue()));
|
*/
|
||||||
|
public static final EnumSet<AltPipelnEnum> overrideTags = EnumSet.range(SMDENSITY, GTILTPA);
|
||||||
|
|
||||||
|
public static String mapToString(Map<AltPipelnEnum, String> pipeConfig) {
|
||||||
|
StringBuilder sb = new StringBuilder();
|
||||||
|
for (Map.Entry<AltPipelnEnum, String> entry : pipeConfig.entrySet()) {
|
||||||
|
sb.append(String.format("%s:%s\n", entry.getKey().toString(), entry.getValue()));
|
||||||
|
}
|
||||||
|
|
||||||
|
return sb.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
return sb.toString();
|
public static AltPipelnEnum fromString(String textString) {
|
||||||
}
|
for (AltPipelnEnum anciType : AltPipelnEnum.values()) {
|
||||||
|
if (anciType.toString().equals(textString)) return anciType;
|
||||||
public static AltPipelnEnum fromString(String textString) {
|
}
|
||||||
for (AltPipelnEnum anciType : AltPipelnEnum.values()) {
|
return null;
|
||||||
if (anciType.toString().equals(textString))
|
|
||||||
return anciType;
|
|
||||||
}
|
}
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Convenience method for evaluating configuration parameter where the value indicates whether or
|
* Convenience method for evaluating configuration parameter where the value indicates whether or
|
||||||
* not the parameter is true or false via an integer value. Assumes 0 or less = false, 1 or more =
|
* not the parameter is true or false via an integer value. Assumes 0 or less = false, 1 or more =
|
||||||
* true. If the key does not exist then return false.
|
* true. If the key does not exist then return false.
|
||||||
*
|
*
|
||||||
* @param pipeConfig
|
* @param pipeConfig
|
||||||
* @param key
|
* @param key
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static boolean isTrue(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
|
public static boolean isTrue(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
|
||||||
boolean returnFlag = false;
|
boolean returnFlag = false;
|
||||||
int parsedVal = 0;
|
int parsedVal = 0;
|
||||||
if (pipeConfig.containsKey(key)) {
|
if (pipeConfig.containsKey(key)) {
|
||||||
try {
|
try {
|
||||||
parsedVal = Integer.valueOf(pipeConfig.get(key));
|
parsedVal = Integer.valueOf(pipeConfig.get(key));
|
||||||
} catch (NumberFormatException e) {
|
} catch (NumberFormatException e) {
|
||||||
System.err.println("ERROR! Could not parse integer value for pipeConfig line:");
|
System.err.println("ERROR! Could not parse integer value for pipeConfig line:");
|
||||||
System.err.println(key.toString() + "," + pipeConfig.get(key));
|
System.err.println(key.toString() + "," + pipeConfig.get(key));
|
||||||
System.err.println("Stopping with error!");
|
System.err.println("Stopping with error!");
|
||||||
System.exit(1);
|
System.exit(1);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (parsedVal > 0) {
|
|
||||||
returnFlag = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
if (parsedVal > 0) {
|
||||||
|
returnFlag = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return returnFlag;
|
||||||
}
|
}
|
||||||
return returnFlag;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks to see whether key exists. If so then return value mapped to key. Otherwise return empty
|
* Checks to see whether key exists. If so then return value mapped to key. Otherwise return empty
|
||||||
* string.
|
* string.
|
||||||
*
|
*
|
||||||
* @param pipeConfig
|
* @param pipeConfig
|
||||||
* @param key
|
* @param key
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
public static String checkAndGet(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
|
public static String checkAndGet(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
|
||||||
String value = "";
|
String value = "";
|
||||||
if (pipeConfig.containsKey(key)) {
|
if (pipeConfig.containsKey(key)) {
|
||||||
value = pipeConfig.get(key);
|
value = pipeConfig.get(key);
|
||||||
|
}
|
||||||
|
return value;
|
||||||
}
|
}
|
||||||
return value;
|
|
||||||
}
|
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* Some enums will have a default value, e.g. the ones in the overrideTags EnumSet. It is easier
|
* Some enums will have a default value, e.g. the ones in the overrideTags EnumSet. It is easier
|
||||||
* to keep them as string values then convert them to other primitives as needed. Sometimes other
|
* to keep them as string values then convert them to other primitives as needed. Sometimes other
|
||||||
* executables will be called w/ the default values, so it is better to keep them as strings to
|
* executables will be called w/ the default values, so it is better to keep them as strings to
|
||||||
* avoid double conversion.
|
* avoid double conversion.
|
||||||
*/
|
*/
|
||||||
public static String getDefault(AltPipelnEnum thisEnum) {
|
public static String getDefault(AltPipelnEnum thisEnum) {
|
||||||
|
|
||||||
if (overrideTags.contains(thisEnum)) {
|
if (overrideTags.contains(thisEnum)) {
|
||||||
|
|
||||||
switch (thisEnum) {
|
switch (thisEnum) {
|
||||||
|
|
||||||
// shape model density and rotation rate must now be explicitly defined in the configuration
|
// shape model density and rotation rate must now be explicitly defined in the configuration
|
||||||
// file!
|
// file!
|
||||||
// case SMDENSITY:
|
// case SMDENSITY:
|
||||||
// return "1.186";
|
// return "1.186";
|
||||||
//
|
//
|
||||||
// case SMRRATE:
|
// case SMRRATE:
|
||||||
// return "0.00040626";
|
// return "0.00040626";
|
||||||
|
|
||||||
case GALGORITHM:
|
case GALGORITHM:
|
||||||
return "werner";
|
return "werner";
|
||||||
|
|
||||||
case GRAVCONST:
|
case GRAVCONST:
|
||||||
return "6.67408e-11";
|
return "6.67408e-11";
|
||||||
|
|
||||||
case LTILTMAJAXIS:
|
case LTILTMAJAXIS:
|
||||||
return "0.0125";
|
return "0.0125";
|
||||||
|
|
||||||
case GTILTMINAXIS:
|
case GTILTMINAXIS:
|
||||||
return "0.0125";
|
return "0.0125";
|
||||||
|
|
||||||
case GTILTPA:
|
case GTILTPA:
|
||||||
case LTILTPA:
|
case LTILTPA:
|
||||||
return "0.0";
|
return "0.0";
|
||||||
|
|
||||||
case MASSUNCERT:
|
case MASSUNCERT:
|
||||||
return "0.01";
|
return "0.01";
|
||||||
|
|
||||||
case VSEARCHRAD_PCTGSD:
|
case VSEARCHRAD_PCTGSD:
|
||||||
return "0.25";
|
return "0.25";
|
||||||
|
|
||||||
case FSEARCHRAD_PCTGSD:
|
case FSEARCHRAD_PCTGSD:
|
||||||
return "0.5";
|
return "0.5";
|
||||||
|
|
||||||
default:
|
default:
|
||||||
return "NA";
|
return "NA";
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
return "NA";
|
||||||
}
|
}
|
||||||
return "NA";
|
;
|
||||||
|
|
||||||
};
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -8,53 +30,49 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
|
|
||||||
public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
|
public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
|
||||||
|
|
||||||
public AltwgAnciGlobal(FitsHdr fitsHeader) {
|
public AltwgAnciGlobal(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.ANCIGLOBALALTWG);
|
super(fitsHeader, FitsHeaderType.ANCIGLOBALALTWG);
|
||||||
}
|
|
||||||
|
|
||||||
// methods below override the concrete methods in AnciTableFits abstract class or
|
|
||||||
// are specific to this class
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
|
|
||||||
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
headers.addAll(getHeaderInfo("header information"));
|
|
||||||
headers.addAll(getMissionInfo("mission information"));
|
|
||||||
headers.addAll(getIDInfo("identification info"));
|
|
||||||
headers.addAll(getMapDataSrc("shape data source"));
|
|
||||||
headers.addAll(getProcInfo("processing information"));
|
|
||||||
headers.addAll(getMapInfo("map specific information"));
|
|
||||||
headers.addAll(getSpatialInfo("summary spatial information"));
|
|
||||||
headers.addAll(getSpecificInfo("product specific"));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Contains OREX-SPOC specific keywords.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
// methods below override the concrete methods in AnciTableFits abstract class or
|
||||||
|
// are specific to this class
|
||||||
|
|
||||||
}
|
/**
|
||||||
|
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
|
||||||
|
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
|
headers.addAll(getHeaderInfo("header information"));
|
||||||
|
headers.addAll(getMissionInfo("mission information"));
|
||||||
|
headers.addAll(getIDInfo("identification info"));
|
||||||
|
headers.addAll(getMapDataSrc("shape data source"));
|
||||||
|
headers.addAll(getProcInfo("processing information"));
|
||||||
|
headers.addAll(getMapInfo("map specific information"));
|
||||||
|
headers.addAll(getSpatialInfo("summary spatial information"));
|
||||||
|
headers.addAll(getSpecificInfo("product specific"));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Contains OREX-SPOC specific keywords.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -8,67 +30,66 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
|
|
||||||
public class AltwgAnciGlobalFacetRelation extends AnciTableFits implements AnciFitsHeader {
|
public class AltwgAnciGlobalFacetRelation extends AnciTableFits implements AnciFitsHeader {
|
||||||
|
|
||||||
public AltwgAnciGlobalFacetRelation(FitsHdr fitsHeader) {
|
public AltwgAnciGlobalFacetRelation(FitsHdr fitsHeader) {
|
||||||
|
|
||||||
super(fitsHeader, FitsHeaderType.ANCIG_FACETRELATION_ALTWG);
|
super(fitsHeader, FitsHeaderType.ANCIG_FACETRELATION_ALTWG);
|
||||||
}
|
|
||||||
|
|
||||||
// methods below override the concrete methods in AnciTableFits abstract class or
|
|
||||||
// are specific to this class
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
|
|
||||||
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
headers.addAll(getHeaderInfo("header information"));
|
|
||||||
headers.addAll(getMissionInfo("mission information"));
|
|
||||||
headers.addAll(getIDInfo("identification info"));
|
|
||||||
headers.addAll(getMapDataSrc("shape data source"));
|
|
||||||
headers.addAll(getProcInfo("processing information"));
|
|
||||||
headers.addAll(getMapInfo("map specific information"));
|
|
||||||
headers.addAll(getSpatialInfo("summary spatial information"));
|
|
||||||
headers.addAll(getSpecificInfo("product specific"));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
|
|
||||||
* product keywords
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJINDX));
|
// methods below override the concrete methods in AnciTableFits abstract class or
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDX));
|
// are specific to this class
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDXI));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
}
|
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
|
||||||
|
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
|
headers.addAll(getHeaderInfo("header information"));
|
||||||
|
headers.addAll(getMissionInfo("mission information"));
|
||||||
|
headers.addAll(getIDInfo("identification info"));
|
||||||
|
headers.addAll(getMapDataSrc("shape data source"));
|
||||||
|
headers.addAll(getProcInfo("processing information"));
|
||||||
|
headers.addAll(getMapInfo("map specific information"));
|
||||||
|
headers.addAll(getSpatialInfo("summary spatial information"));
|
||||||
|
headers.addAll(getSpecificInfo("product specific"));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
|
||||||
|
* product keywords
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJINDX));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDX));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDXI));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -8,53 +30,49 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
|
|
||||||
public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
|
public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
|
||||||
|
|
||||||
public AltwgAnciLocal(FitsHdr fitsHeader) {
|
public AltwgAnciLocal(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.ANCILOCALALTWG);
|
super(fitsHeader, FitsHeaderType.ANCILOCALALTWG);
|
||||||
}
|
|
||||||
|
|
||||||
// methods below override the concrete methods in AnciTableFits abstract class or
|
|
||||||
// are specific to this class
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
|
|
||||||
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
headers.addAll(getHeaderInfo("header information"));
|
|
||||||
headers.addAll(getMissionInfo("mission information"));
|
|
||||||
headers.addAll(getIDInfo("identification info"));
|
|
||||||
headers.addAll(getMapDataSrc("shape data source"));
|
|
||||||
headers.addAll(getProcInfo("processing information"));
|
|
||||||
headers.addAll(getMapInfo("map specific information"));
|
|
||||||
headers.addAll(getSpatialInfo("summary spatial information"));
|
|
||||||
headers.addAll(getSpecificInfo("product specific"));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Contains OREX-SPOC specific keywords.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
// methods below override the concrete methods in AnciTableFits abstract class or
|
||||||
|
// are specific to this class
|
||||||
|
|
||||||
}
|
/**
|
||||||
|
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
|
||||||
|
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
|
headers.addAll(getHeaderInfo("header information"));
|
||||||
|
headers.addAll(getMissionInfo("mission information"));
|
||||||
|
headers.addAll(getIDInfo("identification info"));
|
||||||
|
headers.addAll(getMapDataSrc("shape data source"));
|
||||||
|
headers.addAll(getProcInfo("processing information"));
|
||||||
|
headers.addAll(getMapInfo("map specific information"));
|
||||||
|
headers.addAll(getSpatialInfo("summary spatial information"));
|
||||||
|
headers.addAll(getSpecificInfo("product specific"));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Contains OREX-SPOC specific keywords.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -11,77 +33,73 @@ import terrasaur.utils.DTMHeader;
|
|||||||
* Contains methods for building fits header corresponding to ALTWG Global DTM. Methods that are
|
* Contains methods for building fits header corresponding to ALTWG Global DTM. Methods that are
|
||||||
* specific to the ALTWG Global DTM fits header are contained here. Default methods contained in
|
* specific to the ALTWG Global DTM fits header are contained here. Default methods contained in
|
||||||
* DTMFits class.
|
* DTMFits class.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
|
public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
|
||||||
|
|
||||||
public AltwgGlobalDTM(FitsHdr fitsHeader) {
|
public AltwgGlobalDTM(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.DTMGLOBALALTWG);
|
super(fitsHeader, FitsHeaderType.DTMGLOBALALTWG);
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing observation or ID related information. Includes keywords specific
|
|
||||||
* to OREX-SPOC
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
|
* Fits header block containing observation or ID related information. Includes keywords specific
|
||||||
|
* to OREX-SPOC
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
}
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
/**
|
return headers;
|
||||||
* return Fits header block that contains information about the fits header itself. Custom to
|
|
||||||
* OREX-SPOC
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
|
* return Fits header block that contains information about the fits header itself. Custom to
|
||||||
|
* OREX-SPOC
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
}
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
||||||
|
|
||||||
|
return headers;
|
||||||
/**
|
|
||||||
* Added GSDI - specific to OREX-SPOC.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
}
|
* Added GSDI - specific to OREX-SPOC.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -11,122 +33,119 @@ import terrasaur.utils.DTMHeader;
|
|||||||
* Contains methods for building fits header corresponding to ALTWG local DTM. Methods that are
|
* Contains methods for building fits header corresponding to ALTWG local DTM. Methods that are
|
||||||
* specific to the ALTWG Local DTM fits header are contained here. Default methods are contained in
|
* specific to the ALTWG Local DTM fits header are contained here. Default methods are contained in
|
||||||
* DTMFits class.
|
* DTMFits class.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class AltwgLocalDTM extends DTMFits implements DTMHeader {
|
public class AltwgLocalDTM extends DTMFits implements DTMHeader {
|
||||||
|
|
||||||
public AltwgLocalDTM(FitsHdr fitsHeader) {
|
public AltwgLocalDTM(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.DTMLOCALALTWG);
|
super(fitsHeader, FitsHeaderType.DTMLOCALALTWG);
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing observation or ID related information. Includes keywords specific
|
|
||||||
* to OREX-SPOC
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
|
/**
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
* Fits header block containing observation or ID related information. Includes keywords specific
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_1));
|
* to OREX-SPOC
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_2));
|
*
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
|
* @return
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
|
* @throws HeaderCardException
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
|
*/
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
|
@Override
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
|
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
|
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
}
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
/**
|
return headers;
|
||||||
* Include corner points, center vector and ux,uy,uz describing local plane
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
@Override
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
headers.addAll(getCornerCards());
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
headers.addAll(getCenterVec());
|
if (comment.length() > 0) {
|
||||||
headers.addAll(getUX());
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
headers.addAll(getUY());
|
}
|
||||||
headers.addAll(getUZ());
|
|
||||||
|
|
||||||
return headers;
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
|
||||||
}
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_1));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_2));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
|
||||||
|
|
||||||
/**
|
return headers;
|
||||||
* return Fits header block that contains information about the fits header itself. Custom to
|
|
||||||
* OREX-SPOC
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
|
* Include corner points, center vector and ux,uy,uz describing local plane
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
}
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
||||||
* Added GSDI - specific to OREX-SPOC.
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
headers.addAll(getCornerCards());
|
||||||
if (comment.length() > 0) {
|
headers.addAll(getCenterVec());
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
headers.addAll(getUX());
|
||||||
|
headers.addAll(getUY());
|
||||||
|
headers.addAll(getUZ());
|
||||||
|
|
||||||
|
return headers;
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
}
|
* return Fits header block that contains information about the fits header itself. Custom to
|
||||||
|
* OREX-SPOC
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Added GSDI - specific to OREX-SPOC.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,12 +1,32 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
import nom.tam.fits.HeaderCard;
|
import nom.tam.fits.HeaderCard;
|
||||||
import nom.tam.fits.HeaderCardException;
|
import nom.tam.fits.HeaderCardException;
|
||||||
|
|
||||||
public interface AnciFitsHeader {
|
public interface AnciFitsHeader {
|
||||||
|
|
||||||
public List<HeaderCard> createFitsHeader() throws HeaderCardException;
|
public List<HeaderCard> createFitsHeader() throws HeaderCardException;
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -10,282 +32,272 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
* Abstract generic class with concrete methods and attributes for creating a FITS table with
|
* Abstract generic class with concrete methods and attributes for creating a FITS table with
|
||||||
* generalized fits header. Specific implementations can be written to create custom fits headers as
|
* generalized fits header. Specific implementations can be written to create custom fits headers as
|
||||||
* needed.
|
* needed.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public abstract class AnciTableFits {
|
public abstract class AnciTableFits {
|
||||||
|
|
||||||
public final String COMMENT = "COMMENT";
|
public final String COMMENT = "COMMENT";
|
||||||
FitsHdr fitsHdr;
|
FitsHdr fitsHdr;
|
||||||
public final FitsHeaderType fitsHeaderType;
|
public final FitsHeaderType fitsHeaderType;
|
||||||
|
|
||||||
public AnciTableFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
|
public AnciTableFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
|
||||||
this.fitsHdr = fitsHdr;
|
this.fitsHdr = fitsHdr;
|
||||||
this.fitsHeaderType = fitsHeaderType;
|
this.fitsHeaderType = fitsHeaderType;
|
||||||
}
|
|
||||||
|
|
||||||
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
headers.addAll(getHeaderInfo("header information"));
|
|
||||||
headers.addAll(getMissionInfo("mission information"));
|
|
||||||
headers.addAll(getIDInfo("identification info"));
|
|
||||||
headers.addAll(getMapDataSrc("shape data source"));
|
|
||||||
headers.addAll(getProcInfo("processing information"));
|
|
||||||
headers.addAll(getMapInfo("map specific information"));
|
|
||||||
headers.addAll(getSpatialInfo("summary spatial information"));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* return Fits header block that contains information about the fits header itself.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the mission.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing observation or ID related information.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the source data used to create the map.
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the software processing done to generate the
|
|
||||||
* product.
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
|
||||||
|
|
||||||
headers.addAll(getCornerCards());
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
// add CNTR_V_X,Y,Z
|
headers.addAll(getHeaderInfo("header information"));
|
||||||
headers.addAll(getCenterVec());
|
headers.addAll(getMissionInfo("mission information"));
|
||||||
|
headers.addAll(getIDInfo("identification info"));
|
||||||
|
headers.addAll(getMapDataSrc("shape data source"));
|
||||||
|
headers.addAll(getProcInfo("processing information"));
|
||||||
|
headers.addAll(getMapInfo("map specific information"));
|
||||||
|
headers.addAll(getSpatialInfo("summary spatial information"));
|
||||||
|
|
||||||
// add UX_X,Y,Z
|
return headers;
|
||||||
headers.addAll(getUX());
|
|
||||||
|
|
||||||
// add UY_X,Y,Z
|
|
||||||
headers.addAll(getUY());
|
|
||||||
|
|
||||||
// add UZ_X,Y,Z
|
|
||||||
headers.addAll(getUZ());
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
|
|
||||||
* product keywords
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
|
/**
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
* return Fits header block that contains information about the fits header itself.
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
|
*
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
|
* @return
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
|
* @throws HeaderCardException
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
|
*/
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
|
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
|
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
}
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
||||||
|
|
||||||
/**
|
return headers;
|
||||||
* Return headercards associated with the upper/lower left/right corners of the image.
|
}
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getCornerCards() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
/**
|
||||||
String fmtS = "%18.13f";
|
* Fits header block containing information about the mission.
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
|
*
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
|
* @return
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
|
* @throws HeaderCardException
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
|
*/
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
|
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
|
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
}
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
|
||||||
|
|
||||||
/**
|
return headers;
|
||||||
* Return headercards for vector to center of image.
|
}
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getCenterVec() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
/**
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
|
* Fits header block containing observation or ID related information.
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
|
*
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
}
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
public List<HeaderCard> getUX() throws HeaderCardException {
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
/**
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
|
* Fits header block containing information about the source data used to create the map.
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
|
*
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
}
|
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
public List<HeaderCard> getUY() throws HeaderCardException {
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
return headers;
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
|
}
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
|
* Fits header block containing information about the software processing done to generate the
|
||||||
|
* product.
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
}
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
||||||
|
|
||||||
public List<HeaderCard> getUZ() throws HeaderCardException {
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
|
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
}
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
||||||
|
|
||||||
|
headers.addAll(getCornerCards());
|
||||||
|
|
||||||
|
// add CNTR_V_X,Y,Z
|
||||||
|
headers.addAll(getCenterVec());
|
||||||
|
|
||||||
|
// add UX_X,Y,Z
|
||||||
|
headers.addAll(getUX());
|
||||||
|
|
||||||
|
// add UY_X,Y,Z
|
||||||
|
headers.addAll(getUY());
|
||||||
|
|
||||||
|
// add UZ_X,Y,Z
|
||||||
|
headers.addAll(getUZ());
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
|
||||||
|
* product keywords
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return headercards associated with the upper/lower left/right corners of the image.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getCornerCards() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
String fmtS = "%18.13f";
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return headercards for vector to center of image.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getCenterVec() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<HeaderCard> getUX() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<HeaderCard> getUY() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<HeaderCard> getUZ() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -10,342 +32,332 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
* Abstract generic class with concrete methods and attributes for creating a FITS DTM cube with a
|
* Abstract generic class with concrete methods and attributes for creating a FITS DTM cube with a
|
||||||
* generalized fits header. Specific implementations can be written to create custom fits headers as
|
* generalized fits header. Specific implementations can be written to create custom fits headers as
|
||||||
* needed.
|
* needed.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public abstract class DTMFits {
|
public abstract class DTMFits {
|
||||||
|
|
||||||
public final String COMMENT = "COMMENT";
|
public final String COMMENT = "COMMENT";
|
||||||
final FitsHdr fitsHdr;
|
final FitsHdr fitsHdr;
|
||||||
private FitsData fitsData;
|
private FitsData fitsData;
|
||||||
private boolean dataContained = false;
|
private boolean dataContained = false;
|
||||||
public final FitsHeaderType fitsHeaderType;
|
public final FitsHeaderType fitsHeaderType;
|
||||||
|
|
||||||
public DTMFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
|
public DTMFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
|
||||||
this.fitsHdr = fitsHdr;
|
this.fitsHdr = fitsHdr;
|
||||||
this.fitsHeaderType = fitsHeaderType;
|
this.fitsHeaderType = fitsHeaderType;
|
||||||
}
|
|
||||||
|
|
||||||
public void setData(FitsData fitsData) {
|
|
||||||
this.fitsData = fitsData;
|
|
||||||
dataContained = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
headers.addAll(getHeaderInfo("header information"));
|
|
||||||
headers.addAll(getMissionInfo("mission information"));
|
|
||||||
headers.addAll(getIDInfo("identification info"));
|
|
||||||
headers.addAll(getMapDataSrc("data source"));
|
|
||||||
headers.addAll(getProcInfo("processing information"));
|
|
||||||
headers.addAll(getMapInfo("map specific information"));
|
|
||||||
headers.addAll(getSpatialInfo("summary spatial information"));
|
|
||||||
headers.addAll(getPlaneInfo("plane information", planeList));
|
|
||||||
headers.addAll(getSpecificInfo("product specific"));
|
|
||||||
|
|
||||||
// end keyword
|
|
||||||
headers.add(getEnd());
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* return Fits header block that contains information about the fits header itself. No string
|
|
||||||
* passed, so no comment in header.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getHeaderInfo() throws HeaderCardException {
|
|
||||||
return getHeaderInfo("");
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* return Fits header block that contains information about the fits header itself. This is a
|
|
||||||
* custom section and so is left empty here. It can be defined in the concrete classes that extend
|
|
||||||
* this class.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the mission.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing observation or ID related information.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the source data used to create the map.
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the software processing done to generate the
|
|
||||||
* product.
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Creates header block containing spatial information for the DTM, e.g. corner locations, vector
|
|
||||||
* to center, Ux, Uy, Uz.
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
public void setData(FitsData fitsData) {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
this.fitsData = fitsData;
|
||||||
headers.addAll(getCornerCards());
|
dataContained = true;
|
||||||
|
|
||||||
// remove these keywords. They are specific to local and MLNs
|
|
||||||
// headers.addAll(getCenterVec());
|
|
||||||
// headers.addAll(getUX());
|
|
||||||
// headers.addAll(getUY());
|
|
||||||
// headers.addAll(getUZ());
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return the HeaderCards describing each DTM plane. Used to build the portion of the fits header
|
|
||||||
* that contains information about the planes in the DTM cube. Checks to see that all data planes
|
|
||||||
* are described by comparing size of planeList against length of fits data.
|
|
||||||
*
|
|
||||||
* @param comment
|
|
||||||
* @param planeList
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList)
|
|
||||||
throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.addAll(planeList);
|
|
||||||
|
|
||||||
if (!dataContained) {
|
|
||||||
String errMesg = "ERROR! Cannot return keywords describing the DTM cube without "
|
|
||||||
+ "having the actual data!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// check if planeList describes all the planes in data, throw runtime exception if not.
|
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
|
||||||
if (planeList.size() != fitsData.getData().length) {
|
|
||||||
System.out.println("Error: plane List has " + planeList.size() + " planes but datacube has "
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
+ fitsData.getData().length + " planes");
|
|
||||||
for (HeaderCard thisPlane : planeList) {
|
headers.addAll(getHeaderInfo("header information"));
|
||||||
System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue());
|
headers.addAll(getMissionInfo("mission information"));
|
||||||
}
|
headers.addAll(getIDInfo("identification info"));
|
||||||
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube "
|
headers.addAll(getMapDataSrc("data source"));
|
||||||
+ " has " + fitsData.getData().length + " planes";
|
headers.addAll(getProcInfo("processing information"));
|
||||||
throw new RuntimeException(errMesg);
|
headers.addAll(getMapInfo("map specific information"));
|
||||||
|
headers.addAll(getSpatialInfo("summary spatial information"));
|
||||||
|
headers.addAll(getPlaneInfo("plane information", planeList));
|
||||||
|
headers.addAll(getSpecificInfo("product specific"));
|
||||||
|
|
||||||
|
// end keyword
|
||||||
|
headers.add(getEnd());
|
||||||
|
|
||||||
|
return headers;
|
||||||
}
|
}
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
}
|
* return Fits header block that contains information about the fits header itself. No string
|
||||||
|
* passed, so no comment in header.
|
||||||
/**
|
*
|
||||||
* Return the HeaderCards associated with a specific product.
|
* @return
|
||||||
*
|
* @throws HeaderCardException
|
||||||
* @param comment
|
*/
|
||||||
* @return
|
public List<HeaderCard> getHeaderInfo() throws HeaderCardException {
|
||||||
* @throws HeaderCardException
|
return getHeaderInfo("");
|
||||||
*/
|
|
||||||
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
|
/**
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
* return Fits header block that contains information about the fits header itself. This is a
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
|
* custom section and so is left empty here. It can be defined in the concrete classes that extend
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
|
* this class.
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
|
*
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
|
* @return
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
|
* @throws HeaderCardException
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
|
*/
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
|
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
}
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Return headercards associated with the upper/lower left/right corners of the image.
|
* Fits header block containing information about the mission.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
* @throws HeaderCardException
|
* @throws HeaderCardException
|
||||||
*/
|
*/
|
||||||
public List<HeaderCard> getCornerCards() throws HeaderCardException {
|
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
String fmtS = "%18.13f";
|
if (comment.length() > 0) {
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
|
|
||||||
|
|
||||||
return headers;
|
return headers;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Return headercards for vector to center of image.
|
* Fits header block containing observation or ID related information.
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
* @throws HeaderCardException
|
* @throws HeaderCardException
|
||||||
*/
|
*/
|
||||||
public List<HeaderCard> getCenterVec() throws HeaderCardException {
|
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
|
if (comment.length() > 0) {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
return headers;
|
return headers;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<HeaderCard> getUX() throws HeaderCardException {
|
/**
|
||||||
|
* Fits header block containing information about the source data used to create the map.
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
|
if (comment.length() > 0) {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
|
||||||
|
|
||||||
return headers;
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
}
|
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
public List<HeaderCard> getUY() throws HeaderCardException {
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
return headers;
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
|
}
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
|
* Fits header block containing information about the software processing done to generate the
|
||||||
|
* product.
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
}
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
||||||
|
|
||||||
public List<HeaderCard> getUZ() throws HeaderCardException {
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
/**
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
|
* Creates header block containing spatial information for the DTM, e.g. corner locations, vector
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
|
* to center, Ux, Uy, Uz.
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
}
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
||||||
|
headers.addAll(getCornerCards());
|
||||||
|
|
||||||
public HeaderCard getEnd() throws HeaderCardException {
|
// remove these keywords. They are specific to local and MLNs
|
||||||
return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment());
|
// headers.addAll(getCenterVec());
|
||||||
}
|
// headers.addAll(getUX());
|
||||||
|
// headers.addAll(getUY());
|
||||||
|
// headers.addAll(getUZ());
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the HeaderCards describing each DTM plane. Used to build the portion of the fits header
|
||||||
|
* that contains information about the planes in the DTM cube. Checks to see that all data planes
|
||||||
|
* are described by comparing size of planeList against length of fits data.
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @param planeList
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.addAll(planeList);
|
||||||
|
|
||||||
|
if (!dataContained) {
|
||||||
|
String errMesg =
|
||||||
|
"ERROR! Cannot return keywords describing the DTM cube without " + "having the actual data!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
|
||||||
|
// check if planeList describes all the planes in data, throw runtime exception if not.
|
||||||
|
if (planeList.size() != fitsData.getData().length) {
|
||||||
|
System.out.println("Error: plane List has " + planeList.size() + " planes but datacube has "
|
||||||
|
+ fitsData.getData().length + " planes");
|
||||||
|
for (HeaderCard thisPlane : planeList) {
|
||||||
|
System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue());
|
||||||
|
}
|
||||||
|
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube " + " has "
|
||||||
|
+ fitsData.getData().length + " planes";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the HeaderCards associated with a specific product.
|
||||||
|
*
|
||||||
|
* @param comment
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return headercards associated with the upper/lower left/right corners of the image.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getCornerCards() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
String fmtS = "%18.13f";
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return headercards for vector to center of image.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getCenterVec() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<HeaderCard> getUX() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<HeaderCard> getUY() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<HeaderCard> getUZ() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public HeaderCard getEnd() throws HeaderCardException {
|
||||||
|
return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import terrasaur.enums.AltwgDataType;
|
import terrasaur.enums.AltwgDataType;
|
||||||
@@ -5,186 +27,184 @@ import terrasaur.enums.SrcProductType;
|
|||||||
|
|
||||||
public class FitsData {
|
public class FitsData {
|
||||||
|
|
||||||
private final double[][][] data;
|
|
||||||
private final double[] V;
|
|
||||||
private final double[] ux;
|
|
||||||
private final double[] uy;
|
|
||||||
private final double[] uz;
|
|
||||||
private final double scale;
|
|
||||||
private final double gsd;
|
|
||||||
private final boolean hasV;
|
|
||||||
private final boolean hasUnitv;
|
|
||||||
private final boolean hasGsd;
|
|
||||||
private final boolean isGlobal;
|
|
||||||
private final boolean hasAltType;
|
|
||||||
private final AltwgDataType altProd;
|
|
||||||
private final String dataSource;
|
|
||||||
|
|
||||||
private FitsData(FitsDataBuilder b) {
|
|
||||||
this.data = b.data;
|
|
||||||
this.V = b.V;
|
|
||||||
this.ux = b.ux;
|
|
||||||
this.uy = b.uy;
|
|
||||||
this.uz = b.uz;
|
|
||||||
this.scale = b.scale;
|
|
||||||
this.gsd = b.gsd;
|
|
||||||
this.hasV = b.hasV;
|
|
||||||
this.hasUnitv = b.hasUnitv;
|
|
||||||
this.hasGsd = b.hasGsd;
|
|
||||||
this.hasAltType = b.hasAltType;
|
|
||||||
this.isGlobal = b.isGlobal;
|
|
||||||
this.altProd = b.altProd;
|
|
||||||
this.dataSource = b.dataSource;
|
|
||||||
}
|
|
||||||
|
|
||||||
public AltwgDataType getAltProdType() {
|
|
||||||
return this.altProd;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getSrcProdType() {
|
|
||||||
return this.dataSource;
|
|
||||||
}
|
|
||||||
|
|
||||||
public double[][][] getData() {
|
|
||||||
return this.data;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean hasV() {
|
|
||||||
return this.hasV;
|
|
||||||
}
|
|
||||||
|
|
||||||
public double[] getV() {
|
|
||||||
return this.V;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean hasUnitv() {
|
|
||||||
return this.hasUnitv;
|
|
||||||
}
|
|
||||||
|
|
||||||
public double[] getUnit(UnitDir udir) {
|
|
||||||
switch (udir) {
|
|
||||||
case UX:
|
|
||||||
return this.ux;
|
|
||||||
|
|
||||||
case UY:
|
|
||||||
return this.uy;
|
|
||||||
|
|
||||||
case UZ:
|
|
||||||
return this.uz;
|
|
||||||
|
|
||||||
default:
|
|
||||||
throw new RuntimeException();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public double getScale() {
|
|
||||||
return this.scale;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean hasGsd() {
|
|
||||||
return this.hasGsd;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean hasAltType() {
|
|
||||||
return this.hasAltType;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean isGlobal() {
|
|
||||||
return this.isGlobal;
|
|
||||||
}
|
|
||||||
|
|
||||||
public double getGSD() {
|
|
||||||
if (this.hasGsd) {
|
|
||||||
return this.gsd;
|
|
||||||
} else {
|
|
||||||
String errMesg = "ERROR! fitsData does not have gsd!";
|
|
||||||
throw new RuntimeException(errMesg);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
public static class FitsDataBuilder {
|
|
||||||
private final double[][][] data;
|
private final double[][][] data;
|
||||||
private double[] V = null;
|
private final double[] V;
|
||||||
private double[] ux = null;
|
private final double[] ux;
|
||||||
private double[] uy = null;
|
private final double[] uy;
|
||||||
private double[] uz = null;
|
private final double[] uz;
|
||||||
private boolean hasV = false;
|
private final double scale;
|
||||||
private boolean hasUnitv = false;
|
private final double gsd;
|
||||||
private boolean hasGsd = false;
|
private final boolean hasV;
|
||||||
private boolean isGlobal = false;
|
private final boolean hasUnitv;
|
||||||
private boolean hasAltType = false;
|
private final boolean hasGsd;
|
||||||
private double scale = Double.NaN;
|
private final boolean isGlobal;
|
||||||
private double gsd = Double.NaN;
|
private final boolean hasAltType;
|
||||||
private AltwgDataType altProd = null;
|
private final AltwgDataType altProd;
|
||||||
private String dataSource = SrcProductType.UNKNOWN.toString();
|
private final String dataSource;
|
||||||
|
|
||||||
/**
|
private FitsData(FitsDataBuilder b) {
|
||||||
* Constructor. isGlobal used to fill out fits keyword describing whether data is local or
|
this.data = b.data;
|
||||||
* global. May also be used for fits naming convention.
|
this.V = b.V;
|
||||||
*
|
this.ux = b.ux;
|
||||||
* @param data
|
this.uy = b.uy;
|
||||||
* @param isGlobal
|
this.uz = b.uz;
|
||||||
*/
|
this.scale = b.scale;
|
||||||
public FitsDataBuilder(double[][][] data, boolean isGlobal) {
|
this.gsd = b.gsd;
|
||||||
this.data = data;
|
this.hasV = b.hasV;
|
||||||
this.isGlobal = isGlobal;
|
this.hasUnitv = b.hasUnitv;
|
||||||
|
this.hasGsd = b.hasGsd;
|
||||||
|
this.hasAltType = b.hasAltType;
|
||||||
|
this.isGlobal = b.isGlobal;
|
||||||
|
this.altProd = b.altProd;
|
||||||
|
this.dataSource = b.dataSource;
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsDataBuilder setAltProdType(AltwgDataType altProd) {
|
public AltwgDataType getAltProdType() {
|
||||||
this.altProd = altProd;
|
return this.altProd;
|
||||||
this.hasAltType = true;
|
|
||||||
return this;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsDataBuilder setDataSource(String dataSource) {
|
public String getSrcProdType() {
|
||||||
this.dataSource = dataSource;
|
return this.dataSource;
|
||||||
return this;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsDataBuilder setV(double[] V) {
|
public double[][][] getData() {
|
||||||
this.V = V;
|
return this.data;
|
||||||
this.hasV = true;
|
|
||||||
return this;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsDataBuilder setU(double[] uvec, UnitDir udir) {
|
public boolean hasV() {
|
||||||
switch (udir) {
|
return this.hasV;
|
||||||
case UX:
|
|
||||||
this.ux = uvec;
|
|
||||||
this.hasUnitv = true;
|
|
||||||
break;
|
|
||||||
|
|
||||||
case UY:
|
|
||||||
this.uy = uvec;
|
|
||||||
this.hasUnitv = true;
|
|
||||||
break;
|
|
||||||
|
|
||||||
case UZ:
|
|
||||||
this.uz = uvec;
|
|
||||||
this.hasUnitv = true;
|
|
||||||
break;
|
|
||||||
|
|
||||||
default:
|
|
||||||
throw new RuntimeException();
|
|
||||||
|
|
||||||
}
|
|
||||||
return this;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsDataBuilder setScale(double scale) {
|
public double[] getV() {
|
||||||
this.scale = scale;
|
return this.V;
|
||||||
return this;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsDataBuilder setGSD(double gsd) {
|
public boolean hasUnitv() {
|
||||||
this.gsd = gsd;
|
return this.hasUnitv;
|
||||||
this.hasGsd = true;
|
|
||||||
return this;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public FitsData build() {
|
public double[] getUnit(UnitDir udir) {
|
||||||
return new FitsData(this);
|
switch (udir) {
|
||||||
|
case UX:
|
||||||
|
return this.ux;
|
||||||
|
|
||||||
|
case UY:
|
||||||
|
return this.uy;
|
||||||
|
|
||||||
|
case UZ:
|
||||||
|
return this.uz;
|
||||||
|
|
||||||
|
default:
|
||||||
|
throw new RuntimeException();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public double getScale() {
|
||||||
|
return this.scale;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean hasGsd() {
|
||||||
|
return this.hasGsd;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean hasAltType() {
|
||||||
|
return this.hasAltType;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isGlobal() {
|
||||||
|
return this.isGlobal;
|
||||||
|
}
|
||||||
|
|
||||||
|
public double getGSD() {
|
||||||
|
if (this.hasGsd) {
|
||||||
|
return this.gsd;
|
||||||
|
} else {
|
||||||
|
String errMesg = "ERROR! fitsData does not have gsd!";
|
||||||
|
throw new RuntimeException(errMesg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class FitsDataBuilder {
|
||||||
|
private final double[][][] data;
|
||||||
|
private double[] V = null;
|
||||||
|
private double[] ux = null;
|
||||||
|
private double[] uy = null;
|
||||||
|
private double[] uz = null;
|
||||||
|
private boolean hasV = false;
|
||||||
|
private boolean hasUnitv = false;
|
||||||
|
private boolean hasGsd = false;
|
||||||
|
private boolean isGlobal = false;
|
||||||
|
private boolean hasAltType = false;
|
||||||
|
private double scale = Double.NaN;
|
||||||
|
private double gsd = Double.NaN;
|
||||||
|
private AltwgDataType altProd = null;
|
||||||
|
private String dataSource = SrcProductType.UNKNOWN.toString();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Constructor. isGlobal used to fill out fits keyword describing whether data is local or
|
||||||
|
* global. May also be used for fits naming convention.
|
||||||
|
*
|
||||||
|
* @param data
|
||||||
|
* @param isGlobal
|
||||||
|
*/
|
||||||
|
public FitsDataBuilder(double[][][] data, boolean isGlobal) {
|
||||||
|
this.data = data;
|
||||||
|
this.isGlobal = isGlobal;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsDataBuilder setAltProdType(AltwgDataType altProd) {
|
||||||
|
this.altProd = altProd;
|
||||||
|
this.hasAltType = true;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsDataBuilder setDataSource(String dataSource) {
|
||||||
|
this.dataSource = dataSource;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsDataBuilder setV(double[] V) {
|
||||||
|
this.V = V;
|
||||||
|
this.hasV = true;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsDataBuilder setU(double[] uvec, UnitDir udir) {
|
||||||
|
switch (udir) {
|
||||||
|
case UX:
|
||||||
|
this.ux = uvec;
|
||||||
|
this.hasUnitv = true;
|
||||||
|
break;
|
||||||
|
|
||||||
|
case UY:
|
||||||
|
this.uy = uvec;
|
||||||
|
this.hasUnitv = true;
|
||||||
|
break;
|
||||||
|
|
||||||
|
case UZ:
|
||||||
|
this.uz = uvec;
|
||||||
|
this.hasUnitv = true;
|
||||||
|
break;
|
||||||
|
|
||||||
|
default:
|
||||||
|
throw new RuntimeException();
|
||||||
|
}
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsDataBuilder setScale(double scale) {
|
||||||
|
this.scale = scale;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsDataBuilder setGSD(double gsd) {
|
||||||
|
this.gsd = gsd;
|
||||||
|
this.hasGsd = true;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public FitsData build() {
|
||||||
|
return new FitsData(this);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -11,203 +33,193 @@ import terrasaur.utils.DTMHeader;
|
|||||||
* Factory class that returns List<HeaderCard> where the HeaderCard objects are in the correct order
|
* Factory class that returns List<HeaderCard> where the HeaderCard objects are in the correct order
|
||||||
* for writing to a fits header. Also contains methods for creating List<HeaderCard> for different
|
* for writing to a fits header. Also contains methods for creating List<HeaderCard> for different
|
||||||
* sections of a fits header
|
* sections of a fits header
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class FitsHeaderFactory {
|
public class FitsHeaderFactory {
|
||||||
|
|
||||||
private static final String PLANE = "PLANE";
|
private static final String PLANE = "PLANE";
|
||||||
private static final String COMMENT = "COMMENT";
|
private static final String COMMENT = "COMMENT";
|
||||||
|
|
||||||
|
public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
|
||||||
|
|
||||||
public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
|
switch (headerType) {
|
||||||
|
case NFTMLN:
|
||||||
|
return new NFTmln(fitsHdr);
|
||||||
|
|
||||||
switch (headerType) {
|
case DTMLOCALALTWG:
|
||||||
|
return new AltwgLocalDTM(fitsHdr);
|
||||||
|
|
||||||
case NFTMLN:
|
case DTMGLOBALALTWG:
|
||||||
return new NFTmln(fitsHdr);
|
return new AltwgGlobalDTM(fitsHdr);
|
||||||
|
|
||||||
case DTMLOCALALTWG:
|
case DTMGLOBALGENERIC:
|
||||||
return new AltwgLocalDTM(fitsHdr);
|
return new GenericGlobalDTM(fitsHdr);
|
||||||
|
|
||||||
case DTMGLOBALALTWG:
|
case DTMLOCALGENERIC:
|
||||||
return new AltwgGlobalDTM(fitsHdr);
|
return new GenericLocalDTM(fitsHdr);
|
||||||
|
|
||||||
case DTMGLOBALGENERIC:
|
default:
|
||||||
return new GenericGlobalDTM(fitsHdr);
|
return null;
|
||||||
|
}
|
||||||
case DTMLOCALGENERIC:
|
|
||||||
return new GenericLocalDTM(fitsHdr);
|
|
||||||
|
|
||||||
default:
|
|
||||||
return null;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
|
||||||
|
|
||||||
public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
|
switch (headerType) {
|
||||||
|
case ANCIGLOBALGENERIC:
|
||||||
|
return new GenericAnciGlobal(fitsHdr);
|
||||||
|
|
||||||
switch (headerType) {
|
case ANCILOCALGENERIC:
|
||||||
case ANCIGLOBALGENERIC:
|
return new GenericAnciLocal(fitsHdr);
|
||||||
return new GenericAnciGlobal(fitsHdr);
|
|
||||||
|
|
||||||
case ANCILOCALGENERIC:
|
case ANCIGLOBALALTWG:
|
||||||
return new GenericAnciLocal(fitsHdr);
|
return new AltwgAnciGlobal(fitsHdr);
|
||||||
|
|
||||||
case ANCIGLOBALALTWG:
|
case ANCIG_FACETRELATION_ALTWG:
|
||||||
return new AltwgAnciGlobal(fitsHdr);
|
return new AltwgAnciGlobalFacetRelation(fitsHdr);
|
||||||
|
|
||||||
case ANCIG_FACETRELATION_ALTWG:
|
case ANCILOCALALTWG:
|
||||||
return new AltwgAnciGlobalFacetRelation(fitsHdr);
|
return new AltwgAnciLocal(fitsHdr);
|
||||||
|
|
||||||
case ANCILOCALALTWG:
|
default:
|
||||||
return new AltwgAnciLocal(fitsHdr);
|
return null;
|
||||||
|
}
|
||||||
default:
|
|
||||||
return null;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
/**
|
||||||
|
* Fits Header block that contains information about the fits header itself. Ex. Header version
|
||||||
|
* number.
|
||||||
|
*
|
||||||
|
* @param fitsHdr
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public static List<HeaderCard> getHeaderInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "header information", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
||||||
|
|
||||||
/**
|
return headers;
|
||||||
* Fits Header block that contains information about the fits header itself. Ex. Header version
|
|
||||||
* number.
|
|
||||||
*
|
|
||||||
* @param fitsHdr
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getHeaderInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "header information", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block that contains information about the mission Ex. MISSION name, HOST name,
|
|
||||||
* Target name.
|
|
||||||
*
|
|
||||||
* @param fitsHdr
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getMissionInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "mission information", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block that contains ID information, i.e. information that would uniquely identify
|
|
||||||
* the data product.
|
|
||||||
*
|
|
||||||
* @param fitsHdr
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getIdInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "identification info", false));
|
|
||||||
|
|
||||||
// check latest Map Format SIS revision to see if SPOC handles these keywords
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block that contains information about the source shape model used to create the
|
|
||||||
* fits file.
|
|
||||||
*
|
|
||||||
* @param fitsHdr
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getShapeSourceInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "shape data source", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static List<HeaderCard> getProcInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "processing information", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "map specific information", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
// check latest Map Format SIS revision to see if SPOC handles these keywords
|
|
||||||
// MAP_PROJ*
|
|
||||||
// GSD*
|
|
||||||
// GSDI*
|
|
||||||
}
|
|
||||||
|
|
||||||
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr,
|
|
||||||
FitsHeaderType fitsHeaderType) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
// this section common to all fitsHeaderTypes
|
|
||||||
headers.add(new HeaderCard(COMMENT, "summary spatial information", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLON));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLAT));
|
|
||||||
|
|
||||||
switch (fitsHeaderType) {
|
|
||||||
case ANCILOCALGENERIC:
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT));
|
|
||||||
break;
|
|
||||||
|
|
||||||
default:
|
|
||||||
// default does nothing because switch only handles specific cases.
|
|
||||||
break;
|
|
||||||
}
|
}
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
/**
|
||||||
|
* Fits header block that contains information about the mission Ex. MISSION name, HOST name,
|
||||||
|
* Target name.
|
||||||
|
*
|
||||||
|
* @param fitsHdr
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public static List<HeaderCard> getMissionInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "mission information", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Fits header block that contains ID information, i.e. information that would uniquely identify
|
||||||
|
* the data product.
|
||||||
|
*
|
||||||
|
* @param fitsHdr
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public static List<HeaderCard> getIdInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "identification info", false));
|
||||||
|
|
||||||
|
// check latest Map Format SIS revision to see if SPOC handles these keywords
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Fits header block that contains information about the source shape model used to create the
|
||||||
|
* fits file.
|
||||||
|
*
|
||||||
|
* @param fitsHdr
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public static List<HeaderCard> getShapeSourceInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "shape data source", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static List<HeaderCard> getProcInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "processing information", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "map specific information", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
|
||||||
|
// check latest Map Format SIS revision to see if SPOC handles these keywords
|
||||||
|
// MAP_PROJ*
|
||||||
|
// GSD*
|
||||||
|
// GSDI*
|
||||||
|
}
|
||||||
|
|
||||||
|
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr, FitsHeaderType fitsHeaderType)
|
||||||
|
throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
|
// this section common to all fitsHeaderTypes
|
||||||
|
headers.add(new HeaderCard(COMMENT, "summary spatial information", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLON));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLAT));
|
||||||
|
|
||||||
|
switch (fitsHeaderType) {
|
||||||
|
case ANCILOCALGENERIC:
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT));
|
||||||
|
break;
|
||||||
|
|
||||||
|
default:
|
||||||
|
// default does nothing because switch only handles specific cases.
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,39 +1,60 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Container class for storing the value and comment associated with a given fits keyword
|
* Container class for storing the value and comment associated with a given fits keyword
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class FitsValCom {
|
public class FitsValCom {
|
||||||
private String value;
|
private String value;
|
||||||
private String comment;
|
private String comment;
|
||||||
|
|
||||||
public FitsValCom(String value, String comment) {
|
public FitsValCom(String value, String comment) {
|
||||||
this.value = value;
|
this.value = value;
|
||||||
this.comment = comment;
|
this.comment = comment;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getV() {
|
public String getV() {
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getC() {
|
public String getC() {
|
||||||
return comment;
|
return comment;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setV(String newVal) {
|
public void setV(String newVal) {
|
||||||
this.value = newVal;
|
this.value = newVal;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setC(String newComment) {
|
public void setC(String newComment) {
|
||||||
this.comment = newComment;
|
this.comment = newComment;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setVC(String newVal, String newComment) {
|
|
||||||
setV(newVal);
|
|
||||||
setC(newComment);
|
|
||||||
}
|
|
||||||
|
|
||||||
|
public void setVC(String newVal, String newComment) {
|
||||||
|
setV(newVal);
|
||||||
|
setC(newComment);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -9,38 +31,36 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
/**
|
/**
|
||||||
* Contains methods for building fits header corresponding to the Generic Ancillary GLobal fits
|
* Contains methods for building fits header corresponding to the Generic Ancillary GLobal fits
|
||||||
* header as specified in the Map Formats SIS.
|
* header as specified in the Map Formats SIS.
|
||||||
*
|
*
|
||||||
* See concrete methods and attributes in AnciTableFits unless overridden. Overridden or
|
* See concrete methods and attributes in AnciTableFits unless overridden. Overridden or
|
||||||
* implementation methods specific to this fits type are here.
|
* implementation methods specific to this fits type are here.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
|
public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
|
||||||
|
|
||||||
public GenericAnciGlobal(FitsHdr fitsHeader) {
|
public GenericAnciGlobal(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC);
|
super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC);
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Build fits header portion that contains the spatial information of the Generic Anci Global
|
|
||||||
* product. Overrides the default implementation in AnciTableFits.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// the generic Global Anci product ONLY CONTAINS THE CENTER LAT LON!
|
/**
|
||||||
// per map_format_fits_header_normalization_09212917_V02.xlsx
|
* Build fits header portion that contains the spatial information of the Generic Anci Global
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
* product. Overrides the default implementation in AnciTableFits.
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
}
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
|
// the generic Global Anci product ONLY CONTAINS THE CENTER LAT LON!
|
||||||
|
// per map_format_fits_header_normalization_09212917_V02.xlsx
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import terrasaur.enums.FitsHeaderType;
|
import terrasaur.enums.FitsHeaderType;
|
||||||
@@ -5,18 +27,16 @@ import terrasaur.enums.FitsHeaderType;
|
|||||||
/**
|
/**
|
||||||
* Contains methods for building fits header corresponding to the Generic Ancillary Local fits
|
* Contains methods for building fits header corresponding to the Generic Ancillary Local fits
|
||||||
* header as specified in the Map Formats SIS.
|
* header as specified in the Map Formats SIS.
|
||||||
*
|
*
|
||||||
* See concrete methods and attributes in AnciTableFits unless overridden. Overridden or
|
* See concrete methods and attributes in AnciTableFits unless overridden. Overridden or
|
||||||
* implementation methods specific to this fits type are here.
|
* implementation methods specific to this fits type are here.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class GenericAnciLocal extends AnciTableFits implements AnciFitsHeader {
|
public class GenericAnciLocal extends AnciTableFits implements AnciFitsHeader {
|
||||||
|
|
||||||
public GenericAnciLocal(FitsHdr fitsHeader) {
|
public GenericAnciLocal(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC);
|
super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import terrasaur.enums.FitsHeaderType;
|
import terrasaur.enums.FitsHeaderType;
|
||||||
@@ -6,14 +28,13 @@ import terrasaur.utils.DTMHeader;
|
|||||||
/**
|
/**
|
||||||
* Contains methods for building generic Global DTM fits header. Generic Global DTM header will
|
* Contains methods for building generic Global DTM fits header. Generic Global DTM header will
|
||||||
* contain only those keywords that are deemed common to all Global DTM fits files.
|
* contain only those keywords that are deemed common to all Global DTM fits files.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class GenericGlobalDTM extends DTMFits implements DTMHeader {
|
public class GenericGlobalDTM extends DTMFits implements DTMHeader {
|
||||||
|
|
||||||
public GenericGlobalDTM(FitsHdr fitsHeader) {
|
public GenericGlobalDTM(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC);
|
super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC);
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import terrasaur.enums.FitsHeaderType;
|
import terrasaur.enums.FitsHeaderType;
|
||||||
@@ -6,14 +28,13 @@ import terrasaur.utils.DTMHeader;
|
|||||||
/**
|
/**
|
||||||
* Contains methods for building generic Local DTM Fits Header. DTM header will contain only those
|
* Contains methods for building generic Local DTM Fits Header. DTM header will contain only those
|
||||||
* keywords that are deemed common to all Local DTM fits files.
|
* keywords that are deemed common to all Local DTM fits files.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class GenericLocalDTM extends DTMFits implements DTMHeader {
|
public class GenericLocalDTM extends DTMFits implements DTMHeader {
|
||||||
|
|
||||||
public GenericLocalDTM(FitsHdr fitsHeader) {
|
public GenericLocalDTM(FitsHdr fitsHeader) {
|
||||||
super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC);
|
super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC);
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.EnumSet;
|
import java.util.EnumSet;
|
||||||
@@ -9,153 +31,177 @@ import java.util.EnumSet;
|
|||||||
* enumeration. A few keywords may have overlap. For example, SIGMA is defined here to represent the
|
* enumeration. A few keywords may have overlap. For example, SIGMA is defined here to represent the
|
||||||
* global uncertainty of the data. It is also in PlaneInfo to define an entire image plane
|
* global uncertainty of the data. It is also in PlaneInfo to define an entire image plane
|
||||||
* consisting of sigma values.
|
* consisting of sigma values.
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public enum HeaderTag {
|
public enum HeaderTag {
|
||||||
|
|
||||||
// fits data tags. Included here in case we want to have values or comments
|
// fits data tags. Included here in case we want to have values or comments
|
||||||
// that are not null. Some of the default values obviously need to be updated when actual fits
|
// that are not null. Some of the default values obviously need to be updated when actual fits
|
||||||
// file
|
// file
|
||||||
// is created.
|
// is created.
|
||||||
SIMPLE("T", null), BITPIX(null, null), NAXIS("3", null), NAXIS1(null, null), NAXIS2(null, null),
|
SIMPLE("T", null),
|
||||||
|
BITPIX(null, null),
|
||||||
|
NAXIS("3", null),
|
||||||
|
NAXIS1(null, null),
|
||||||
|
NAXIS2(null, null),
|
||||||
|
|
||||||
NAXIS3(null, null), EXTEND("T", null), HDRVERS("1.0.0", null), NPRDVERS("1.0.0",
|
NAXIS3(null, null),
|
||||||
null), MISSION("OSIRIS-REx", "Name of mission"), HOSTNAME("OREX", "PDS ID"),
|
EXTEND("T", null),
|
||||||
|
HDRVERS("1.0.0", null),
|
||||||
|
NPRDVERS("1.0.0", null),
|
||||||
|
MISSION("OSIRIS-REx", "Name of mission"),
|
||||||
|
HOSTNAME("OREX", "PDS ID"),
|
||||||
|
|
||||||
TARGET("101955 BENNU", "Target object"), ORIGIN("OREXSPOC", null),
|
TARGET("101955 BENNU", "Target object"),
|
||||||
|
ORIGIN("OREXSPOC", null),
|
||||||
|
|
||||||
SPOC_ID("SPOCUPLOAD", null), SDPAREA("SPOCUPLOAD", null), SDPDESC("SPOCUPLOAD", null),
|
SPOC_ID("SPOCUPLOAD", null),
|
||||||
|
SDPAREA("SPOCUPLOAD", null),
|
||||||
|
SDPDESC("SPOCUPLOAD", null),
|
||||||
|
|
||||||
MPHASE("FillMeIn", "Mission phase."), DATASRC("FillMeIn",
|
MPHASE("FillMeIn", "Mission phase."),
|
||||||
"Shape model data source, i.e. 'SPC' or 'OLA'"),
|
DATASRC("FillMeIn", "Shape model data source, i.e. 'SPC' or 'OLA'"),
|
||||||
|
|
||||||
DATASRCF("FillMeIn", "Source shape model data filename"), DATASRCV("FillMeIn",
|
DATASRCF("FillMeIn", "Source shape model data filename"),
|
||||||
"Name and version of shape model"), DATASRCD("FillMeIn",
|
DATASRCV("FillMeIn", "Name and version of shape model"),
|
||||||
"Creation date of shape model in UTC"), DATASRCS("N/A",
|
DATASRCD("FillMeIn", "Creation date of shape model in UTC"),
|
||||||
"[m/pix] Shpe model plt scale"), SOFTWARE("FillMeIn",
|
DATASRCS("N/A", "[m/pix] Shpe model plt scale"),
|
||||||
"Software used to create map data"),
|
SOFTWARE("FillMeIn", "Software used to create map data"),
|
||||||
|
|
||||||
SOFT_VER("FillMeIn", "Version of software used to create map data"),
|
SOFT_VER("FillMeIn", "Version of software used to create map data"),
|
||||||
|
|
||||||
|
DATEPRD("1701-10-09", "Date this product was produced in UTC"),
|
||||||
|
DATENPRD("1701-10-09", "Date this NFT product was produced in UTC"),
|
||||||
|
PRODNAME("FillMeIn", "Product filename"),
|
||||||
|
PRODVERS("1.0.0", "Product version number"),
|
||||||
|
|
||||||
DATEPRD("1701-10-09", "Date this product was produced in UTC"), DATENPRD("1701-10-09",
|
MAP_VER("999", "Product version number."),
|
||||||
"Date this NFT product was produced in UTC"), PRODNAME("FillMeIn",
|
CREATOR("ALT-pipeline", "Name of software that created this product"),
|
||||||
"Product filename"), PRODVERS("1.0.0", "Product version number"),
|
AUTHOR("Espiritu", "Name of person that compiled this product"),
|
||||||
|
PROJECTN("Equirectangular", "Simple cylindrical projection"),
|
||||||
|
CLON("-999", "[deg] longitude at center of image"),
|
||||||
|
CLAT("-999", "[deg] latitude at center of image"),
|
||||||
|
MINLON(null, "[deg] minimum longitude of global DTM"),
|
||||||
|
MAXLON(null, "[deg] maximum longitude of global DTM"),
|
||||||
|
MINLAT(null, "[deg] minimum latitude of global DTM"),
|
||||||
|
MAXLAT(null, "[deg] maximum latitude of global DTM"),
|
||||||
|
|
||||||
MAP_VER("999", "Product version number."), CREATOR("ALT-pipeline",
|
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."),
|
||||||
"Name of software that created this product"), AUTHOR("Espiritu",
|
LLCLAT("-999", "[deg]"),
|
||||||
"Name of person that compiled this product"), PROJECTN("Equirectangular",
|
LLCLNG("-999", "[deg]"),
|
||||||
"Simple cylindrical projection"), CLON("-999",
|
ULCLAT("-999", "[deg]"),
|
||||||
"[deg] longitude at center of image"), CLAT("-999",
|
ULCLNG("-999", "[deg]"),
|
||||||
"[deg] latitude at center of image"), MINLON(null,
|
|
||||||
"[deg] minimum longitude of global DTM"), MAXLON(null,
|
|
||||||
"[deg] maximum longitude of global DTM"), MINLAT(null,
|
|
||||||
"[deg] minimum latitude of global DTM"), MAXLAT(null,
|
|
||||||
"[deg] maximum latitude of global DTM"),
|
|
||||||
|
|
||||||
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."), LLCLAT("-999",
|
URCLAT("-999", "[deg]"),
|
||||||
"[deg]"), LLCLNG("-999", "[deg]"), ULCLAT("-999", "[deg]"), ULCLNG("-999", "[deg]"),
|
URCLNG("-999", "[deg]"),
|
||||||
|
LRCLAT("-999", "[deg]"),
|
||||||
|
LRCLNG("-999", "[deg]"),
|
||||||
|
|
||||||
URCLAT("-999", "[deg]"), URCLNG("-999", "[deg]"), LRCLAT("-999", "[deg]"), LRCLNG("-999",
|
CNTR_V_X("-999", "[km]"),
|
||||||
"[deg]"),
|
CNTR_V_Y("-999", "[km]"),
|
||||||
|
CNTR_V_Z("-999", "[km]"),
|
||||||
|
UX_X("-999", "[m]"),
|
||||||
|
UX_Y("-999", "[m]"),
|
||||||
|
|
||||||
CNTR_V_X("-999", "[km]"), CNTR_V_Y("-999", "[km]"), CNTR_V_Z("-999", "[km]"), UX_X("-999",
|
UX_Z("-999", "[m]"),
|
||||||
"[m]"), UX_Y("-999", "[m]"),
|
UY_X("-999", "[m]"),
|
||||||
|
UY_Y("-999", "[m]"),
|
||||||
|
UY_Z("-999", "[m]"),
|
||||||
|
UZ_X("-999", "[m]"),
|
||||||
|
|
||||||
UX_Z("-999", "[m]"), UY_X("-999", "[m]"), UY_Y("-999", "[m]"), UY_Z("-999", "[m]"), UZ_X("-999",
|
UZ_Y("-999", "/[m]"),
|
||||||
"[m]"),
|
UZ_Z("-999", "[m]"),
|
||||||
|
GSD("-999", "[mm] grid spacing in units/pixel"),
|
||||||
|
GSDI("-999", "[unk] Ground sample distance integer"),
|
||||||
|
SIGMA("-999", "Global uncertainty of the data [m]"),
|
||||||
|
|
||||||
UZ_Y("-999", "/[m]"), UZ_Z("-999", "[m]"), GSD("-999", "[mm] grid spacing in units/pixel"), GSDI(
|
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"),
|
||||||
"-999",
|
DQUAL_1("-999", "Data quality metric; incidence directions"),
|
||||||
"[unk] Ground sample distance integer"), SIGMA("-999", "Global uncertainty of the data [m]"),
|
|
||||||
|
|
||||||
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"), DQUAL_1("-999",
|
DQUAL_2("-999", "Data quality metric; emission directions"),
|
||||||
"Data quality metric; incidence directions"),
|
DSIG_DEF("UNK", "Defines uncertainty metric in ancillary file"),
|
||||||
|
END(null, null),
|
||||||
|
|
||||||
DQUAL_2("-999", "Data quality metric; emission directions"), DSIG_DEF("UNK",
|
// additional fits tags describing gravity derived values
|
||||||
"Defines uncertainty metric in ancillary file"), END(null, null),
|
DENSITY("-999", "[kgm^-3] density of body"),
|
||||||
|
ROT_RATE("-999", "[rad/sec] rotation rate of body"),
|
||||||
|
REF_POT("-999", "[J/kg] reference potential of body"),
|
||||||
|
|
||||||
// additional fits tags describing gravity derived values
|
// additional fits tags describing tilt derived values
|
||||||
DENSITY("-999", "[kgm^-3] density of body"), ROT_RATE("-999",
|
TILT_RAD("-999", "[m]"),
|
||||||
"[rad/sec] rotation rate of body"), REF_POT("-999", "[J/kg] reference potential of body"),
|
TILT_MAJ("-999", "[m] semi-major axis of ellipse for tilt calcs"),
|
||||||
|
TILT_MIN("-999", "[m] semi-minor axis of ellipse for tilt calcs"),
|
||||||
|
TILT_PA("-999", "[deg] position angle of ellipse for tilt calcs"),
|
||||||
|
|
||||||
// additional fits tags describing tilt derived values
|
// Additional fits tags specific to ancillary fits files
|
||||||
TILT_RAD("-999", "[m]"), TILT_MAJ("-999",
|
MAP_NAME("FillMeIn", "Map data type"),
|
||||||
"[m] semi-major axis of ellipse for tilt calcs"), TILT_MIN("-999",
|
MAP_TYPE("FillMeIn", "Defines whether this is a global or local map"),
|
||||||
"[m] semi-minor axis of ellipse for tilt calcs"), TILT_PA("-999",
|
OBJ_FILE("TEMPLATEENTRY", null),
|
||||||
"[deg] position angle of ellipse for tilt calcs"),
|
|
||||||
|
|
||||||
// Additional fits tags specific to ancillary fits files
|
// keywords specific to facet mapping ancillary file
|
||||||
MAP_NAME("FillMeIn", "Map data type"), MAP_TYPE("FillMeIn",
|
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"),
|
||||||
"Defines whether this is a global or local map"), OBJ_FILE("TEMPLATEENTRY", null),
|
GSDINDX("-999", "[unk] Ground sample distance of OBJINDX"),
|
||||||
|
GSDINDXI("-999", "[unk] GSDINDX integer"),
|
||||||
|
|
||||||
// keywords specific to facet mapping ancillary file
|
// return this enum when no match is found
|
||||||
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"), GSDINDX("-999",
|
NOMATCH(null, "could not determine");
|
||||||
"[unk] Ground sample distance of OBJINDX"), GSDINDXI("-999", "[unk] GSDINDX integer"),
|
|
||||||
|
|
||||||
// return this enum when no match is found
|
// PIXPDEG(null,"pixels per degree","pixel/deg"),
|
||||||
NOMATCH(null, "could not determine");
|
// PIX_SZ(null, "mean size of pixels at equator (meters)","m");
|
||||||
|
|
||||||
// PIXPDEG(null,"pixels per degree","pixel/deg"),
|
private FitsValCom fitsValCom;
|
||||||
// PIX_SZ(null, "mean size of pixels at equator (meters)","m");
|
|
||||||
|
|
||||||
|
private HeaderTag(String value, String comment) {
|
||||||
private FitsValCom fitsValCom;
|
this.fitsValCom = new FitsValCom(value, comment);
|
||||||
|
|
||||||
private HeaderTag(String value, String comment) {
|
|
||||||
this.fitsValCom = new FitsValCom(value, comment);
|
|
||||||
}
|
|
||||||
|
|
||||||
public String value() {
|
|
||||||
return this.fitsValCom.getV();
|
|
||||||
}
|
|
||||||
|
|
||||||
public String comment() {
|
|
||||||
return this.fitsValCom.getC();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword
|
|
||||||
*/
|
|
||||||
public static final EnumSet<HeaderTag> fitsKeywords =
|
|
||||||
EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
|
|
||||||
|
|
||||||
public static final EnumSet<HeaderTag> globalDTMFitsData =
|
|
||||||
EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
|
|
||||||
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found.
|
|
||||||
*
|
|
||||||
* @param value
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static HeaderTag tagFromString(String value) {
|
|
||||||
for (HeaderTag tagName : values()) {
|
|
||||||
if (tagName.toString().equals(value)) {
|
|
||||||
return tagName;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
return NOMATCH;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
public String value() {
|
||||||
* Return the "Source Data Product" (SDP) string to be used in a product file naming convention.
|
return this.fitsValCom.getV();
|
||||||
* The SDP string may not always be the same as the data source.
|
}
|
||||||
*
|
|
||||||
* For example, for ALTWG products, the dataSource could be "OLA" but the SDP string in the
|
public String comment() {
|
||||||
* filename is supposed to be "alt".
|
return this.fitsValCom.getC();
|
||||||
*
|
}
|
||||||
* @param dataSource
|
|
||||||
* @return
|
/**
|
||||||
*/
|
* Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword
|
||||||
public static String getSDP(String dataSource) {
|
*/
|
||||||
String sdp = dataSource;
|
public static final EnumSet<HeaderTag> fitsKeywords = EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
|
||||||
if (dataSource.equals("ola")) {
|
|
||||||
sdp = "alt";
|
public static final EnumSet<HeaderTag> globalDTMFitsData = EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found.
|
||||||
|
*
|
||||||
|
* @param value
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static HeaderTag tagFromString(String value) {
|
||||||
|
for (HeaderTag tagName : values()) {
|
||||||
|
if (tagName.toString().equals(value)) {
|
||||||
|
return tagName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return NOMATCH;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the "Source Data Product" (SDP) string to be used in a product file naming convention.
|
||||||
|
* The SDP string may not always be the same as the data source.
|
||||||
|
*
|
||||||
|
* For example, for ALTWG products, the dataSource could be "OLA" but the SDP string in the
|
||||||
|
* filename is supposed to be "alt".
|
||||||
|
*
|
||||||
|
* @param dataSource
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public static String getSDP(String dataSource) {
|
||||||
|
String sdp = dataSource;
|
||||||
|
if (dataSource.equals("ola")) {
|
||||||
|
sdp = "alt";
|
||||||
|
}
|
||||||
|
return sdp;
|
||||||
}
|
}
|
||||||
return sdp;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -13,157 +35,150 @@ import terrasaur.utils.DTMHeader;
|
|||||||
* HeaderCardFactory static methods. This is because the NFT MLN header updates are independent of
|
* HeaderCardFactory static methods. This is because the NFT MLN header updates are independent of
|
||||||
* updates to the ALTWG product fits headers or the Map Formats headers. static methods in the
|
* updates to the ALTWG product fits headers or the Map Formats headers. static methods in the
|
||||||
* HeaderCardFactory
|
* HeaderCardFactory
|
||||||
*
|
*
|
||||||
* @author espirrc1
|
* @author espirrc1
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class NFTmln extends DTMFits implements DTMHeader {
|
public class NFTmln extends DTMFits implements DTMHeader {
|
||||||
|
|
||||||
// private FitsData fitsData;
|
// private FitsData fitsData;
|
||||||
private boolean dataContained = false;
|
private boolean dataContained = false;
|
||||||
|
|
||||||
public NFTmln(FitsHdr fitsHeader) {
|
public NFTmln(FitsHdr fitsHeader) {
|
||||||
|
|
||||||
super(fitsHeader, FitsHeaderType.NFTMLN);
|
super(fitsHeader, FitsHeaderType.NFTMLN);
|
||||||
}
|
|
||||||
|
|
||||||
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headerCards = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
headerCards.addAll(getHeaderInfo("Header Information"));
|
|
||||||
headerCards.addAll(getMissionInfo("Mission Information"));
|
|
||||||
headerCards.addAll(getIDInfo("Observation Information"));
|
|
||||||
// headerCards.addAll(getShapeSrcInfo());
|
|
||||||
headerCards.addAll(getMapDataSrc("Data Source"));
|
|
||||||
headerCards.addAll(getTimingInfo());
|
|
||||||
headerCards.addAll(getProcInfo("Processing Information"));
|
|
||||||
headerCards.addAll(getSpecificInfo("Product Specific Information"));
|
|
||||||
headerCards.addAll(getPlaneInfo("", planeList));
|
|
||||||
|
|
||||||
// end keyword
|
|
||||||
headerCards.add(getEnd());
|
|
||||||
|
|
||||||
return headerCards;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Custom header block.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fits header block containing information about the source data used for the shape model. This
|
|
||||||
* method does not exist in the parent class.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getShapeSrcInfo() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
headers.add(new HeaderCard(COMMENT, "Shape Data Source", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Custom source map data block.
|
|
||||||
*/
|
|
||||||
@Override
|
|
||||||
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Custom product specific info data block
|
|
||||||
*
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
|
||||||
headers.addAll(getCornerCards());
|
|
||||||
headers.addAll(getCenterVec());
|
|
||||||
headers.addAll(getUX());
|
|
||||||
headers.addAll(getUY());
|
|
||||||
headers.addAll(getUZ());
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
|
|
||||||
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
|
|
||||||
|
|
||||||
return headers;
|
List<HeaderCard> headerCards = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
}
|
headerCards.addAll(getHeaderInfo("Header Information"));
|
||||||
|
headerCards.addAll(getMissionInfo("Mission Information"));
|
||||||
|
headerCards.addAll(getIDInfo("Observation Information"));
|
||||||
|
// headerCards.addAll(getShapeSrcInfo());
|
||||||
|
headerCards.addAll(getMapDataSrc("Data Source"));
|
||||||
|
headerCards.addAll(getTimingInfo());
|
||||||
|
headerCards.addAll(getProcInfo("Processing Information"));
|
||||||
|
headerCards.addAll(getSpecificInfo("Product Specific Information"));
|
||||||
|
headerCards.addAll(getPlaneInfo("", planeList));
|
||||||
|
|
||||||
/**
|
// end keyword
|
||||||
* Fits header block containing information about when products were created. This method does not
|
headerCards.add(getEnd());
|
||||||
* exist in the parent class.
|
|
||||||
*
|
|
||||||
* @return
|
|
||||||
* @throws HeaderCardException
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getTimingInfo() throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
return headerCards;
|
||||||
headers.add(new HeaderCard(COMMENT, "Timing Information", false));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD));
|
|
||||||
|
|
||||||
return headers;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Custom processing info block.
|
|
||||||
*/
|
|
||||||
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
|
|
||||||
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
if (comment.length() > 0) {
|
|
||||||
headers.add(new HeaderCard(COMMENT, comment, false));
|
|
||||||
}
|
}
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODVERS));
|
|
||||||
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
|
|
||||||
|
|
||||||
return headers;
|
/**
|
||||||
|
* Custom header block.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
}
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Fits header block containing information about the source data used for the shape model. This
|
||||||
|
* method does not exist in the parent class.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getShapeSrcInfo() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "Shape Data Source", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Custom source map data block.
|
||||||
|
*/
|
||||||
|
@Override
|
||||||
|
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Custom product specific info data block
|
||||||
|
*
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
|
||||||
|
headers.addAll(getCornerCards());
|
||||||
|
headers.addAll(getCenterVec());
|
||||||
|
headers.addAll(getUX());
|
||||||
|
headers.addAll(getUY());
|
||||||
|
headers.addAll(getUZ());
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
|
||||||
|
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Fits header block containing information about when products were created. This method does not
|
||||||
|
* exist in the parent class.
|
||||||
|
*
|
||||||
|
* @return
|
||||||
|
* @throws HeaderCardException
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getTimingInfo() throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
headers.add(new HeaderCard(COMMENT, "Timing Information", false));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Custom processing info block.
|
||||||
|
*/
|
||||||
|
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
|
||||||
|
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
if (comment.length() > 0) {
|
||||||
|
headers.add(new HeaderCard(COMMENT, comment, false));
|
||||||
|
}
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODVERS));
|
||||||
|
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
|
||||||
|
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
@@ -12,12 +34,12 @@ import java.util.HashMap;
|
|||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
import java.util.TimeZone;
|
import java.util.TimeZone;
|
||||||
import org.apache.commons.io.FilenameUtils;
|
|
||||||
import nom.tam.fits.BasicHDU;
|
import nom.tam.fits.BasicHDU;
|
||||||
import nom.tam.fits.Fits;
|
import nom.tam.fits.Fits;
|
||||||
import nom.tam.fits.FitsException;
|
import nom.tam.fits.FitsException;
|
||||||
import nom.tam.fits.HeaderCard;
|
import nom.tam.fits.HeaderCard;
|
||||||
import nom.tam.util.Cursor;
|
import nom.tam.util.Cursor;
|
||||||
|
import org.apache.commons.io.FilenameUtils;
|
||||||
import terrasaur.enums.FitsHeaderType;
|
import terrasaur.enums.FitsHeaderType;
|
||||||
import terrasaur.enums.PlaneInfo;
|
import terrasaur.enums.PlaneInfo;
|
||||||
import terrasaur.fits.FitsHdr.FitsHdrBuilder;
|
import terrasaur.fits.FitsHdr.FitsHdrBuilder;
|
||||||
@@ -31,371 +53,360 @@ import terrasaur.utils.xml.AsciiFile;
|
|||||||
*/
|
*/
|
||||||
public class ProductFits {
|
public class ProductFits {
|
||||||
|
|
||||||
public static final String PLANE = "PLANE";
|
public static final String PLANE = "PLANE";
|
||||||
public static final String COMMENT = "COMMENT";
|
public static final String COMMENT = "COMMENT";
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Open the fits file and extract fits keywords which start with "PLANE". The convention is that
|
* Open the fits file and extract fits keywords which start with "PLANE". The convention is that
|
||||||
* these are the keywords which describe the planes in the fits datacube.
|
* these are the keywords which describe the planes in the fits datacube.
|
||||||
*
|
*
|
||||||
* @param fitsFile
|
* @param fitsFile
|
||||||
* @return
|
* @return
|
||||||
* @throws FitsException
|
* @throws FitsException
|
||||||
* @throws IOException
|
* @throws IOException
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
Fits inFits = new Fits(fitsFile);
|
|
||||||
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits);
|
|
||||||
return planeHeaders;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Open the fits object and extract fits keywords which start with "PLANE". The convention is that
|
|
||||||
* these are the keywords which describe the planes in the fits datacube.
|
|
||||||
*
|
|
||||||
* @param inFitsFile
|
|
||||||
* @return
|
|
||||||
* @throws FitsException
|
|
||||||
* @throws IOException
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
BasicHDU inHdu = inFitsFile.getHDU(0);
|
|
||||||
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu);
|
|
||||||
return planeHeaders;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse the HeaderDataUnit (HDU) and extract fits keywords which start with "PLANE". The
|
|
||||||
* convention is that these are the keywords which describe the planes in the fits datacube.
|
|
||||||
*
|
|
||||||
* @param inHdu
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public static List<HeaderCard> getPlaneHeaderCards(BasicHDU inHdu) {
|
|
||||||
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
|
|
||||||
Cursor cursor = inHdu.getHeader().iterator();
|
|
||||||
while (cursor.hasNext()) {
|
|
||||||
HeaderCard hc = (HeaderCard) cursor.next();
|
|
||||||
if (hc.getKey().startsWith(PLANE)) planeHeaders.add(hc);
|
|
||||||
}
|
|
||||||
return planeHeaders;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse fits header and determine min/max latitude and longitude. For global fits files will just
|
|
||||||
* parse keywords that directly contain the min/max lat/lon values. For regional fits files will
|
|
||||||
* parse the latlon corner keywords and determine min, max lat/lon values.
|
|
||||||
*
|
|
||||||
* @param fitsFile
|
|
||||||
* @return Map<String, Double> where string is the .toString() of HeaderTags MINLON, MAXLON,
|
|
||||||
* MINLAT, MAXLAT.
|
|
||||||
* @throws IOException
|
|
||||||
* @throws FitsException
|
|
||||||
*/
|
|
||||||
public static Map<String, Double> minMaxLLFromFits(File fitsFile)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
|
|
||||||
System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath());
|
|
||||||
|
|
||||||
// initialize output
|
|
||||||
Map<String, Double> minMaxLL = new HashMap<String, Double>();
|
|
||||||
|
|
||||||
Map<String, HeaderCard> fitsHeaders = FitsUtil.getFitsHeaderAsMap(fitsFile.getAbsolutePath());
|
|
||||||
|
|
||||||
// check whether MINLON keyword exists.
|
|
||||||
String keyword = HeaderTag.MINLON.toString();
|
|
||||||
boolean failOnNull = false;
|
|
||||||
HeaderCard thisCard = FitsUtil.getCard(fitsHeaders, keyword, failOnNull);
|
|
||||||
if (thisCard == null) {
|
|
||||||
|
|
||||||
// assume LatLon corner keywords exist
|
|
||||||
// Map<String, HeaderCard> cornerCards = latLonCornersFromMap(fitsHeaders);
|
|
||||||
|
|
||||||
// iterate through Lat corners and find min/max values
|
|
||||||
double minLat = 999D;
|
|
||||||
double maxLat = -999D;
|
|
||||||
failOnNull = true;
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLAT.toString(), failOnNull);
|
|
||||||
double thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLat = Math.min(minLat, thisValue);
|
|
||||||
maxLat = Math.max(maxLat, thisValue);
|
|
||||||
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLAT.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLat = Math.min(minLat, thisValue);
|
|
||||||
maxLat = Math.max(maxLat, thisValue);
|
|
||||||
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLAT.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLat = Math.min(minLat, thisValue);
|
|
||||||
maxLat = Math.max(maxLat, thisValue);
|
|
||||||
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLAT.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLat = Math.min(minLat, thisValue);
|
|
||||||
maxLat = Math.max(maxLat, thisValue);
|
|
||||||
|
|
||||||
// double check center lat. For polar observations the center latitude could be
|
|
||||||
// either the minimum or maximum latitude
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLAT.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLat = Math.min(minLat, thisValue);
|
|
||||||
maxLat = Math.max(maxLat, thisValue);
|
|
||||||
|
|
||||||
minMaxLL.put(HeaderTag.MINLAT.toString(), minLat);
|
|
||||||
minMaxLL.put(HeaderTag.MAXLAT.toString(), maxLat);
|
|
||||||
|
|
||||||
// iterate through Lon corners and find min/max values
|
|
||||||
double minLon = 999D;
|
|
||||||
double maxLon = -999D;
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLNG.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLon = Math.min(minLon, thisValue);
|
|
||||||
maxLon = Math.max(maxLon, thisValue);
|
|
||||||
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLNG.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLon = Math.min(minLon, thisValue);
|
|
||||||
maxLon = Math.max(maxLon, thisValue);
|
|
||||||
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLNG.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLon = Math.min(minLon, thisValue);
|
|
||||||
maxLon = Math.max(maxLon, thisValue);
|
|
||||||
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLNG.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLon = Math.min(minLon, thisValue);
|
|
||||||
maxLon = Math.max(maxLon, thisValue);
|
|
||||||
|
|
||||||
// double check center lon
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLON.toString(), failOnNull);
|
|
||||||
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minLon = Math.min(minLon, thisValue);
|
|
||||||
maxLon = Math.max(maxLon, thisValue);
|
|
||||||
|
|
||||||
minMaxLL.put(HeaderTag.MINLON.toString(), minLon);
|
|
||||||
minMaxLL.put(HeaderTag.MAXLON.toString(), maxLon);
|
|
||||||
|
|
||||||
} else {
|
|
||||||
|
|
||||||
// assume min/max lat/lon keywords exist
|
|
||||||
|
|
||||||
// already queried for MINLON
|
|
||||||
double thisVal = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minMaxLL.put(HeaderTag.MINLON.toString(), thisVal);
|
|
||||||
|
|
||||||
// get MAXLON
|
|
||||||
failOnNull = true;
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLON.toString(), failOnNull);
|
|
||||||
thisVal = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minMaxLL.put(HeaderTag.MAXLON.toString(), thisVal);
|
|
||||||
|
|
||||||
// get MINLAT
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MINLAT.toString(), failOnNull);
|
|
||||||
thisVal = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minMaxLL.put(HeaderTag.MINLAT.toString(), thisVal);
|
|
||||||
|
|
||||||
// get MAXLAT
|
|
||||||
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLAT.toString(), failOnNull);
|
|
||||||
thisVal = thisCard.getValue(Double.class, Double.NaN);
|
|
||||||
minMaxLL.put(HeaderTag.MAXLAT.toString(), thisVal);
|
|
||||||
}
|
|
||||||
|
|
||||||
return minMaxLL;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Parse and extract list of PlaneInfo enumerations that describe the planes contained in the fits
|
|
||||||
* file. Return empty list if none of the planes match.
|
|
||||||
*
|
|
||||||
* @param fitsFile
|
|
||||||
* @return
|
|
||||||
* @throws IOException
|
|
||||||
* @throws FitsException
|
|
||||||
*/
|
|
||||||
public static List<PlaneInfo> planesFromFits(String fitsFile) throws FitsException, IOException {
|
|
||||||
|
|
||||||
List<PlaneInfo> planes = new ArrayList<PlaneInfo>();
|
|
||||||
|
|
||||||
// extract header cards
|
|
||||||
List<HeaderCard> planeCards = getPlaneHeaderCards(fitsFile);
|
|
||||||
|
|
||||||
/*
|
|
||||||
* planeCards are assumed to follow the fits keyword naming convention of PLANEN, where N goes
|
|
||||||
* from 0 to the number of planes in the fits file - 1. They correspond directly to the order in
|
|
||||||
* which the data is stored in the data cube. For example, PLANE0 = "Latitude" indicates that
|
|
||||||
* the 0th plane in the fits datacube contains the latitude values.
|
|
||||||
*/
|
*/
|
||||||
for (HeaderCard thisPlane : planeCards) {
|
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile) throws FitsException, IOException {
|
||||||
PlaneInfo thisPlaneInfo = PlaneInfo.keyVal2Plane(thisPlane.getValue());
|
Fits inFits = new Fits(fitsFile);
|
||||||
if (thisPlaneInfo != null) {
|
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits);
|
||||||
planes.add(thisPlaneInfo);
|
return planeHeaders;
|
||||||
}
|
|
||||||
}
|
|
||||||
return planes;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Save data to a Fits Datacube. Assumes hdrBuilder is pre-loaded with all the keyword values that
|
|
||||||
* can be known prior to this method. An example of a keyword value that cannot be known prior to
|
|
||||||
* this method is the ALTWG filename. This can only be created within this method by using
|
|
||||||
* metadata contained in the FitsHdrBuilder. The map information in fitsData will also be used to
|
|
||||||
* populate hdrBuilder if it is available. If it is not available then the method assumes the
|
|
||||||
* information is already in hdrBuilder. Ex. if fitsData was populated by reading a fits file then
|
|
||||||
* it may not have vector to center or unit vectors describing the reference frame. However, that
|
|
||||||
* information may be in the fits header, which should be captured by hdrBuilder.
|
|
||||||
*
|
|
||||||
* @param fitsData
|
|
||||||
* @param planeList
|
|
||||||
* @param outFname
|
|
||||||
* @param hdrBuilder
|
|
||||||
* @param hdrType
|
|
||||||
* @param crossrefFile - if not null then method will create a cross-reference file. The
|
|
||||||
* cross-reference file allows the pipeline to cross reference an original filenaming
|
|
||||||
* convention with the mission specific one.
|
|
||||||
* @throws FitsException
|
|
||||||
* @throws IOException
|
|
||||||
*/
|
|
||||||
public static void saveDataCubeFits(
|
|
||||||
FitsData fitsData,
|
|
||||||
List<PlaneInfo> planeList,
|
|
||||||
String outFname,
|
|
||||||
FitsHdrBuilder hdrBuilder,
|
|
||||||
FitsHeaderType hdrType,
|
|
||||||
File crossrefFile)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
|
|
||||||
hdrBuilder.setByFitsData(fitsData);
|
|
||||||
|
|
||||||
String fitsFname = outFname;
|
|
||||||
|
|
||||||
if (crossrefFile != null) {
|
|
||||||
// save outfile name in cross-reference file, for future reference
|
|
||||||
String path = FilenameUtils.getFullPath(outFname);
|
|
||||||
if (path.length() == 0) path = ".";
|
|
||||||
String outBaseName =
|
|
||||||
String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
|
|
||||||
AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath());
|
|
||||||
crfFile.streamSToFile(outBaseName, 0);
|
|
||||||
crfFile.closeFile();
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// set date this product was produced
|
/**
|
||||||
hdrBuilder.setDateprod();
|
* Open the fits object and extract fits keywords which start with "PLANE". The convention is that
|
||||||
|
* these are the keywords which describe the planes in the fits datacube.
|
||||||
DTMHeader fitsHeader = FitsHeaderFactory.getDTMHeader(hdrBuilder.build(), hdrType);
|
*
|
||||||
fitsHeader.setData(fitsData);
|
* @param inFitsFile
|
||||||
List<HeaderCard> headers = fitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
|
* @return
|
||||||
|
* @throws FitsException
|
||||||
System.out.println("saving to fits file:" + fitsFname);
|
* @throws IOException
|
||||||
FitsUtil.saveFits(fitsData.getData(), fitsFname, headers);
|
*/
|
||||||
}
|
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile) throws FitsException, IOException {
|
||||||
|
BasicHDU inHdu = inFitsFile.getHDU(0);
|
||||||
/**
|
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu);
|
||||||
* General method for saving 3D double array with fits header as defined in fitsHdrbuilder.
|
return planeHeaders;
|
||||||
* Assumes FitsHdrBuilder contains all the keywords that will be written to the fits header in the
|
|
||||||
* order that they should be written.
|
|
||||||
*
|
|
||||||
* @param dataCube
|
|
||||||
* @param hdrBuilder
|
|
||||||
* @throws IOException
|
|
||||||
* @throws FitsException
|
|
||||||
*/
|
|
||||||
public static void saveDataCubeFits(
|
|
||||||
double[][][] dataCube,
|
|
||||||
FitsHdrBuilder hdrBuilder,
|
|
||||||
List<HeaderCard> planeHeaders,
|
|
||||||
String outFile)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
|
|
||||||
FitsHdr fitsHdr = hdrBuilder.build();
|
|
||||||
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
|
||||||
|
|
||||||
for (String keyword : fitsHdr.fitsKV.keySet()) {
|
|
||||||
headers.add(fitsHdr.fitsKV.get(keyword));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
saveDataCubeFits(dataCube, headers, planeHeaders, outFile);
|
/**
|
||||||
}
|
* Parse the HeaderDataUnit (HDU) and extract fits keywords which start with "PLANE". The
|
||||||
|
* convention is that these are the keywords which describe the planes in the fits datacube.
|
||||||
/**
|
*
|
||||||
* General method for writing to a fits file a 3D double array with fits headers as defined in
|
* @param inHdu
|
||||||
* List<HeaderCard> and planeKeywords as defined in separate List<HeaderCard>. plane keywords
|
* @return
|
||||||
* defining the planes of the dataCube in a separate List<HeaderCard>
|
*/
|
||||||
*
|
public static List<HeaderCard> getPlaneHeaderCards(BasicHDU inHdu) {
|
||||||
* @param dataCube
|
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
|
||||||
* @param headers
|
Cursor cursor = inHdu.getHeader().iterator();
|
||||||
* @param planeKeywords - can be null. If so then assumes headers contains all the keyword
|
while (cursor.hasNext()) {
|
||||||
* information to be written to the file.
|
HeaderCard hc = (HeaderCard) cursor.next();
|
||||||
* @param outFname
|
if (hc.getKey().startsWith(PLANE)) planeHeaders.add(hc);
|
||||||
* @throws IOException
|
}
|
||||||
* @throws FitsException
|
return planeHeaders;
|
||||||
*/
|
|
||||||
public static void saveDataCubeFits(
|
|
||||||
double[][][] dataCube,
|
|
||||||
List<HeaderCard> headers,
|
|
||||||
List<HeaderCard> planeKeywords,
|
|
||||||
String outFname)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
|
|
||||||
// append planeHeaders
|
|
||||||
if (!planeKeywords.isEmpty()) {
|
|
||||||
headers.addAll(planeKeywords);
|
|
||||||
}
|
}
|
||||||
FitsUtil.saveFits(dataCube, outFname, headers);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Save NFT MLN.
|
* Parse fits header and determine min/max latitude and longitude. For global fits files will just
|
||||||
*
|
* parse keywords that directly contain the min/max lat/lon values. For regional fits files will
|
||||||
* @param fitsData
|
* parse the latlon corner keywords and determine min, max lat/lon values.
|
||||||
* @param planeList
|
*
|
||||||
* @param outfile
|
* @param fitsFile
|
||||||
* @param hdrBuilder
|
* @return Map<String, Double> where string is the .toString() of HeaderTags MINLON, MAXLON,
|
||||||
* @param hdrType
|
* MINLAT, MAXLAT.
|
||||||
* @param crossrefFile
|
* @throws IOException
|
||||||
* @throws FitsException
|
* @throws FitsException
|
||||||
* @throws IOException
|
*/
|
||||||
*/
|
public static Map<String, Double> minMaxLLFromFits(File fitsFile) throws FitsException, IOException {
|
||||||
public static void saveNftFits(
|
|
||||||
FitsData fitsData,
|
|
||||||
List<PlaneInfo> planeList,
|
|
||||||
String outfile,
|
|
||||||
FitsHdrBuilder hdrBuilder,
|
|
||||||
FitsHeaderType hdrType,
|
|
||||||
File crossrefFile)
|
|
||||||
throws FitsException, IOException {
|
|
||||||
|
|
||||||
// public static void saveDataCubeFits(FitsData fitsData, List<PlaneInfo> planeList, String
|
System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath());
|
||||||
// outFname,
|
|
||||||
// FitsHdrBuilder hdrBuilder, FitsHeaderType hdrType,
|
|
||||||
// File crossrefFile) throws FitsException, IOException {
|
|
||||||
|
|
||||||
String outNftFile = outfile;
|
// initialize output
|
||||||
Path outPath = Paths.get(outfile);
|
Map<String, Double> minMaxLL = new HashMap<String, Double>();
|
||||||
String nftFitsName = outPath.getFileName().toString();
|
|
||||||
|
|
||||||
// need to replace the product name in the headers list. No need to change comments.
|
Map<String, HeaderCard> fitsHeaders = FitsUtil.getFitsHeaderAsMap(fitsFile.getAbsolutePath());
|
||||||
hdrBuilder.setVbyHeaderTag(HeaderTag.PRODNAME, nftFitsName);
|
|
||||||
|
|
||||||
// set date this product was produced. Uses NFT specific keyword
|
// check whether MINLON keyword exists.
|
||||||
DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");
|
String keyword = HeaderTag.MINLON.toString();
|
||||||
dateFormat.setTimeZone(TimeZone.getTimeZone("GMT"));
|
boolean failOnNull = false;
|
||||||
Date date = new Date();
|
HeaderCard thisCard = FitsUtil.getCard(fitsHeaders, keyword, failOnNull);
|
||||||
hdrBuilder.setVbyHeaderTag(HeaderTag.DATENPRD, dateFormat.format(date));
|
if (thisCard == null) {
|
||||||
|
|
||||||
FitsHdr fitsHeader = hdrBuilder.build();
|
// assume LatLon corner keywords exist
|
||||||
|
// Map<String, HeaderCard> cornerCards = latLonCornersFromMap(fitsHeaders);
|
||||||
|
|
||||||
DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN);
|
// iterate through Lat corners and find min/max values
|
||||||
nftFitsHeader.setData(fitsData);
|
double minLat = 999D;
|
||||||
|
double maxLat = -999D;
|
||||||
|
failOnNull = true;
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLAT.toString(), failOnNull);
|
||||||
|
double thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLat = Math.min(minLat, thisValue);
|
||||||
|
maxLat = Math.max(maxLat, thisValue);
|
||||||
|
|
||||||
List<HeaderCard> headers =
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLAT.toString(), failOnNull);
|
||||||
nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLat = Math.min(minLat, thisValue);
|
||||||
|
maxLat = Math.max(maxLat, thisValue);
|
||||||
|
|
||||||
System.out.println("Saving to " + outNftFile);
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLAT.toString(), failOnNull);
|
||||||
FitsUtil.saveFits(fitsData.getData(), outNftFile, headers);
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
}
|
minLat = Math.min(minLat, thisValue);
|
||||||
|
maxLat = Math.max(maxLat, thisValue);
|
||||||
|
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLAT.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLat = Math.min(minLat, thisValue);
|
||||||
|
maxLat = Math.max(maxLat, thisValue);
|
||||||
|
|
||||||
|
// double check center lat. For polar observations the center latitude could be
|
||||||
|
// either the minimum or maximum latitude
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLAT.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLat = Math.min(minLat, thisValue);
|
||||||
|
maxLat = Math.max(maxLat, thisValue);
|
||||||
|
|
||||||
|
minMaxLL.put(HeaderTag.MINLAT.toString(), minLat);
|
||||||
|
minMaxLL.put(HeaderTag.MAXLAT.toString(), maxLat);
|
||||||
|
|
||||||
|
// iterate through Lon corners and find min/max values
|
||||||
|
double minLon = 999D;
|
||||||
|
double maxLon = -999D;
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLNG.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLon = Math.min(minLon, thisValue);
|
||||||
|
maxLon = Math.max(maxLon, thisValue);
|
||||||
|
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLNG.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLon = Math.min(minLon, thisValue);
|
||||||
|
maxLon = Math.max(maxLon, thisValue);
|
||||||
|
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLNG.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLon = Math.min(minLon, thisValue);
|
||||||
|
maxLon = Math.max(maxLon, thisValue);
|
||||||
|
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLNG.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLon = Math.min(minLon, thisValue);
|
||||||
|
maxLon = Math.max(maxLon, thisValue);
|
||||||
|
|
||||||
|
// double check center lon
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLON.toString(), failOnNull);
|
||||||
|
thisValue = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minLon = Math.min(minLon, thisValue);
|
||||||
|
maxLon = Math.max(maxLon, thisValue);
|
||||||
|
|
||||||
|
minMaxLL.put(HeaderTag.MINLON.toString(), minLon);
|
||||||
|
minMaxLL.put(HeaderTag.MAXLON.toString(), maxLon);
|
||||||
|
|
||||||
|
} else {
|
||||||
|
|
||||||
|
// assume min/max lat/lon keywords exist
|
||||||
|
|
||||||
|
// already queried for MINLON
|
||||||
|
double thisVal = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minMaxLL.put(HeaderTag.MINLON.toString(), thisVal);
|
||||||
|
|
||||||
|
// get MAXLON
|
||||||
|
failOnNull = true;
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLON.toString(), failOnNull);
|
||||||
|
thisVal = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minMaxLL.put(HeaderTag.MAXLON.toString(), thisVal);
|
||||||
|
|
||||||
|
// get MINLAT
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MINLAT.toString(), failOnNull);
|
||||||
|
thisVal = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minMaxLL.put(HeaderTag.MINLAT.toString(), thisVal);
|
||||||
|
|
||||||
|
// get MAXLAT
|
||||||
|
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLAT.toString(), failOnNull);
|
||||||
|
thisVal = thisCard.getValue(Double.class, Double.NaN);
|
||||||
|
minMaxLL.put(HeaderTag.MAXLAT.toString(), thisVal);
|
||||||
|
}
|
||||||
|
|
||||||
|
return minMaxLL;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse and extract list of PlaneInfo enumerations that describe the planes contained in the fits
|
||||||
|
* file. Return empty list if none of the planes match.
|
||||||
|
*
|
||||||
|
* @param fitsFile
|
||||||
|
* @return
|
||||||
|
* @throws IOException
|
||||||
|
* @throws FitsException
|
||||||
|
*/
|
||||||
|
public static List<PlaneInfo> planesFromFits(String fitsFile) throws FitsException, IOException {
|
||||||
|
|
||||||
|
List<PlaneInfo> planes = new ArrayList<PlaneInfo>();
|
||||||
|
|
||||||
|
// extract header cards
|
||||||
|
List<HeaderCard> planeCards = getPlaneHeaderCards(fitsFile);
|
||||||
|
|
||||||
|
/*
|
||||||
|
* planeCards are assumed to follow the fits keyword naming convention of PLANEN, where N goes
|
||||||
|
* from 0 to the number of planes in the fits file - 1. They correspond directly to the order in
|
||||||
|
* which the data is stored in the data cube. For example, PLANE0 = "Latitude" indicates that
|
||||||
|
* the 0th plane in the fits datacube contains the latitude values.
|
||||||
|
*/
|
||||||
|
for (HeaderCard thisPlane : planeCards) {
|
||||||
|
PlaneInfo thisPlaneInfo = PlaneInfo.keyVal2Plane(thisPlane.getValue());
|
||||||
|
if (thisPlaneInfo != null) {
|
||||||
|
planes.add(thisPlaneInfo);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return planes;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save data to a Fits Datacube. Assumes hdrBuilder is pre-loaded with all the keyword values that
|
||||||
|
* can be known prior to this method. An example of a keyword value that cannot be known prior to
|
||||||
|
* this method is the ALTWG filename. This can only be created within this method by using
|
||||||
|
* metadata contained in the FitsHdrBuilder. The map information in fitsData will also be used to
|
||||||
|
* populate hdrBuilder if it is available. If it is not available then the method assumes the
|
||||||
|
* information is already in hdrBuilder. Ex. if fitsData was populated by reading a fits file then
|
||||||
|
* it may not have vector to center or unit vectors describing the reference frame. However, that
|
||||||
|
* information may be in the fits header, which should be captured by hdrBuilder.
|
||||||
|
*
|
||||||
|
* @param fitsData
|
||||||
|
* @param planeList
|
||||||
|
* @param outFname
|
||||||
|
* @param hdrBuilder
|
||||||
|
* @param hdrType
|
||||||
|
* @param crossrefFile - if not null then method will create a cross-reference file. The
|
||||||
|
* cross-reference file allows the pipeline to cross reference an original filenaming
|
||||||
|
* convention with the mission specific one.
|
||||||
|
* @throws FitsException
|
||||||
|
* @throws IOException
|
||||||
|
*/
|
||||||
|
public static void saveDataCubeFits(
|
||||||
|
FitsData fitsData,
|
||||||
|
List<PlaneInfo> planeList,
|
||||||
|
String outFname,
|
||||||
|
FitsHdrBuilder hdrBuilder,
|
||||||
|
FitsHeaderType hdrType,
|
||||||
|
File crossrefFile)
|
||||||
|
throws FitsException, IOException {
|
||||||
|
|
||||||
|
hdrBuilder.setByFitsData(fitsData);
|
||||||
|
|
||||||
|
String fitsFname = outFname;
|
||||||
|
|
||||||
|
if (crossrefFile != null) {
|
||||||
|
// save outfile name in cross-reference file, for future reference
|
||||||
|
String path = FilenameUtils.getFullPath(outFname);
|
||||||
|
if (path.length() == 0) path = ".";
|
||||||
|
String outBaseName = String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
|
||||||
|
AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath());
|
||||||
|
crfFile.streamSToFile(outBaseName, 0);
|
||||||
|
crfFile.closeFile();
|
||||||
|
}
|
||||||
|
|
||||||
|
// set date this product was produced
|
||||||
|
hdrBuilder.setDateprod();
|
||||||
|
|
||||||
|
DTMHeader fitsHeader = FitsHeaderFactory.getDTMHeader(hdrBuilder.build(), hdrType);
|
||||||
|
fitsHeader.setData(fitsData);
|
||||||
|
List<HeaderCard> headers = fitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
|
||||||
|
|
||||||
|
System.out.println("saving to fits file:" + fitsFname);
|
||||||
|
FitsUtil.saveFits(fitsData.getData(), fitsFname, headers);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* General method for saving 3D double array with fits header as defined in fitsHdrbuilder.
|
||||||
|
* Assumes FitsHdrBuilder contains all the keywords that will be written to the fits header in the
|
||||||
|
* order that they should be written.
|
||||||
|
*
|
||||||
|
* @param dataCube
|
||||||
|
* @param hdrBuilder
|
||||||
|
* @throws IOException
|
||||||
|
* @throws FitsException
|
||||||
|
*/
|
||||||
|
public static void saveDataCubeFits(
|
||||||
|
double[][][] dataCube, FitsHdrBuilder hdrBuilder, List<HeaderCard> planeHeaders, String outFile)
|
||||||
|
throws FitsException, IOException {
|
||||||
|
|
||||||
|
FitsHdr fitsHdr = hdrBuilder.build();
|
||||||
|
List<HeaderCard> headers = new ArrayList<HeaderCard>();
|
||||||
|
|
||||||
|
for (String keyword : fitsHdr.fitsKV.keySet()) {
|
||||||
|
headers.add(fitsHdr.fitsKV.get(keyword));
|
||||||
|
}
|
||||||
|
|
||||||
|
saveDataCubeFits(dataCube, headers, planeHeaders, outFile);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* General method for writing to a fits file a 3D double array with fits headers as defined in
|
||||||
|
* List<HeaderCard> and planeKeywords as defined in separate List<HeaderCard>. plane keywords
|
||||||
|
* defining the planes of the dataCube in a separate List<HeaderCard>
|
||||||
|
*
|
||||||
|
* @param dataCube
|
||||||
|
* @param headers
|
||||||
|
* @param planeKeywords - can be null. If so then assumes headers contains all the keyword
|
||||||
|
* information to be written to the file.
|
||||||
|
* @param outFname
|
||||||
|
* @throws IOException
|
||||||
|
* @throws FitsException
|
||||||
|
*/
|
||||||
|
public static void saveDataCubeFits(
|
||||||
|
double[][][] dataCube, List<HeaderCard> headers, List<HeaderCard> planeKeywords, String outFname)
|
||||||
|
throws FitsException, IOException {
|
||||||
|
|
||||||
|
// append planeHeaders
|
||||||
|
if (!planeKeywords.isEmpty()) {
|
||||||
|
headers.addAll(planeKeywords);
|
||||||
|
}
|
||||||
|
FitsUtil.saveFits(dataCube, outFname, headers);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save NFT MLN.
|
||||||
|
*
|
||||||
|
* @param fitsData
|
||||||
|
* @param planeList
|
||||||
|
* @param outfile
|
||||||
|
* @param hdrBuilder
|
||||||
|
* @param hdrType
|
||||||
|
* @param crossrefFile
|
||||||
|
* @throws FitsException
|
||||||
|
* @throws IOException
|
||||||
|
*/
|
||||||
|
public static void saveNftFits(
|
||||||
|
FitsData fitsData,
|
||||||
|
List<PlaneInfo> planeList,
|
||||||
|
String outfile,
|
||||||
|
FitsHdrBuilder hdrBuilder,
|
||||||
|
FitsHeaderType hdrType,
|
||||||
|
File crossrefFile)
|
||||||
|
throws FitsException, IOException {
|
||||||
|
|
||||||
|
// public static void saveDataCubeFits(FitsData fitsData, List<PlaneInfo> planeList, String
|
||||||
|
// outFname,
|
||||||
|
// FitsHdrBuilder hdrBuilder, FitsHeaderType hdrType,
|
||||||
|
// File crossrefFile) throws FitsException, IOException {
|
||||||
|
|
||||||
|
String outNftFile = outfile;
|
||||||
|
Path outPath = Paths.get(outfile);
|
||||||
|
String nftFitsName = outPath.getFileName().toString();
|
||||||
|
|
||||||
|
// need to replace the product name in the headers list. No need to change comments.
|
||||||
|
hdrBuilder.setVbyHeaderTag(HeaderTag.PRODNAME, nftFitsName);
|
||||||
|
|
||||||
|
// set date this product was produced. Uses NFT specific keyword
|
||||||
|
DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");
|
||||||
|
dateFormat.setTimeZone(TimeZone.getTimeZone("GMT"));
|
||||||
|
Date date = new Date();
|
||||||
|
hdrBuilder.setVbyHeaderTag(HeaderTag.DATENPRD, dateFormat.format(date));
|
||||||
|
|
||||||
|
FitsHdr fitsHeader = hdrBuilder.build();
|
||||||
|
|
||||||
|
DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN);
|
||||||
|
nftFitsHeader.setData(fitsData);
|
||||||
|
|
||||||
|
List<HeaderCard> headers = nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
|
||||||
|
|
||||||
|
System.out.println("Saving to " + outNftFile);
|
||||||
|
FitsUtil.saveFits(fitsData.getData(), outNftFile, headers);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,26 +1,47 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.fits;
|
package terrasaur.fits;
|
||||||
|
|
||||||
public enum UnitDir {
|
public enum UnitDir {
|
||||||
|
UX {
|
||||||
|
public int getAxis() {
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
UX {
|
UY {
|
||||||
public int getAxis() {
|
|
||||||
return 1;
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
UY {
|
public int getAxis() {
|
||||||
|
return 2;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
public int getAxis() {
|
UZ {
|
||||||
return 2;
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
UZ {
|
public int getAxis() {
|
||||||
|
return 3;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
public int getAxis() {
|
public abstract int getAxis();
|
||||||
return 3;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
public abstract int getAxis();
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.gui;
|
package terrasaur.gui;
|
||||||
|
|
||||||
import java.net.URL;
|
import java.net.URL;
|
||||||
@@ -14,118 +36,108 @@ import javafx.scene.control.ChoiceBox;
|
|||||||
import javafx.scene.control.Label;
|
import javafx.scene.control.Label;
|
||||||
import javafx.scene.control.TextField;
|
import javafx.scene.control.TextField;
|
||||||
import javafx.stage.Stage;
|
import javafx.stage.Stage;
|
||||||
|
import spice.basic.SpiceException;
|
||||||
import terrasaur.apps.TranslateTime;
|
import terrasaur.apps.TranslateTime;
|
||||||
import terrasaur.utils.AppVersion;
|
import terrasaur.utils.AppVersion;
|
||||||
import spice.basic.SpiceException;
|
|
||||||
|
|
||||||
public class TranslateTimeController implements Initializable {
|
public class TranslateTimeController implements Initializable {
|
||||||
|
|
||||||
private TranslateTime tt;
|
private TranslateTime tt;
|
||||||
|
|
||||||
public TranslateTimeController(Stage stage) {
|
public TranslateTimeController(Stage stage) {
|
||||||
this.tt = TranslateTimeFX.tt;
|
this.tt = TranslateTimeFX.tt;
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public void initialize(URL location, ResourceBundle resources) {
|
|
||||||
this.title.setText("Translate Time");
|
|
||||||
this.version.setText(AppVersion.getVersionString());
|
|
||||||
|
|
||||||
// populate SCLK menu
|
|
||||||
NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs);
|
|
||||||
this.sclkChoice.getItems().addAll(sclkIDs);
|
|
||||||
this.sclkChoice.getSelectionModel().selectedIndexProperty()
|
|
||||||
.addListener(new ChangeListener<Number>() {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public void changed(ObservableValue<? extends Number> observable, Number oldValue,
|
|
||||||
Number newValue) {
|
|
||||||
tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue));
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
});
|
|
||||||
this.sclkChoice.getSelectionModel().select(0);
|
|
||||||
|
|
||||||
try {
|
|
||||||
String zTime = ZonedDateTime.now(ZoneOffset.UTC).toString().strip();
|
|
||||||
// strip off the final "Z"
|
|
||||||
this.tt.setUTC(zTime.substring(0, zTime.length() - 1));
|
|
||||||
updateTime();
|
|
||||||
} catch (SpiceException e) {
|
|
||||||
e.printStackTrace();
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
@FXML
|
@Override
|
||||||
private Label title;
|
public void initialize(URL location, ResourceBundle resources) {
|
||||||
|
this.title.setText("Translate Time");
|
||||||
|
this.version.setText(AppVersion.getVersionString());
|
||||||
|
|
||||||
@FXML
|
// populate SCLK menu
|
||||||
private Label version;
|
NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs);
|
||||||
|
this.sclkChoice.getItems().addAll(sclkIDs);
|
||||||
|
this.sclkChoice.getSelectionModel().selectedIndexProperty().addListener(new ChangeListener<Number>() {
|
||||||
|
|
||||||
@FXML
|
@Override
|
||||||
private TextField julianString;
|
public void changed(ObservableValue<? extends Number> observable, Number oldValue, Number newValue) {
|
||||||
|
tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
this.sclkChoice.getSelectionModel().select(0);
|
||||||
|
|
||||||
@FXML
|
try {
|
||||||
private void setJulian() throws NumberFormatException, SpiceException {
|
String zTime = ZonedDateTime.now(ZoneOffset.UTC).toString().strip();
|
||||||
if (julianString.getText().trim().length() > 0)
|
// strip off the final "Z"
|
||||||
tt.setJulianDate(Double.parseDouble(julianString.getText()));
|
this.tt.setUTC(zTime.substring(0, zTime.length() - 1));
|
||||||
updateTime();
|
updateTime();
|
||||||
}
|
} catch (SpiceException e) {
|
||||||
|
e.printStackTrace();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private ChoiceBox<Integer> sclkChoice;
|
private Label title;
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private TextField sclkString;
|
private Label version;
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private void setSCLK() throws SpiceException {
|
private TextField julianString;
|
||||||
if (sclkString.getText().trim().length() > 0)
|
|
||||||
tt.setSCLK(sclkString.getText());
|
|
||||||
updateTime();
|
|
||||||
}
|
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private TextField tdbString;
|
private void setJulian() throws NumberFormatException, SpiceException {
|
||||||
|
if (julianString.getText().trim().length() > 0) tt.setJulianDate(Double.parseDouble(julianString.getText()));
|
||||||
|
updateTime();
|
||||||
|
}
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private void setTDB() throws SpiceException {
|
private ChoiceBox<Integer> sclkChoice;
|
||||||
if (tdbString.getText().trim().length() > 0)
|
|
||||||
tt.setTDB(Double.parseDouble(tdbString.getText()));
|
|
||||||
updateTime();
|
|
||||||
}
|
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private TextField tdbCalendarString;
|
private TextField sclkString;
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private void setTDBCalendar() throws SpiceException {
|
private void setSCLK() throws SpiceException {
|
||||||
if (tdbCalendarString.getText().trim().length() > 0)
|
if (sclkString.getText().trim().length() > 0) tt.setSCLK(sclkString.getText());
|
||||||
tt.setTDBCalendarString(tdbCalendarString.getText());
|
updateTime();
|
||||||
updateTime();
|
}
|
||||||
}
|
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private TextField utcString;
|
private TextField tdbString;
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private Label utcLabel;
|
private void setTDB() throws SpiceException {
|
||||||
|
if (tdbString.getText().trim().length() > 0) tt.setTDB(Double.parseDouble(tdbString.getText()));
|
||||||
|
updateTime();
|
||||||
|
}
|
||||||
|
|
||||||
@FXML
|
@FXML
|
||||||
private void setUTC() throws SpiceException {
|
private TextField tdbCalendarString;
|
||||||
if (utcString.getText().trim().length() > 0)
|
|
||||||
tt.setUTC(utcString.getText());
|
|
||||||
updateTime();
|
|
||||||
}
|
|
||||||
|
|
||||||
private void updateTime() throws SpiceException {
|
@FXML
|
||||||
julianString.setText(tt.toJulian());
|
private void setTDBCalendar() throws SpiceException {
|
||||||
sclkString.setText(tt.toSCLK().toString());
|
if (tdbCalendarString.getText().trim().length() > 0) tt.setTDBCalendarString(tdbCalendarString.getText());
|
||||||
tdbString.setText(String.format("%.6f", tt.toTDB().getTDBSeconds()));
|
updateTime();
|
||||||
tdbCalendarString.setText(tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB"));
|
}
|
||||||
utcString.setText(tt.toTDB().toUTCString("ISOC", 3));
|
|
||||||
utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY")));
|
|
||||||
}
|
|
||||||
|
|
||||||
|
@FXML
|
||||||
|
private TextField utcString;
|
||||||
|
|
||||||
|
@FXML
|
||||||
|
private Label utcLabel;
|
||||||
|
|
||||||
|
@FXML
|
||||||
|
private void setUTC() throws SpiceException {
|
||||||
|
if (utcString.getText().trim().length() > 0) tt.setUTC(utcString.getText());
|
||||||
|
updateTime();
|
||||||
|
}
|
||||||
|
|
||||||
|
private void updateTime() throws SpiceException {
|
||||||
|
julianString.setText(tt.toJulian());
|
||||||
|
sclkString.setText(tt.toSCLK().toString());
|
||||||
|
tdbString.setText(String.format("%.6f", tt.toTDB().getTDBSeconds()));
|
||||||
|
tdbCalendarString.setText(tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB"));
|
||||||
|
utcString.setText(tt.toTDB().toUTCString("ISOC", 3));
|
||||||
|
utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY")));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.gui;
|
package terrasaur.gui;
|
||||||
|
|
||||||
import java.util.Collection;
|
import java.util.Collection;
|
||||||
@@ -11,34 +33,33 @@ import terrasaur.apps.TranslateTime;
|
|||||||
|
|
||||||
public class TranslateTimeFX extends Application {
|
public class TranslateTimeFX extends Application {
|
||||||
|
|
||||||
// package private so it's visible from TranslateTimeController
|
// package private so it's visible from TranslateTimeController
|
||||||
static TranslateTime tt;
|
static TranslateTime tt;
|
||||||
static Collection<Integer> sclkIDs;
|
static Collection<Integer> sclkIDs;
|
||||||
|
|
||||||
public static void setSCLKIDs(Collection<Integer> sclkIDs) {
|
public static void setSCLKIDs(Collection<Integer> sclkIDs) {
|
||||||
TranslateTimeFX.sclkIDs = sclkIDs;
|
TranslateTimeFX.sclkIDs = sclkIDs;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static void setTranslateTime(TranslateTime tt) {
|
public static void setTranslateTime(TranslateTime tt) {
|
||||||
TranslateTimeFX.tt = tt;
|
TranslateTimeFX.tt = tt;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static void main(String[] args) {
|
public static void main(String[] args) {
|
||||||
launch(args);
|
launch(args);
|
||||||
Platform.exit();
|
Platform.exit();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void start(Stage stage) throws Exception {
|
public void start(Stage stage) throws Exception {
|
||||||
FXMLLoader loader = new FXMLLoader();
|
FXMLLoader loader = new FXMLLoader();
|
||||||
loader.setLocation(getClass().getResource("/terrasaur/gui/TranslateTime.fxml"));
|
loader.setLocation(getClass().getResource("/terrasaur/gui/TranslateTime.fxml"));
|
||||||
TranslateTimeController controller = new TranslateTimeController(stage);
|
TranslateTimeController controller = new TranslateTimeController(stage);
|
||||||
loader.setController(controller);
|
loader.setController(controller);
|
||||||
|
|
||||||
Parent root = loader.load();
|
|
||||||
Scene scene = new Scene(root);
|
|
||||||
stage.setScene(scene);
|
|
||||||
stage.show();
|
|
||||||
}
|
|
||||||
|
|
||||||
|
Parent root = loader.load();
|
||||||
|
Scene scene = new Scene(root);
|
||||||
|
stage.setScene(scene);
|
||||||
|
stage.show();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,27 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.smallBodyModel;
|
package terrasaur.smallBodyModel;
|
||||||
|
|
||||||
|
|
||||||
import picante.math.intervals.Interval;
|
import picante.math.intervals.Interval;
|
||||||
import picante.math.intervals.UnwritableInterval;
|
import picante.math.intervals.UnwritableInterval;
|
||||||
import picante.math.vectorspace.UnwritableVectorIJK;
|
import picante.math.vectorspace.UnwritableVectorIJK;
|
||||||
@@ -9,226 +30,227 @@ import picante.math.vectorspace.VectorIJK;
|
|||||||
/**
|
/**
|
||||||
* The BoundingBox class is a data structure for storing the bounding box enclosing the a
|
* The BoundingBox class is a data structure for storing the bounding box enclosing the a
|
||||||
* 3-dimensional box-shaped region.
|
* 3-dimensional box-shaped region.
|
||||||
*
|
*
|
||||||
* @author kahneg1
|
* @author kahneg1
|
||||||
* @version 1.0
|
* @version 1.0
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class BoundingBox {
|
public class BoundingBox {
|
||||||
private Interval xRange;
|
private Interval xRange;
|
||||||
private Interval yRange;
|
private Interval yRange;
|
||||||
private Interval zRange;
|
private Interval zRange;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Return a BoundingBox with each range set to {@link Interval#ALL_DOUBLES}.
|
* Return a BoundingBox with each range set to {@link Interval#ALL_DOUBLES}.
|
||||||
*/
|
*/
|
||||||
public BoundingBox() {
|
public BoundingBox() {
|
||||||
this(Interval.ALL_DOUBLES, Interval.ALL_DOUBLES, Interval.ALL_DOUBLES);
|
this(Interval.ALL_DOUBLES, Interval.ALL_DOUBLES, Interval.ALL_DOUBLES);
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return a bounding box with its X limits set to elements 0 and 1, Y limits set to elements 2 and
|
|
||||||
* 3, and Z limits set to elements 4 and 5 of the input array.
|
|
||||||
*
|
|
||||||
* @param bounds
|
|
||||||
*/
|
|
||||||
public BoundingBox(double[] bounds) {
|
|
||||||
this(new Interval(bounds[0], bounds[1]), new Interval(bounds[2], bounds[3]),
|
|
||||||
new Interval(bounds[4], bounds[5]));
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return a BoundingBox with the supplied dimensions.
|
|
||||||
*
|
|
||||||
* @param xRange
|
|
||||||
* @param yRange
|
|
||||||
* @param zRange
|
|
||||||
*/
|
|
||||||
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange,
|
|
||||||
UnwritableInterval zRange) {
|
|
||||||
this.xRange = new Interval(xRange);
|
|
||||||
this.yRange = new Interval(yRange);
|
|
||||||
this.zRange = new Interval(zRange);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return a new BoundingBox with the same center as this instance but with each side's length
|
|
||||||
* scaled by scale.
|
|
||||||
*
|
|
||||||
* @param scale
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public BoundingBox getScaledBoundingBox(double scale) {
|
|
||||||
double center = xRange.getMiddle();
|
|
||||||
double length = xRange.getLength() * scale;
|
|
||||||
Interval newXRange = new Interval(center - length / 2, center + length / 2);
|
|
||||||
|
|
||||||
center = yRange.getMiddle();
|
|
||||||
length = yRange.getLength() * scale;
|
|
||||||
Interval newYRange = new Interval(center - length / 2, center + length / 2);
|
|
||||||
|
|
||||||
center = zRange.getMiddle();
|
|
||||||
length = zRange.getLength() * scale;
|
|
||||||
Interval newZRange = new Interval(center - length / 2, center + length / 2);
|
|
||||||
|
|
||||||
return new BoundingBox(newXRange, newYRange, newZRange);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Set the X dimension */
|
|
||||||
public void setXRange(UnwritableInterval range) {
|
|
||||||
this.xRange.setTo(range);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Set the Y dimension */
|
|
||||||
public void setYRange(UnwritableInterval range) {
|
|
||||||
this.yRange.setTo(range);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Set the Z dimension */
|
|
||||||
public void setZRange(UnwritableInterval range) {
|
|
||||||
this.zRange.setTo(range);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Return the X dimension */
|
|
||||||
public UnwritableInterval getXRange() {
|
|
||||||
return new UnwritableInterval(xRange);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Return the Y dimension */
|
|
||||||
public UnwritableInterval getYRange() {
|
|
||||||
return new UnwritableInterval(yRange);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Return the Z dimension */
|
|
||||||
public UnwritableInterval getZRange() {
|
|
||||||
return new UnwritableInterval(zRange);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Expand the bounding box to contain the supplied point if it is not already contained.
|
|
||||||
*
|
|
||||||
* @param point
|
|
||||||
*/
|
|
||||||
public void update(UnwritableVectorIJK point) {
|
|
||||||
xRange.set(Math.min(point.getI(), xRange.getBegin()), Math.max(point.getI(), xRange.getEnd()));
|
|
||||||
yRange.set(Math.min(point.getJ(), yRange.getBegin()), Math.max(point.getJ(), yRange.getEnd()));
|
|
||||||
zRange.set(Math.min(point.getK(), zRange.getBegin()), Math.max(point.getK(), zRange.getEnd()));
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if this instance intersects the other. The intersection test in each dimension is
|
|
||||||
* {@link UnwritableInterval#closedIntersects(UnwritableInterval)}.
|
|
||||||
*
|
|
||||||
* @param other
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public boolean intersects(BoundingBox other) {
|
|
||||||
return xRange.closedIntersects(other.xRange) && yRange.closedIntersects(other.yRange)
|
|
||||||
&& zRange.closedIntersects(other.zRange);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return the length of the largest side of the box
|
|
||||||
*/
|
|
||||||
public double getLargestSide() {
|
|
||||||
return Math.max(xRange.getLength(), Math.max(yRange.getLength(), zRange.getLength()));
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return the center of the box
|
|
||||||
*/
|
|
||||||
public VectorIJK getCenterPoint() {
|
|
||||||
return new VectorIJK(xRange.getMiddle(), yRange.getMiddle(), zRange.getMiddle());
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @return the diagonal length of the box
|
|
||||||
*/
|
|
||||||
public double getDiagonalLength() {
|
|
||||||
VectorIJK vec = new VectorIJK(xRange.getLength(), yRange.getLength(), zRange.getLength());
|
|
||||||
return vec.getLength();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Returns whether or not the given point is contained in the box. The contains test in each
|
|
||||||
* dimension is {@link UnwritableInterval#closedContains(double)}.
|
|
||||||
*
|
|
||||||
* @param pt
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public boolean contains(UnwritableVectorIJK pt) {
|
|
||||||
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ())
|
|
||||||
&& zRange.closedContains(pt.getK());
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Returns whether or not the given point is contained in the box. The contains test in each
|
|
||||||
* dimension is {@link UnwritableInterval#closedContains(double)}.
|
|
||||||
*
|
|
||||||
* @param pt
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public boolean contains(double[] pt) {
|
|
||||||
return contains(new VectorIJK(pt));
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* expand the X range by length on each side.
|
|
||||||
*
|
|
||||||
* @param length
|
|
||||||
*/
|
|
||||||
public void expandX(double length) {
|
|
||||||
xRange.set(xRange.getBegin() - length, xRange.getEnd() + length);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* expand the Y range by length on each side.
|
|
||||||
*
|
|
||||||
* @param length
|
|
||||||
*/
|
|
||||||
public void expandY(double length) {
|
|
||||||
yRange.set(yRange.getBegin() - length, yRange.getEnd() + length);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* expand the Z range by length on each side.
|
|
||||||
*
|
|
||||||
* @param length
|
|
||||||
*/
|
|
||||||
public void expandZ(double length) {
|
|
||||||
zRange.set(zRange.getBegin() - length, zRange.getEnd() + length);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Increase the size of the bounding box by adding (subtracting) to each side a specified
|
|
||||||
* percentage of the bounding box diagonal
|
|
||||||
*
|
|
||||||
* @param fractionOfDiagonalLength must be positive
|
|
||||||
*/
|
|
||||||
public void increaseSize(double fractionOfDiagonalLength) {
|
|
||||||
if (fractionOfDiagonalLength > 0.0) {
|
|
||||||
double size = fractionOfDiagonalLength * getDiagonalLength();
|
|
||||||
xRange.set(xRange.getBegin() - size, xRange.getEnd() + size);
|
|
||||||
yRange.set(yRange.getBegin() - size, yRange.getEnd() + size);
|
|
||||||
zRange.set(zRange.getBegin() - size, zRange.getEnd() + size);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
/**
|
||||||
public String toString() {
|
* Return a bounding box with its X limits set to elements 0 and 1, Y limits set to elements 2 and
|
||||||
return "xmin: " + xRange.getBegin() + " xmax: " + xRange.getEnd() + " ymin: "
|
* 3, and Z limits set to elements 4 and 5 of the input array.
|
||||||
+ yRange.getBegin() + " ymax: " + yRange.getEnd() + " zmin: " + zRange.getBegin()
|
*
|
||||||
+ " zmax: " + zRange.getEnd();
|
* @param bounds
|
||||||
}
|
*/
|
||||||
|
public BoundingBox(double[] bounds) {
|
||||||
|
this(
|
||||||
|
new Interval(bounds[0], bounds[1]),
|
||||||
|
new Interval(bounds[2], bounds[3]),
|
||||||
|
new Interval(bounds[4], bounds[5]));
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
/**
|
||||||
public boolean equals(Object obj) {
|
* Return a BoundingBox with the supplied dimensions.
|
||||||
BoundingBox b = (BoundingBox) obj;
|
*
|
||||||
return xRange.equals(b.xRange) && yRange.equals(b.yRange) && zRange.equals(b.zRange);
|
* @param xRange
|
||||||
}
|
* @param yRange
|
||||||
|
* @param zRange
|
||||||
|
*/
|
||||||
|
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange, UnwritableInterval zRange) {
|
||||||
|
this.xRange = new Interval(xRange);
|
||||||
|
this.yRange = new Interval(yRange);
|
||||||
|
this.zRange = new Interval(zRange);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
/**
|
||||||
public Object clone() {
|
* Return a new BoundingBox with the same center as this instance but with each side's length
|
||||||
return new BoundingBox(xRange, yRange, zRange);
|
* scaled by scale.
|
||||||
}
|
*
|
||||||
|
* @param scale
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public BoundingBox getScaledBoundingBox(double scale) {
|
||||||
|
double center = xRange.getMiddle();
|
||||||
|
double length = xRange.getLength() * scale;
|
||||||
|
Interval newXRange = new Interval(center - length / 2, center + length / 2);
|
||||||
|
|
||||||
|
center = yRange.getMiddle();
|
||||||
|
length = yRange.getLength() * scale;
|
||||||
|
Interval newYRange = new Interval(center - length / 2, center + length / 2);
|
||||||
|
|
||||||
|
center = zRange.getMiddle();
|
||||||
|
length = zRange.getLength() * scale;
|
||||||
|
Interval newZRange = new Interval(center - length / 2, center + length / 2);
|
||||||
|
|
||||||
|
return new BoundingBox(newXRange, newYRange, newZRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Set the X dimension */
|
||||||
|
public void setXRange(UnwritableInterval range) {
|
||||||
|
this.xRange.setTo(range);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Set the Y dimension */
|
||||||
|
public void setYRange(UnwritableInterval range) {
|
||||||
|
this.yRange.setTo(range);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Set the Z dimension */
|
||||||
|
public void setZRange(UnwritableInterval range) {
|
||||||
|
this.zRange.setTo(range);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Return the X dimension */
|
||||||
|
public UnwritableInterval getXRange() {
|
||||||
|
return new UnwritableInterval(xRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Return the Y dimension */
|
||||||
|
public UnwritableInterval getYRange() {
|
||||||
|
return new UnwritableInterval(yRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Return the Z dimension */
|
||||||
|
public UnwritableInterval getZRange() {
|
||||||
|
return new UnwritableInterval(zRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Expand the bounding box to contain the supplied point if it is not already contained.
|
||||||
|
*
|
||||||
|
* @param point
|
||||||
|
*/
|
||||||
|
public void update(UnwritableVectorIJK point) {
|
||||||
|
xRange.set(Math.min(point.getI(), xRange.getBegin()), Math.max(point.getI(), xRange.getEnd()));
|
||||||
|
yRange.set(Math.min(point.getJ(), yRange.getBegin()), Math.max(point.getJ(), yRange.getEnd()));
|
||||||
|
zRange.set(Math.min(point.getK(), zRange.getBegin()), Math.max(point.getK(), zRange.getEnd()));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if this instance intersects the other. The intersection test in each dimension is
|
||||||
|
* {@link UnwritableInterval#closedIntersects(UnwritableInterval)}.
|
||||||
|
*
|
||||||
|
* @param other
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public boolean intersects(BoundingBox other) {
|
||||||
|
return xRange.closedIntersects(other.xRange)
|
||||||
|
&& yRange.closedIntersects(other.yRange)
|
||||||
|
&& zRange.closedIntersects(other.zRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @return the length of the largest side of the box
|
||||||
|
*/
|
||||||
|
public double getLargestSide() {
|
||||||
|
return Math.max(xRange.getLength(), Math.max(yRange.getLength(), zRange.getLength()));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @return the center of the box
|
||||||
|
*/
|
||||||
|
public VectorIJK getCenterPoint() {
|
||||||
|
return new VectorIJK(xRange.getMiddle(), yRange.getMiddle(), zRange.getMiddle());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @return the diagonal length of the box
|
||||||
|
*/
|
||||||
|
public double getDiagonalLength() {
|
||||||
|
VectorIJK vec = new VectorIJK(xRange.getLength(), yRange.getLength(), zRange.getLength());
|
||||||
|
return vec.getLength();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns whether or not the given point is contained in the box. The contains test in each
|
||||||
|
* dimension is {@link UnwritableInterval#closedContains(double)}.
|
||||||
|
*
|
||||||
|
* @param pt
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public boolean contains(UnwritableVectorIJK pt) {
|
||||||
|
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ()) && zRange.closedContains(pt.getK());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns whether or not the given point is contained in the box. The contains test in each
|
||||||
|
* dimension is {@link UnwritableInterval#closedContains(double)}.
|
||||||
|
*
|
||||||
|
* @param pt
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public boolean contains(double[] pt) {
|
||||||
|
return contains(new VectorIJK(pt));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* expand the X range by length on each side.
|
||||||
|
*
|
||||||
|
* @param length
|
||||||
|
*/
|
||||||
|
public void expandX(double length) {
|
||||||
|
xRange.set(xRange.getBegin() - length, xRange.getEnd() + length);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* expand the Y range by length on each side.
|
||||||
|
*
|
||||||
|
* @param length
|
||||||
|
*/
|
||||||
|
public void expandY(double length) {
|
||||||
|
yRange.set(yRange.getBegin() - length, yRange.getEnd() + length);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* expand the Z range by length on each side.
|
||||||
|
*
|
||||||
|
* @param length
|
||||||
|
*/
|
||||||
|
public void expandZ(double length) {
|
||||||
|
zRange.set(zRange.getBegin() - length, zRange.getEnd() + length);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Increase the size of the bounding box by adding (subtracting) to each side a specified
|
||||||
|
* percentage of the bounding box diagonal
|
||||||
|
*
|
||||||
|
* @param fractionOfDiagonalLength must be positive
|
||||||
|
*/
|
||||||
|
public void increaseSize(double fractionOfDiagonalLength) {
|
||||||
|
if (fractionOfDiagonalLength > 0.0) {
|
||||||
|
double size = fractionOfDiagonalLength * getDiagonalLength();
|
||||||
|
xRange.set(xRange.getBegin() - size, xRange.getEnd() + size);
|
||||||
|
yRange.set(yRange.getBegin() - size, yRange.getEnd() + size);
|
||||||
|
zRange.set(zRange.getBegin() - size, zRange.getEnd() + size);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String toString() {
|
||||||
|
return "xmin: " + xRange.getBegin() + " xmax: " + xRange.getEnd() + " ymin: "
|
||||||
|
+ yRange.getBegin() + " ymax: " + yRange.getEnd() + " zmin: " + zRange.getBegin()
|
||||||
|
+ " zmax: " + zRange.getEnd();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean equals(Object obj) {
|
||||||
|
BoundingBox b = (BoundingBox) obj;
|
||||||
|
return xRange.equals(b.xRange) && yRange.equals(b.yRange) && zRange.equals(b.zRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public Object clone() {
|
||||||
|
return new BoundingBox(xRange, yRange, zRange);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,28 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.smallBodyModel;
|
package terrasaur.smallBodyModel;
|
||||||
|
|
||||||
|
import com.google.common.collect.HashMultimap;
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
@@ -12,174 +35,174 @@ import org.apache.commons.math3.geometry.euclidean.threed.Rotation;
|
|||||||
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
|
||||||
import org.apache.logging.log4j.LogManager;
|
import org.apache.logging.log4j.LogManager;
|
||||||
import org.apache.logging.log4j.Logger;
|
import org.apache.logging.log4j.Logger;
|
||||||
import com.google.common.collect.HashMultimap;
|
|
||||||
import picante.math.vectorspace.VectorIJK;
|
import picante.math.vectorspace.VectorIJK;
|
||||||
import terrasaur.utils.math.MathConversions;
|
|
||||||
import terrasaur.utils.PolyDataStatistics;
|
import terrasaur.utils.PolyDataStatistics;
|
||||||
import terrasaur.utils.PolyDataUtil;
|
import terrasaur.utils.PolyDataUtil;
|
||||||
|
import terrasaur.utils.math.MathConversions;
|
||||||
import terrasaur.utils.tessellation.FibonacciSphere;
|
import terrasaur.utils.tessellation.FibonacciSphere;
|
||||||
import vtk.vtkPoints;
|
import vtk.vtkPoints;
|
||||||
import vtk.vtkPolyData;
|
import vtk.vtkPolyData;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Hold a collection of local shape models
|
* Hold a collection of local shape models
|
||||||
*
|
*
|
||||||
* @author Hari.Nair@jhuapl.edu
|
* @author Hari.Nair@jhuapl.edu
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class LocalModelCollection {
|
public class LocalModelCollection {
|
||||||
|
|
||||||
private final static Logger logger = LogManager.getLogger();
|
private static final Logger logger = LogManager.getLogger();
|
||||||
|
|
||||||
static class LocalModel {
|
static class LocalModel {
|
||||||
final Vector3D center;
|
final Vector3D center;
|
||||||
final String filename;
|
final String filename;
|
||||||
|
|
||||||
LocalModel(Vector3D center, String filename) {
|
LocalModel(Vector3D center, String filename) {
|
||||||
this.center = center;
|
this.center = center;
|
||||||
this.filename = filename;
|
this.filename = filename;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
// key is tile index, value is collection of localModels
|
||||||
|
private HashMultimap<Long, LocalModel> localModelMap;
|
||||||
|
private FibonacciSphere tessellation;
|
||||||
|
// key is filename, value is shape model
|
||||||
|
private ThreadLocal<Map<String, SmallBodyModel>> localModels;
|
||||||
|
|
||||||
// key is tile index, value is collection of localModels
|
private Double scale;
|
||||||
private HashMultimap<Long, LocalModel> localModelMap;
|
private Rotation rotation;
|
||||||
private FibonacciSphere tessellation;
|
|
||||||
// key is filename, value is shape model
|
|
||||||
private ThreadLocal<Map<String, SmallBodyModel>> localModels;
|
|
||||||
|
|
||||||
private Double scale;
|
/**
|
||||||
private Rotation rotation;
|
*
|
||||||
|
* @param numTiles total number of tiles to use for sorting local models
|
||||||
/**
|
*/
|
||||||
*
|
public LocalModelCollection(int numTiles, Double scale, Rotation rotation) {
|
||||||
* @param numTiles total number of tiles to use for sorting local models
|
localModelMap = HashMultimap.create();
|
||||||
*/
|
tessellation = new FibonacciSphere(numTiles);
|
||||||
public LocalModelCollection(int numTiles, Double scale, Rotation rotation) {
|
localModels = new ThreadLocal<>();
|
||||||
localModelMap = HashMultimap.create();
|
this.scale = scale;
|
||||||
tessellation = new FibonacciSphere(numTiles);
|
this.rotation = rotation;
|
||||||
localModels = new ThreadLocal<>();
|
|
||||||
this.scale = scale;
|
|
||||||
this.rotation = rotation;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Add a shape model. Models are stored in a map with the center as the key, so an entry with the
|
|
||||||
* same center as an existing entry will overwrite the existing one.
|
|
||||||
*
|
|
||||||
* @param latInRadians
|
|
||||||
* @param lonInRadians
|
|
||||||
* @param filename
|
|
||||||
*/
|
|
||||||
public void addModel(double latInRadians, double lonInRadians, String filename) {
|
|
||||||
Vector3D center = new Vector3D(lonInRadians, latInRadians);
|
|
||||||
long tileIndex = tessellation.getTileIndex(MathConversions.toVectorIJK(center));
|
|
||||||
LocalModel lm = new LocalModel(center, filename);
|
|
||||||
localModelMap.put(tileIndex, lm);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return a shape model containing the supplied point. This may not be the only shape model that
|
|
||||||
* contains this point, just the first one found.
|
|
||||||
*
|
|
||||||
* @param point
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public SmallBodyModel get(Vector3D point) {
|
|
||||||
List<String> filenames = getFilenames(point);
|
|
||||||
if (filenames.size() == 0)
|
|
||||||
logger.error("No shape models cover {}", point.toString());
|
|
||||||
double[] origin = new double[3];
|
|
||||||
double[] intersectPoint = new double[3];
|
|
||||||
for (String filename : filenames) {
|
|
||||||
SmallBodyModel sbm = load(filename);
|
|
||||||
long intersect =
|
|
||||||
sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
|
|
||||||
|
|
||||||
if (intersect != -1)
|
|
||||||
return sbm;
|
|
||||||
}
|
}
|
||||||
logger.debug("Failed intersection for lon {}, lat {}", Math.toDegrees(point.getAlpha()),
|
|
||||||
Math.toDegrees(point.getDelta()));
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Load a shape model after applying any rotation or scaling
|
* Add a shape model. Models are stored in a map with the center as the key, so an entry with the
|
||||||
*
|
* same center as an existing entry will overwrite the existing one.
|
||||||
* @param filename
|
*
|
||||||
* @return
|
* @param latInRadians
|
||||||
*/
|
* @param lonInRadians
|
||||||
private SmallBodyModel load(String filename) {
|
* @param filename
|
||||||
Map<String, SmallBodyModel> map = localModels.get();
|
*/
|
||||||
if (map == null) {
|
public void addModel(double latInRadians, double lonInRadians, String filename) {
|
||||||
map = new HashMap<>();
|
Vector3D center = new Vector3D(lonInRadians, latInRadians);
|
||||||
localModels.set(map);
|
long tileIndex = tessellation.getTileIndex(MathConversions.toVectorIJK(center));
|
||||||
|
LocalModel lm = new LocalModel(center, filename);
|
||||||
|
localModelMap.put(tileIndex, lm);
|
||||||
}
|
}
|
||||||
SmallBodyModel sbm = map.get(filename);
|
|
||||||
if (sbm == null) {
|
|
||||||
|
|
||||||
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(),
|
/**
|
||||||
FilenameUtils.getBaseName(filename));
|
* Return a shape model containing the supplied point. This may not be the only shape model that
|
||||||
try {
|
* contains this point, just the first one found.
|
||||||
vtkPolyData model = PolyDataUtil.loadShapeModel(filename);
|
*
|
||||||
if (scale != null || rotation != null) {
|
* @param point
|
||||||
PolyDataStatistics stats = new PolyDataStatistics(model);
|
* @return
|
||||||
Vector3D center = new Vector3D(stats.getCentroid());
|
*/
|
||||||
|
public SmallBodyModel get(Vector3D point) {
|
||||||
|
List<String> filenames = getFilenames(point);
|
||||||
|
if (filenames.size() == 0) logger.error("No shape models cover {}", point.toString());
|
||||||
|
double[] origin = new double[3];
|
||||||
|
double[] intersectPoint = new double[3];
|
||||||
|
for (String filename : filenames) {
|
||||||
|
SmallBodyModel sbm = load(filename);
|
||||||
|
long intersect = sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
|
||||||
|
|
||||||
vtkPoints points = model.GetPoints();
|
if (intersect != -1) return sbm;
|
||||||
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
|
}
|
||||||
Vector3D thisPoint = new Vector3D(points.GetPoint(i));
|
logger.debug(
|
||||||
if (scale != null)
|
"Failed intersection for lon {}, lat {}",
|
||||||
thisPoint = thisPoint.subtract(center).scalarMultiply(scale).add(center);
|
Math.toDegrees(point.getAlpha()),
|
||||||
if (rotation != null)
|
Math.toDegrees(point.getDelta()));
|
||||||
thisPoint = rotation.applyTo(thisPoint.subtract(center)).add(center);
|
return null;
|
||||||
points.SetPoint(i, thisPoint.toArray());
|
}
|
||||||
}
|
|
||||||
|
/**
|
||||||
|
* Load a shape model after applying any rotation or scaling
|
||||||
|
*
|
||||||
|
* @param filename
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
private SmallBodyModel load(String filename) {
|
||||||
|
Map<String, SmallBodyModel> map = localModels.get();
|
||||||
|
if (map == null) {
|
||||||
|
map = new HashMap<>();
|
||||||
|
localModels.set(map);
|
||||||
|
}
|
||||||
|
SmallBodyModel sbm = map.get(filename);
|
||||||
|
if (sbm == null) {
|
||||||
|
|
||||||
|
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(), FilenameUtils.getBaseName(filename));
|
||||||
|
try {
|
||||||
|
vtkPolyData model = PolyDataUtil.loadShapeModel(filename);
|
||||||
|
if (scale != null || rotation != null) {
|
||||||
|
PolyDataStatistics stats = new PolyDataStatistics(model);
|
||||||
|
Vector3D center = new Vector3D(stats.getCentroid());
|
||||||
|
|
||||||
|
vtkPoints points = model.GetPoints();
|
||||||
|
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
|
||||||
|
Vector3D thisPoint = new Vector3D(points.GetPoint(i));
|
||||||
|
if (scale != null)
|
||||||
|
thisPoint = thisPoint
|
||||||
|
.subtract(center)
|
||||||
|
.scalarMultiply(scale)
|
||||||
|
.add(center);
|
||||||
|
if (rotation != null)
|
||||||
|
thisPoint =
|
||||||
|
rotation.applyTo(thisPoint.subtract(center)).add(center);
|
||||||
|
points.SetPoint(i, thisPoint.toArray());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
sbm = new SmallBodyModel(model);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error(e.getLocalizedMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
map.put(filename, sbm);
|
||||||
|
}
|
||||||
|
return map.get(filename);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Return the local model with the closest center to point
|
||||||
|
*
|
||||||
|
* @param point
|
||||||
|
* @return null if no models have been loaded
|
||||||
|
*/
|
||||||
|
private List<String> getFilenames(Vector3D point) {
|
||||||
|
VectorIJK ijk = MathConversions.toVectorIJK(point);
|
||||||
|
|
||||||
|
// A sorted map of tiles by distance
|
||||||
|
NavigableMap<Double, Integer> distanceMap = tessellation.getDistanceMap(ijk);
|
||||||
|
|
||||||
|
List<String> smallBodyModels = new ArrayList<>();
|
||||||
|
for (Double dist : distanceMap.keySet()) {
|
||||||
|
|
||||||
|
// A set of local models with centers in this tile
|
||||||
|
Set<LocalModel> localModelSet = localModelMap.get((long) distanceMap.get(dist));
|
||||||
|
|
||||||
|
if (localModelSet.size() > 0) {
|
||||||
|
NavigableMap<Double, LocalModel> localDistanceMap = new TreeMap<>();
|
||||||
|
for (LocalModel localModel : localModelSet) {
|
||||||
|
double thisDist = Vector3D.angle(localModel.center, point);
|
||||||
|
localDistanceMap.put(thisDist, localModel);
|
||||||
|
}
|
||||||
|
// add all local models with centers within PI/4 of point
|
||||||
|
for (double localDist :
|
||||||
|
localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
|
||||||
|
smallBodyModels.add(localDistanceMap.get(localDist).filename);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
sbm = new SmallBodyModel(model);
|
return smallBodyModels;
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error(e.getLocalizedMessage());
|
|
||||||
}
|
|
||||||
|
|
||||||
map.put(filename, sbm);
|
|
||||||
}
|
}
|
||||||
return map.get(filename);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Return the local model with the closest center to point
|
|
||||||
*
|
|
||||||
* @param point
|
|
||||||
* @return null if no models have been loaded
|
|
||||||
*/
|
|
||||||
private List<String> getFilenames(Vector3D point) {
|
|
||||||
VectorIJK ijk = MathConversions.toVectorIJK(point);
|
|
||||||
|
|
||||||
// A sorted map of tiles by distance
|
|
||||||
NavigableMap<Double, Integer> distanceMap = tessellation.getDistanceMap(ijk);
|
|
||||||
|
|
||||||
List<String> smallBodyModels = new ArrayList<>();
|
|
||||||
for (Double dist : distanceMap.keySet()) {
|
|
||||||
|
|
||||||
// A set of local models with centers in this tile
|
|
||||||
Set<LocalModel> localModelSet = localModelMap.get((long) distanceMap.get(dist));
|
|
||||||
|
|
||||||
if (localModelSet.size() > 0) {
|
|
||||||
NavigableMap<Double, LocalModel> localDistanceMap = new TreeMap<>();
|
|
||||||
for (LocalModel localModel : localModelSet) {
|
|
||||||
double thisDist = Vector3D.angle(localModel.center, point);
|
|
||||||
localDistanceMap.put(thisDist, localModel);
|
|
||||||
}
|
|
||||||
// add all local models with centers within PI/4 of point
|
|
||||||
for (double localDist : localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
|
|
||||||
smallBodyModels.add(localDistanceMap.get(localDist).filename);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return smallBodyModels;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.smallBodyModel;
|
package terrasaur.smallBodyModel;
|
||||||
|
|
||||||
import java.awt.Color;
|
import java.awt.Color;
|
||||||
@@ -6,124 +28,122 @@ import org.immutables.value.Value;
|
|||||||
import terrasaur.smallBodyModel.ImmutableSBMTStructure.Builder;
|
import terrasaur.smallBodyModel.ImmutableSBMTStructure.Builder;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
*
|
*
|
||||||
* <pre>
|
* <pre>
|
||||||
# SBMT Structure File
|
* # SBMT Structure File
|
||||||
# type,point
|
* # type,point
|
||||||
# ------------------------------------------------------------------------------
|
* # ------------------------------------------------------------------------------
|
||||||
# File consists of a list of structures on each line.
|
* # File consists of a list of structures on each line.
|
||||||
#
|
* #
|
||||||
# Each line is defined by 17 columns with the following:
|
* # Each line is defined by 17 columns with the following:
|
||||||
# <id> <name> <centerXYZ[3]> <centerLLR[3]> <coloringValue[4]> <diameter> <flattening> <regularAngle> <colorRGB> <label>*
|
* # <id> <name> <centerXYZ[3]> <centerLLR[3]> <coloringValue[4]> <diameter> <flattening> <regularAngle> <colorRGB> <label>*
|
||||||
#
|
* #
|
||||||
# id: Id of the structure
|
* # id: Id of the structure
|
||||||
# name: Name of the structure
|
* # name: Name of the structure
|
||||||
# centerXYZ[3]: 3 columns that define the structure center in 3D space
|
* # centerXYZ[3]: 3 columns that define the structure center in 3D space
|
||||||
# centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
|
* # centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
|
||||||
# coloringValue[4]: 4 columns that define the ellipse “standard” colorings. The
|
* # coloringValue[4]: 4 columns that define the ellipse “standard” colorings. The
|
||||||
# colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
|
* # colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
|
||||||
# diameter: Diameter of (semimajor) axis of ellipse
|
* # diameter: Diameter of (semimajor) axis of ellipse
|
||||||
# flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
|
* # flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
|
||||||
# regularAngle: Angle between the semimajor axis and the line of longitude
|
* # regularAngle: Angle between the semimajor axis and the line of longitude
|
||||||
# as projected onto the surface
|
* # as projected onto the surface
|
||||||
# colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
|
* # colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
|
||||||
# spaces). This column appears as a single textual column.
|
* # spaces). This column appears as a single textual column.
|
||||||
# label: Label of the structure
|
* # label: Label of the structure
|
||||||
#
|
* #
|
||||||
#
|
* #
|
||||||
# Please note the following:
|
* # Please note the following:
|
||||||
# - Each line is composed of columns separated by a tab character.
|
* # - Each line is composed of columns separated by a tab character.
|
||||||
# - Blank lines or lines that start with '#' are ignored.
|
* # - Blank lines or lines that start with '#' are ignored.
|
||||||
# - Angle units: degrees
|
* # - Angle units: degrees
|
||||||
# - Length units: kilometers
|
* # - Length units: kilometers
|
||||||
* </pre>
|
* </pre>
|
||||||
*
|
*
|
||||||
* @author Hari.Nair@jhuapl.edu
|
* @author Hari.Nair@jhuapl.edu
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
@Value.Immutable
|
@Value.Immutable
|
||||||
public abstract class SBMTStructure {
|
public abstract class SBMTStructure {
|
||||||
|
|
||||||
abstract int id();
|
abstract int id();
|
||||||
|
|
||||||
abstract String name();
|
abstract String name();
|
||||||
|
|
||||||
public abstract Vector3D centerXYZ();
|
public abstract Vector3D centerXYZ();
|
||||||
|
|
||||||
abstract String slopeColoring();
|
abstract String slopeColoring();
|
||||||
|
|
||||||
abstract String elevationColoring();
|
abstract String elevationColoring();
|
||||||
|
|
||||||
abstract String accelerationColoring();
|
abstract String accelerationColoring();
|
||||||
|
|
||||||
abstract String potentialColoring();
|
abstract String potentialColoring();
|
||||||
|
|
||||||
abstract double diameter();
|
abstract double diameter();
|
||||||
|
|
||||||
abstract double flattening();
|
abstract double flattening();
|
||||||
|
|
||||||
abstract double regularAngle();
|
abstract double regularAngle();
|
||||||
|
|
||||||
abstract Color rgb();
|
abstract Color rgb();
|
||||||
|
|
||||||
abstract String label();
|
abstract String label();
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String toString() {
|
public String toString() {
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
sb.append(String.format("%d\t", id()));
|
sb.append(String.format("%d\t", id()));
|
||||||
sb.append(String.format("%s\t", name()));
|
sb.append(String.format("%s\t", name()));
|
||||||
sb.append(String.format("%.16f\t", centerXYZ().getX()));
|
sb.append(String.format("%.16f\t", centerXYZ().getX()));
|
||||||
sb.append(String.format("%.16f\t", centerXYZ().getY()));
|
sb.append(String.format("%.16f\t", centerXYZ().getY()));
|
||||||
sb.append(String.format("%.16f\t", centerXYZ().getZ()));
|
sb.append(String.format("%.16f\t", centerXYZ().getZ()));
|
||||||
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getDelta())));
|
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getDelta())));
|
||||||
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getAlpha())));
|
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getAlpha())));
|
||||||
sb.append(String.format("%.16f\t", centerXYZ().getNorm()));
|
sb.append(String.format("%.16f\t", centerXYZ().getNorm()));
|
||||||
sb.append(String.format("%s\t", slopeColoring()));
|
sb.append(String.format("%s\t", slopeColoring()));
|
||||||
sb.append(String.format("%s\t", elevationColoring()));
|
sb.append(String.format("%s\t", elevationColoring()));
|
||||||
sb.append(String.format("%s\t", accelerationColoring()));
|
sb.append(String.format("%s\t", accelerationColoring()));
|
||||||
sb.append(String.format("%s\t", potentialColoring()));
|
sb.append(String.format("%s\t", potentialColoring()));
|
||||||
sb.append(String.format("%f\t", diameter()));
|
sb.append(String.format("%f\t", diameter()));
|
||||||
sb.append(String.format("%f\t", flattening()));
|
sb.append(String.format("%f\t", flattening()));
|
||||||
sb.append(String.format("%f\t", regularAngle()));
|
sb.append(String.format("%f\t", regularAngle()));
|
||||||
sb.append(String.format("%d,%d,%d\t", rgb().getRed(), rgb().getGreen(), rgb().getBlue()));
|
sb.append(String.format("%d,%d,%d\t", rgb().getRed(), rgb().getGreen(), rgb().getBlue()));
|
||||||
sb.append(label());
|
sb.append(label());
|
||||||
return sb.toString();
|
return sb.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
public static SBMTStructure fromString(String line) {
|
|
||||||
String[] parts = line.split("\\s+");
|
|
||||||
int id = Integer.parseInt(parts[0]);
|
|
||||||
String name = parts[1];
|
|
||||||
Vector3D centerXYZ = new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]),
|
|
||||||
Double.parseDouble(parts[4]));
|
|
||||||
String slopeColoring = parts[8];
|
|
||||||
String elevationColoring = parts[9];
|
|
||||||
String accelerationColoring = parts[10];
|
|
||||||
String potentialColoring = parts[11];
|
|
||||||
double diameter = Double.parseDouble(parts[12]);
|
|
||||||
double flattening = Double.parseDouble(parts[13]);
|
|
||||||
double regularAngle = Double.parseDouble(parts[14]);
|
|
||||||
String[] colorParts = parts[15].split(",");
|
|
||||||
Color rgb = new Color(Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]),
|
|
||||||
Integer.parseInt(colorParts[2]));
|
|
||||||
String label = parts[16];
|
|
||||||
|
|
||||||
Builder builder = ImmutableSBMTStructure.builder();
|
|
||||||
builder.id(id);
|
|
||||||
builder.name(name);
|
|
||||||
builder.centerXYZ(centerXYZ);
|
|
||||||
builder.slopeColoring(slopeColoring);
|
|
||||||
builder.elevationColoring(elevationColoring);
|
|
||||||
builder.accelerationColoring(accelerationColoring);
|
|
||||||
builder.potentialColoring(potentialColoring);
|
|
||||||
builder.diameter(diameter);
|
|
||||||
builder.flattening(flattening);
|
|
||||||
builder.regularAngle(regularAngle);
|
|
||||||
builder.rgb(rgb);
|
|
||||||
builder.label(label);
|
|
||||||
return builder.build();
|
|
||||||
}
|
|
||||||
|
|
||||||
|
public static SBMTStructure fromString(String line) {
|
||||||
|
String[] parts = line.split("\\s+");
|
||||||
|
int id = Integer.parseInt(parts[0]);
|
||||||
|
String name = parts[1];
|
||||||
|
Vector3D centerXYZ =
|
||||||
|
new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]), Double.parseDouble(parts[4]));
|
||||||
|
String slopeColoring = parts[8];
|
||||||
|
String elevationColoring = parts[9];
|
||||||
|
String accelerationColoring = parts[10];
|
||||||
|
String potentialColoring = parts[11];
|
||||||
|
double diameter = Double.parseDouble(parts[12]);
|
||||||
|
double flattening = Double.parseDouble(parts[13]);
|
||||||
|
double regularAngle = Double.parseDouble(parts[14]);
|
||||||
|
String[] colorParts = parts[15].split(",");
|
||||||
|
Color rgb = new Color(
|
||||||
|
Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]), Integer.parseInt(colorParts[2]));
|
||||||
|
String label = parts[16];
|
||||||
|
|
||||||
|
Builder builder = ImmutableSBMTStructure.builder();
|
||||||
|
builder.id(id);
|
||||||
|
builder.name(name);
|
||||||
|
builder.centerXYZ(centerXYZ);
|
||||||
|
builder.slopeColoring(slopeColoring);
|
||||||
|
builder.elevationColoring(elevationColoring);
|
||||||
|
builder.accelerationColoring(accelerationColoring);
|
||||||
|
builder.potentialColoring(potentialColoring);
|
||||||
|
builder.diameter(diameter);
|
||||||
|
builder.flattening(flattening);
|
||||||
|
builder.regularAngle(regularAngle);
|
||||||
|
builder.rgb(rgb);
|
||||||
|
builder.label(label);
|
||||||
|
return builder.build();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
package terrasaur.smallBodyModel;
|
package terrasaur.smallBodyModel;
|
||||||
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
@@ -10,175 +32,172 @@ import vtk.vtkPolyData;
|
|||||||
/**
|
/**
|
||||||
* This class is used to subdivide the bounding box of a shape model into a contiguous grid of 3D
|
* This class is used to subdivide the bounding box of a shape model into a contiguous grid of 3D
|
||||||
* cubes (sort of like voxels).
|
* cubes (sort of like voxels).
|
||||||
*
|
*
|
||||||
* @author kahneg1
|
* @author kahneg1
|
||||||
* @version 1.0
|
* @version 1.0
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
public class SmallBodyCubes {
|
public class SmallBodyCubes {
|
||||||
|
|
||||||
private final static Logger logger = LogManager.getLogger(SmallBodyCubes.class);
|
private static final Logger logger = LogManager.getLogger(SmallBodyCubes.class);
|
||||||
|
|
||||||
private BoundingBox boundingBox;
|
private BoundingBox boundingBox;
|
||||||
private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>();
|
private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>();
|
||||||
private final double cubeSize;
|
private final double cubeSize;
|
||||||
private final double buffer;
|
private final double buffer;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Create a cube set structure for the given model, where each cube has side <tt>cubeSize</tt> and
|
* Create a cube set structure for the given model, where each cube has side <tt>cubeSize</tt> and
|
||||||
* <tt>buffer</tt> is added to all sides of the bounding box of the model. Cubes that do not
|
* <tt>buffer</tt> is added to all sides of the bounding box of the model. Cubes that do not
|
||||||
* intersect the asteroid at all are removed.
|
* intersect the asteroid at all are removed.
|
||||||
*
|
*
|
||||||
* @param smallBodyPolyData
|
* @param smallBodyPolyData
|
||||||
* @param cubeSize
|
* @param cubeSize
|
||||||
* @param buffer
|
* @param buffer
|
||||||
*/
|
*/
|
||||||
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer,
|
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer, boolean removeEmptyCubes) {
|
||||||
boolean removeEmptyCubes) {
|
this.cubeSize = cubeSize;
|
||||||
this.cubeSize = cubeSize;
|
this.buffer = buffer;
|
||||||
this.buffer = buffer;
|
|
||||||
|
|
||||||
initialize(smallBodyPolyData);
|
initialize(smallBodyPolyData);
|
||||||
|
|
||||||
if (removeEmptyCubes)
|
if (removeEmptyCubes) removeEmptyCubes(smallBodyPolyData);
|
||||||
removeEmptyCubes(smallBodyPolyData);
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private void initialize(vtkPolyData smallBodyPolyData) {
|
private void initialize(vtkPolyData smallBodyPolyData) {
|
||||||
smallBodyPolyData.ComputeBounds();
|
smallBodyPolyData.ComputeBounds();
|
||||||
double[] bounds = smallBodyPolyData.GetBounds();
|
double[] bounds = smallBodyPolyData.GetBounds();
|
||||||
boundingBox = new BoundingBox(new Interval(bounds[0] - buffer, bounds[1] + buffer),
|
boundingBox = new BoundingBox(
|
||||||
new Interval(bounds[2] - buffer, bounds[3] + buffer),
|
new Interval(bounds[0] - buffer, bounds[1] + buffer),
|
||||||
new Interval(bounds[4] - buffer, bounds[5] + buffer));
|
new Interval(bounds[2] - buffer, bounds[3] + buffer),
|
||||||
|
new Interval(bounds[4] - buffer, bounds[5] + buffer));
|
||||||
|
|
||||||
int numCubesX = (int) Math.ceil(boundingBox.getXRange().getLength() / cubeSize);
|
int numCubesX = (int) Math.ceil(boundingBox.getXRange().getLength() / cubeSize);
|
||||||
int numCubesY = (int) Math.ceil(boundingBox.getYRange().getLength() / cubeSize);
|
int numCubesY = (int) Math.ceil(boundingBox.getYRange().getLength() / cubeSize);
|
||||||
int numCubesZ = (int) Math.ceil(boundingBox.getZRange().getLength() / cubeSize);
|
int numCubesZ = (int) Math.ceil(boundingBox.getZRange().getLength() / cubeSize);
|
||||||
|
|
||||||
for (int k = 0; k < numCubesZ; ++k) {
|
for (int k = 0; k < numCubesZ; ++k) {
|
||||||
double zmin = boundingBox.getZRange().getBegin() + k * cubeSize;
|
double zmin = boundingBox.getZRange().getBegin() + k * cubeSize;
|
||||||
double zmax = boundingBox.getZRange().getBegin() + (k + 1) * cubeSize;
|
double zmax = boundingBox.getZRange().getBegin() + (k + 1) * cubeSize;
|
||||||
for (int j = 0; j < numCubesY; ++j) {
|
for (int j = 0; j < numCubesY; ++j) {
|
||||||
double ymin = boundingBox.getYRange().getBegin() + j * cubeSize;
|
double ymin = boundingBox.getYRange().getBegin() + j * cubeSize;
|
||||||
double ymax = boundingBox.getYRange().getBegin() + (j + 1) * cubeSize;
|
double ymax = boundingBox.getYRange().getBegin() + (j + 1) * cubeSize;
|
||||||
for (int i = 0; i < numCubesX; ++i) {
|
for (int i = 0; i < numCubesX; ++i) {
|
||||||
double xmin = boundingBox.getXRange().getBegin() + i * cubeSize;
|
double xmin = boundingBox.getXRange().getBegin() + i * cubeSize;
|
||||||
double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize;
|
double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize;
|
||||||
BoundingBox bb = new BoundingBox(new Interval(xmin, xmax), new Interval(ymin, ymax),
|
BoundingBox bb = new BoundingBox(
|
||||||
new Interval(zmin, zmax));
|
new Interval(xmin, xmax), new Interval(ymin, ymax), new Interval(zmin, zmax));
|
||||||
allCubes.add(bb);
|
allCubes.add(bb);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private void removeEmptyCubes(vtkPolyData smallBodyPolyData) {
|
|
||||||
logger.info("total cubes before reduction = {}", allCubes.size());
|
|
||||||
|
|
||||||
// Remove from allCubes all cubes that do not intersect the asteroid
|
|
||||||
// long t0 = System.currentTimeMillis();
|
|
||||||
TreeSet<Integer> intersectingCubes = getIntersectingCubes(smallBodyPolyData);
|
|
||||||
// System.out.println("Time elapsed: " +
|
|
||||||
// ((double)System.currentTimeMillis()-t0)/1000.0);
|
|
||||||
|
|
||||||
ArrayList<BoundingBox> tmpCubes = new ArrayList<BoundingBox>();
|
|
||||||
for (Integer i : intersectingCubes) {
|
|
||||||
tmpCubes.add(allCubes.get(i));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
allCubes = tmpCubes;
|
private void removeEmptyCubes(vtkPolyData smallBodyPolyData) {
|
||||||
|
logger.info("total cubes before reduction = {}", allCubes.size());
|
||||||
|
|
||||||
logger.info("finished initializing cubes, total = {}", allCubes.size());
|
// Remove from allCubes all cubes that do not intersect the asteroid
|
||||||
}
|
// long t0 = System.currentTimeMillis();
|
||||||
|
TreeSet<Integer> intersectingCubes = getIntersectingCubes(smallBodyPolyData);
|
||||||
|
// System.out.println("Time elapsed: " +
|
||||||
|
// ((double)System.currentTimeMillis()-t0)/1000.0);
|
||||||
|
|
||||||
public BoundingBox getCube(int cubeId) {
|
ArrayList<BoundingBox> tmpCubes = new ArrayList<BoundingBox>();
|
||||||
return allCubes.get(cubeId);
|
for (Integer i : intersectingCubes) {
|
||||||
}
|
tmpCubes.add(allCubes.get(i));
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
allCubes = tmpCubes;
|
||||||
* Get all the cubes that intersect with <tt>polydata</tt>
|
|
||||||
*
|
|
||||||
* @param polydata
|
|
||||||
* @return
|
|
||||||
*/
|
|
||||||
public TreeSet<Integer> getIntersectingCubes(vtkPolyData polydata) {
|
|
||||||
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
|
|
||||||
|
|
||||||
// Iterate through each cube and check if it intersects
|
logger.info("finished initializing cubes, total = {}", allCubes.size());
|
||||||
// with the bounding box of any of the polygons of the polydata
|
|
||||||
BoundingBox polydataBB = new BoundingBox(polydata.GetBounds());
|
|
||||||
|
|
||||||
long numberPolygons = polydata.GetNumberOfCells();
|
|
||||||
|
|
||||||
// Store all the bounding boxes of all the individual polygons in an
|
|
||||||
// array first
|
|
||||||
// since the call to GetCellBounds is very slow.
|
|
||||||
double[] cellBounds = new double[6];
|
|
||||||
ArrayList<BoundingBox> polyCellsBB = new ArrayList<BoundingBox>();
|
|
||||||
for (int j = 0; j < numberPolygons; ++j) {
|
|
||||||
polydata.GetCellBounds(j, cellBounds);
|
|
||||||
polyCellsBB.add(new BoundingBox(cellBounds));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
int numberCubes = allCubes.size();
|
public BoundingBox getCube(int cubeId) {
|
||||||
for (int i = 0; i < numberCubes; ++i) {
|
return allCubes.get(cubeId);
|
||||||
// Before checking each polygon individually, first see if the
|
}
|
||||||
// polydata as a whole intersects the cube
|
|
||||||
BoundingBox cube = getCube(i);
|
/**
|
||||||
if (cube.intersects(polydataBB)) {
|
* Get all the cubes that intersect with <tt>polydata</tt>
|
||||||
|
*
|
||||||
|
* @param polydata
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public TreeSet<Integer> getIntersectingCubes(vtkPolyData polydata) {
|
||||||
|
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
|
||||||
|
|
||||||
|
// Iterate through each cube and check if it intersects
|
||||||
|
// with the bounding box of any of the polygons of the polydata
|
||||||
|
BoundingBox polydataBB = new BoundingBox(polydata.GetBounds());
|
||||||
|
|
||||||
|
long numberPolygons = polydata.GetNumberOfCells();
|
||||||
|
|
||||||
|
// Store all the bounding boxes of all the individual polygons in an
|
||||||
|
// array first
|
||||||
|
// since the call to GetCellBounds is very slow.
|
||||||
|
double[] cellBounds = new double[6];
|
||||||
|
ArrayList<BoundingBox> polyCellsBB = new ArrayList<BoundingBox>();
|
||||||
for (int j = 0; j < numberPolygons; ++j) {
|
for (int j = 0; j < numberPolygons; ++j) {
|
||||||
BoundingBox bb = polyCellsBB.get(j);
|
polydata.GetCellBounds(j, cellBounds);
|
||||||
if (cube.intersects(bb)) {
|
polyCellsBB.add(new BoundingBox(cellBounds));
|
||||||
cubeIds.add(i);
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
int numberCubes = allCubes.size();
|
||||||
|
for (int i = 0; i < numberCubes; ++i) {
|
||||||
|
// Before checking each polygon individually, first see if the
|
||||||
|
// polydata as a whole intersects the cube
|
||||||
|
BoundingBox cube = getCube(i);
|
||||||
|
if (cube.intersects(polydataBB)) {
|
||||||
|
for (int j = 0; j < numberPolygons; ++j) {
|
||||||
|
BoundingBox bb = polyCellsBB.get(j);
|
||||||
|
if (cube.intersects(bb)) {
|
||||||
|
cubeIds.add(i);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return cubeIds;
|
||||||
}
|
}
|
||||||
|
|
||||||
return cubeIds;
|
/**
|
||||||
}
|
* Get all the cubes that intersect with BoundingBox <tt>bb</tt>
|
||||||
|
*
|
||||||
|
* @param bb
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public TreeSet<Integer> getIntersectingCubes(BoundingBox bb) {
|
||||||
|
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
|
||||||
|
|
||||||
/**
|
int numberCubes = allCubes.size();
|
||||||
* Get all the cubes that intersect with BoundingBox <tt>bb</tt>
|
for (int i = 0; i < numberCubes; ++i) {
|
||||||
*
|
BoundingBox cube = getCube(i);
|
||||||
* @param bb
|
if (cube.intersects(bb)) {
|
||||||
* @return
|
cubeIds.add(i);
|
||||||
*/
|
}
|
||||||
public TreeSet<Integer> getIntersectingCubes(BoundingBox bb) {
|
}
|
||||||
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
|
|
||||||
|
|
||||||
int numberCubes = allCubes.size();
|
return cubeIds;
|
||||||
for (int i = 0; i < numberCubes; ++i) {
|
|
||||||
BoundingBox cube = getCube(i);
|
|
||||||
if (cube.intersects(bb)) {
|
|
||||||
cubeIds.add(i);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return cubeIds;
|
/**
|
||||||
}
|
* Get the id of the cube containing <tt>point</tt>
|
||||||
|
*
|
||||||
|
* @param point
|
||||||
|
* @return
|
||||||
|
*/
|
||||||
|
public int getCubeId(double[] point) {
|
||||||
|
if (!boundingBox.contains(point)) return -1;
|
||||||
|
|
||||||
/**
|
int numberCubes = allCubes.size();
|
||||||
* Get the id of the cube containing <tt>point</tt>
|
for (int i = 0; i < numberCubes; ++i) {
|
||||||
*
|
BoundingBox cube = getCube(i);
|
||||||
* @param point
|
if (cube.contains(point)) return i;
|
||||||
* @return
|
}
|
||||||
*/
|
|
||||||
public int getCubeId(double[] point) {
|
|
||||||
if (!boundingBox.contains(point))
|
|
||||||
return -1;
|
|
||||||
|
|
||||||
int numberCubes = allCubes.size();
|
// If we reach here something is wrong
|
||||||
for (int i = 0; i < numberCubes; ++i) {
|
System.err.println("Error: could not find cube");
|
||||||
BoundingBox cube = getCube(i);
|
|
||||||
if (cube.contains(point))
|
return -1;
|
||||||
return i;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// If we reach here something is wrong
|
|
||||||
System.err.println("Error: could not find cube");
|
|
||||||
|
|
||||||
return -1;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,25 @@
|
|||||||
|
/*
|
||||||
|
* The MIT License
|
||||||
|
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
|
||||||
|
*
|
||||||
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
* of this software and associated documentation files (the "Software"), to deal
|
||||||
|
* in the Software without restriction, including without limitation the rights
|
||||||
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
* copies of the Software, and to permit persons to whom the Software is
|
||||||
|
* furnished to do so, subject to the following conditions:
|
||||||
|
*
|
||||||
|
* The above copyright notice and this permission notice shall be included in
|
||||||
|
* all copies or substantial portions of the Software.
|
||||||
|
*
|
||||||
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
* THE SOFTWARE.
|
||||||
|
*/
|
||||||
/**
|
/**
|
||||||
* Classes to work with a shape model
|
* Classes to work with a shape model
|
||||||
*/
|
*/
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user